It's based on the version 0.0.4
of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.
3.3.1
of hadoop repository. For rust usage, a few changes are also applied.Add this to your Cargo.toml:
toml
[dependencies]
fs-hdfs = "0.1.3"
Firstly, we need to add library path for the jvm related dependencies. An example for MacOS,
sh
export DYLD_LIBRARY_PATH=$JAVA_HOME/jre/lib/server
Here, $JAVA_HOME
need to be specified and exported.
Since our compiled libhdfs is JNI native implementation, it requires the proper CLASSPATH
. An example,
sh
export CLASSPATH=$CLASSPATH:`hadoop classpath`
The test also requires the CLASSPATH
. In case that the java class of org.junit.Assert
can't be found. Refine the $CLASSPATH
as follows:
sh
export CLASSPATH=$CLASSPATH:`hadoop classpath`:$HADOOP_HOME/share/hadoop/tools/lib/*
Here, $HADOOP_HOME
need to be specified and exported.
Then you can run
bash
cargo test
```rust use hdfs::hdfs::HdfsFs;
let fs: HdfsFs = HdfsFs::new("hdfs://localhost:8020/").ok().unwrap(); match fs.mkdir("/data") { Ok() => { println!("/data has been created") }, Err() => { panic!("/data creation has failed") } }; ```