It's based on the version 0.0.4
of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.
2.7.3
of hadoop repository. For rust usage, a few changes are also applied.Add this to your Cargo.toml:
toml
[dependencies]
fs-hdfs = "0.1.12"
We need to specify $JAVA_HOME
to make Java shared library available for building.
Since our compiled libhdfs is JNI-based implementation,
it requires Hadoop-related classes available through CLASSPATH
. An example,
sh
export CLASSPATH=$CLASSPATH:`hadoop classpath --glob`
Also, we need to specify the JVM dynamic library path for the application to load the JVM shared library at runtime.
For jdk8 and macOS, it's
sh
export DYLD_LIBRARY_PATH=$JAVA_HOME/jre/lib/server
For jdk11 (or later jdks) and macOS, it's
sh
export DYLD_LIBRARY_PATH=$JAVA_HOME/lib/server
For jdk8 and Centos
sh
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server
For jdk11 (or later jdks) and Centos
sh
export LD_LIBRARY_PATH=$JAVA_HOME/lib/server
The test also requires the CLASSPATH
and DYLD_LIBRARY_PATH
(or LD_LIBRARY_PATH
). In case that the java class of org.junit.Assert
can't be found. Refine the $CLASSPATH
as follows:
sh
export CLASSPATH=$CLASSPATH:`hadoop classpath --glob`:$HADOOP_HOME/share/hadoop/tools/lib/*
Here, $HADOOP_HOME
need to be specified and exported.
Then you can run
bash
cargo test
```rust use std::sync::Arc; use hdfs::hdfs::{gethdfsbyfullpath, HdfsFs};
let fs: Arc