fs-hdfs

It's based on the version 0.0.4 of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.

Current Status

Documentation

Requirements

Usage

Add this to your Cargo.toml:

toml [dependencies] fs-hdfs = "0.1.12"

Build

We need to specify $JAVA_HOME to make Java shared library available for building.

Run

Since our compiled libhdfs is JNI-based implementation, it requires Hadoop-related classes available through CLASSPATH. An example,

sh export CLASSPATH=$CLASSPATH:`hadoop classpath --glob`

Also, we need to specify the JVM dynamic library path for the application to load the JVM shared library at runtime.

For jdk8 and macOS, it's

sh export DYLD_LIBRARY_PATH=$JAVA_HOME/jre/lib/server

For jdk11 (or later jdks) and macOS, it's

sh export DYLD_LIBRARY_PATH=$JAVA_HOME/lib/server

For jdk8 and Centos sh export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server

For jdk11 (or later jdks) and Centos sh export LD_LIBRARY_PATH=$JAVA_HOME/lib/server

Testing

The test also requires the CLASSPATH and DYLD_LIBRARY_PATH (or LD_LIBRARY_PATH). In case that the java class of org.junit.Assert can't be found. Refine the $CLASSPATH as follows:

sh export CLASSPATH=$CLASSPATH:`hadoop classpath --glob`:$HADOOP_HOME/share/hadoop/tools/lib/*

Here, $HADOOP_HOME need to be specified and exported.

Then you can run

bash cargo test

Example

```rust use std::sync::Arc; use hdfs::hdfs::{gethdfsbyfullpath, HdfsFs};

let fs: Arc = gethdfsbyfullpath("hdfs://localhost:8020/").ok().unwrap(); match fs.mkdir("/data") { Ok() => { println!("/data has been created") }, Err() => { panic!("/data creation has failed") } }; ```