This repo provides a Rust API encapsulating the works of https://github.com/victordibia/handtracking. Basically this library encapsulates the required tensorflow interactions and provide a easy-to-use API for simply detecting hands.
This is very much in WIP and I have only tested this with single images. My aim here to basically be able to detect hands in real-time video just like the js version of this little library at https://github.com/victordibia/handtrack.js
```rust // Import the image. let image = Image::fromfile(projectdir).unwrap();
// Construct detection options. let scorethreshold = 0.7f32; let maxhands = 1; let detectionopts = DetectionOptions::new(maxhands, score_threshold);
// Run the detection. let detection = detect(image, detection_opts).unwrap();
let detection_box = &detection[0]; ```
As it can be seen from the example above, detect
function requires an Image
and DetectionOptions
. Currently it is possible to specify desired maximum number of hands detected and score threshold for classifying an object as a hand.
Although this is a small library it does have lots of missing features and contributions are more than welcome! As this is very early stage I do not have set contribution guidelines but I have some CI checks in place for just in case which are:
clippy
lintingcargo fmt
checkingCargo.toml
linting (dependencies must be in alphabetical order etc.)cargo test
check