Fast data dupe finder
This is a small Rust command-line program to find duplicate files in a directory recursively. It uses a thread pool to calculate file hashes in parallel.
Duplicates are found by checking size, then (Blake2) hashes of parts of files of same size, then a byte-for-byte comparison.
Directly from crates.io with cargo install fddf
.
From checkout:
cargo build --release
cargo run --release
```
fddf [-s] [-t] [-m SIZE] [-M SIZE] [-v]
-s: report dupe groups in a single line -t: produce a grand total -m: minimum size (default 1 byte) -M: maximum size (default unlimited) -v: verbose operation ```
PRs welcome!