When fetching a file from a web server via GET, it is possible to define a range of bytes to receive per request. This allows the possibility of using multiple GET requests on the same URL to increase the throughput of a transfer for that file. Once the parts have been fetched, they are concatenated into a single file.
Therefore, this crate will make it trivial to set up a parallel GET request, with an API that provides a configurable number of threads and an optional callback to monitor the progress of a transfer.
```rust extern crate reqwest; extern crate parallel_getter;
use reqwest::Client; use parallel_getter::ParallelGetter; use std::fs::File; use std::path::PathBuf; use std::sync::Arc;
let client = Arc::new(Client::new()); let mut file = File::create("newfile").unwrap(); ParallelGetter::new("urlhere", &mut file) // Additional mirrors that can be used. .mirrors(&["mirrora", "mirrorb"]) // Optional client to use for the request. .client(client) // Optional path to store the parts. .cachepath(PathBuf::from("/a/path/here")) // Number of theads to use. .threads(5) // threshold (length in bytes) to determine when multiple threads are required. .thresholdparallel(1 * 1024 * 1024) // threshold for defining when to store parts in memory or on disk. .threshold_memory(10 * 1024 * 1024) // Callback for monitoring progress. .callback(16, Box::new(|progress, total| { println!( "{} of {} KiB downloaded", progress / 1024, total / 1024 ); })) // Commit the parallel GET requests. .get(); ```