Multithreaded web crawler written in Rust.
On Debian or other DEB based distributions:
bash
$ sudo apt install openssl libssl-dev
On Fedora and other RPM based distributions:
bash
$ sudo dnf install openssl-devel
Add this dependency to your Cargo.toml file.
toml
[dependencies]
spider = "1.3.1"
Then you'll be able to use library. Here is a simple example:
```rust extern crate spider;
use spider::website::Website;
fn main() { let mut website: Website = Website::new("https://choosealicense.com"); website.crawl();
for page in website.get_pages() {
println!("- {}", page.get_url());
}
} ```
You can use Configuration
object to configure your crawler:
```rust // .. let mut website: Website = Website::new("https://choosealicense.com"); website.configuration.blacklisturl.push("https://choosealicense.com/licenses/".tostring()); website.configuration.respectrobotstxt = true; website.configuration.verbose = true; // Defaults to false website.configuration.delay = 2000; // Defaults to 250 ms website.configuration.concurrency = 10; // Defaults to 4 website.configuration.useragent = "myapp/version"; // Defaults to spider/x.y.z, where x.y.z is the library version website.onlinkfindcallback = |s| { println!("link target: {}", s); s }; // Callback to run on each link find
website.crawl(); ```
You can get a working example at example.rs
and run it with
sh
cargo run --example example
I am open-minded to any contribution. Just fork & commit
on another branch.