xwde: robotxt

Build Status Crate Docs Crate Version

The implementation of the robots.txt protocol (or URL exclusion protocol) in Rust programming language with the support of crawl-delay, sitemap and universal * match extensions (according to the RFC specification).

Examples

```rust use robotxt::Robots;

fn main() { let txt = r#" User-Agent: foobot Allow: /example/ Disallow: /example/nope.txt "#.as_bytes();

let r = Robots::from_slice(txt, "foobot");
assert!(r.is_match("/example/yeah.txt"));
assert!(!r.is_match("/example/nope.txt"));

} ```

Note : the builder is not yet implemented.

rust

Links

Notes

The parser is based on Smerity/texting_robots.