The implementation of the robots.txt protocol (or URL exclusion protocol) in
Rust programming language with the support of crawl-delay
, sitemap
and
universal *
match extensions (according to the RFC specification).
user-agent
in the provided robots.txt
file:```rust use robotxt::Robots;
fn main() { let txt = r#" User-Agent: foobot Allow: /example/ Disallow: /example/nope.txt "#.as_bytes();
let r = Robots::from_slice(txt, "foobot");
assert!(r.is_match("/example/yeah.txt"));
assert!(!r.is_match("/example/nope.txt"));
} ```
robots.txt
file from provided directives:Note : the builder is not yet implemented.
rust
The parser is based on Smerity/texting_robots.