A regex-based lexer (tokenizer) in Rust.
```rust enum Token { Num(usize), // ... }
let lexer = regexlexer::LexerBuilder::new() .token(r"[0-9]+", |num| Some(Token::Num(num.parse().unwrap()))) .token(r"\s+", || None) // skip whitespace // ... .build();
let tokens = lexer.tokens(/* source */); ```
Licensed under either of
at your option.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusing in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.