regex-lexer

github crates.io docs.rs build status

A regex-based lexer (tokenizer) in Rust.

Basic Usage

```rust enum Tok { Num, // ... }

let lexer = regex_lexer::LexerBuilder::new() .token(r"[0-9]+", Tok::Num) .ignore(r"\s+") // skip whitespace // ... .build();

let tokens = lexer.tokens(/* source */); ```

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusing in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.