The lexical analysis component of Tusk.
This crate provides the Lexer
and Token
implementations used in Tusk. It allows you to provide a &str
of input and stream Token
instances on demand.
To use the crate, first add it to your Cargo.toml
:
toml
[dependencies]
tusk_lexer = "0.2.*"
To create a new Lexer
, import the struct
and use the Lexer::new()
method.
```rust use tusk_lexer::Lexer;
fn main() { let mut lexer = Lexer::new("$hello = 'cool'"); } ```
To get the next token from the input, use the Lexer::next()
method:
```rust use tusk_lexer::Lexer;
fn main() { let mut lexer = Lexer::new("$hello = 'cool'");
let maybe_some_token = lexer.next();
} ```
This method returns a Token
. This struct has 3 fields:
rust
struct Token {
pub kind: TokenType,
pub slice: &str,
pub range: TextRange,
}
For more information, please read the CONTRIBUTING document.
This repository is distributed under the MIT license. For more information, please read the LICENSE document.