laps

github crates.io docs.rs build status

Lexer and parser collections.

With laps, you can build parsers by just defining ASTs and deriving Parse trait for them.

Usage

Add laps to your project by running cargo add:

cargo add laps

Example

Implement a lexer for S-expression:

```rust use laps::prelude::*;

[token_kind]

enum TokenKind { /// Atom. Atom(String), /// Parentheses. Paren(char), /// End-of-file. Eof, }

type Token = laps::token::Token;

struct Lexer(laps::reader::Reader);

impl Tokenizer for Lexer { type Token = Token;

fn nexttoken(&mut self) -> laps::span::Result { // skip spaces self.0.skipuntil(|c| !c.iswhitespace())?; // check the current character Ok(match self.0.peek()? { // parentheses Some(c) if c == '(' || c == ')' => Token::new(c, self.0.nextspan()?.clone()), // atom Some() => { let (atom, span) = self .0 .collectwithspanuntil(|c| c.iswhitespace() || c == '(' || c == ')')?; Token::new(atom, span) } // end-of-file None => Token::new(TokenKind::Eof, self.0.nextspan()?.clone()), }) } } ```

And the parser and ASTs (or actually CSTs):

```rust tokenast! { macro Token(mod = crate, Kind = TokenKind) { [atom] => (TokenKind::Atom(), "atom"), [lpr] => (TokenKind::Paren('('), _), [rpr] => (TokenKind::Paren(')'), _), [eof] => (TokenKind::Eof, _), } }

[derive(Parse)]

[token(Token)]

enum Statement { Elem(Elem), End(Token![eof]), }

[derive(Parse)]

[token(Token)]

struct SExp(Token![lpr], Vec, Token![rpr]);

[derive(Parse)]

[token(Token)]

enum Elem { Atom(Token![atom]), SExp(SExp), } ```

The above implementation is very close in form to the corresponding EBNF representation of the S-expression:

ebnf Statement ::= Elem | EOF; SExp ::= "(" {Elem} ")"; Elem ::= ATOM | SExp;

More Examples

See the examples directory, which contains the following examples:

Changelog

See CHANGELOG.md.

License

Copyright (C) 2022-2023 MaxXing. Licensed under either of Apache 2.0 or MIT at your option.