The multi-tool of lexical analysis and tokenization.
pip install plrs
maturin build
EOF_TOKEN
``` Tokens Settings Token - part - token - setpart - settoken - str - repr
Lexer - new - charforward - skipovercharset - next ```
is_char_symbol
is_char_operator
is_char_whitespace
is_char_numeric
is_single_quote
is_double_quote
ends_token
is_part_numeric
tokenize