This crate helps you to parse input strings and slices by building on the Iterator
interface.
The aim of this crate is to provide transparent, flexible and easy to understand parsing utilities which work on arbitrary slices and strings. Some goals of this library are:
Tokens
trait, and everything used
internally to implement those functions can be used by you, too.Have a look at the Tokens
trait for all of the parsing methods, and examples for each.
Here's what it looks like:
``rust
use yap::{
// This trait has all of the parsing methods on it:
Tokens,
// Allows you to use
.into_tokens()` on strings and slices,
// to get an instance of the above:
IntoTokens
};
// Step 1: convert some tokens into something implementing Tokens
// ================================================================
let mut tokens = "1+2/3-4,foobar".into_tokens();
// Step 2: Parse some things from our tokens // =========================================
enum Op { Plus, Minus, Divide }
enum OpOrDigit { Op(Op), Digit(u32) }
// The Tokens
trait builds on Iterator
and so looks similar,
// as well as having all of the normal Iterator
methods on it.
fn parse_op(mut t: impl Tokens
fn parsedigit(mut tokens: impl Tokens
// Combinator functions exist which accept functions that consume tokens, // and combine them. Here, we parse digits separated by operators, leaving // any input that does not match this. // // These functions themselves tend to return iterators, so that you can // collect up the results however you choose. No input is consumed beyond // that which was successfully parsed. let output: Vec<_> = tokens.sepbyall( |t| parsedigit(t).map(OpOrDigit::Digit), |t| parseop(t).map(OpOrDigit::Op) ).collect();
// Step 3: do whatever you like with the rest of the input! // ========================================================
// This is available on the concrete type that strings
// are converted into (rather than on the Tokens
trait):
let remaining = tokens.remaining();
assert_eq!(remaining, ",foobar"); ```