A low level, performance oriented parser for EU4 save files and other PDS developed titles.
Jomini is the cornerstone of the Rakaly, an EU4 achievement leaderboard and save file analyzer. This library is also used in the Paradox Game Converters project to parse ironman EU4, CK3, and Imperator saves.
Below is a demonstration on parsing plaintext data using jomini tools.
```rust use jomini::{JominiDeserialize, TextDeserializer};
pub struct Model {
human: bool,
first: Option
let data = br#" human = yes forth = 10 core = "HAB" names = { "Johan" "Frederick" } core = FRA "#;
let expected = Model { human: true, first: None, fourth: 10, cores: vec!["HAB".tostring(), "FRA".tostring()], names: vec!["Johan".tostring(), "Frederick".tostring()], };
let actual: Model = TextDeserializer::fromwindows1252slice(data)?; assert_eq!(actual, expected); ```
Parsing data encoded in the binary format is done in a similar fashion but with an extra step. Tokens can be encoded into 16 integers, and so one must provide a map from these integers to their textual representations
```rust use jomini::{JominiDeserialize, BinaryDeserializer}; use std::collections::HashMap;
struct MyStruct { field1: String, }
let data = [ 0x82, 0x2d, 0x01, 0x00, 0x0f, 0x00, 0x03, 0x00, 0x45, 0x4e, 0x47 ];
let mut map = HashMap::new(); map.insert(0x2d82, "field1");
let actual: MyStruct = BinaryDeserializer::fromeu4(&data[..], &map)?; asserteq!(actual, MyStruct { field1: "ENG".to_string() }); ```
When done correctly, one can use the same structure to represent both the plaintext and binary data without any duplication.
One can configure the behavior when a token is unknown (ie: fail immediately or try to continue).
Caller is responsible for:
EU4txt
/ EU4bin
)If the automatic deserialization via JominiDeserialize
is too high level, there is a mid-level
api where one can easily iterate through the parsed document and interrogate fields for
their information.
```rust use jomini::TextTape;
let data = b"name=aaa name=bbb core=123 name=ccc name=ddd"; let tape = TextTape::fromslice(data).unwrap(); let mut reader = tape.windows1252reader();
while let Some((key, op, value)) = reader.nextfield() { println!("{:?}={:?}", key.readstr(), value.readstr().unwrap()); } ```
At the lowest layer, one can interact with the raw data directly via TextTape
and BinaryTape
.
```rust use jomini::{TextTape, TextToken, Scalar};
let data = b"foo=bar";
asserteq!( TextTape::fromslice(&data[..])?.tokens(), &[ TextToken::Unquoted(Scalar::new(b"foo")), TextToken::Unquoted(Scalar::new(b"bar")), ] ); ```
If one will only use TextTape
and BinaryTape
then jomini
can be compiled without default
features, resulting in a build without dependencies.
There are two targeted use cases for the write API. One is when a text tape is on hand. This is useful when one needs to reformat a document (note that comments are not preserved):
```rust use jomini::{TextTape, TextWriterBuilder};
let tape = TextTape::fromslice(b"hello = world")?;
let mut out: Vec
The writer normalizes any formatting issues. The writer is not able to losslessly write all parsed documents, but these are limited to truly esoteric situations and hope to be resolved in future releases.
The other use case is geared more towards incremental writing that can be found in melters or those crafting documents by hand. These use cases need to manually drive the writer:
rust
use jomini::TextWriterBuilder;
let mut out: Vec<u8> = Vec::new();
let mut writer = TextWriterBuilder::new().from_writer(&mut out);
writer.write_unquoted(b"hello")?;
writer.write_unquoted(b"world")?;
writer.write_unquoted(b"foo")?;
writer.write_unquoted(b"bar")?;
assert_eq!(&out, b"hello=world\nfoo=bar\n");
Benchmarks are ran with the following command:
cargo clean
cargo bench -- '/ck3'
find ./target -wholename "*/new/raw.csv" -print0 | xargs -0 xsv cat rows > assets/jomini-benchmarks.csv
And can be analyzed with the R script found in the assets directory.
Below is a graph generated from benchmarking on an arbitrary computer.