EVTX

A cross-platform parser for the Windows XML EventLog format


Crates.io version Download docs.rs docs safety-dance Build status


Features

Installation (associated binary utility):

evtx_dump (Binary utility):

The main binary utility provided with this crate is evtx_dump, and it provides a quick way to convert .evtx files to different output formats.

Some examples - evtx_dump <evtx_file> will dump contents of evtx records as xml. - evtx_dump -o json <evtx_file> will dump contents of evtx records as JSON. - evtx_dump -f <output_file> -o json <input_file> will dump contents of evtx records as JSON to a given file.

evtx_dump can be combined with fd for convinient batch processing of files: - fd -e evtx -x evtx_dump -o jsonl will scan a folder and dump all evtx files to a single jsonlines file. - fd -e evtx -x evtx_dump -f "{.}.xml will create an xml file next to each evtx file, for all files in folder recursively! - If the source of the file needs to be added to json, xargs (or gxargs on mac) and jq can be used: fd -a -e evtx | xargs -I input sh -c "evtx_dump -o jsonl input | jq --arg path "input" '. + {path: \$path}'"

Note: by default, evtx_dump will try to utilize multithreading, this means that the records may be returned out of order.

To force single threaded usage (which will also ensure order), -t 1 can be passed.

Example usage (as library):

```rust use evtx::EvtxParser; use std::path::PathBuf;

fn main() { // Change this to a path of your .evtx sample. let fp = PathBuf::from(format!("{}/samples/security.evtx", std::env::var("CARGOMANIFESTDIR").unwrap()));

let mut parser = EvtxParser::from_path(fp).unwrap();
for record in parser.records() {
    match record {
        Ok(r) => println!("Record {}\n{}", r.event_record_id, r.data),
        Err(e) => eprintln!("{}", e),
    }
}

} ```

The parallel version is enabled when compiling with feature "multithreading" (enabled by default).

Benchmarking

Initial benchmarking I've performed indicate that this implementation is probably the fastest available 🍺.

I'm using a real world, 30MB sample which contains ~62K records.

This is benchmarked on my 2017 MBP.

Comparison with other libraries:

Caveats

If the parser errors on any of these nodes, feel free to open an issue or drop me an email with a sample.

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.