RINEX Cli

Command line tool to handle, manage and analyze RINEX files

crates.io License License crates.io
Rust

This command line interface implements the latest Rinex crate and allows easy RINEX files manipulation.

RINEX

Why this tool ?

RINEX are very common worldwide, they are used in GNSS :artificialsatellite:, timing :clock1: :satellite: and navigation :rocket: :earthamericas: applications.

RINEX are complex files, several kinds exist, they differ a lot from one another.
This tool is powerful enough to manage almost all revisions and most common RINEX files, without compromising ease of use.

Notes on data & RINEX

Read this section before getting started

Supported RINEX

File naming conventions

File names are disregarded by these tools, you can analyze & parse files that do not follow naming conventions.

Compressed files

RINEX files are most of the time compressed.

This tool support CRINEX (compressed RINEX) natively. You can pass a CRINEX and parse it directly.

This tool does not support extra compression (like .gz for instance). It is up to the user to decompress these files prior analysis.

RINEX Revisions

Many RINEX revisions exist

Getting started

You can run the application with cargo for instance

bash cargo run -- --help

Command line arguments order does not matter.
(Input) filepath is the only mandatory argument, all other are optionnal. Help menu tells you which argument has a shortenned version, here is an example on how to use a shortenned argument:

bash cargo run --filepath /tmp/amel010.21g cargo run -f /tmp/amel010.21g

Some arguments, like filepath or obscodes can take an array of values. In this case, we use comma separated enumeration like this:

bash cargo run -f /tmp/amel010.21g,/mnt/CBW100NLD_R_20210010000_01D_MN.rnx

Output format

This tool display everything in the terminal (stdout). One should pipe the output data to a file in order to store it.

This tool uses JSON format to expose data, which makes it easy to import into external tool for further calculations and processing, like python scripts for instance.

At any moment, add the --pretty option to make this more readable if desired. Output is still valid JSON.

Epoch identification

--epoch or -e is used to export the identified epochs (sampling timestamps) in the given RINEX records. When we extract data, we always associate it to a sampling timestamp. Therefore, this flag should only be used if the user studies epoch events or sampling events specifically.

Example :

bash cargo run -- -f /tmp/data.obs --epoch --pretty cargo run -- -f /tmp/data.obs -e > /tmp/epochs.json

OBS / DATA code identification

--obscodes or -o is used to identify which data codes (not necesarily OBS..) are present in the given records. This macro is very useful because it lets the user understand which data (physics) is present and we can build efficient data filter from that information

bash cargo run -- -f /tmp/data.obs --obscodes --pretty cargo run -- -f /tmp/data.obs -o > /tmp/data-codes.json

This flag name can be misleading as it is possible to use this flag to identify NAV data field also !

Record resampling

Record resampling is work in progress !

Epoch filter

Some RINEX files like Observation Data associate an epoch flag to each epoch.
A non Ok epoch flag describes a special event or external pertubations that happened at that sampling date. We provide the following arguments to easily discard unusual events or focus on them to figure things out:

Example :

bash cargo run -- -f /tmp/data.obs -c C1C,C2C # huge set cargo run -- -f /tmp/data.obs --epoch-ok -c C1C,C2C # reduce set cargo run -- -f /tmp/data.obs --epoch-nok -c C1C,C2C # focus on weird events

Satellite vehicule filter

--sv is one way to focus on specific Satellite Vehicule or constellation of interest. Use comma separated description!

Example:

shell cargo run -- --filepath /tmp/data.obs --epoch-ok \ --sv G01,G2,E06,E24,R24

Will only retain (all) data that has a EpochFlag::Ok from GPS 1+2, GAL 6+24 and GLO 24 vehicules.

Constellation filter

Constellation filter is not feasible at the moment. User can only use sv filter to do that at the moment.

LLI : RX condition filter

Observation data might have LLI flags attached to them.
It is possible to filter data that have a matching LLI flag, for instance 0x01 means Loss of Lock at given epoch:

shell cargo run -- -f /tmp/data.obs --lli 1 --sv R01 > output.txt

SSI: signal "quality" filter

Observation data might have an SSI indication attached to them. It is possible to filter data according to this value and retain only data with a certain "quality" attached to them.

For example, with this value, we only retain data with SSI >= 5 that means at least 30 dB SNR

shell cargo run -- -f /tmp/data.obs --ssi 1 --sv R01 > output.txt

Data filter

We use the -c or --code argument to filter data out and only retain data of interest.

Example:

bash cargo run -f CBW100NLD_R_20210010000_01D_MN.rnx -c L1C,S1P

Cummulated filters

Because all arguments can be cummulated, one can create efficient data filter and focus on data of interest:

bash cargo run -f CBW100NLD_R_20210010000_01D_MN.rnx \ --lli 0 # "OK" \ --ssi 5 # not bad \ -c C1C,C2C,C1X # PR measurements only :) \ --sv G01,G2,G24,G25 # GPS focus !

teqc operations

This tool supports special operations that only teqc supports at the moment. Therefore it can be an efficient alternative to this program.

All of the special operations actually create an output file.

Merge special operation

It is possible to perform merging operations with -m or --merge, in teqc similar fashion.

When merging, if analysis are to be performed, they will be performed on the resulting record.

For example:

bash cargo run -f file1.rnx,/tmp/file2.rnx