qsv: Ultra-fast CSV data-wrangling CLI toolkit

Linux build status Windows build status macOS build status Security audit Crates.io Minimum supported Rust version Discussions Docs Downloads
qsv is a command line program for indexing, slicing, analyzing, splitting, enriching, validating & joining CSV files. Commands are simple, fast and composable.

NOTE: qsv is a fork of the popular xsv utility, merging several pending PRs since xsv 0.13.0's release. It has numerous new features & 20 additional commands (for a total of 40). See FAQ for more details.

Available commands

| Command | Description | | --- | --- | | apply[^1] | Apply series of string, date, currency & geocoding transformations to a CSV column. It also has some basic NLP functions (similarity, sentiment analysis, profanity, eudex & language detection). | | behead | Drop headers from a CSV. | | cat | Concatenate CSV files by row or by column. | | count[^2] | Count the rows in a CSV file. (Instantaneous with an index.) | | dedup[^3] | Remove redundant rows. | | enum | Add a new column enumerating rows by adding a column of incremental or uuid identifiers. Can also be used to copy a column or fill a new column with a constant value. | | exclude[^2] | Removes a set of CSV data from another set based on the specified columns. | | explode | Explode rows into multiple ones by splitting a column value based on the given separator. | | fetch | Fetches HTML/data from web pages or web services for every row in a URL column. | | fill | Fill empty values. | | fixlengths | Force a CSV to have same-length records by either padding or truncating them. Can also be used to inspect if a CSV is well-formed. | | flatten | A flattened view of CSV records. Useful for viewing one record at a time.
e.g. qsv slice -i 5 data.csv \| qsv flatten. | | fmt | Reformat a CSV with different delimiters, record terminators or quoting rules. (Supports ASCII delimited data.) | | foreach[^1] | Loop over a CSV to execute bash commands. (not available on Windows) | | frequency^2 | Build frequency tables of each column. (Uses multithreading to go faster if an index is present.) | | generate[^1] | Generate test data by profiling a CSV using Markov decision process machine learning. | | headers | Show the headers of a CSV. Or show the intersection of all headers between many CSV files. | | index | Create an index for a CSV. This is very quick & provides constant time indexing into the CSV file. | | input | Read a CSV with exotic quoting/escaping rules. | | join[^2] | Inner, outer, cross, anti & semi joins. Uses a simple hash index to make it fast. | | jsonl | Convert newline-delimited JSON (JSONL/NDJSON) to CSV. | lua[^1] | Execute a Lua script over CSV lines to transform, aggregate or filter them. | | partition | Partition a CSV based on a column value. | | pseudo | Pseudonymise the value of the given column by replacing them with an incremental identifier. | | py[^1] | Evaluate a Python expression over CSV lines to transform, aggregate or filter them. Python's f-strings is particularly useful for extended formatting (Python 3.8+ required). | | rename | Rename the columns of a CSV efficiently. | | replace | Replace CSV data using a regex. | | reverse[^3] | Reverse order of rows in a CSV. Unlike the sort --reverse command, it preserves the order of rows with the same key. | | sample[^2] | Randomly draw rows (with optional seed) from a CSV using reservoir sampling (i.e., use memory proportional to the size of the sample). | | schema | Infer schema from CSV data and output in JSON Schema format. See validate command. (WIP) | | search | Run a regex over a CSV. Applies the regex to each field individually & shows only matching rows. | | searchset | Run multiple regexes over a CSV in a single pass. Applies the regexes to each field individually & shows only matching rows. | | select | Select, re-order, duplicate or drop columns. | | slice^2 | Slice rows from any part of a CSV. When an index is present, this only has to parse the rows in the slice (instead of all rows leading up to the start of the slice). | | sort | Sorts CSV data in alphabetical, numerical, reverse or random (with optional seed) order. | | split^2 | Split one CSV file into many CSV files of N chunks. | | stats^2[^4] | Infer data type & compute descriptive statistics for each column in a CSV (sum, min/max, min/max length, mean, stddev, variance, quartiles, IQR, lower/upper fences, skew, median, mode, cardinality & nullcount) | | table[^3] | Show aligned output of a CSV using elastic tabstops. | | transpose[^3] | Transpose rows/columns of a CSV. | | validate | Validate CSV data with JSON Schema. See schema command. |

Installation

Pre-built binaries for Windows, Linux and macOS are available from GitHub.

There are two versions of qsv. qsvlite has all features disabled. qsv supports features, with the pre-built binaries enabling all valid platform features[^5].

Alternatively, you can compile from source by installing Cargo (Rust's package manager) and installing qsv using Cargo:

bash cargo install qsv --path .

If you encounter compilation errors, ensure you're using the exact version of the dependencies qsv was built with by issuing:

bash cargo install qsv --path . --locked

Compiling from this repository also works similarly:

```bash git clone git://github.com/jqnatividad/qsv cd qsv cargo build --release

or if you encounter compilation errors

cargo build --release --locked ```

The compiled binary will end up in ./target/release/qsv.

To enable optional features, use cargo --features or --all-features (see Feature Flags for more info):

```bash cargo install qsv --path . --features apply,generate,lua,foreach,python

or

cargo install qsv --path . --all-features

or when compiling from a local repo

cargo build --release --features apply,generate,lua,foreach,python

or

cargo build --release --all-features ```

access to a native python interpreter for those platforms (aarch64, i686, and arm) on GitHub's action runners. Compile natively on those platforms with Python 3.8+ installed, if you want to enable the python feature.

Minimum Supported Rust Version

Building qsv requires Rust version 1.58.1.

Tab Completion

qsv's command-line options are quite extensive. Thankfully, since it uses docopt for CLI processing, we can take advantage of docopt.rs' tab completion support to make it easier to use qsv at the command-line (currently, only bash shell is supported):

```bash

install docopt-wordlist

cargo install docopt

IMPORTANT: run these commands from the root directory of your qsv git repository

to setup bash qsv tab completion

echo "DOCOPTWORDLISTBIN=\"$(which docopt-wordlist)"\" >> $HOME/.bashcompletion echo "source \"$(pwd)/scripts/docopt-wordlist.bash\"" >> $HOME/.bashcompletion echo "complete -F docoptwordlistcommands qsv" >> $HOME/.bashcompletion ```

Recognized file formats

qsv recognizes CSV (.csv file extension) and TSV files (.tsv and .tab file extensions). CSV files are assummed to have "," (comma) as a delimiter, and TSV files, "\t" (tab) as a delimiter. The delimiter is a single ascii character that can be set either by the --delimiter command-line option or with the QSV_DEFAULT_DELIMITER environment variable.

JSONL/NDJSON files are also recognized and converted to CSV with the jsonl command.

Environment Variables

Several dependencies also have environment variables that influence qsv's performance & behavior:

NOTE: To get a list of all qsv-relevant environment variables, run qsv --envlist.

Feature Flags

qsv has several features:

The following "power-user" commands can be abused and present "foot-shooting" scenarios. * lua - enable lua command. * foreach - enable foreach command (not valid for Windows). * python - enable py command (requires Python 3.8+). Note that qsv will automatically use the currently activated python version when run in a virtual environment.

NOTE: qsvlite, as the name implies, always has non-default features disabled. qsv can be built with any combination of the above features using the cargo --features, --all-features & --no-default-features flags. The pre-built qsv binaries has all applicable features enabled for the target platform[^5].

Performance Tuning

CPU Optimization

Modern CPUs have various features that the Rust compiler can take advantage of to increase performance. If you want the compiler to take advantage of these CPU-specific speed-ups, set this environment variable BEFORE installing/compiling qsv:

On Linux and macOS: bash export CARGO_BUILD_RUSTFLAGS='-C target-cpu=native'

On Windows Powershell: powershell $env:CARGO_BUILD_RUSTFLAGS='-C target-cpu=native'

Do note though that the resulting binary will only run on machines with the same architecture as the machine you installed/compiled from.
To find out your CPU architecture and other valid values for target-cpu:

```bash rustc --print target-cpus

to find out what CPU features are used by the Rust compiler WITHOUT specifying target-cpu

rustc --print cfg | grep -i target_feature

to find out what additional CPU features will be used by the Rust compiler when you specify target-cpu=native

rustc --print cfg -C target-cpu=native | grep -i target_feature

to get a short explanation of each CPU target-feature

rustc --print target-features ```

Memory Allocator

By default, qsv uses an alternative allocator - mimalloc, a performance-oriented allocator from Microsoft. If you want to use the standard allocator, use the --no-default-features flag when installing/compiling qsv, e.g.:

bash cargo install qsv --path . --no-default-features

or

bash cargo build --release --no-default-features

To find out what memory allocator qsv is using, run qsv --version. After the qsv version number, the allocator used is displayed ("standard" or "mimalloc"). Note that mimalloc is not supported on the x86_64-pc-windows-gnu and arm targets, and you'll need to use the "standard" allocator on those platforms.

Buffer size

Depending on your filesystem's configuration (e.g. block size, file system type, writing to remote file systems (e.g. sshfs, efs, nfs), SSD or rotating magnetic disks, etc.), you can also fine-tune qsv's read/write buffers.

By default, the read buffer size is set to 16k, you can change it by setting the environment variable QSV_RDR_BUFFER_CAPACITY in bytes.

The same is true with the write buffer (default: 64k) with the QSV_WTR_BUFFER_CAPACITY environment variable.

Multithreading

Several commands support multithreading - stats, frequency and split.

Previously, these commands spawned several jobs equal to the number of logical processors. After extensive benchmarking, it turns out doing so often results in the multithreaded runs running slower than single-threaded runs.

Multithreaded jobs do increase performance - to a point. After a certain number of threads, there are not only diminishing returns, the multithreading overhead actually results in slower runs.

Starting with qsv 0.22.0, a heuristic of setting the maximum number of jobs to the number of logical processors divided by 3 is applied. The user can still manually override this using the --jobs command-line option or the QSV_MAX_JOBS environment variable, but testing shows negative returns start at around this point.

These observations were gathered using the benchmark script, using a relatively large file (520mb, 41 column, 1M row sample of NYC's 311 data). Performance will vary based on environment - CPU architecture, amount of memory, operating system, I/O speed, and the number of background tasks, so this heuristic will not work for every situation.

To find out your jobs setting, call qsv --version. The second to the last number is the number of jobs qsv will use for multi-threaded commands. The last number is the number of logical processors detected by qsv.

Benchmarking for Performance

Use and fine-tune the benchmark script when tweaking qsv's performance to your environment. Don't be afraid to change the benchmark data and the qsv commands to something that is more representative of your workloads.

Use the generated benchmark TSV files to meter and compare performance across platforms. You'd be surprised how performance varies across environments - e.g. qsv's join performs abysmally on Windows's WSL running Ubuntu 20.04 LTS, taking 172.44 seconds. On the same machine, running in a VirtualBox VM at that with the same Ubuntu version, join was done in 1.34 seconds - two orders of magnitude faster!

However, stats performs two times faster on WSL vs the VirtualBox VM - 2.80 seconds vs 5.33 seconds for the stats_index benchmark.

License

Dual-licensed under MIT or the UNLICENSE.

Sponsor

qsv was made possible by datHere - Data Infrastructure Engineering.
Standards-based, best-of-breed, open source solutions to make your Data Useful, Usable & Used.

Naming Collision

This project is unrelated to Intel's Quick Sync Video.