Rstats crates.io GitHub last commit Actions Status

Author: Libor Spacek

This crate is written in 100% safe Rust.

Usage

Insert Rstats = "^1" in the Cargo.toml file, under [dependencies].

Use in your source files any of the following structs, as and when needed:

rust use Rstats::{RE,RError,TriangMat,Mstats,MinMax};

and any of the following traits:

rust use Rstats::{Stats,Vecg,Vecu8,MutVecg,VecVec,VecVecg};

and any of the following auxiliary functions:

rust use Rstats::{noop,fromop,sumn,t_stat,unit_matrix,re_error};

The latest (nightly) version is always available in the github repository Rstats. Sometimes it may be (only in some details) a little ahead of the crates.io release versions.

It is highly recommended to read and run tests.rs for examples of usage. To run all the tests, use a single thread in order not to print the results in confusing mixed-up order:

bash cargo test --release -- --test-threads=1 --nocapture

However, geometric_medians, which compares multithreading performance, should be run separately in multiple threads, as follows:

bash cargo test -r geometric_medians -- --nocapture

Alternatively, just to get a quick idea of the methods provided and their usage, read the output produced by an automated test run. There are test logs generated for each new push to the github repository. Click the latest (top) one, then Rstats and then Run cargo test ... The badge at the top of this document lights up green when all the tests have passed and clicking it gets you to these logs as well.

Any compilation errors arising out of rstats crate indicate most likely that some of the dependencies are out of date. Issuing cargo update command will usually fix this.

Introduction

Rstats has a small footprint. Only the best methods are implemented, primarily with Data Analysis and Machine Learning in mind. They include multidimensional ('nd' or 'hyperspace') analysis, i.e. characterising clouds of n points in space of d dimensions.

Several branches of mathematics: statistics, information theory, set theory and linear algebra are combined in this one consistent crate, based on the abstraction that they all operate on the same data objects (here Rust Vecs). The only difference being that an ordering of their components is sometimes assumed (in linear algebra, set theory) and sometimes it is not (in statistics, information theory, set theory).

Rstats begins with basic statistical measures, information measures, vector algebra and linear algebra. These provide self-contained tools for the multidimensional algorithms but they are also useful in their own right.

Non analytical (non parametric) statistics is preferred, whereby the 'random variables' are replaced by vectors of real data. Probabilities densities and other parameters are in preference obtained from the real data (pivotal quantity), not from some assumed distributions.

Linear algebra uses generic data structure Vec<Vec<T>> capable of representing irregular matrices.

Struct TriangMat is defined and used for symmetric, anti-symmetric, and triangular matrices, and their transposed versions, saving memory.

Our treatment of multidimensional sets of points is constructed from the first principles. Some original concepts, not found elsewhere, are defined and implemented here (see the next section).

Zero median vectors are generally preferred to commonly used zero mean vectors.

In n dimensions, many authors 'cheat' by using quasi medians (one dimensional (1d) medians along each axis). Quasi medians are a poor start to stable characterisation of multidimensional data. Also, they are actually slower to compute than is our gm (true geometric median), as soon as the number of dimensions exceeds trivial numbers.

Specifically, all such 1d measures are sensitive to the choice of axis and thus are affected by their rotation.

In contrast, our methods based on gm are axis (rotation) independent. Also, they are more stable, as medians have a 50% breakdown point (the maximum possible).

We compute geometric medians by methods gmedian and its parallel version par_gmedian in trait VecVec and their weighted versions wgmedian and par_wgmedian in trait VecVecg. It is mostly these efficient algorithms that make our new concepts described below practical.

Additional Documentation

For more detailed comments, plus some examples, see rstats in docs.rs. You may have to go directly to the modules source. These traits are implemented for existing 'out of this crate' rust Vec type and rust docs do not display 'implementations on foreign types' very well.

New Concepts and their Definitions

Previously Known Concepts and Terminology

Implementation Notes

The main constituent parts of Rstats are its traits. The different traits are determined by the types of objects to be handled. The objects are mostly vectors of arbitrary length/dimensionality (d). The main traits are implementing methods applicable to:

The traits and their methods operate on arguments of their required categories. In classical statistical parlance, the main categories correspond to the number of 'random variables'.

Vec<Vec<T>> type is used for rectangular matrices (could also have irregular rows).

struct TriangMat is used for symmetric / antisymmetric / transposed / triangular matrices and wedge and geometric products. All instances of TriangMat store only n*(n+1)/2 items in a single flat vector, instead of n*n, thus almost halving the memory requirements. Their transposed versions only set up a flag kind >=3 that is interpreted by software, instead of unnecessarily rewriting the whole matrix. Thus saving some processing as well. All this is put to a good use in our implementation of the matrix decomposition methods.

The vectors' end types (of the actual data) are mostly generic: usually some numeric type. Copy trait bounds on these generic input types have been relaxed to Clone, to allow you to clone your own complex data end types in any way you choose. There is no difference to the users for ordinary simple types.

The computed results end types are mostly f64.

Errors

Rstats crate produces custom error RError:

rust pub enum RError<T> where T:Sized+Debug { /// Insufficient data NoDataError(T), /// Wrong kind/size of data DataError(T), /// Invalid result, such as prevented division by zero ArithError(T), /// Other error converted to RError OtherError(T) }

Each of its enum variants also carries a generic payload T. Most commonly this will be a String message, giving more helpful explanation, e.g.:

rust if dif <= 0_f64 { return Err(RError::ArithError(format!( "cholesky needs a positive definite matrix {}", dif ))); };

format!(...) is used to insert values of variables to the payload String, as shown. These errors are returned and can then be automatically converted (with ?) to users' own errors. Some such error conversions are implemented at the bottom of errors.rs file and used in tests.rs.

There is a type alias shortening return declarations to, e.g.: Result<Vec<f64>,RE>, where

rust pub type RE = RError<String>;

Convenience function re_error can be used to construct these errors with either String or &str payload messages, as follows: rust if denom == 0. { return Err(re_error("arith","Attempted division by zero!")); };

Structs

struct MStats

holds the central tendency of 1d data, e.g. some kind of mean or median, and its spread measure, e.g. standard deviation or 'mad'.

struct TriangMat

holds triangular matrices of all kinds, as described in Implementation section above. Beyond the usual conversion to full matrix form, a number of (the best) Linear Algebra methods are implemented directly on TriangMat, in module triangmat.rs, such as:

Some methods implemented for VecVecg also produce TriangMat matrices, specifically the covariance/comedience calculations: covar and wcovar. Their results are positive definite, which makes the most efficient Cholesky-Banachiewicz decomposition applicable.

Quantify Functions (Dependency Injection)

Most methods in medians::Median trait and some methods in indxvec crate, e.g. hashort, find_any and find_all, require explicit closure passed to them, usually to tell them how to quantify input data of any type T, into f64. Variety of different quantifying methods can then be dynamically employed.

For example, in text analysis (&str type), it can be the word length, or the numerical value of its first few bytes, or the numerical value of its consonants, etc. Then we can sort them or find their means / medians / spreads under these different measures. We do not necessarily want to explicitly store all such quantifications, as data can be voluminous. Rather, we want to be able to compute any of them on demand.

noop

is a shorthand dummy function to supply to these methods, when the data is already of f64 end type. The second line is the full equivalent version that can be used instead:

rust noop |f:&f64| *f

asop

When T is a wide primitive type, such as i64, u64, usize, that can only be converted to f64 by explicit truncation, we can use:

rust |f:&T| *f as f64

fromop

When T is a narrow numeric type, or is convertible by an existing From implementation, and f64:From<T> has been duly added everywhere as a trait bound, then we can pass in one of these:

rust fromop |f:&T| (*f).clone().into()

All other cases were previously only possible with manual implementation written for the (global) From trait for each type and each different quantification method, whereby the different quantification would conflict with each other. Now the user can simply pass in a custom 'quantify' closure. This generality is obtained at the price of a small inconvenience: using the above signature closures in simple cases.

Auxiliary Functions

Trait Stats

One dimensional statistical measures implemented for all numeric end types.

Its methods operate on one slice of generic data and take no arguments. For example, s.amean()? returns the arithmetic mean of the data in slice s. These methods are checked and will report RError(s), such as an empty input. This means you have to apply ? to their results to pass the errors up, or explicitly match them to take recovery actions, depending on the error variant.

Included in this trait are:

Note that a fast implementation of 1d 'classic' medians is, as of version 1.1.0, provided in a separate crate medians.

Trait Vecg

Generic vector algebra operations between two slices &[T], &[U] of any (common) length (dimensions). Note that it may be necessary to invoke some using the 'turbofish' ::<type> syntax to indicate the type U of the supplied argument, e.g.:

rust datavec.somemethod::<f64>(arg)

Methods implemented by this trait:

Note that our median correlation is implemented in a separate crate medians.

Some simpler methods of this trait may be unchecked (for speed), so some caution with data is advisable.

Trait MutVecg

A select few of the Stats and Vecg methods (e.g. mutable vector addition, subtraction and multiplication) are reimplemented under this trait, so that they can mutate self in-place. This is more efficient and convenient in some circumstances, such as in vector iterative methods.

However, these methods do not fit in with the functional programming style, as they do not explicitly return anything (their calls are statements with side effects, rather than expressions).

Trait Vecu8

Some vector algebra as above that can be more efficient when the end type happens to be u8 (bytes). These methods have u8 appended to their names to avoid confusion with Vecg methods. These specific algorithms are different to their generic equivalents in Vecg.

Trait VecVec

Relationships between n vectors in d dimensions. This (hyper-dimensional) data domain is denoted here as (nd). It is in nd where the main original contribution of this library lies. True geometric median (gm) is found by fast and stable iteration, using improved Weiszfeld's algorithm gmedian. This algorithm solves Weiszfeld's convergence and stability problems in the neighbourhoods of existing set points. Its variant, par_gmedian, employs multithreading for faster execution and gives otherwise the same result.

Trait VecVecg

Methods which take an additional generic vector argument, such as a vector of weights for computing weighted geometric medians (where each point has its own weight). Matrices multiplications.

Appendix: Recent Releases