egobox-moe
provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
egobox-moe
is a library crate in the top-level package egobox.
egobox-moe
currently implements mixture of gaussian processes provided by egobox-gp
:
linfa-clustering/gmm
)There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0