Rust toolbox for Efficient Global Optimization algorithms inspired from SMT.
egobox
is twofold:
1. for developers: a set of Rust libraries useful to implement bayesian optimization (EGO-like) algorithms,
2. for end-users: a Python module, the Python binding of the implemented EGO-like optimizer, named Egor
and surrogate model Gpx
, mixture of Gaussian processes.
egobox
Rust libraries consists of the following sub-packages.
| Name | Version | Documentation | Description |
| :---------------------------------------------------- | :---------------------------------------------------------------------------------------------- | :-------------------------------------------------------------------------- | :------------------------------------------------------------------------------ |
| doe | |
| sampling methods; contains LHS, FullFactorial, Random methods |
| gp |
|
| gaussian process regression; contains Kriging and PLS dimension reduction |
| moe |
|
| mixture of experts using GP models |
| ego |
|
| efficient global optimization with basic constraints and mixed integer handling |
Depending on the sub-packages you want to use, you have to add following declarations to your Cargo.toml
text
[dependencies]
egobox-doe = { version = "0.8.0" }
egobox-gp = { version = "0.8.0" }
egobox-moe = { version = "0.8.0" }
egobox-ego = { version = "0.8.0" }
serializable-gp
The serializable-gp
feature enables the serialization of GP models using the serde crate.
persistent-moe
The persistent-moe
feature enables save()
and load()
methods for MoE model to/from a json file using the serde crate.
Examples (in examples/
sub-packages folder) are run as follows:
bash
$ cd doe && cargo run --example samplings --release
bash
$ cd gp && cargo run --example kriging --release
bash
$ cd moe && cargo run --example clustering --release
bash
$ cd ego && cargo run --example ackley --release
egobox
relies on linfa project for methods like clustering and dimension reduction, but also try to adopt as far as possible the same coding structures.
As for linfa
, the linear algebra routines used in gp
, moe
ad ego
are provided by the pure-Rust linfa-linalg crate, the default linear algebra provider.
Otherwise, you can choose an external BLAS/LAPACK backend available through the ndarray-linalg crate. In this case, you have to specify the blas
feature and a linfa
BLAS/LAPACK backend feature (more information in linfa features).
Thus, for instance, to use gp
with the Intel MKL BLAS/LAPACK backend, you could specify in your Cargo.toml
the following features:
text
[dependencies]
egobox-gp = { version = "0.8.0", features = ["blas", "linfa/intel-mkl-static"] }
or you could run the gp
example as follows:
bash
$ cd gp && cargo run --example kriging --release --features blas,linfa/intel-mkl-static
egobox
Python bindingThanks to the PyO3 project, which makes Rust well suited for building Python extensions. You can install the Python package using:
bash
$ pip install egobox
See the tutorial notebooks for usage of the optimizer and mixture of Gaussian processes surrogate model.
If you find this project useful for your research, you may cite it as follows:
text
@article{
Lafage2022,
author = {RĂ©mi Lafage},
title = {egobox, a Rust toolbox for efficient global optimization},
journal = {Journal of Open Source Software}
year = {2022},
doi = {10.21105/joss.04737},
url = {https://doi.org/10.21105/joss.04737},
publisher = {The Open Journal},
volume = {7},
number = {78},
pages = {4737},
}
Additionally, you may consider adding a star to the repository. This positive feedback improves the visibility of the project.