Implementing Neural Networks in Rust from scratch + utils for data manipulation.
This was made for the purpose of my own learning. It is obviously not a production-ready library by any means.
Feel free to give feedback.
Add this in your project's Cargo.toml
file:
toml
[dependencies]
neural_networks_rust = "*"
Simple regression workflow example:
```rust // Loading a model specification from JSON let mut model = Model::fromjsonfile("mymodelspec.json");
// Applying a data pipeline on it according to its dataset specification let mut pipeline = Pipeline::basicsinglepass(); let (updateddatasetspec, data) = pipeline .add(AttachIds::new("id")) .run("./dataset", &model.dataset);
let model = model.withnewdataset(updateddatasetspec);
// Training it using k-fold cross validation // + extracting test & training metrics per folds & per epochs // + extracting all predictions made during final epoch let kfold = model.trainer.maybekfold().expect("We only do k-folds here!"); let (validationpreds, modeleval) = kfold .attachrealtimereporter(|report| println!("Perf report: {:#?}", report)) .run(&model, &data);
// Reverting the pipeline on the predictions & data to get interpretable values let validationpreds = pipeline.revertcolumnswise(&validationpreds); let data = pipeline.revertcolumnswise(&data);
// Joining the data and the predictions together let dataandpreds = data.innerjoin(&validationpreds, "id", "id", Some("pred"));
// Saving it all to disk dataandpreds.tofile("mymodelpreds.csv"); modeleval.tojsonfile("mymodelevals.json"); ```
You can then plot the results using a third-party crate like gnuplot
(recommended), plotly
or even plotters
.
But first you would need to write or generate your model's specification.
Here is an example generating it with code (recommended):
```rust // Including all features from some CSV dataset let mut datasetspec = Dataset::fromcsv("kchousedata.csv"); datasetspec // Removing useless features for both the model & derived features .removefeatures(&["id", "zipcode", "sqftliving15", "sqftlot15"]) // Setting up the price as the "output" predicted feature .addoptto("price", Out) // Setting up the date format .addoptto("date", DateFormat("%Y%m%dT%H%M%S")) // Converting the date to a datetimestamp feature .addoptto("date", AddExtractedTimestamp) // Excluding the date from the model .addoptto("date", Not(&UsedInModel)) // Mapping yrrenovated to yrbuilt if = to 0 .addoptto( "yrrenovated", Mapped( MapSelector::Equal(0.0.into()), MapOp::ReplaceWith(MapValue::Feature("yrbuilt".tostring())), ), ) // Converting relevant features to their log10 .addopt(Log10.only(&["sqftliving", "sqftabove", "price"])) // Adding ^2 features of all input features // (including the added ones like the timestamp) .addopt(AddSquared.except(&["price", "date"]).incladdedfeatures()) // Filtering rows according to feature's outliers .addopt(FilterOutliers.except(&["date"]).incladdedfeatures()) // Normalizing everything .addopt(Normalized.except(&["date"]).incladdedfeatures());
// Creating our layers let hsize = datasetspec.infeaturesnames().len() + 1; let nh = 8;
let mut layers = vec![]; for i in 0..nh { layers.push(LayerSpec::fromoptions(&[ OutSize(hsize), Activation(ReLU), Optimizer(adam()), ])); } let finallayer = LayerSpec::fromoptions(&[ OutSize(1), Activation(Linear), Optimizer(adam()), ]);
// Putting it all together let model = Model::fromoptions(&[ Dataset(datasetspec), HiddenLayers(layers.asslice()), FinalLayer(finallayer), BatchSize(128), Trainer(Trainers::KFolds(8)), Epochs(300), ]);
// Saving it all model.tojsonfile("mymodelspec.json"); ```
Vector<f64>
statistics & manipulation utils
nalgebra
(enabled by default)
linalg
linalg-rayon
faer
f32
(default) to f64
-backed Scalar
type with the f64
feature