Current Crates.io Version Test Status Documentation Rust Version license

This library aims to be a complete deep learning framework with extreme flexibility written in Rust. The goal would be to satisfy researchers as well as practitioners making it easier to experiment, train and deploy your models.

Features

Details

Example

Full example showing most of the features from burn MNIST.

Components

Knowing the main components will be of great help when starting playing with burn.

Backend

Almost everything is based on the Backend trait, which allows to run tensor operations with different implementations without having to change your code. A backend does not necessary have autodiff capabilities, therefore you can use ADBackend when you need it.

Tensor

The Tensor struct is at the core of the burn framework. It takes two generic parameters, the Backend and the number of dimensions D,

```rust use burn::tensor::{Tensor, Shape, Data}; use burn::tensor::backend::{NdArrayBackend, TchBackend};

let myndarraymatrix = Tensor::, 2>::ones(Shape::new([3, 3])); let mytchmatrix = Tensor::, 2>::from_data( Data::from([[1.0, 7.0], [13.0, -3.0]]) ); ```

Note that Data is not specific to any backend.

Module

The Module derive let your create your own neural network module similar to PyTorch.

```rust use burn::nn; use burn::module::{Param, Module}; use burn::tensor::backend::Backend;

[derive(Module, Debug)]

struct MyModule { my_param: Param>, repeat: usize, } ```

Note that only the fields wrapped inside Param are updated during training, and the other ones should implement Clone.

Forward

The Forward trait can also be implemented by your module.

```rust use burn::module::Forward; use burn::tensor::Tensor;

impl Forward, Tensor> for MyModule { fn forward(&self, input: Tensor) -> Tensor { let mut x = input;

   for _ in 0..self.repeat {
       x = self.my_param.forward(x);
   }

   x

} } ```

Note that you can implement multiple time the Forward trait with different inputs and outputs.