Zyx is open source tensor library. It defines struct Variable that adds gradient to any datatype. Provided is multidimensional array that can optionally use matrixmultiply crate for faster execution.
From user perspective, it works similar to PyTorch. Also names of functions are mostly the same, so that you can quickly pick up this library if you are familiar with PyTorch.
We want to provide a way to do automatic differentiation and backpropagation for any datatypes, whether are those scalars, arrays, matrices, or tensors. This library aims to be zero cost abstraction and use simple Rust syntax for this autodiff and backprop.
By passing datatype into .with_grad() function you create Variable. Variable stores your datatype and adds gradient to this datatype. This gradient is of the same type as your datatype. To manage access to this gradient we use UnsafeCell as gradient must be accessed from different places.
Tensor is a result of a mathematical or other operation performed on Variable. Tensor creates the graph needed for backpropagation at compile time.
All operations are executed eagerly.
TL DR: By zero cost abstraction we mean zero dyn, zero Rc, zero RefCell and minimal number of branches.
The syntax you will be using as a user is very close to PyTorch. Also, although the graph is created at compile time, it behaves completely dynamically (i. e. RNNs are easy). You don't need to do any graph.compile or graph.execute calls. Tensor and Variable are both immutable
For examples of linear and recurrent neural networks, look at examples directory.
If you want to accelerate matrix multiplication using matrixmultiply crate, use --features=matrimultiply
.
```rust
use zyx::prelude::*; use zyx::accel::cpu::Buffer;
let x = Buffer::uniform((2usize, 3, 2, 3), -1., 1.).withgrad();
let y = Buffer::
let z = x.matmul(&y).sum((0i32, 1, 2, 3)); z.backward();
println!("{}", x.grad()); println!("{}", y.grad());
```
Want to use scalars? Just give them gradients!
```rust use zyx::prelude::*;
let x = 3f32.withgrad(); let y = 5.; let z = (&x + y).relu(); z.backward(); println!("{}", x.grad()); ```
Want to use ndarray? Just give it gradients and use --features=ndarray
!
Note that reduce and movement ops are not yet implemented for ndarray. Support for binary operations is limited.
```rust
use zyx::prelude::*; use ndarray::array;
let x = array![[2., 3., 4.], [3., 4., 2.]]; let x = x.with_grad(); x.exp().backward(); println!("{}", x.grad());
```
The library is available on crates.io: https://crates.io/crates/zyx
Not all features are yet implemented and not all tests are written. Therefore this library can not be considered stable yet, but we are getting closer to stable release as the main API is not gonna change much anymore. Preliminary support for convolution is done. Most stuff is implemented and working as intended.
This is the order of modules from most to least important. 1. ops 2. tensor 3. accel 4. optim 5. module 6. shape 7. nn 8. init
Any opinions, issue reports, feature requests as well as code contributions are very welcome.