Zyx is open source tensor library. It defines struct Variable that adds gradient to any datatype. Provided is multidimensional array that can optionally use matrixmultiply crate for faster execution.
From user perspective, it works similar to PyTorch. Also names of functions are mostly the same, so that you can quickly pick up this library, if you are familiar with PyTorch.
For examples of linear and recurrent neural networks, look at examples directory.
If you want to accelerate matrix multiplication using matrixmultiply crate, use --features=matrimultiply
.
```rust use zyx::prelude::*; use zyx::accel::cpu::Buffer;
let x = Buffer::uniform((2, 3, 2, 3), -1., 1.).withgrad();
let y = Buffer::
let z = x.matmul(&y).sum(()); z.backward();
println!("{}", x.grad()); println!("{}", y.grad()); ```
Want to use scalars? Just give them gradients!
```rust use zyx::prelude::*;
let x = 3f32.withgrad(); let y = 5.; let z = (&x + y).relu(); z.backward(); println!("{}", x.grad()); ```
Want to use ndarray? Just give it gradients and use --features=ndarray
!
Note that reduce and movement ops are not yet implemented for ndarray.
```rust
use zyx::prelude::*; use ndarray::array;
let x = array![[2., 3., 4.], [3., 4., 2.]]; let x = x.with_grad(); x.exp().backward(); println!("{}", x.grad());
```
The library is available on crates.io: https://crates.io/crates/zyx
Not all features are yet implemented and not all tests are written. Therefore this library can not be considered stable yet, but we are getting closer to stable release as the main API is not gonna change much anymore. Preliminary support for convolution is done. Most stuff is implemented and working as intended.
This is the order of modules from most to least important. 1. ops 2. tensor 3. accel 4. optim 5. module 6. shape 7. nn 8. init
To all the users and contributors. Without you, this library would have no reason to exist.
Any opinions, issue reports, feature requests as well as code contributions are very welcome.