Elara Math

Elara Math is a Rust-native math library, with (current or planned support for):

*: GPU tensors are not available yet, but GPU acceleration is planned to be added in the future

As an example, here is a working tiny neural network using elara-math.

```rs use elara_math::prelude::*;

const EPOCHS: usize = 10000; const LR: f64 = 1e-5;

fn main() { let traindata = tensor![ [0.0, 0.0, 1.0], [1.0, 1.0, 1.0], [1.0, 0.0, 1.0], [0.0, 1.0, 1.0]]; let trainlabels = tensor![ [0.0], [1.0], [1.0], [0.0] ].reshape([4, 1]); let mut weights = Tensor::rand([3, 1]); for epoch in 0..EPOCHS { let output = traindata.matmul(&weights).relu(); let loss = elaramath::mse(&output, &trainlabels); println!("Epoch {}, loss: {:?}", epoch, loss); loss.backward(); let adjustment = weights.grad() * LR; weights = weights - Tensor::new(adjustment); weights.zerograd(); } let preddata = tensor![[1.0, 0.0, 0.0]]; let pred = &preddata.matmul(&weights).relu(); println!("Weights after training: {:?}", weights); println!("Prediction [1, 0, 0] -> {:?}", pred.borrow().data); } ```

Developing

To develop elara-math, first clone the repository:

git clone https://github.com/elaraproject/elara-math

Then, copy over the pre-commit githook:

cp .githooks/pre-commit .git/hooks/pre-commit && chmod a+x .git/hooks/pre-commit

You should then be all set to start making changes!