Corgi

A neural network, and tensor dynamic automatic differentiation implementation for Rust.

Build: Github Workflow Download: crates.io Documentation: docs.rs Licence: MIT


BLAS

Important Design Notes

Examples

for _ in 0..iterations { let mut input = vec![0.0; inputsize * batchsize]; let mut target = vec![0.0; outputsize * batchsize];

// set inputs, and targets

// arrays in corgi should not be mutated after creation, so we initialise the values first
let input = Array::from((vec![batch_size, input_size], input));
let target = Array::from((vec![batch_size, output_size], target));

let _result = model.forward(input.clone());
let loss = model.backward(target.clone());
// update the parameters, and clear gradients (backward pass only sets gradients)
model.update();

println!("loss: {}", loss);

} * Dynamic computational graph: rust let a = arr![5.0].tracked(); let b = arr![2.0].tracked(); let mut c = arr![0.0].tracked();

for _ in 0..10 { c = &c + &(&a * &b); if c[0] > 50.0 { c = &c * &a; } }

assert_eq!(c, arr![195300.0]);

c.backward(None); asserteq!(c.gradient(), arr![1.0]); asserteq!(b.gradient(), arr![97650.0]); assert_eq!(a.gradient(), arr![232420.0]); ``` * Custom operation (still needs some work).

Design

Tracked Arrays

Backward Pass

Informal UML sequence diagram

Name

Acknowledgements

Licence