Skip to content

adelsz/oxigrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Oxigrad - toy automatic differentiation engine in Rust

This is a self-contained toy automatic differentiation engine written in Rust.
Built for learning purposes and partially inspired by micrograd.
API is focused on ease of use and educational clarity, performance was not a priority.

Usage

    #[test]
    fn backprop_test() {
        let a = Value::new(2.0);
        let b = Value::new(1.0);
    
        // z = (a * b + a) * (a * b + b) = a^2 * b^2 + a^2 * b + a * b^2 + a * b
        let z = (&a * &b + &a) * (&a * &b + &b);
    
        backprop(&z);
    
        // dz/da = 2ab^2 + 2ab + b^2 + b = 2*2*1 + 2*2 + 1 + 1 = 10
        assert_eq!(a.grad().get(), 10.0);
    }

Basic neural network training

There is a basic NN training example in src/neuron.rs that also saves a training progress visualization in a GIF file.

About

Toy autodifferentiation engine in Rust

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages