This is a from-scratch implementation of a feedforward neural network in Rust, built without using high-level machine learning libraries.
Medium Blog: https://medium.com/@dkjain2005co/neural-network-in-rust-on-mnist-dataset-from-scratch-f42971eaead3
Detailed explanation of working given below.
DATA: 784, 60000
LABELS: 1, 60000
[[5, 0, 4, 1, 9, ..., 8, 3, 5, 6, 8]]
Iteration: 0
Accuracy: 10.86%
Iteration: 50
Accuracy: 56.97%
Iteration: 100
Accuracy: 69.91%
Iteration: 150
Accuracy: 75.45%
Iteration: 200
Accuracy: 78.56%
DATA: 784, 60000
LABELS: 1, 60000
[[5, 0, 4, 1, 9, ..., 8, 3, 5, 6, 8]]
Iteration: 0
Accuracy: 12.46%
Iteration: 50
Accuracy: 47.05%
Iteration: 100
Accuracy: 61.53%
Iteration: 150
Accuracy: 69.01%
Iteration: 200
Accuracy: 73.28%
Iteration: 250
Accuracy: 76.48%
Iteration: 300
Accuracy: 78.93%
Iteration: 350
Accuracy: 80.81%
Iteration: 400
Accuracy: 82.38%
Iteration: 450
Accuracy: 83.53%
Iteration: 500
Accuracy: 84.48%
- Manual forward and backward propagation
- Use of ReLU and softmax activation functions
- One-hot encoding
- Gradient descent for training
- Accuracy evaluation
- Model parameter export to CSV using polars
- ndarray (store 2d array of data)
- ndarray-rand (generate intial random weights(w) and biases(b))
- polars (to read write data in csv)
- input layer, 1 hidden layer, output layer
- Input: 784-dimensional MNIST images
- Hidden layer: 10 neurons with ReLU as activation function
- Output layer: 10 neurons with softmax as activation function for multi-class classification
main.rs
: Training loop and evaluationlib.rs
: Core model logic — forward, backward, update, softmax, etc.final_config/
: Stores final weights and biasesmnistdata/
: Contains input dataset
Make sure the MNIST dataset is placed in mnistdata/.