Neural Network from Scratch
Draw a digit below — watch pure math classify it.
Draw a digit below — watch pure math classify it.
This project builds a neural network from first principles using MNIST (60,000 train, 10,000 test) and serves the learned weights directly in-browser for real-time digit recognition.
Each MNIST image is 28x28 grayscale, flattened into a 784-length vector with values normalized to 0-1. This vector is the raw numeric signal the network learns from.
The first weight matrix projects input pixels into 128 hidden units. ReLU keeps positive activations and suppresses weak/noisy responses, turning random weights into useful digit-sensitive features.
Hidden activations are mapped to 10 logits, one for each digit class (0-9). Softmax converts logits into probabilities, and the highest probability is selected as the prediction.
Error is computed as actual - prediction, giving direction to weight updates. Output-layer updates are scaled by hidden activations and learning rate, then propagated back to input-layer weights so every update reflects contribution to final error.
Canvas input is cropped, scaled to fit a 20x20 box, centered by center-of-mass in a 28x28 frame, and normalized before inference. Matching MNIST-style preprocessing is what makes the deployed model robust enough to reach ~95% test accuracy.