Neural Network Demo

This demo demonstrates a small feed-forward neural network trained to classify handwritten digits from the MNIST dataset. Each 28x28 image is flattened into a vector and passed through several fully connected layers with ReLU activations, followed by a Softmax output layer that produces probabilities for the digits 0-9.

The network is trained using cross-entropy loss and mini-batch gradient descent. During training, backpropagation computes how each weight contributed to the error, allowing the model to update its parameters and gradually improve its predictions.

The entire implementation is written from scratch with NumPy to illustrate the fundamental mechanics behind neural networks and to show that it is possible to implement such models without relying on machine learning frameworks.