Skip to content

Latest commit

 

History

History
14 lines (8 loc) · 645 Bytes

README.md

File metadata and controls

14 lines (8 loc) · 645 Bytes

MLP

Multi-Layer Perceptron (MLP) from scratch to classify digits in the MNIST dataset

This project will implement a Multi-Layer Perceptron (MLP) from scratch to classify digits in the MNIST dataset. It's layers will have both forward and backward pass implementations. Inbuilt functions from PyTorch are not used, or other similar libraries for the implementation of the following.

Implementation

• Fully-connected layer with bias.

• ReLU and Sigmoid activations.

• Mini-batch Gradient Descent (SGD) and Cross-Entropy loss.

File is not uploaded as per "Eindhoven University of Technology" (TU/e) assignment guidelines.