Skip to content

Multi-Layer Perceptron (MLP) from scratch to classify digits in the MNIST dataset

Notifications You must be signed in to change notification settings

Thomaspatrick14/MLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 

Repository files navigation

MLP

Multi-Layer Perceptron (MLP) from scratch to classify digits in the MNIST dataset

This project will implement a Multi-Layer Perceptron (MLP) from scratch to classify digits in the MNIST dataset. It's layers will have both forward and backward pass implementations. Inbuilt functions from PyTorch are not used, or other similar libraries for the implementation of the following.

Implementation

• Fully-connected layer with bias.

• ReLU and Sigmoid activations.

• Mini-batch Gradient Descent (SGD) and Cross-Entropy loss.

File is not uploaded as per "Eindhoven University of Technology" (TU/e) assignment guidelines.

About

Multi-Layer Perceptron (MLP) from scratch to classify digits in the MNIST dataset

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published