Assignment submission for the course Fundamentals of Deep Learning (CS6910) in the Spring 2022 Semester, under Prof. Mitesh Khapra
-
Updated
Jul 6, 2022 - Jupyter Notebook
Assignment submission for the course Fundamentals of Deep Learning (CS6910) in the Spring 2022 Semester, under Prof. Mitesh Khapra
Repository with the submissions for the 'Fundamentals of Optimization' course, where techniques such as gradient descent and its variants are implemented. These include gradient descent with a fixed step size (alpha), Nesterov GD with a fixed step, GD with a decreasing step size, GD with diagonal scaling and fixed step size.
This repository contains various tools of machine learning like the GRADIENT DESCENT and others, implemented from scratch using packages like numPy, matplotlib, pandas
Package used for mathematical optimization.
🧠Implementation of a Neural Network from scratch in Python for the Machine Learning Course.
SVM algorithms implementation from scratch for AI539 class project
Digit recognition neural network using the MNIST dataset. Features include a full gui, convolution, pooling, momentum, nesterov momentum, RMSProp, batch normalization, and deep networks.
Implemented optimization algorithms, including Momentum, AdaGrad, RMSProp, and Adam, from scratch using only NumPy in Python. Implemented the Broyden-Fletcher-Goldfarb-Shanno (BFGS) optimizer and conducted a comparative analysis of its results with those obtained using Adam.
Implementation of SVD without using package.
"Simulations for the paper 'A Review Article On Gradient Descent Optimization Algorithms' by Sebastian Roeder"
ISANet is a Neural Network Library.
Primal-Dual algorithm for smooth regularization of non-smooth optimization functions
ASCPD: an accelerated algorithm for CPD
Neural Networks and optimizers from scratch in NumPy, featuring newer optimizers such as DemonAdam or QHAdam.
NAG-GS: Nesterov Accelerated Gradients with Gauss-Siedel splitting
Hands on implementation of gradient descent based optimizers in raw python
From linear regression towards neural networks...
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
A collection of various gradient descent algorithms implemented in Python from scratch
Educational deep learning library in plain Numpy.
Add a description, image, and links to the nesterov-accelerated-sgd topic page so that developers can more easily learn about it.
To associate your repository with the nesterov-accelerated-sgd topic, visit your repo's landing page and select "manage topics."