Skip to content

Radhika-Keni/Adam_Vs_Vanilla_GradientDescent_From_Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Adam_Vs_Vanilla_GradientDescent_From_Scratch

Implement Adam and compare it with Vanilla GD

Objective of this notebook

  • Implement Adam and compare its convergence with that of Vanilla Gradient Descent
  • Details of the problem statement , data set , summary of the code/solution , sample output/Prediction from the program and final result of the project are listed in the sections to follow.

Problem Statement/Prologue

This Note book build upon our previous notebook "Vanilla Gradient Decsent From Scratch" which is present in my repository .In this notebook ,we build on top of that notebook , implement Adam and observe its effects on convergence.We jump staright into Adam , so if the reader needs more of a background on gradient descent then I would recommend going through the previous notebook(which has detailed comments on Gradient Descent)

Data Description:

The dataset is manually created for the purpose of this exercise

Domain:

Deep Learning :Proof of concept

Summary of the Solution/Code:

  • We implement Vanilla Gradient Descent on a dataset and log observations
  • We implement Adam on the same data set and log observations
  • We compare results and demostrate which algorithm is superior
  • Refer python worksheet Adam_Vs_VanillaGD.ipynb for the solution

Visualizing Input :

image

The above image depicts the following

  • We start at a random line (marked in RED)
  • The BLUE line indicates the best fit line which is our target
  • We need to arrive at this line through the Gradient Descent Algorithm
  • Let us see how changing learning rate affects this convergence

Result with Vanilla Gradient Descent :

image image

Result with Adam :

image image

Compare Results :

image

References

The following references were used while creating this notebook:

About

Implement Adam and compare it with Vanilla GD

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published