Skip to content

SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in

License

Notifications You must be signed in to change notification settings

mmahesh/variants-of-rmsprop-and-adagrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Variants of RMSProp and Adagrad

Keras implementation of SC-Adagrad, SC-RMSProp and RMSProp Algorithms proposed in here

Short version accepted at ICML, 17 can be found here

I wrote a blog/tutorial here, describing Adagrad, RMSProp, Adam, SC-Adagrad and SC-RMSProp in simple terms, so that it is easy to understand the gist of the algorithms.

Usage

So, you created a deep network using keras, now you want to train with above algorithms. Copy the file "new_optimizers.py" into your repository. Then in the file where the model is created (also to be compiled) add the following

from new_optimizers import *

# lets for example you want to use SC-Adagrad then
# create optimizer object as follows.

sc_adagrad = SC_Adagrad()

# similarly for SC-RMSProp and RMSProp (Ours)

sc_rmsprop = SC_RMSProp()
rmsprop_variant = RMSProp_variant() 

Then in the code where you compile your keras model you must set optimizer=sc_adagrad. You can do the same for SC-RMSProp and RMSProp algorithms.

Overview of Algorithms