Skip to content

A python implementation of the mini-batch SGD algorithm. Performance is compared with SkLearn's SGDRegressor.

Notifications You must be signed in to change notification settings

ananda1996ai/A-mini-batch-Stochastic-Gradient-Descent-implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

A-mini-batch-Stochastic-Gradient-Descent-implementation

A python implementation of the mini-batch SGD algorithm for Linear Regression. Performance is compared with SkLearn's SGDRegressor.

About

A python implementation of the mini-batch SGD algorithm. Performance is compared with SkLearn's SGDRegressor.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published