Skip to content

Python code for Gradient Descent, Momentum, and Adam optimization methods. Train neural networks efficiently.

Notifications You must be signed in to change notification settings

aliejabbari/Optimizations-ADAM-Momentum-SGD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

Neural Network Optimization Methods Comparison

This repository contains a Jupyter notebook that demonstrates the implementation and comparison of various optimization methods for training a three-layer neural network. The notebook includes three popular optimization algorithms: Gradient Descent, Momentum, and Adam. The neural network is trained on a non-linearly separable dataset generated using the sklearn.datasets.make_moons() function.

Features

  • Implementation of a three-layer neural network using NumPy.
  • Comparison of Gradient Descent, Momentum, and Adam optimization methods.
  • Visualization of the cost function and training progress over epochs.
  • Evaluation of the trained model's accuracy on the training data.

Contents

  • neural_network_optimization.ipynb: Jupyter notebook with the implementation and comparison of optimization methods.
  • README.md: Detailed information about the notebook and its contents.

How to Use

  1. Clone the repository to your local machine using git clone <repository-url>.
  2. Open the Jupyter notebook neural_network_optimization.ipynb using Jupyter Notebook or Jupyter Lab.
  3. Run the cells in the notebook to see the implementation of the neural network and the comparison of optimization methods.
  4. Observe the cost function's progress and accuracy of the model on the training data.

Dependencies

  • Python 3.x
  • NumPy
  • Matplotlib
  • scikit-learn (for generating the dataset)

Author

Ali Jabbari

Feel free to use and modify this notebook to explore and experiment with different optimization methods for neural networks. If you have any questions or suggestions, please feel free to reach out and contribute to the repository.

Happy learning and coding!

About

Python code for Gradient Descent, Momentum, and Adam optimization methods. Train neural networks efficiently.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published