Skip to content

Latest commit

 

History

History
40 lines (24 loc) · 1.71 KB

README.md

File metadata and controls

40 lines (24 loc) · 1.71 KB

Neural Network Optimization Methods Comparison

This repository contains a Jupyter notebook that demonstrates the implementation and comparison of various optimization methods for training a three-layer neural network. The notebook includes three popular optimization algorithms: Gradient Descent, Momentum, and Adam. The neural network is trained on a non-linearly separable dataset generated using the sklearn.datasets.make_moons() function.

Features

  • Implementation of a three-layer neural network using NumPy.
  • Comparison of Gradient Descent, Momentum, and Adam optimization methods.
  • Visualization of the cost function and training progress over epochs.
  • Evaluation of the trained model's accuracy on the training data.

Contents

  • neural_network_optimization.ipynb: Jupyter notebook with the implementation and comparison of optimization methods.
  • README.md: Detailed information about the notebook and its contents.

How to Use

  1. Clone the repository to your local machine using git clone <repository-url>.
  2. Open the Jupyter notebook neural_network_optimization.ipynb using Jupyter Notebook or Jupyter Lab.
  3. Run the cells in the notebook to see the implementation of the neural network and the comparison of optimization methods.
  4. Observe the cost function's progress and accuracy of the model on the training data.

Dependencies

  • Python 3.x
  • NumPy
  • Matplotlib
  • scikit-learn (for generating the dataset)

Author

Ali Jabbari

Feel free to use and modify this notebook to explore and experiment with different optimization methods for neural networks. If you have any questions or suggestions, please feel free to reach out and contribute to the repository.

Happy learning and coding!