Skip to content

This project is one of the Computational Intelligence course projects in the spring of 2023, and it includes code related to training neural networks with gradient descent, training neural network using neuroevolution, Neural Architecture Search (NAS), and Self-Organizing Maps (SOM)

Notifications You must be signed in to change notification settings

kianmajl/CIFAR10_Image_Classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 

Repository files navigation


CIFAR_10_Image_Classification


📒 Table of Contents


📍 Overview

This project is one of the Computational Intelligence course projects in the spring of 2023, and it includes code related to training neural networks with gradient descent, training neural network using neuroevolution, Neural Architecture Search (NAS), and Self-Organizing Maps (SOM). The core functionalities of the project are image classification tasks using the CIFAR-10 dataset. The purpose of the project is to explore and compare different approaches and techniques for improving the accuracy and efficiency of image classification models.


📂 Project Structure


🔎 Details of Codes

Gradient descent is an optimization algorithm commonly used for training neural networks. It iteratively adjusts the parameters of the neural network (such as weights and biases) to minimize a defined loss function. By computing gradients and updating the parameters in the direction of steepest descent, gradient descent helps the neural network gradually improve its performance over time.

Neuro-Evolution combines neural networks and evolutionary algorithms to optimize the parameters (such as weights and biases). It involves evolving a population of neural networks through processes such as mutation, crossover, and selection, similar to how genetic algorithms work in evolutionary computation for improved performance on image classification tasks.

Neural architecture search is a technique used to automatically discover the architecture of a neural network that performs well on a given task. It involves searching through a large space of possible network architectures with evolutionary algorithms to find the most suitable one.

Self-Organizing Maps (SOM) is an unsupervised learning algorithm used for clustering and visualization of high-dimensional data. It maps the input data onto a lower-dimensional grid, preserving the topological relationships between the data points.


🚀 Getting Started

✔️ Requirements

Before you begin, ensure that you have the packages in requirements.txt installed.

📦 Installation

  1. Clone the CIFAR_10_Image_Classification repository:
git clone https://github.com/kianmajl/CIFAR_10_Image_Classification.git
  1. Change to the project directory:
cd CIFAR_10_Image_Classification
  1. Install the dependencies:
pip install -r ./Codes/requirements.txt

🎮 Using CIFAR_10_Image_Classification

Now you can train neural network with gradient descent or neuro evolution, find the best architecture for neural network with NAS, and classify images with the method you want.


🤝 Collaborators

Kian Majlessi and Audrina Ebrahimi

About

This project is one of the Computational Intelligence course projects in the spring of 2023, and it includes code related to training neural networks with gradient descent, training neural network using neuroevolution, Neural Architecture Search (NAS), and Self-Organizing Maps (SOM)

Topics

Resources

Stars

Watchers

Forks