Skip to content

Supercharge your neural network models with our top-tier activation functions repository. Ideal for data scientists and ML enthusiasts, featuring comprehensive guides, practical implementations, and detailed comparisons. Dive into advanced AI techniques and make impactful contributions today!

License

Notifications You must be signed in to change notification settings

PujanMotiwala/the_fun_activations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fun Activations

Welcome to Fun Activations! This repository is dedicated to exploring various activation functions used in neural networks. Our goal is to understand when and how different activation functions are helpful in different scenarios.

Table of Contents

Introduction

Activation functions are a crucial component of neural networks, influencing how the network learns and performs. In this repository, we will dive into a variety of activation functions, analyze their properties, and compare their performance in different tasks.

Activation Functions

We will cover a wide range of activation functions, including but not limited to:

  • Sigmoid
  • Tanh
  • ReLU (Rectified Linear Unit)
  • Leaky ReLU

(ongoing additions)

  • PReLU (Parametric ReLU)
  • ELU (Exponential Linear Unit)
  • SELU (Scaled Exponential Linear Unit)
  • Swish
  • Mish
  • Softmax

Each activation function will have its own detailed explanation, implementation, and use cases.

Usage

You can use this repository to:

  • Learn about different activation functions and their properties.
  • Experiment with various activation functions in your own projects.
  • Compare the performance of different activation functions in different scenarios.

Installation

To get started, clone this repository to your local machine:

git clone https://github.com/PujanMotiwala/fun_activations.git
cd fun_activations

Install the required dependencies:

pip install -r requirements.txt

Examples

We provide several example scripts demonstrating the use of different activation functions. You can find them in the examples directory.

Here’s how you can run an example:

python examples/example_script.py

Each example script includes a detailed explanation and results analysis.

Comparative Analysis

Check out our comparative analysis notebook to see how different activation functions perform on the same dataset. This notebook includes visualizations and performance metrics to help you understand the strengths and weaknesses of each activation function.

Contributing

We welcome contributions from the community! If you have an idea for a new activation function or an improvement to an existing one, feel free to open an issue or submit a pull request.

1.	Fork the repository.
2.	Create a new branch (git checkout -b feature-branch).
3.	Make your changes.
4.	Commit your changes (git commit -am 'Add new feature').
5.	Push to the branch (git push origin feature-branch).
6.	Create a new Pull Request.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

About

Supercharge your neural network models with our top-tier activation functions repository. Ideal for data scientists and ML enthusiasts, featuring comprehensive guides, practical implementations, and detailed comparisons. Dive into advanced AI techniques and make impactful contributions today!

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages