myID3 and myC45 modules implementation (Tubes1B), myMLP module implementation with mini-batch gradient descent (Tubes1C) and 10-fold cross validation scheme implementation (Tubes1D)
-
Updated
Apr 8, 2020 - Jupyter Notebook
myID3 and myC45 modules implementation (Tubes1B), myMLP module implementation with mini-batch gradient descent (Tubes1C) and 10-fold cross validation scheme implementation (Tubes1D)
🥼Clothes Classification, Artificial Intelligence course, University of Tehran
This GitHub repository explores the importance of MLP components using the MNIST dataset. Techniques like Dropout, Batch Normalization, and optimization algorithms are experimented with to improve MLP performance. Gain a deeper understanding of MLP components and learn to fine-tune for optimal classification performance on MNIST.
🏡💲 Stochastic, full and mini-batch gradient descent for ridge regression using California Housing Dataset
Linear Regression with TensorFlow 2 (using Mini-Batch Gradient Descent)
rede neural totalmente conectada, utilizando mini-batch gradient descent e softmax para classificação no dataset MNIST
Various methods for Deep Learning, SGD and Neural Networks.
This repository contains my solutions and implementations for assignments assigned during the Machine Learning course.
Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.
3-layer linear neural network to classify the MNIST dataset using the TensorFlow
Logistic regression using JAX to support GPU acceleration
CUDA implementation of the best model in the Robust Mini-batch Gradient Descent repo
Tugas besar pembelajaran mesin mini batch gradient descent
🐚 Abalone Age Prediction: Dive into Data, Surf on Insights! 📊 Unleash the power of predictive analytics on abalone age estimation! From meticulous data exploration to a showdown of optimization methods, this repo is your gateway to accurate age predictions using physical measurements using Pysaprk. 🌊🔮
Robust Mini-batch Gradient Descent models
Regression models on Boston Houses dataset
Implement Linear Regression class and experiment with Batch, Mini Batch and Stohastic Gradient Descent
Numerical Optimization for Machine Learning & Data Science
Implementing ML Algorithms using Python and comparing with Standard Library functions
Add a description, image, and links to the mini-batch-gradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the mini-batch-gradient-descent topic, visit your repo's landing page and select "manage topics."