🥼Clothes Classification, Artificial Intelligence course, University of Tehran
-
Updated
Sep 5, 2022 - HTML
🥼Clothes Classification, Artificial Intelligence course, University of Tehran
This GitHub repository explores the importance of MLP components using the MNIST dataset. Techniques like Dropout, Batch Normalization, and optimization algorithms are experimented with to improve MLP performance. Gain a deeper understanding of MLP components and learn to fine-tune for optimal classification performance on MNIST.
myID3 and myC45 modules implementation (Tubes1B), myMLP module implementation with mini-batch gradient descent (Tubes1C) and 10-fold cross validation scheme implementation (Tubes1D)
This repository contains my solutions and implementations for assignments assigned during the Machine Learning course.
Linear Regression with TensorFlow 2 (using Mini-Batch Gradient Descent)
rede neural totalmente conectada, utilizando mini-batch gradient descent e softmax para classificação no dataset MNIST
Just exploring Deep Learning
🏡💲 Stochastic, full and mini-batch gradient descent for ridge regression using California Housing Dataset
Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.
3-layer linear neural network to classify the MNIST dataset using the TensorFlow
Implementação em Python de uma rede neural perceptron de multicamadas (multilayer perceptron) treinada com Mini-Batch Gradient Descent
Regression models on Boston Houses dataset
Implement Linear Regression class and experiment with Batch, Mini Batch and Stohastic Gradient Descent
A simplified explanation of gradient descent for linear regression in python using numpy
Custom multilayer perceptron (MLP)
CUDA implementation of the best model in the Robust Mini-batch Gradient Descent repo
Tugas besar pembelajaran mesin mini batch gradient descent
Robust Mini-batch Gradient Descent models
a fully connected neural-network implemented in python using numpy, with an option to save run as a JSON file. Network uses mini-batch gradient-descent
Add a description, image, and links to the mini-batch-gradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the mini-batch-gradient-descent topic, visit your repo's landing page and select "manage topics."