Skip to content

Learning how to use various Python hyperparameter optimization platforms for machine learning models.

Notifications You must be signed in to change notification settings

mccrindlebrian/hyperparam_methods

Repository files navigation

hyperparam_methods

Learning how to use various Python hyperparameter optimization platforms for machine learning models. We implement:

  1. Grid Search
  2. Random Search
  3. Bayesian Optimization
  4. Hyperopt
  5. Optuna

Bayesian, Hyperopt, and Optuna all provide similar peformance. These should be preferred over something like a simple grid search. In general, experimental with the model first to figure out what scope you want your parameters to be before running an n number of models. This will speed the process as the search space should be as tight as possible.

Download the data from the following location: https://www.kaggle.com/datasets/iabhishekofficial/mobile-price-classification

About

Learning how to use various Python hyperparameter optimization platforms for machine learning models.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages