Skip to content

Commit

Permalink
📊 Comet HPO (#563)
Browse files Browse the repository at this point in the history
* added hpo

* lint fixed

* Update hyperparameter_optimization.rst

* fixed file lint

* fixed documentation images

* added sweep doc image

* updated hpo docs to include images

* fixed linting errors

* added config folder to store sample sweeps

* fixed docs for new location of config files

* not needed. moved to config directory

* not needed moved to config directory

* renamed to configs

* changed to "configs"

* fixed grammar
  • Loading branch information
sherpan committed Sep 26, 2022
1 parent de1bea2 commit 353d981
Show file tree
Hide file tree
Showing 6 changed files with 106 additions and 11 deletions.
41 changes: 35 additions & 6 deletions docs/source/guides/hyperparameter_optimization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,34 @@
Hyperparameter Optimization
===========================

The default configuration for the models will not always work on a new dataset. Additionally, to increase performance, learning rate, optimizers, activation functions, etc. need to be tuned/selected. To make it easier to run such broad experiments that isolate the right combination of hyperparameters, Anomalib supports hyperparameter optimization using weights and biases.
The default configuration for the models will not always work on a new dataset. Additionally, to increase performance, learning rate, optimizers, activation functions, etc. need to be tuned/selected. To make it easier to run such broad experiments that isolate the right combination of hyperparameters, Anomalib supports hyperparameter optimization using Comet or weights and biases.

YAML file
**********

A sample configuration file for hyperparameter optimization is provided at ``tools/hpo/sweep.yaml`` and is reproduced below:
A Sample configuration files for hyperparameter optimization with Comet is provided at ``tools/hpo/config/comet_sweep.yaml`` and reproduced below:

.. code-block:: yaml
algorithm: "bayes"
spec:
maxCombo: 10
metric: "image_F1Score"
objective: "maximize"
parameters:
dataset:
category: capsule
image_size:
type: discrete
values: [128, 256]
model:
backbone:
type: categorical
values: ["resnet18", "wide_resnet50_2"]
The maxCombo defines the total number of experiments to run. The algorithm is the optimization method to be used. The metric is the metric to be used to evaluate the performance of the model. The parameters are the hyperparameters to be optimized. For details on other possible configurations with Comet's Optimizer , refer to the `Comet's <https://www.comet.com/docs/v2/api-and-sdk/python-sdk/introduction-optimizer/>`_ documentation.

A sample configuration file for hyperparameter optimization with Weights and Bias is provided at ``tools/hpo/config/wandb_sweep.yaml`` and is reproduced below:

.. code-block:: yaml
Expand All @@ -26,33 +48,40 @@ A sample configuration file for hyperparameter optimization is provided at ``too
backbone:
values: [resnet18, wide_resnet50_2]
The observation budget defines the total number of experiments to run. The method is the optimization method to be used. The metric is the metric to be used to evaluate the performance of the model. The parameters are the hyperparameters to be optimized. For details on methods other than ``bayes`` and parameter values apart from list, refer the `Weights and Biases <https://docs.wandb.ai/guides/sweeps/quickstart>`_ documentation. Everything under the ``parameters`` key overrides the default values defined in the model configuration. Currently, only the dataset and model parameters are overridden for the HPO search.
The observation budget defines the total number of experiments to run. The method is the optimization method to be used. The metric is the metric to be used to evaluate the performance of the model. The parameters are the hyperparameters to be optimized. For details on methods other than ``bayes`` and parameter values apart from list, refer the `Weights and Biases <https://docs.wandb.ai/guides/sweeps/quickstart>`_ documentation.

Everything under the ``parameters`` key (in both configuration formats) overrides the default values defined in the model configuration. In these examples, only the dataset and model parameters are overridden for the HPO search.

Running HPO
************

.. note::

You will need to have logged into a wandb account to use HPO search and view the results.
You will need to have logged into a Comet or wandb account to use HPO search and view the results.

To run the hyperparameter optimization, use the following command:

.. code-block:: bash
python tools/hpo/sweep.py --model padim \
--model_config ./path_to_config.yaml \
--sweep_config tools/hpo/sweep.yaml
--sweep_config tools/hpo/config/comet_sweep.yaml
In case ``model_config`` is not provided, the script looks at the default config location for that model.

.. code-block:: bash
python tools/hpo/sweep.py --sweep_config tools/hpo/sweep.yaml
python tools/hpo/sweep.py --sweep_config tools/hpo/config/comet_sweep.yaml
Sample Output
**************

.. figure:: ../images/logging/comet_sweep.png
:alt: Sample configuration of a Comet sweep

Sample Comet sweep on Padim


.. figure:: ../images/logging/wandb_sweep.png
:alt: Sample configuration of a wandb sweep
Expand Down
4 changes: 2 additions & 2 deletions docs/source/guides/logging.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Anomalib allows you to save predictions to the file system by setting ``log_imag

Logging images to Comet,TensorBoard and wandb won't work if you don't have ``logger: [comet, tensorboard, wandb]`` set as well. This ensures that the respective logger is passed to the trainer object.

.. figure:: ../images/logging/comet_media.jpg
.. figure:: ../images/logging/comet_media.png
:alt: comet dashboard showing logged images

Comet Images in TensorBoard Dashboard
Expand Down Expand Up @@ -84,7 +84,7 @@ Anomalib makes it easier to log your model graph to Comet, TensorBoard or Weight
logger: [comet, tensorboard]
log_graph: true
.. figure:: ../images/logging/comet_graph.jpg
.. figure:: ../images/logging/comet_graph.png
:alt: comet dashboard showing model graph

Model Graph in Comet Dashboard
Expand Down
Binary file added docs/source/images/logging/comet_sweep.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
15 changes: 15 additions & 0 deletions tools/hpo/configs/comet_sweep.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
algorithm: "bayes"
spec:
maxCombo: 10
metric: "image_F1Score"
objective: "maximize"
parameters:
dataset:
category: capsule
image_size:
type: discrete
values: [128, 256]
model:
backbone:
type: categorical
values: ["resnet18", "wide_resnet50_2"]
File renamed without changes.
57 changes: 54 additions & 3 deletions tools/hpo/sweep.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
"""Run wandb sweep."""
"""Run hpo sweep."""

# Copyright (C) 2022 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
Expand All @@ -8,9 +8,10 @@
from typing import Union

import pytorch_lightning as pl
from comet_ml import Optimizer
from omegaconf import DictConfig, ListConfig, OmegaConf
from pytorch_lightning import seed_everything
from pytorch_lightning.loggers import WandbLogger
from pytorch_lightning.loggers import CometLogger, WandbLogger
from utils import flatten_hpo_params

import wandb
Expand Down Expand Up @@ -71,6 +72,51 @@ def sweep(self):
trainer.fit(model, datamodule=datamodule)


class CometSweep:
"""comet sweep.
Args:
config (DictConfig): Original model configuration.
sweep_config (DictConfig): Sweep configuration.
"""

def __init__(self, config: Union[DictConfig, ListConfig], sweep_config: Union[DictConfig, ListConfig]) -> None:
self.config = config
self.sweep_config = sweep_config

def run(self):
"""Run the sweep."""
flattened_hpo_params = flatten_hpo_params(self.sweep_config.parameters)
self.sweep_config.parameters = flattened_hpo_params

# comet's Optmizer cannot takes dict as an input, not DictConfig
std_dict = OmegaConf.to_object(self.sweep_config)

opt = Optimizer(std_dict)

project_name = f"{self.config.model.name}_{self.config.dataset.name}"

for exp in opt.get_experiments(project_name=project_name):
comet_logger = CometLogger()

# allow pytorch-lightning to use the experiment from optimizer
comet_logger._experiment = exp # pylint: disable=W0212
run_params = exp.params
for param in run_params.keys():
set_in_nested_config(self.config, param.split("."), run_params[param])
config = update_input_size_config(self.config)

model = get_model(config)
datamodule = get_datamodule(config)
callbacks = get_sweep_callbacks(config)

# Disable saving checkpoints as all checkpoints from the sweep will get uploaded
config.trainer.checkpoint_callback = False

trainer = pl.Trainer(**config.trainer, logger=comet_logger, callbacks=callbacks)
trainer.fit(model, datamodule=datamodule)


def get_args():
"""Gets parameters from commandline."""
parser = ArgumentParser()
Expand All @@ -89,5 +135,10 @@ def get_args():
if model_config.project.seed != 0:
seed_everything(model_config.project.seed)

sweep = WandbSweep(model_config, hpo_config)
# check hpo config structure to see whether it adheres to comet or wandb format
sweep: Union[CometSweep, WandbSweep]
if "spec" in hpo_config.keys():
sweep = CometSweep(model_config, hpo_config)
else:
sweep = WandbSweep(model_config, hpo_config)
sweep.run()

0 comments on commit 353d981

Please sign in to comment.