Skip to content

montrealrobotics/BIV

Repository files navigation

Batch Inverse-Variance Weighting: Deep Heteroscedastic Regression

ICML 2021 Workshop: Uncertainty in Deep Learning.

Paper available here: https://arxiv.org/abs/2107.04497

Introduction

The performance of deep supervised learning methods is impacted when the training dataset, on which the parameters are optimized, and the testing dataset, which evaluates the performance of the model on the task, are not sampled from identical distributions. In heteroscedastic regression, the label for each training sample is corrupted by noise coming from a different distribution. In some cases, it is possible to know an estimate of the variance of the noise for each label, which quantifies how much it contributes to the misalignment between the datasets. We propose an approach to include this privileged information in the loss function together with dataset statistics inferred from the mini-batch to mitigate the impact of the dataset misalignment. We adapt the idea of Fisher-information weighted average to function approximation and propose Batch Inverse-Variance weighting. We show the validity of this approach as it achieves a significant improvement of the performances of the network when confronted to high, input-independent noise.

Prerequisites

To run the code, we wrapped up all the used libraries inside a singularity container, you can download it here. To manually build your environment using anaconda, we provide the yml file here.

Run the Code

To run the code locally:

python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="vanilla_cnn,cutoffMSE"
--noise_settings="True,binary_uniform" --params_settings="meanvar_avg,0.5,3000" --parameters="True,0.5,1,0.3,0"

To run the code locally inside singularity container:

singularity exec --nv -H $HOME:/home/ -B ./your_dataset_directory:/datasets/ -B ./your_outputs_directory:/final_outps/ ./your_environments_directory/pytorch_f.simg python /path/to/main.py  --experiment_settings="exp_tag,7159,utkf,True,16000" 
--model_settings="vanilla_cnn,cutoffMSE" --noise_settings="True,binary_uniform" --params_settings="meanvar_avg,0.5,3000"
--parameters="True,0.5,1,0.3,0"

To run the code in a cluster that supporting slurm workload manager, use this starter script:

#!/bin/bash
#SBATCH -o /path/to/logs/noise_%j.out   # Change this!
#SBATCH --cpus-per-task=4  
#SBATCH --gres=gpu:1        
#SBATCH --mem=32Gb    

# Load cuda (it is not needed if have it enabled as a default.)
module load cuda/10.0    
# 1. You have to load singularity (it is not needed if have it enabled as a default.)
module load singularity   
# 2. Then you copy the container to the local disk
rsync -avz /path/to/pytorch_f.simg $SLURM_TMPDIR     # Change this!
# 3. Copy your dataset on the compute node
rsync -avz /path/to/your_dataset/ $SLURM_TMPDIR        # Change this!
# 4. Executing your code with singularity
singularity exec --nv -H $HOME:/home/ -B $SLURM_TMPDIR:/datasets/ -B $SLURM_TMPDIR:/final_outps/  $SLURM_TMPDIR/pytorch_f.simg python /path/to/main.py --experiment_settings=$1 --model_settings=$2 --noise_settings=$3 --params_settings=${4-"None"}  --parameters=${5-"None"}
# 5. Move results back to the login node.
rsync -avz $SLURM_TMPDIR --exclude="your_dataset" --exclude="pytorch_f.simg"  /path/to/outputs  # Change this!

# Note:
# $SLURM_TMPDIR = The compute node directory.

then run the script with sbatch:

sbatch --gres=gpu:rtx8000:1 ./path/to/main.sh  "exp_tag,7159,utkf,True,16000" "vanilla_cnn,cutoffMSE" "True,binary_uniform" "meanvar_avg,0.5,3000" "True,0.5,1,0.3,0"

Examples

  • To run a vanilla CNN while normalising the data, where the loss function is MSE:

    python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="vanilla_cnn,mse,0.5" --noise_settings="False" 
  • To run resnet-18 with BIV loss (epsilon=0.5), where the noise variance is coming from a single uniform distribution:

    python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="resnet,biv,0.5" --noise_settings="True,uniform" 
    --params_settings="boundaries" --parameters="0,1"
  • To run resnet-18 with BIV loss (epsilon=0.5), where the noise variance is coming from a single uniform distribution that has a variance equal to the maximum heteroscedasticity:

    python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="resnet,biv,0.5" --noise_settings="True,uniform" 
    --params_settings="meanvar" --parameters="True,0.5,0.083"
  • To run resnet-18 with BIV loss (epsilon=0.5), where the noise variance is coming from a bi-model (uniform) distribution, in which the weight of the contribution of the both distributions is equal (0.5):

    python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="resnet,biv,0.5" 
    --noise_settings="True,binary_uniform"  --params_settings="boundaries,0.5" --parameters="0,1,1,4"
  • To run resnet-18 with MSE loss, where the noise variance is coming from a bi-model (uniform) distribution by specifying the mean and variance of this model:

    python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="resnet,mse" --noise_settings="True,binary_uniform"  --params_settings="meanvar,0.5" --parameters="False,0.5,1,0.083,0"
  • To run resnet-18 with BIV loss (epsilon=0.5), where the noise variance is coming from a bi-model (uniform) distribution in which the average mean is 2000.

    python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="resnet,biv,0.5" 
    --noise_settings="True,binary_uniform"  --params_settings="meanvar_avg,0.5,2000" --parameters="False,0.5,1,0.083,0"
  • To run resnet-18 with MSE loss, where the noise variance is coming from a bi-model (uniform) distribution and with noise threshold=1:

python main.py --experiment_settings="exp_tag,7159,utkf,True,16000" --model_settings="resnet,cutoffMSE,1" --noise_settings="True,binary_uniform" --params_settings="meanvar_avg,0.5,2000" --parameters="False,0.5,1,0.08,0"

Command-line Arguments

1] Flow Chart

2] Table:

Group Argument Description Value Data Type
Tag Experiment wandb tag. (click here for more details) Any string
Seed Experiment seed. Any int, float
experiment_settings Dataset The available datasets:
1-UTKFace. (click here for more details)
2-Wine Quality. (click here for more details)
1- utkf
2- wine
string
Normalization Enable dataset normalization True or False boolean
Dataset Training Size Size of the training dataset. Between 0 and the difference between the size of the dataset and the size of the test set int




Model Type The available models:
1- Vanilla ANN, (click here for more details)
2-Vanilla CNN. (click here for more details)
3- Resnet-18. (click here for more details)
1- vanilla_ann
2- vanilla_cnn
3- resnet
string
model_settings Loss Type The available loss functions:
1- Mean squared error. (MSE)
2- MSE with Cutoff (threshold)
3- Inverse variance. (IV)
4- Batch inverse variance. (BIV)
1- mse
2- cutoffMSE
3- iv
4- biv
string
1- Epsilon
or:
2- Threshold Value
The value of this argument will be tailored depending on the loss type. It has two options
1- Epsilon: A parameter that prevents the BIV function from having high loss values.
2- Threshold Value: The cutoff or noise threshold value of the cutoffMSE loss.
[0,+] float
.



Noise Enabling noise addition to the labels True or False boolean
noise_settings Noise Type The available noise variance distributions:
1- Uniform distribution.
2- Gamma distribution.
1- uniform
2-binary_uniform
3- gamma
string




Params Type The current baseline supports the following settings for the noise distributions:
1- Uniform boundaries: Where the boundaries of the uniform are provided.
2- Gamma's parameters: Where alpha and beta are provided.
3- Mean and Variance: Where the mean and variance (v) of the selected distribution should be provided to estimate the its parameters indirectly.
1- boundaries
2- alphabeta
3- meanvar
4- meanvar_avg
string
parmas_settings Noise Distributions Ratio (p) Probability function over noise variance distributions. This is to study the contribution effect of low and high noise variance distributions. [0-1] float
Average
Variance

(X)
Average over means of the noise variance distributions (two):
X = p x + (1-p) x

X = average mean variance.
p = probability function over noise variance distributions.
= mean of the first distribution.
= mean of the second distribution.
Any float




parameters Parameters Parameters of the noise variance distributions:
1- Uniform
2- binary_uniform
3- Gamma
Or:
and v of the noise variance distributions.

Note:

1- When the "Params Type" is not boundaries, the first parameter in the list (var_scale) represents a condition to enabling maximum heteroscedasticity.
1- will be the heteroscedasticity scale if var_scale equal to True.
2- In this case, 0 < <= 1
1- (, )

2- (,,,)
3- (,)
Or:
1- (var_scale, ,)
2- (var_scale, ,,,)
or:
2- (var_scale, ,,,)
3- (,,,)
list