Skip to content

sverdoot/NOT_AE

Repository files navigation

Neural Optimal Transport Autoencoders

Installation

Create environment and set dependencies:

conda create -n not_ae python=3.8
curl -sSL https://install.python-poetry.org | python3 -
poetry config virtualenvs.create false

conda activate not_ae
poetry install

Make bash scripts runable

chmod +x -R scripts/*.sh

Prepare

python tools/compute_fid_stats.py CelebADataset stats/celeba_fid_stats_{val, test}.npz --split {val, test}
python tools/compute_fid_stats.py ArtBench10 stats/artbench_fid_stats_{val, test}.npz --split {val, test}

Usage

train baseline:

python train.py train configs/train_{celeba / artbench}_{l1 / l2}_ae.yml

train NOT-AE:

python train.py train configs/train_{celeba / artbench}_{l1 / l2 / perceptual}_cost.yml

Examples

Method Cost test LPIPS ($\downarrow$) test FID ($\downarrow$)
AE L2 $0.23$ $71.8$
NOT-AE L2 $\mathbf{0.14}$ $\mathbf{58.4}$

Vanilla Autoencoder with MSE loss:

alt text $\quad$ alt text

NOT-Autoencoder with L2 cost:

alt text $\quad$ alt text

TODO

  • fix artbench (add train / val / test split)
  • extend to VAE ?
@article{korotin2022neural,
  title={Neural optimal transport},
  author={Korotin, Alexander and Selikhanovych, Daniil and Burnaev, Evgeny},
  journal={arXiv preprint arXiv:2201.12220},
  year={2022}
}

About

Neural Optimal Transport Auto-encoders

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published