Skip to content

Code for our CVPR'23 paper: "Polynomial Implicit Neural Representations For Large Diverse Datasets"

Notifications You must be signed in to change notification settings

Rajhans0/Poly_INR

Repository files navigation

PWC PWC PWC PWC

The libraries are burrowed from the StyleGAN-XL repository. Big thanks to the authors for the wonderful code.

Requirements

  • 64-bit Python 3.8 and PyTorch 1.9.0 (or later)
  • CUDA toolkit 11.1 or later.
  • GCC 7 or later compilers.
  • Use the following commands with Miniconda3 to create and activate your Python environment:
     conda env create -f environment.yml
     conda activate polyinr

Data Preparation

python dataset_tool.py --source=./data/location --dest=./data/dataname_256.zip --resolution=256x256 --transform=center-crop

Training intial resolutuion

python train.py --outdir=./training-runs/dataname --data=./data/dataname_32.zip --gpus=4 --batch=64 --mirror=1 --snap 10 --batch-gpu 8 --kimg 10000

Training super-resolution

python train.py --outdir=./training-runs/dataname --data=./data/dataname_64.zip --gpus=4 --batch=64 --mirror=1 --snap 10 --batch-gpu 8 --kimg 10000
--superres --path_stem training-runs/dataname/00000-gmgan-dataname_32-gpus8-batch64/best_model.pkl

To generate samples run

python gen_images.py --outdir=out --trunc=0.6 --seeds=1-20 --batch-sz 1 --class 135 --network=path/to/best_model.pkl

Pretrained checkpoints

ImageNet-128x128
ImageNet-256x256
ImageNet-512x512

About

Code for our CVPR'23 paper: "Polynomial Implicit Neural Representations For Large Diverse Datasets"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published