Skip to content

Official Implementation of "DANI-Net: Uncalibrated Photometric Stereo by Differentiable Shadow Handling, Anisotropic Reflectance Modeling, and Neural Inverse Rendering", CVPR2023

Notifications You must be signed in to change notification settings

LMozart/CVPR2023-DANI-Net

Repository files navigation

CVPR2023-DANI-Net

DANI-Net: Uncalibrated Photometric Stereo by Differentiable Shadow Handling, Anisotropic Reflectance Modeling, and Neural Inverse Rendering
Zongrui Li, Qian Zheng, Boxin Shi, Gang Pan, Xudong Jiang

Given a set of observed images captured under varying, parallel lights, DANI-Net recovers light conditions (directions and intensities), surface normal, anisotropic reflectance, and soft shadow map.

Updates

  • [2024-04-17] We have fixed some bugs in scale-invariant error calculation for light intensity evaluation. These bugs may lead to unexpected large errors. We also update the configuration files for DiLiGenT100 datasets.
  • [2024-04-10] We have fixed some bugs in calculating silhouette loss on DiLiGenT100 datasets.

Our Relighting Results

Dependencies

We use Anaconda to install the dependencies given following code:

# Create a new python3.8 environment named dani-net
conda env create -f environment.yml
conda activate dani-net

Train

Train on benchmark datasets.

DANI-Net uses wandb.ai for the training logs recording. Please register an account first if you would like to use it. Otherwise, please change the 'logger_type' of the config file to `tensorboard'.

The datasets can be downloaded according to the table below:

Dataset Link
DiLiGenT Benchmark Link
DiLiGenT10^2 Benchmark Link
Gourd & Apple Link
Light Stage Data Gallery Link

Please download and unzip it to the 'data' folder in the root directory. To test DANI-Net on a particular object, you may run:

python train.py --config configs/diligent/YOUR_OBJ_NAME.yml --exp_code YOUR_EXP_TAG

To test DANI-Net on multiple objects in a particular dataset, please run:

# DiLiGenT
sh scripts/train_diligent.sh

# DiLiGenT 10^2
sh scripts/train_diligent100.sh

# Gourd & Apple
sh scripts/train_apple.sh

# Light Stage
sh scripts/train_lightstage.sh

Train on your own datasets.

Please create a data loader for your own dataset in 'utils/dataset_loader/' and add the corresponding code 'utils/dataset_utils/'. You should also create the corresponding config file following the template in 'configs/template.yml'.

Test

We provide all the trained models in this link, download and unzip them to the 'runs' folder in the root directory.

To test the results in an particular dataset, please run:

# DiLiGenT
sh scripts/test_diligent.sh

# DiLiGenT 10^2
sh scripts/test_diligent100.sh

# Gourd & Apple
sh scripts/test_apple.sh

# Light Stage
sh scripts/test_lightstage.sh

Acknowledgement

Part of the code is based on Neural-Reflectance-PS, nerf-pytorch, SCPS-NIR repository.

Citation

@inproceedings{li2023dani,
  title={DANI-Net: Uncalibrated Photometric Stereo by Differentiable Shadow Handling, Anisotropic Reflectance Modeling, and Neural Inverse Rendering},
  author={Li, Zongrui and Zheng, Qian and Shi, Boxin and Pan, Gang and Jiang, Xudong},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2023}}

About

Official Implementation of "DANI-Net: Uncalibrated Photometric Stereo by Differentiable Shadow Handling, Anisotropic Reflectance Modeling, and Neural Inverse Rendering", CVPR2023

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published