Skip to content

Code implementation of the paper "Accurate Interpolation for Scattered Data through Hierarchical Residual Refinement"

Notifications You must be signed in to change notification settings

DingShizhe/HINT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code implementation of HINT

Code implementation of the paper Accurate Interpolation for Scattered Data through Hierarchical Residual Refinement.

The implementation of this work is built upon the foundations of three existing projects: NIERT, NeuralSymbolicRegressionThatScales and TFR-HSS-Benchmark.

Preparation

  1. We highly recommend utilizing the conda package manager to create and manage the project's environment:
conda create -n hint python=3.7
conda activate hint
  1. Install the required third-party libraries by executing the following command:
pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
pip install -r requirements.txt

Construction of Mathit-2D Dataset

The Mathit-2D dataset construction process builds upon the work of NeuralSymbolicRegressionThatScales and NIERT.

Follow the steps below to create the dataset:

# generate training equations set
python3 -m src.data.mathit.run_dataset_creation --number_of_equations 1000000 --no-debug

# generate testing equations set
python3 -m src.data.mathit.run_dataset_creation --number_of_equations 150 --no-debug

mkdir -p mathit_data/test_set

# convert the newly created validation dataset in a csv format
python3 -m src.data.mathit.run_dataload_format_to_csv raw_test_path=mathit_data/data/raw_datasets/150

# remove the validation equations from the training set
python3 -m src.data.mathit.run_filter_from_already_existing --data_path mathit_data/data/raw_datasets/1000000 --csv_path mathit_data/test_set/test_nc.csv

python3 -m src.data.mathit.run_apply_filtering --data_path mathit_data/data/raw_datasets/1000000

By following these steps, you will generate the Mathit-2D dataset.

Accessing the PTV and TFRD Dataset

The PTV dataset can be obtained from here. This dataset provides valuable resources for interpolating particle velocities and reconstructing velocity fields in velocity-based analyses.

The TFRD datasets can be obtained from here. This dataset is specifically designed for reconstructing temperature fields from measurements obtained by scattered temperature sensors.

Training

Follow the steps below to Train HINT:

# Training on Mathit-2D dataset
CUDA_VISIBLE_DEVICES="0,1" python main.py --config_path ./config/config_Mathit.yml

# Training on Pelrin dataset
CUDA_VISIBLE_DEVICES="0" python main.py --config_path ./config/config_Perlin.yml

# Training on TFRD-ADlet dataset
CUDA_VISIBLE_DEVICES="0,1" python main.py --config_path ./config/config_TFR_adlet.yml

# Training on PTV dataset
CUDA_VISIBLE_DEVICES="0,1" python main.py --config_path ./config/config_PTV.yml

Testing

For Mathit dataset, we certainly need to fix a interpolation task test set from the equation skeleton test set.

python main.py -m save_Mathit_testdataset_as_file

Then we can evaluate HINT on such test set.

CUDA_VISIBLE_DEVICES="0,1" python main.py -m test_Mathit --resume_from_checkpoint path_of_hint_checkpoint

For evaluation on other datasets, just run:

CUDA_VISIBLE_DEVICES="0,1" python main.py -m test_<dataset_name> --resume_from_checkpoint path_of_hint_checkpoint

About

Code implementation of the paper "Accurate Interpolation for Scattered Data through Hierarchical Residual Refinement"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages