Skip to content

Latest commit

 

History

History
55 lines (44 loc) · 5.42 KB

RESULTS.md

File metadata and controls

55 lines (44 loc) · 5.42 KB

low resolution = 128, 416 high resolution = 256, 640

5 epochs, depth and motion, low resolution = 2.487 (90 ms/step) 5 epochs, depth and motion, high resolution = 2.3258 (140 ms/step) 5 epochs, depth and motion, low resolution, fully connected = 2.487 (should have the same result as the 1x1 conv) (84 ms/step) 5 epochs, depth and motion, low resolution, fully connected, no mean = 2.5549 5 epochs w/ pose augmentation, depth and motion, low resolution, fully connected = 2.3956 (1.2701 after 15 epochs) 5 epochs w/ pose augmentation and imagenet standardization, depth and motion, low resolution, fully connected = 2.066 (1.1928 after 15 epochs) 5 epochs w/ pose augmentation, imagenet standardization and color augmentation, depth and motion, low resolution, fully connected = 2.432 (1.457 after 15 epochs)

5 epochs, resnet18, low resolution = 2.0806 (200 ms/step)

5 epochs, resnet18 + resnet further downsampling, low resolution = 2.1518 (smells like overfitting) 5 epochs, resnet18 + resnet further downsampling w/ fancy downscaling, low resolution = 1.8149 (lowest val 1.6473, still overfitting) 5 epochs, resnet18 + resnet further downsampling w/ fancy downscaling residual branch, low resolution = 1.9803 (lowest val 1.5707, still overfitting) 5 epochs, resnet18 + resnet further downsampling w/ fancy downscaling conv branch, low resolution = 1.7693 (lowest val 1.6617, still overfitting)

5 epochs, resnet18 + resnet further downsampling + dropout (to reduce overfitting), low resolution = 1.9951 5 epochs, resnet18 + resnet further downsampling w/ fancy downscaling + weight decay (1e-5), low resolution = had terrible results, didn't finish 5 epochs, resnet18 + resnet further downsampling w/ fancy downscaling + weight decay (5e-6), low resolution = 2.2502 5 epochs, resnet18 + resnet further downsampling w/ fancy downscaling + dropout, low resolution = 1.8251

5 epochs w/ pose augmentation, resnet18 + resnet further downsampling w/ fancy downscaling, low resolution = 0.7265 5 epochs w/ pose augmentation, resnet18 + resnet further downsampling w/ fancy downscaling + weight decay (1e-7), low resolution = 1.0815 5 epochs w/ pose augmentation, resnet18 + resnet further downsampling + stochastic depth, low resolution = 1.0628 5 epochs w/ pose augmentation, resnet18 + resnet further downsampling, low resolution = 0.7762

5 epochs w/ pose augmentation, resnet18 + resnet further downsampling w/ fancy downscaling + squeeze excite, low resolution = 0.7963 5 epochs w/ more pose augmentation, resnet18 + resnet further downsampling w/ fancy downscaling, low resolution = 1.2209 5 epochs w/ deduplicated pose augmentation, resnet18 + resnet further downsampling w/ fancy downscaling, low resolution = 0.6665 (lowest val 0.4578) 5 epochs w/ deduplicated pose augmentation, resnet18 + resnet further downsampling w/ fancy downscaling + weight decay (1e-8), low resolution, half precision = 0.8372 (lowest val 0.4710) 5 epochs w/ deduplicated pose augmentation, resnet18 + no further convs w/ fancy downscaling, low resolution, half precision = 0.9144 (lowest val 0.4767) 5 epochs w/ deduplicated pose augmentation, imagenet standardization, resnet18 w/ fancy downscaling, low resolution = 0.7345 (lowest val 0.4581) 5 epochs w/ deduplicated pose augmentation, imagenet standardization and color augmentation, resnet18 + resnet further downsampling w/ fancy downscaling, low resolution = 0.9221 5 epochs w/ deduplicated pose augmentation, imagenet standardization, resnet18 w/ fancy downscaling, high resolution = 0.5202 (lowest val 0.3162) 5 epochs w/ deduplicated pose augmentation no reverse augmentation, imagenet standardization, resnet18 w/ fancy downscaling, high resolution = 0.3584 (lowest val 0.2967) 5 epochs w/ deduplicated pose augmentation no reverse augmentation, imagenet standardization, resnet18 w/ fancy downscaling, low resolution = 0.5075 (lowest val 0.3752) 5 epochs w/ deduplicated pose augmentation no reverse augmentation, imagenet standardization, resnet18 w/ fancy downscaling+ weight decay (1e-8), low resolution = 0.5591 (lowest val 0.3813)

5 epochs, resnet34, low resolution = 2.2646 (280 ms/step) (smells like overfitting)

5 epochs w/ pose augmentation, resnet34 + resnet further downsampling w/ fancy downscaling + squeeze excite, low resolution = 0.7098 5 epochs w/ deduplicated pose augmentation, resnet34 + resnet further downsampling w/ fancy downscaling, low resolution = 0.5678 5 epochs w/ deduplicated pose augmentation, resnet34 + resnet further downsampling w/ fancy downscaling, low resolution, half precision = 0.5779 (lowest val 0.4343) 5 epochs w/ deduplicated pose augmentation, imagenet standardization, resnet34 w/ fancy downscaling, high resolution = didn't train

5 epochs w/ deduplicated pose augmentation, resnet50 + resnet further downsampling w/ fancy downscaling, low resolution = 0.6043

5 epochs, regnet-y 400m w/ fancy downscaling, low resolution = 1.7567 5 epochs, regnet-y 400m w/o fancy downscaling, low resolution = 2.5458 (seems like fancy downscaling is having a huge regularizing effect) 5 epochs w/ pose augmentation, regnet-y 400m + resnet further downsampling w/ fancy downscaling, low resolution = didn't even train, not sure why

5 epochs w/ deduplicated pose augmentation, imagenet standardization, regnet-x 600m no further convs, low resolution = 0.9005 (lowest val 0.5871) 5 epochs w/ deduplicated pose augmentation, imagenet standardization, regnet-x 600m no further convs w/ fancy downscaling residual branch, low resolution = 1.9306 (lowest val 0.6266)