Skip to content
This repository has been archived by the owner on Jan 10, 2023. It is now read-only.

Detailed procedure for preprocessing CelebA-HQ 128x128 images #46

Open
kuc2477 opened this issue Mar 4, 2020 · 2 comments
Open

Detailed procedure for preprocessing CelebA-HQ 128x128 images #46

kuc2477 opened this issue Mar 4, 2020 · 2 comments

Comments

@kuc2477
Copy link

kuc2477 commented Mar 4, 2020

Thank you for the extensive experiments and reliable implementations which are hard to find in these days!

I have a few questions on CelebA-HQ 128x128 dataset preprocessing, which was mentioned in "A Large-Scale Study on Regularization and Normalization in GANs, Kurach et al., ICML 2019"

In the section 2.6 of the paper, authors mention that the images were preprocessed by running the 128x128x3 version of the code provided from PGGAN repository.

Can you give some detail on how exactly the "128x128x3 version" was implemented?

Two possibilities in my mind are that

(a) replace all the 1024's of the code with 128, or
(b) resize preprocessed 1024x1024x3 images (original CelebA-HQ images) to 128x128x3

My questions are:

  1. If (a), can you provide us example code for reference?
  2. If (b), can you tell us the resize method you used? (BILINEAR, ANTIALIAS, etc)
  3. How did you split the images into training (27,000) and test (3,000) images? (e.g. sort by index and use first 27,000 images as training set and use last 3,000 images as test images)

Thanks again for your invaluable contribution!

@a7b23
Copy link

a7b23 commented Apr 29, 2020

+1 to the above issue. Would appreciate if someone can reply

@KomputerMaster64
Copy link

I wanted to know what should be the trend with CelebA with different resilutions like 64x64, 128x128, 256x256, 512x512, and 1024x1024

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants