You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi. I was wondering about the results of your experiments. In section 3.1, in the codebook size ablation study, you show that the usage increases from 75%->97% when the size increases from 8192 to 16384. It's very interesting to see that at 4096 size, the usage is 100%, at 8192 size, the usage drops to 75%, and then rises to 97% at 16384 size. Do you guys have any insights or conclusions on this result?
The text was updated successfully, but these errors were encountered:
Hi. I was also wondering the experiments about code book dimension. I have trained the VQGAN with the downsamlping of 4, however, I do not get the similar result. Do you have the similar results when the downsample is 8 or 4? Thanks.
hi. I was wondering about the results of your experiments. In section 3.1, in the codebook size ablation study, you show that the usage increases from 75%->97% when the size increases from 8192 to 16384. It's very interesting to see that at 4096 size, the usage is 100%, at 8192 size, the usage drops to 75%, and then rises to 97% at 16384 size. Do you guys have any insights or conclusions on this result?
The text was updated successfully, but these errors were encountered: