Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replaced KL divergence estimation with deterministic KL #760

Merged

Conversation

nmichlo
Copy link
Contributor

@nmichlo nmichlo commented Oct 28, 2021

What does this PR do?

Changed the VAE to compute the KL divergence directly during training, instead of estimating it by sampling from the posterior and prior distributions.

Fixes #565

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes? (I struggled to get them to run)
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

@github-actions github-actions bot added the model label Oct 28, 2021
@nmichlo nmichlo marked this pull request as ready for review October 28, 2021 13:57
@Borda Borda merged commit 51942bc into Lightning-Universe:master Nov 8, 2021
@mergify mergify bot added the ready label Nov 8, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

VAE KL Divergence
2 participants