Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
-
Updated
Mar 3, 2018 - Jupyter Notebook
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
The Dirichlet Mechanism for Differentially Private KL Divergence Minimization
This repository includes some detailed proofs of "Bias Variance Decomposition for KL Divergence".
Implementation of a Denoising Diffusion Probabilistic Model with some mathematical background.
Kullback-Leibler divergence in Python
My lab work of “Generative AI with Large Language Models” course offered by DeepLearning.AI and Amazon Web Services on coursera.
Using Monte-Carlo simulated datasets, a completely transparent Boltzmann Machine trained on 1-D Ising chain data is implemented to predict model couplers in the absence of past coupler values. Methods from machine learning applied to theoretical physics are on display in this work.
In this project, we explore how we can use entropy and information in language models and how we can optimize it for generative tasks.
Python information theory computation
Implementation of the Non-negative Multiple Matrix Factorization (NMMF) algorithm proposed in Takeuchi et al, 2013 with some modifications. There is a python native version NMMFlexPy and a R wrapper NMMFlexR
My MSc project on applying, tuning and modifying the PPO and A2C algorithms to Pettingzoo MARL library two player poker game
Forward Sampling-Conversion of BN
Change point detection using KL divergence
Generating gray scale images of numerical digit by using variational autoencoder
Text Analytic Projects
average-KL-divergence-calculator.py is a Python script that calculates the average KL divergence for each FASTA file in a directory and produces separate output files and a combined output file with the results.
Replication of the research paper titled Auto-Encoding Variational Bayes.
Scheduling TRPO's KL Divergence Constraint
Novel technique to fit a target distribution with a class of distributions using SVI (via NumPyro). Unlike standard SVI, our "data" is a distribution rather than a finite collection of samples.
Factorized variational approximation using a univariate Gaussian distribution over a single variable x.
Add a description, image, and links to the kl-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kl-divergence topic, visit your repo's landing page and select "manage topics."