Variational autoencoders using Kera's modular design
-
Updated
Mar 3, 2018 - Python
Variational autoencoders using Kera's modular design
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
Python information theory computation
Code for enumerating and evaluating numerical methods for Langevin dynamics using near-equilibrium estimates of the KL-divergence. Accompanies https://doi.org/10.3390/e20050318
Implementations of basic concepts dealt under the Reinforcement Learning umbrella. This project is collection of assignments in CS747: Foundations of Intelligent and Learning Agents (Autumn 2017) at IIT Bombay
Experiments with variational autoencoders in Julia
This repository summarizes techniques for KL divergence vanishing problem.
Implementation of KL Divergence and inverted vector model for plagiarism detection in text files
Text Analytic Projects
A collection of summarizer algorithms
Machine Learning algorithms built from scratch for AMMI Machine Learning course
Implementing various measures of paraphrase detection on Microsoft Paraphrase Corpus and checking their performance on original high dimension TF-IDF matrix and it's low dimension approximation
NLP implementations like information-theoretic measures of distributional similarity, text preprocessing using shell commands, Naive Bayes text categorization model, Cocke-Younger-Kasami parsing.
Relative entropy, mutual information, KL divergence of 2 given Images 🖼
PyTorch implementation of α-geodesical skew divergence
Factorized variational approximation using a univariate Gaussian distribution over a single variable x.
Coordinate ascent mean-field variational inference (CAVI) using the evidence lower bound (ELBO) to iteratively perform the optimal variational factor distribution parameter updates for clustering.
Replication of the research paper titled Auto-Encoding Variational Bayes.
Add a description, image, and links to the kl-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kl-divergence topic, visit your repo's landing page and select "manage topics."