A pytorch package for non-negative matrix factorization.
-
Updated
Jul 25, 2024 - Python
A pytorch package for non-negative matrix factorization.
A PyTorch Implementation of Generating Sentences from a Continuous Space by Bowman et al. 2015.
Implementations of basic concepts dealt under the Reinforcement Learning umbrella. This project is collection of assignments in CS747: Foundations of Intelligent and Learning Agents (Autumn 2017) at IIT Bombay
IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"
Machine Learning algorithms built from scratch for AMMI Machine Learning course
Code for enumerating and evaluating numerical methods for Langevin dynamics using near-equilibrium estimates of the KL-divergence. Accompanies https://doi.org/10.3390/e20050318
A collection of summarizer algorithms
NLP implementations like information-theoretic measures of distributional similarity, text preprocessing using shell commands, Naive Bayes text categorization model, Cocke-Younger-Kasami parsing.
Experiments with variational autoencoders in Julia
This repository contains the lab work for Coursera course on "Generative AI with Large Language Models".
Implementation of KL Divergence and inverted vector model for plagiarism detection in text files
This repository summarizes techniques for KL divergence vanishing problem.
Relative entropy, mutual information, KL divergence of 2 given Images 🖼
PyTorch implementation of α-geodesical skew divergence
Variational autoencoders using Kera's modular design
Implementing various measures of paraphrase detection on Microsoft Paraphrase Corpus and checking their performance on original high dimension TF-IDF matrix and it's low dimension approximation
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
The Dirichlet Mechanism for Differentially Private KL Divergence Minimization
Add a description, image, and links to the kl-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kl-divergence topic, visit your repo's landing page and select "manage topics."