Skip to content

Language modeling, LSTM, Attention models, Transformers, Parsing and Tagging in NLP, EM algorithm, Auto-encoders implemented in Python using PyTorch. The assignments are part of the course Natural Language Processing.

Notifications You must be signed in to change notification settings

prakruti-joshi/Natural-Language-Processing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Natural-Language-Processing

CS: 533 Intructor: Karl Stratos, Rutgers University

Course Outline:

The topics covered are:

  • Language modeling: n-gram models, log-linear models, neural models
  • Deep learning in NLP: RNN, LSTM, Attention models, Transformers, BERT (transfer knowledge models)
  • Structured Prediction in NLP: Tagging and Parsing (Constituency and Dependency)
  • Unsupervised Learning in NLP: Latent-Variable Generative Models, EM Algorithm, Autoencoders, VAEs
  • Information Extraction
  • Large-Scale Transfer Learning

Project:

Gated Attention Network

Assignments:

All the assignments are implemented in Python using PyTorch.

Assignment 1:

  • N-gram models: Relative Frequency Lemma, Maximum Likelihood Estimation (MLE) of the Trigram Language Model
  • Preliminary probability and statistics, Linear Algebra, Optimization

Assignment 2:

  • Log-linear language models
  • NLTK tokenizer
  • Feedforward Neural Language Model

Assignement 3:

  • Backpropogation in Neural Networks
  • Self-attenion and LSTM models
  • Transformers (encoder/decoder models)
  • BLUE (bilingual evaluation understudy) algorithm

Assignement 4:

  • Structured Prediction in NLP: Tagging
  • Hidden Markov Models (HMMs)
  • Conditional Random Fields (CRFs)
  • Structured Prediction in NLP: Constituency and Dependency Parsing
  • Probabilistic Context-Free Grammars (PCFGs)

Assignment 5:

  • Expectation Maximization (EM) Algorithm
  • Variational Autoencoders (VAE)

About

Language modeling, LSTM, Attention models, Transformers, Parsing and Tagging in NLP, EM algorithm, Auto-encoders implemented in Python using PyTorch. The assignments are part of the course Natural Language Processing.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages