This repo contains my work for Andrej Karpathy's "Neural Networks: Zero to Hero" course.
It is a fantastic series of lectures and exercises that dives deep into the fundamentals of neural nets and deep learning in a very hands-on, first-principles way. For example, in the very first lecture, we implement backpropagation from scratch without using any libraries (no pytorch, numpy, etc).
I highly recommend the course to anyone who wants to acquire foundational skills for building and training neural networks.
https://karpathy.ai/zero-to-hero.html
https://github.com/karpathy/nn-zero-to-hero
Link | Title |
---|---|
Lecture 1 | The spelled-out intro to neural networks and backpropagation: building micrograd |
Lecture 2 | The spelled-out intro to language modeling: building makemore |
Lecture 3 | Building makemore Part 2: MLP |
Lecture 4 | Building makemore Part 3: Activations & Gradients, BatchNorm |
Lecture 5 | Building makemore Part 4: Becoming a Backprop Ninja |
Lecture 6 | Building makemore Part 5: Building WaveNet |
Lecture 7 | Let's build GPT: from scratch, in code, spelled out |