Full Stack Deep Learning - Course Spring 2021
Info
This is the page for the 2021 edition of the course. For the 2022 edition, click here.
We've updated and improved our materials for our 2021 course taught at UC Berkeley and online.
Synchronous Online Course
We offered a paid synchronous option for those who wanted weekly assignments, capstone project, Slack discussion, and certificate of completion.
Enter your email below or follow us on Twitter to be the first to hear about future offerings of this option.
Week 1: Fundamentals
We do a blitz review of the fundamentals of deep learning, and introduce the codebase we will be working on in labs for the remainder of the class.
Reading:
How the backpropagation algorithm works
Week 2: CNNs
We cover CNNs and Computer Vision Applications, and introduce a CNN in lab.
Reading:
A brief introduction to Neural Style Transfer
Improving the way neural networks learn
Week 3: RNNs
We cover RNNs and applications in Natural Language Processing, and start doing sequence processing in lab.
Reading:
The Unreasonable Effectiveness of Recurrent Neural Networks
Attention Craving RNNS: Building Up To Transformer Networks
Week 4: Transformers
We talk about the successes of transfer learning and the Transformer architecture, and start using it in lab.
Reading:
Week 5: ML Projects
Our synchronous online course begins with the first "Full Stack" lecture: Setting up ML Projects.
- Lecture 5: Setting up ML Projects (π with detailed notes)
Reading:
ML Yearning (and subscribe to Andrew Ng's newsletter)
Those in the syncronous online course will have their first weekly assignment: Assignment 1, available on Gradescope.
Week 6: Infra & Tooling
We tour the landscape of infrastructure and tooling for deep learning.
- Lecture 6: Infrastructure & Tooling (π with detailed notes)
Reading:
Machine Learning: The High-Interest Credit Card of Technical Debt
Those in the syncronous online course will have to work on Assignment 2.
Week 7: Troubleshooting
We talk about how to best troubleshoot training. In lab, we learn to manage experiments.
- Lecture 7: Troubleshooting DNNs (π with detailed notes)
- Lab 5: Experiment Management
Reading:
Those in the syncronous online course will have to work on Assignment 3.
Week 8: Data
We talk about Data Management, and label some data in lab.
- Lecture 8: Data Management (π with detailed notes)
- Lab 6: Data Labeling
Reading:
Emerging architectures for modern data infrastructure
Those in the syncronous online course will have to work on Assignment 4.
Week 9: Ethics
We discuss ethical considerations. In lab, we move from lines to paragraphs.
- Lecture 9: AI Ethics (π with detailed notes)
- Lab 7: Paragraph Recognition
Those in the synchronous online course will have to submit their project proposals.
Week 10: Testing
We talk about Testing and Explainability, and set up Continuous Integration in lab.
- Lecture 10: Testing & Explainability (π with detailed notes)
- Lab 8: Testing & CI
Those in the synchronous online course will work on their projects.
Week 11: Deployment
We cover Deployment and Monitoring, and package up our model for deployment in lab.
- Lecture 11: Deployment & Monitoring (π with detailed notes)
- Lab 9: Web Deployment
Those in the synchronous online course will work on their projects.
Week 12: Research
We talk research, and set up robust monitoring for our model.
- Lecture 12: Research Directions (π with detailed notes)
- Lab 10: Monitoring
Those in the synchronous online course will work on their projects.
Week 13: Teams
We discuss ML roles and team structures, as well as big companies vs startups.
- Lecture 13: ML Teams & Startups (π with detailed notes)
- Panel Discussion: Do I need a PhD to work in ML?
Week 14-16: Projects
Those in the synchronous online course will submit 5-minute videos of their projects and associated write-ups by May 15.
Check out the course projects showcase.
Other Resources
Fast.ai is a great free two-course sequence aimed at first getting hackers to train state-of-the-art models as quickly as possible, and only afterward delving into how things work under the hood. Highly recommended for anyone.
Dive Into Deep Learning is a great free textbook with Jupyter notebooks for every part of deep learning.
NYUβs Deep Learning course has excellent PyTorch breakdowns of everything important going on in deep learning.
Stanfordβs ML Systems Design course has lectures that parallel those in this course.
The Batch by Andrew Ng is a great weekly update on progress in the deep learning world.
/r/MachineLearning/ is the best community for staying up to date with the latest developments.