This course is an introductory undergraduate course in machine learning. The class will briefly cover topics in linear regression, linear classification, fully connected neural networks, convolutional neural networks, recurrent Neural Networks, deep learning.
Prerequisites: You should understand basic probability and statistics, and college-level algebra and calculus. For the programming assignments, you should have some background in programming, and it would be helpful if you know Python and Pytorch.
1 Introduction
1.1 Computer vision
1.2 Deep learning
1.3 Natural language processing
2 Learning Foundations
2.1 Model evaluation
2.2 Model parameter selection
2.3 Supervised learning
2.4 Unsupervised learning
2.5 Gradient descent optimization
3 Linear Models
3.1 Linear models for regression
3.2 Linear models for classification
4 Fully Connected Neural Networks
4.1 Neural networks basics
4.2 Feed-forward network functions
4.3 Error Backpropagation
4.4 A simple example
5 Convolutional Neural Networks
5.1 Convolutional neural networks basics
5.2 Neural computations in a CNN
5.3 The equations of a forward pass through a CNN
5.4 The equations of backpropagation used to train CNNs
5.5 Classic network architectures
5.6 An example for classification
6 Recurrent Neural Networks
6.1 Recurrent neural networks basics
6.2 The equations of a forward pass through a RNN
6.3 The equations of backpropagation used to train RNNs
6.4 RNNs architectures
6.5 An example for natural language processing
Week | Topic | Material | Assignments |
---|---|---|---|
Feb 21-28 | 1.1 Introduction 1.2 Computer vision 1.3 Deep learning 1.4 Natural language processing | Asynchronous Class: [Lv21] Chap 1 (slides)(handouts) Linear Algebra and Probability Review (part 1 Linear Algebra, part 2 Probability) | Python Documentation Python Documentation(中文版) Pytorch Setup Documentation(安装说明) ( CPU version)( GPU version) |
Mar 7-14 | 2.1 Model evaluation 2.2 Model parameter selection 2.3 Supervised learning 2.4 Unsupervised learning 2.5 Gradient descent optimization | Asynchronous Class: [Lv21] Chap 3 (slides) (updated handouts) | Pytorch Documentation Pytorch Documentation(中文版) |
Mar 21-28 Apr 4 | 3.1 Linear models for regression 3.2 Linear models for classification | Asynchronous Class: [Lv21] Chap 5 (slides) Chap 5(handouts) | Linear regression implementation from scratch Implementation of Softmax Regression from Scratch |
Apr 11-25 | 4.1 Neural networks basics 4.2 Feed-forward network functions 4.3 Error Backpropagation 4.4 A simple example | Asynchronous class: [Lv21] Chap 6 (slides) [Bis06] Chap 6 (handouts) | Implementation of Multilayer Perceptrons from Scratch |
May 2-30 | 5.1 Convolutional neural networks basics 5.2 Neural computations in a CNN 5.3 The equations of a forward pass through a CNN 5.4 The equations of backpropagation used to train CNNs 5.5 Classic network architectures 5.6 An example for classification | Asynchronous class: [Lv21] Chap 7 (slides) [Gonz18] Chap 7 (handouts) [Fag21] Representation Learning and Deep Learning (slides) ConvNetJS demos Feature visualization | |
Jun 6-20 | 6.1 Recurrent neural networks basics 6.2 The equations of a forward pass through a RNN 6.3 The equations of backpropagation used to train RNNs 6.4 RNNs architectures 6.5 An example for natural language processing | Asynchronous class: [Lv21] Chap 8 (slides) Chap 8 (handouts) | |
Jun 25 | Final Project |