Course description
Instructor
Jingli Gao
Associate Profesor
School of Software Engineering
Pingdingshan University
Course goal
This course is an introductory undergraduate course in machine learning. The class will briefly cover topics in linear regression, linear classification, fully connected neural networks, convolutional neural networks, recurrent Neural Networks, deep learning.
Prerequisites: You should understand basic probability and statistics, and college-level algebra and calculus. For the programming assignments, you should have some background in programming, and it would be helpful if you know Python and Pytorch.
Course topics
1 Introduction
1.1 Computer vision
1.2 Natural language processing
2 Learning Foundations
2.1 Machine learning in aily life
2.2 Key components of machine learning
2.3 Various machine learning issues
3 Linear Models
3.1 Linear models for regression
3.2 Linear models for classification
4 Fully Connected Neural Networks
4.1 Neural networks basics
4.2 Feed-forward network functions
4.3 Error Backpropagation
4.4 A simple example
5 Convolutional Neural Networks
5.1 Convolutional neural networks basics
5.2 Neural computations in a CNN
5.3 The equations of a forward pass through a CNN
5.4 The equations of backpropagation used to train CNNs
5.5 Classic network architectures
5.6 An example for classification
6 Recurrent Neural Networks
6.1 Recurrent neural networks basics
6.2 The equations of a forward pass through a RNN
6.3 The equations of backpropagation used to train RNNs
6.4 RNNs architectures
6.5 An example for natural language processing
Evaluation and grading policy
- Assignments 50%
- Final project 50%
Course resources
References
- [Lv21] 吕云翔, 刘卓然. PyTorch深度学习实战:微课视频版, 清华大学出版社, 2021.
- [xu22] 徐亦达. Machine Learning Class.
- [Gon18] Rafael C. Gonzalez, Richard E. Woods. Digital Image Processing, 4th Edition, Pearson, 2018.
- [Gut22] Michael U. Gutmann. Pen and Paper Exercises in Machine Learning, 2022.
- [Alp14] Alpaydin, E. Introduction to Machine Learning, 3Ed. The MIT Press, 2014.
- [Fag21] Fabio A. González. Machine Learning, 2021.
- [Mur12] Murphy, Kevin P. Machine learning: a probabilistic perspective. The MIT Press, 2012.
- [Barber2013] Barber, David, Bayesian Reasoning and Machine Learning, Cambridge University Press, 2013.
- [Bis06] Bishop, C. Pattern Recognition and Machine Learning. Springer-Verlag, 2006.
- [HTF09] Hastie, T. and Tibshirani, R. and Friedman. The elements of statistical learning: data mining, inference, and prediction, Springer, 2009
- [GBC2016] Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. Deep learning. MIT press, 2016.
- [zhang2021dive] Aston Zhang, Zack C. Lipton, Mu Li, Alex J. Smola. Dive into Deep Learning, 2021.
- [Mit97] Mitchell, T. M. 1997 Machine Learning. 1st. McGraw-Hill Higher Education.
- [DHS00] Duda, R. O., Hart, P. E., and Stork, D. G. 2000 Pattern Classification (2nd Edition). Wiley-Interscience.
- [SC04] Shawe-Taylor, J. and Cristianini, N. 2004 Kernel Methods for Pattern Analysis. Cambridge University Press.
- [SS02] Scholkopf, B. and Smola, A.J., 2002, Learning with kernels, MIT Press.
- [OCW-ML] 6.867 Machine Learning, Fall 2006, MIT OpenCourseWare.
- [STANFD-ML] Andrew Ng, CS229 Machine Learning, Stanford University
Additional resources
- SciPy: scientific, mathematical, and engineering package for Python
- scikit-learn: machine learning Scipy add-on
- Kaggle: datascience competition, many interesting data sets and different competitions with prizes.
- CMU Deep Learning Systems: Algorithms and Implementation: taught by Tianqi Chen and Zico Kolter.
- Stanford Deep Learning for Computer Vision Course: one of the best computer vision course taught by Fei-Fei Li.
- Coursera Machine Learning Course: one of the first (and still one of the best) machine learning MOOCs taught by Andrew Ng.
- Stanford Statistical Learning Course: an introductory course with focus in supervised learning and taught by Trevor Hastie and Rob Tibshirani.
- Introduction to Machine Learning: the course content comes from specific chapters of Pattern Recognition and Machine Learning by Chris Bishop.
Course schedule
| Week | Topic | Material | Labs |
|---|---|---|---|
| Feb 21-28 | 1.1 Introduction 1.2 Computer vision 1.3 Deep learning 1.4 Natural language processing | Asynchronous Class: [Lv21] Chap 1 (slides1)(handouts) Linear Algebra and Probability Review (part 1 Linear Algebra, part 2 Probability) | Python Documentation Python Documentation(中文版) Pytorch Documentation Pytorch Documentation(中文版) Pytorch Setup Documentation(安装说明) ( CPU version)( GPU version) |
| Mar 7-14 | 2.1 Model evaluation 2.2 Model parameter selection 2.3 Supervised learning 2.4 Unsupervised learning 2.5 Gradient descent optimization | Asynchronous Class: [Lv21] Chap 3 (slides) (updated handouts) | Lab1-Tensor(实验1) |
| Mar 21-28 Apr 4 | 3.1 Linear models for regression 3.2 Linear models for classification | Asynchronous Class: [Lv21] Chap 5 (slides) Chap 5(handouts) | Lab2-Linear-Model(实验2) |
| Apr 11-25 | 4.1 Neural networks basics 4.2 Feed-forward network functions 4.3 Error Backpropagation 4.4 A simple example | Asynchronous class: [Lv21] Chap 6 (slides) [Bis06] Chap 6 (handouts) | Lab3-Multilayer-Perceptrons-Model(实验3) |
| May 2-30 | 5.1 Convolutional neural networks basics 5.2 Neural computations in a CNN 5.3 The equations of a forward pass through a CNN 5.4 The equations of backpropagation used to train CNNs 5.5 Classic network architectures 5.6 An example for classification | Asynchronous class: [Lv21] Chap 7 (slides) [Gonz18] Chap 7 (handouts) | Lab4-Convolutional-Neural-Networks-Model(实验4) |
| Jun 6-20 | 6.1 Recurrent neural networks basics 6.2 The equations of a forward pass through a RNN 6.3 The equations of backpropagation used to train RNNs 6.4 RNNs architectures 6.5 An example for natural language processing | Asynchronous class: [Lv21] Chap 8 (slides) Chap 8 (handouts) | Lab5-Recurrent-Neural-Networks-Model(实验5) |
| Jun 25 | Final Project |