1. Basic Information
- Instructor: Yangfeng Ji
- Semester: Spring 2023
- Location: Rice 130
- Time: Tuesday and Thursday 11:00 AM – 12:15 PM
- TA: TBA
- Office Hours: TBA
3. Course Description
- The goal of this course is to understand basic concepts and algorithms in machine learning.
- Teaching how to use machine learning packages (e.g., sklearn) will not be a focus of this course. Although, we do have demos built upon these packages for explaining the concepts and algorithms.
- Deep learning (or neural network modeling) will only be a small part of this course.
Most of the course materials (and assignments) are adopted from Shalev-Shwartz and Ben-David’s textbook on machine learning. Particularly, the topics overed in this course are:
- Introduction to learning theory
- Linear classification and regression
- Support vector machines and kernel methods
- Model selection and validation
- Neural networks and deep learning
- Generative modeling
For more information about this course, please checkout the [schedule](schedule.
In addition to the programming skill, basic probability theory and linear algebra are highly recommended.
- Programming and Algorithm: CS 2150 or CS 3100 with a grade of C- or better
- We will use Python for our homework assignments
- Probability: APMA 3100, APMA 3110, MATH 3100, or equivalent
- Example topics: definition of probability, basic probability distributions (e.g., Gaussian, Bernoulli, and Multi-nominal)
- Linear Algebra: Math 3350 or APMA 3080 or equivalent
- Example topics: matrix-vector multiplication, special metrices, linear transformation, matrix factorization (e.g., SVD)
- [UML] Shalev-Shwartz and Ben-David. Understanding Machine Learning: From Theory to Algorithms. 2014
- [DL] Goodfellow, Bengio, and Courville. Deep Learning. 2016
- [PRML] Bishop. Pattern Recognition and Machine Learning. 2006
- [MRT] Mohri, Rostamizadeh, and Talwalkar. Foundations of Machine Learning. 2nd Edition. 2018
4. Assignments and Final Project