ECE-5424G / CS-5824: Advanced Machine Learning Spring 2019

Course Information


  • Where: Room 118C, Surge Space Building

  • When: Monday and Wednesday 2:30 – 3:45 PM

Instructor & Teaching Assistants

Jia-Bin Huang Chen Gao Chen Gao
Jia-Bin Huang Chen Gao Shih-Yang Su
Instructor Teaching Assistant Teaching Assistant


  • Proficiency in Python: We will use NumPy and PyTorch for class assignments. Check out the Python review.

Textbook and optional references

Lectures are not based on any particular textbook. Useful references include

  • Pattern Recognition and Machine Learning by Christopher Bishop [Link]

  • Machine Learning: A Probabilistic Perspective by Kevin Murphy [Link]

  • Deep Learning by Ian Goodfellow and Yoshua Bengio and Aaron Courville [Link]

  • Probabilistic Graphical Models by Daphne Koller and Nir Friedman. [Link]


Please use Piazza for all communications. Please no emails to the instructors or TA.

Disability-related academic adjustments

To obtain disability-related academic adjustments and/or auxiliary aids, students with disabilities must contact the course instructor and the Services for Students with Disabilities (SSD) as soon as possible. To contact SSD you may visit Suite 310 at Lavery Hall, or contact SSD via email or here.


Regular attendance is expected. I will post lecture slides on the course website. However, the slides will be difficult to interpret without attending lectures.

Office hour

Instructor VT Email Office Office Hours
Jia-Bin Huang jbhuang 440 Whittemore Hall Mon 3:45 - 4:45
Cheng Gao (TA) chengao XXX Whittemore TBD
Shih-Yang Su (TA) shihyang XXX Whittemore TBD

Academic integrity

Feel free to discuss homeworks with your classmates, but please refrain from showing or sharing any code. Any existing code from the Internet cannot be used in your project assignments unless it is specifically approved by the course instructor. Be sure to acknowledge any help that you do get from other students or outside works, even if its just a small suggestion. Note that violations of academic integrity will go on record at the university, and zero points for the entire project assignment. Please read the following Honor Code pledge.

The Undergraduate Honor Code pledge that each member of the university community agrees to abide by states:

“As a Hokie, I will conduct myself with honor and integrity at all times. I will not lie, cheat, or steal, nor will I accept the actions of those who do.”

Students enrolled in this course are responsible for abiding by the Honor Code. A student who has doubts about how the Honor Code applies to any assignment is responsible for obtaining specific guidance from the course instructor before submitting the assignment for evaluation. Ignorance of the rules does not exclude any member of the University community from the requirements and expectations of the Honor Code.

For additional information about the Honor Code, please visit:


Assignments and Grading

  • Homeworks (50%): There are in total seven homework assignments (HW0 - HW6)

  • Midterm exam (10 %)

  • Final exam (15 %)

  • Final project (25% of final grade):

Due dates

All problem sets/reports are to be submitted through Canvas by the due date noted on the assignment. Deadlines are firm.

Late policy

You are expected to do assignments on time. Late assignments will be assigned a penalty of 10% per day. Throughout the term you have an allowance of SIX free late days for your submissions, meaning you can accrue up to five days in late submissions with no penalty.

Final project

The final project is a chance to further explore a topic of interest. Groups of up to four are highly encouraged. More is expected of larger groups. Projects will include a project report webpage. Various types of projects are possible. You could implement a paper that you find interesting, something discussed in class, a significant extension of one of the course projects, or something entirely of your own design. The work does not have to be of publishable originality. However, you are encouraged to submit the revised versions of projects to top machine learning conferences.

  • Research project: Perform a project in a topic of your choice. Formulate a goal, devise an approach, and evaluate. When proposing, indicate what dataset you will use for evaluation. For example, you could base your project on an existing paper and try to improve the accuracy or speed with some modification. You could also apply existing algorithms to your own field (e.g., robotics).

  • Review and implement a paper: Choose a paper or set of papers and write a scholarly review. Then, implement and evaluate the algorithm. If done in a group, more than one paper should be implemented and compared. Reviews should be written independently for each person, but the group can collaborate on implementation and evaluation.

Date Lecture Topics Course Materials Events
Jan 23 (Wed) Introduction - Machine learning overview
- Course logistics
[PPT] [PDF] Homework 0 Out
Jan 28 (Mon) K-Nearest Neighbor - Learning from data
- Curse of dimensionality
Jan 30 (Wed) Linear Regression - Cost function
- Gradient descent
- Features and polynomial regression
Feb 04 (Mon) Naive Bayes - Conditional independence
- Naive Bayes: why and how
[PPT] [PDF] Homework 0 Due
Homework 1 Out
Feb 06 (Wed) Logistic Regression - Maximizing conditional likelihood
- Multi-class classification
Feb 11 (Mon) Regularization - Over-fitting/underfitting
- Regularized linear/logistic regression
Feb 13 (Wed) SVM I - Linear SVM: Primal and dual forms [PPT] [PDF]
Feb 18 (Mon) SVM II - Kernel methods [PPT] [PDF] Homework 1 Due
Homework 2 Out
Feb 20 (Wed) Deep Neural Networks I - Model representation [PPT] [PDF]
Feb 25 (Mon) Deep Neural Networks II - Model training: Backpropagation [PPT] [PDF]
Feb 27 (Wed) Diagnosing ML systems - Hypothesis evaluation
- Bias-Variance tradeoff
- Model/feature selection
Mar 04 (Mon) Midterm Review [PPT] [PDF] Homework 2 Due
Homework 3 Out
Mar 06 (Wed) Midterm exam
Mar 11 (Mon) Spring break
Mar 13 (Wed) Spring break
Mar 18 (Mon) Clustering - Introduction to unsupervised learning
- K-means
Mar 20 (Wed) No class
Mar 25 (Mon) EM and GMM - Expectation maximization algorithm
- Gaussian mixture model
Mar 27 (Wed) Dimensionality reduction - Motivation
- Principal component analysis
[PPT] [PDF] Homework 3 Due
Homework 4 Out
Apr 01 (Mon) Anomaly Detection - Developing anomaly detection algorithms
- Anomaly detection vs. supervised learning
Apr 03 (Wed) Recommender systems - Content-based recommendation system
- Collaborative filtering
Apr 08 (Mon) Semi-supervised learning - Label propagation based methods
- Consistency-based methods
Apr 10 (Wed) Emsemble methods - Bagging
- Gradient boosting
- AdaBoost
[PPT] [PDF] [Emsembles] Homework 4 Due
Homework 5 Out
Apr 15 (Mon) Generative Models I - Variational auto-encoder
- Auto-regressive methods
[PPT] [PDF] [GenerativeModels]
Apr 17 (Wed) Generative Models II - Generative adversarial networks [PPT] [PDF]
Apr 22 (Mon) Sequence prediction models - RNN, LSTM, GRU, Transformer [PPT] [PDF] Homework 5 Due
Apr 24 (Wed) Markov Decision Process - Introduction to reinforcement learning
- Bellman Equations
- Value iteration and policy iteration
[PPT] [PDF] [Reinforcement Learning]
Apr 29 (Mon) Q-learning, Policy Gradient, Actor-Critic - Tabular Q-Learning
- Value function approximation.
Policy Search
May 01 (Wed) Course Summary [PPT] [PDF]
May 06 (Mon) Final Project Presentation I
May 08 (Wed) Final Project Presentation II

Credits and Course Notes

The course material builds upon many preceding efforts to design excellent course projects and wonderful course notes. Feel free to use and modify any of the slides for academic and research purposes. Please do credit the original sources where appropriate.