**Table of Contents**

- Instructor: Yangfeng Ji
- Semester: Fall 2021
- Location: Rice 340
- Time: Monday and Wednesday 3:30 PM - 4:45 PM
- TA: Wanyu Du
- Office Hours:
- Yangfeng Ji, Tuesday 1:30 PM - 2:30 PM, Location: Rice 510/Zoom
- Wanyu Du, Thursday 1:30 PM - 2:30 PM, Location: Zoom (temporarily)

- The lectures will also be streamed and recorded on Zoom. Students can find the Zoom link on Collab.

- Schedule
- Campuswire for online discussion. By the time of our first class, students registered for this course should all receive an invitation from Campuswire. Please let the instructor know if you havenâ€™t gotten one.
- Homework submission template for homework assignments

Natural language processing (NLP) seeks to provide computers with the ability to process and understand human language intelligently. Examples of NLP techniques include (i) automatically translating from one natural language to another, (ii) analyzing documents to answer related questions or make related predictions, and (iii) generating texts to help story writing or build conversational agents. This course, consisting of one fundamental part and one advanced part, will give an overview of modern NLP techniques.

This course will mainly focus on applying machine learning (particularly, deep learning) techniques to natural language processing. NLP topics covered by this course

- Text classification
- Language modeling
- Word embeddings
- Machine translation and sequence-to-sequence models
- Some advanced topics: large-scale pre-trained language modelsing (e.g. BERT), natural language generation, interpretability in NLP

**Proficiency in Python**

This course requires some programming in both homeworks and the final project. The preference of programming language for this course is Python (with some additional packages like Scipy, Sklearn, and PyTorch).**Calculus and Linear Algebra**

Multivariable derivatives, matrix/vector notations and operations; singular value decomposition, etc.**Probability and Statistics**

Mean and variance, multinomial distribution, conditional dependence, maximum likelihood estimation, Bayes theorem, etc.**Foundations of Machine Learning**

Logistic regression, cross validation, optimization with gradient descent, bias and variance decomposition, etc.

- [JE] Eisenstein, Natural Language Processing, 2018

**Supplemental materials**

- Jurafsky and Martin, Speech and Language Processing, 3rd Edition, 2019
- Shalev-Shwartz and Ben-David, Understanding Machine Learning: From Theory to Algorithms, 2014
- Goodfellow, Bengio and Courville, Deep Learning, 2016

**Homework (80%)**:- There will be
**five**homeworks and each of them is worth 16%. - Students are allowed to discuss homework with their classmates. If you discuss with your classmates, please disclose their names in your submission.
- Directly copying answers from others is definitely considered as plagiarism.

- There will be
**Project (20%)**:

There is only one course project and the credit breaks down to four parts- Project proposal: 6%
- Final project presentation: 7%
- Final project report: 7%
- Students should team up for this project, each group should have 2 - 3 students.

- In both homework and the final project, other than using the machine learning libraries including Sklearn, PyTorch, Tensorflow, students need to implemented the rest of the proposed model by themselves. Copying code from any resources (e.g., Github, Bitbucket, and Gitlab) is prohibited and will be considered as plagiarism.

**Last updated on August 24, 2021**