Website | https://www.isi.edu/~jonmay/cs662_fa20_web/ |
Lectures | https://usc.zoom.us/j/92532775397 (password on piazza) , Mondays and Wednesdays 10:00–11:50 am |
Instructor & office hours | Jonathan May, Online, Mondays and Wednesdays 9:00–10:00 am or by appointment (same room as lectures) |
TAs & Office hours | Mozhdeh Gheini, Tuesdays and Thursdays, 1:00–2:00 pm, Online |
Meryem M'Hamdi, Mondays and Wednesdays, 4:00–5:00 pm, Online | |
Textbook | Required: Natural Language Processing - Eisenstein1 |
Required: Selected papers from NLP literature, see (evolving) schedule | |
Optional: Introduction to Deep Learning - Charniak 2 | |
Optional: Speech and Language Processing 3rd edition -Jurafsky, Martin 3 | |
Grading | 10 %: In-class participation |
10 %: Posted questions before each in-class selected paper presentation | |
10 %: In-class selected paper presentation | |
30 %: Three Homeworks (10% each) | |
40 %: Project, comprising proposal (10%), final conference-quality paper (15%), and 15-minute in-class presentation (15%) (may be done in small groups). Final report is due December 7, 2020, 10:00 AM PST | |
Contact us | On Piazza or in class/office hours. Please do not email (unless notified otherwise). |
Topics (subject to change per instructor/class whim) (will not be presented in this order):
date | material | reading | presentation | Other | TA attending |
8/24 | intro, applications | Eisenstein 1 (not mandatory) | Meryem & Mozhdeh | ||
8/26 | end of intro, probability basics | Eisenstein Appendix A, Goldwater probability tutorial 4 | project assignment out (due 9/9) | Meryem | |
8/31 |
probability, ethics, naive bayes | Eisenstein 2, Nathan Schneider's unix notes5, Unix for poets6, sculpting text7 | The Social Impact of Natural Language Processing8 Presenter: Jon | - | |
9/2 | Perceptron, Logistic Regression | Eisenstein 3, Charniak 1. | Thumbs up? Sentiment Classification using Machine Learning Techniques9 Presenter: Zekun | HW1 out (due 9/30) | Meryem |
9/7 | LABOR DAY NO CLASS | ||||
9/9 | Nonlinear classifiers, backpropagation, gradient descent | Eisenstein 7 | Fast Semantic Extraction Using a Novel Neural Network Architecture10 Presenter: Tooraj | project proposal due | Mozhdeh |
9/14 | POS tags, HMMs | Eisenstein 9,2, 10. | Part-of-Speech Tagging for Twitter: Annotation, Features, and Experiments11 Presenter: Negar | Mozhdeh | |
9/16 | cky, constituencies, treebank | Eisenstein 11 | Building a Large Annotated Corpus of English12 Presenters: Ani, Sabyasachee | Meryem | |
9/21 | viterbi cky, restructuring, dependencies, shift-reduce | Eisenstein 4.5. | A Fast and Accurate Dependency Parser using Neural Networks13 Presenter: Shweta. | Mozhdeh | |
9/23 | arc-eager, evaluation, human annotation | Eisenstein 13, 14. | An Empirical Investigation of Statistical Significance in NLP 14 Presenter: Nikolaos | HW2 out (due 10/21) | Meryem |
9/28 | NO CLASS | ||||
9/30 | mechanical turk, semantics: word sense, propbank, amr, distributional | Eisenstein 7 | Linguistic Regularities in Continuous Space Word Representations15. Presenter: Hongkuan The word analogy testing caveat16 Presenter: Jihoon | HW1 due | Meryem |
10/5 | language models: ngram, feed-forward, recurrent Machine Translation history, evaluation | Eisenstein 18.1, 18.2 | Bleu: a Method for Automatic Evaluation of Machine Translation17 Presenter: Paras Towards a Literary Machine Translation: The Role of Referential Cohesion18 Presenter: Katy | Mozhdeh | |
10/7 | Statistical, Neural Machine Translation, summarization, generation | Eisenstein 18.3, 19.1, 19.2 | Effective Approaches to Attention-based Neural Machine Translation19 Presenter: Xiou Neural Machine Translation by Jointly Learning to Align and Translate20 Presenter: Soumya | Revised proposals due (no late days) | Meryem |
10/12 | Transformers | Attention is all you need 21, Illustrated Transformer 22 | Six Challenges for Neural Machine Translation23 Presenter: Yuchen. Get To The Point: Summarization with Pointer-Generator Networks24 Presenter: Qi. |
Mozhdeh | |
10/14 | Large Contextualized Language Models (ElMo, BERT, GPT-N, etc.) | Illustrated BERT, ElMo, and co.25 | Universal Neural Machine Translation for Extremely Low Resource Languages 26 Presenter: Amirhesam. Defending Against Neural Fake News 27 Presenter: Yizhou. | HW3 out (due 11/11) | Meryem |
10/19 | Guest Lecture (Xuezhe Ma) | Language Models are Unsupervised Multitask Learners 28 Presenter: I-Hung. Language Models are Few-Shot Learners29 Presenters: Yufei, Wenxuan. | Mozhdeh | ||
10/21 | Information Extraction: Entity/Relation, CRF | Eisenstein 17.1, 17.2 | 25 years of IE30 Presenter: Justin. | HW2 Due | Meryem |
10/26 | Information Extraction: Events, Zero-shot | Eisenstein 17.3 | Events are Not Simple: Identity, Non-Identity, and Quasi-Identity31 Presenter: Basel | Mozhdeh | |
10/28 | Blade Runner NLP/Bertology | GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding32 Presenter: Prateek | Meryem | ||
11/2 | Text Games and Reinforcement Learning | The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives33 Presenter: Shuai | Mozhdeh | ||
11/4 | Dialogue | Eisenstein 19.3. | A Diversity-Promoting Objective Function for Neural Conversation Models34 Presenter: Peifeng Personalizing Dialogue Agents: I have a dog, do you have pets too?35 Presenter: Akshat | Meryem | |
11/9 |
Power and Ethics | Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data36 Presenter: Ani Energy and Policy Considerations for Deep Learning in NLP 37 Presenter: Ali A. | Mozhdeh | ||
11/11 |
How to write a paper | Neubig slides on Piazza | On Measuring Social Biases in Sentence Encoders 38 Presenter: Katy | HW3 Due | Meryem |
11/16 |
Presentations | Meryem & Mozhdeh | |||
11/18 | Presentations | Meryem & Mozhdeh | |||
11/23 |
Presentations | Meryem & Mozhdeh | |||