Jonathan May
Website | https://www.isi.edu/~jonmay/cs662_fa19_web/ |
Lectures | LVL (Leavy Library) 16, Mondays and Wednesdays 3:30-5:20 pm |
Instructor & office hours | Jonathan May, LVL (exact location on piazza), Mondays and Wednesdays 2-3pm or by appointment |
TA & Office hours | Samar Haider, TBD |
Textbook | Required: Natural Language Processing - Eisenstein1 |
Selected Papers: From NLP literature, see (evolving) schedule | |
Optional: Introduction to Deep Learning - Charniak 2 | |
Optional: Speech and Language Processing 3rd edition -Jurafsky, Martin 3 | |
Grading | 10 %: In-class participation |
10 %: In-class quizzes | |
40 %: Four Homeworks | |
40 %: Project, including extended abstract, final conference-quality paper, and 20-minute in-class presentation (may be done in small groups) | |
Contact us | On Piazza or in class/office hours. Please do not email (unless notified otherwise). |
Topics (subject to change per instructor/class whim) (will not be presented in this order):
instructor | date | material | reading | Other |
JM | 8/26 | intro, applications | Eisenstein 1 (not mandatory) | |
JM | 8/28 | probability basics | Eisenstein Appendix A, Goldwater probability tutorial 4 | |
N/A | 9/2 | LABOR DAY NO CLASS | ||
JM | 9/4 | corpora, text processing, Linear Classifiers (Naive Bayes, Logistic Regression, Perceptron), part of speech tagging | Eisenstein 2, Nathan Schneider's unix notes5, Unix for poets6, sculpting text7 | HW1 out (4 weeks) |
TG | 9/9 | Nonlinear classifiers, feed forward neural networks, backpropagation, gradient descent | Eisenstein 3, Charniak 1 | |
TG | 9/11 | pytorch and google cloud basics | ||
JM | 9/16 | POS tags, HMMs, search | Eisenstein 7 | |
JM | 9/18 | parsing and syntax 1: treebanks, evaluation, cky, grammar induction, pcfgs | Eisenstein 9,2, 10 | |
JM | 9/23 | parsing and syntax 2: dependencies, shift-reduce, chiu-liu-edmonds | Eisenstein 11, A Fast and Accurate Dependency Parser using Neural Networks8 | HW2 out (4 weeks) |
JM | 9/25 | (moved!) evaluation, annotation, mechanical turk | Eisenstein 4.5, berg-kirkpatrick 2012 9 resnik/lin chapter10 | |
N/A | 9/30 | NO CLASS | ||
JM | 10/2 | language models: ngram, feed-forward | Eisenstein 7 | HW1 due |
JM | 10/7 | RNN LMs | ||
XY | 10/9 | Text Games and Reinforcement Learning | ||
JM | 10/14 | semantics: word sense, propbank, amr, distributional lexical | Eisenstein 13, 14 | HW3 out (4 weeks) |
JM | 10/16 | Machine Translation history, evaluation, statistical | Eisenstein 18.1, 18.2 | project proposal due |
JM | 10/21 | Neural Machine Translation, summarization, generation | Eisenstein 18.3, 19.1, 19.2 | |
JM | 10/23 | Information Extraction: Entity/Relation, CRF | 25 years of IE11 Eisenstein 17.1, 17.2 | |
JM | 10/28 | Information Extraction: Events, Zero-shot | Eisenstein 17.3 | HW 2 Due |
JM | 10/30 | Transformers | Attention is all you need 12, Illustrated Transformer 13 | |
JM | 11/4 | Large Contextualized Language Models (ElMo, BERT, GPT(2), etc.) | Illustrated BERT, ElMo, and co.14 | HW4 out (4 weeks) |
JM | 11/6 | Blade Runner NLP | ||
JM | 11/11 | Power and Ethics | Energy and Policy Considerations for Deep Learning in NLP 15 | HW 3 due |
JM | 11/13 | Amazon event (optional); no class | ||
JM | 11/18 | How to write a paper | Neubig slides on Piazza | |
JM | 11/20 | Creative Generation, structure-to-text, text-to-text | Eisenstein 19.1, 19.2 | project paper due (if submitting to ACL) |
JM | 11/25 | Dialogue | Eisenstein 19.3 | |
N/A | 11/27 | THANKSGIVING NO CLASS | ||
Class | 12/2 | Presentations | HW4 due | |
Class | 12/4 | Presentations | ||