CSE 5525: Speech and Language Processing

Details
Textbooks:
There are two excellent NLP textbooks that are freely available online. I will assign readings from both - there is a lot of value in seeing multiple perspectives on the same material. If a concept you encounter seems confusing at first, try reading about it in the other textbook to get a different perspective.
Grading

Grading will be based on:

Participation (12.5%)

You will receive credit for asking and answering questions related to the homework on Piazza, engaging in class discussion and participating in the in-class exercises.

Homeworks (62.5%)

The homeworks will include both written and programming assignments. Homework should be submitted to the Dropbox folder in Carmen by 11:59pm on the day it is due (unless otherwise instructed). Each student will have 3 flexible days to turn in late homework throughout the semester. As an example, you could turn in the first homework 2 days late and the second homework 1 day late without any penalty. After that you will loose 20% for each day the homework is late. Please email your homework to the instructor in case there are any technical issues with submission.

Midterm (25%)

There will be an in-class midterm.

Final Projects (Bonus 10%)

The final project is an open-ended assignment, with the goal of gaining experience applying the techniques presented in class to real-world datasets. Students should work in groups of 3-4. It is a good idea to discuss your planned project with the instructor to get feedback. The final project report should be 4 pages. The report should describe the problem you are solving, what data is being used, the proposed technique you are applying in addition to what baseline is used to compare against.

Grading Scale

Your overall grade is computed as (hw1+hw2+hw3) * 62.5/(10+15+20) + (midterm score) * 25/20 + (participation score) *12.5/10 + (final project score). This is then mapped a letter grade based on the standard OSU policy: 93-100 (A), 90-92.9 (A-), 87-89.9 (B+), 83-86.9 (B), 80-82.9 (B-), 77-79.9 (C+), 73-76.9 (C), 70-72.9 (C-), 67-69.9 (D+), 60-66.9 (D), below 60 (E). These cutoffs represent grade minimums. We may adjust grades upward based on class grade distribution curve. You pass the class if you receive D or above.

Resources
  • Piazza (discussion, announcements, etc...). https://piazza.com/osu/spring2020/cse5525
  • Carmen (homework submission + grades). https://osu.instructure.com/courses/75441
  • Academic Integrity
    Any assignment or exam that you hand in must be your own work (with the exception of group projects). However, talking with others to better understand the material is strongly encouraged. Copying a solution or letting someone copy your solution is considered cheating. Everything you hand in must be your own words. Code you hand in must be written by you, with the exception of any code provided as part of the assignment. Any collaboration during an exam is considered cheating. Any student who is caught cheating will be reported to the Committee on Academic Misconduct. Please don't take a chance - if you are having trouble understanding the material, let us know and we will be happy to help.
    Homework
  • Homework 1 - Naive Bayes for Sentiment Analysis (Due 1/17 11:59pm, submit report and code to Dropbox on Carmen)
  • Homework 2 - Perceptron & Feed Forward Neural Network (Due 2/25 11:59pm, submit report and code to Dropbox on Carmen)
  • Homework 3 - Structured Perceptron & Convolutional Neural Networks (Due 3/30, submit report and code to Dropbox on Carmen)
  • Anonymous Feedback
    Tentative Schedule:
    Schedule
    Date Topic Required Reading Suggested Reading
    1/8 Course Overview J+M, 3rd Edition Chapter 1
    1/10 Machine Learning (binary classification) [lecture notes] J+M 4, Eisenstein 2.0-2.5, 4.1,4.3-4.5
    1/15 Machine Learning (cont') CIML, 4.1-4.4, 4.6-4.7
    1/17 Multiclass Learning J+M 5, Eisenstein 4.2
    1/22 Multiclass Learning (cont') J+M 5, Eisenstein 4.2
    1/24 Neural Networks for NLP [lecture notes] Eisenstein 3.1-3.3, J+M 7.1-7.4 Goldberg 1-4
    2/5 Neural Networks for NLP (cont') [lecture notes] Backpropagation (J.G. Makin)
    2/7 Sequence Models [lecture notes] J+M 8 Part-of-Speech Tagging (Manning)
    2/12 Viterbi Algorithm [excercise] [solution] Eisenstein 7.0-7.4
    2/14 Conditional Random Fields [lecture notes] Eisenstein 7.5, 8.3 CRF (Sutton & McCallum)
    2/19 Word Embeddings Eisenstein 3.3.4, 14.5, 14.6, J+M 6 Dropout, Initialization, Batch Normalization
    2/21 Word Embeddings (cont') Goldberg 5, word2vec, Levy, GloVe
    2/26 Recurrent Neural Networks J+M 9, Goldberg 10,11
    2/28 Recurrent Neural Networks (cont') [lecture notes]
    3/4 Convolutional Neural Networks [excercise] [lecture notes] Goldberg 9, Eisenstein 3.4, 7.6 Kim, Collobert and Weston
    3/6 Neural CRF Eisenstein 3.4, 7.6 Neural NER
    3/25 Statistical Machine Translation Eisenstein 18.1, 18.2
    3/27 Statistical Machine Translation (cont') Statistical Machine Translation (Koehn)
    4/1 Midterm released
    4/8 Sequence-to-Sequence Model J+M 10
    4/10 Attention and Copy Mechanism Eisenstein 18.3, 18.4
    4/13 Midterm due
    4/15 Neural Machine Translation / Transformer Google NMT, Attention is all you need The Annotated Transformer (Rush)
    4/17 withdraw deadline
    4/17 Guest lecture by Alan Ritter
    4/22 Information Extraction Eisenstein 13, 17
    4/24 Speech Recognition CTC Model, Baidu's Deep Speech