This is the site where we will build the lecture notes for Indiana University's CSCI-B 490, Natural Language Processing.

There won't be slides for the class (probably). What there will be is readings from the textbook and discussions in class! I'll structure discussions around the relevant notes, but if we get off-topic, then we get off-topic. Please consider these notes to be one reference of many: another good reference would be your notes from class.


Part 1: Introduction

Part 2: classifying individual things

Part 3: sequences of things

Part 4: structured things

Part 5: some other things

Part 6: machine translation, because MT rules.

    • history
    • RBMT
    • EBMT
    • SMT


suggestions for topics:

    • Greg and Karl think it's important to hit the Viterbi algorithm and HMMs!
    • and most importantly, make sure everybody really understands n-gram models. Probably as an introduction to why we use probability distributions at all in NLP.