This course teaches students to understand and apply deep learning methods for natural language processing, with a particular emphasis on large language models (LLMs). Students will spend most of the term exploring neural language models, a key driver in advancing the state of the art in the field. It is designed for Masters students in computer science or informatics who are (1) interested in keeping pace with cutting-edge research developments in NLP and (2) have a solid background in machine learning fundamentals. The course covers key algorithmic foundations and applications of advanced natural language processing.
This course is organized and taught by Dr. Jennifer D'Souza.
Teaching Assistants: Mr. Hamed Babaei Giglou and Mr. Sameer Sadruddin (both excellent PhD candidates supervized by Dr. D'Souza and Prof. Auer)
The course is offered for master students of the "Electrical Engineering and Information Technology" or "Computer Science" programs at the Leibniz University of Hannover and at present is offered in Summer Semester 2026 (SoSe 26).
Course location: Raum 405: V405, Gebaeude 3109 (Map: https://info.cafm.uni-hannover.de/de/room/3109.004.405)
Course day/time:
Lectures: Tuesdays, 09:00 - 10:30
Exercises: Tuesdays, 10:45 - 12:15, weekly. Each exercise session might last for a minimum of 45 minutes.
For each class there will be:
Reading: Most classes will have associated reading material that is recommended you read before the class to familiarize yourself with the topic.
Lecture and Discussion: There will be a lecture and discussion regarding the class material.
Code/Data Walk: After the lectures, there will be exercise sessions managed by the course teaching assistants where they will demonstrate in practice via code/data demonstrations selected key topic(s) from the lecture of that day. Their code will be publicly uploaded to Github https://github.com/jd-coderepos/advanced-nlp-course/2026
The course grade is based on the final written exam (100%).
Bonus Points: Students of this course will have the opportunity to earn upto 25% bonus points which will be added to the final exam grade. This will be available via the following two avenues:
Homework: The homework includes both programming and writing assignments. The optionally graded homework (15%) will be added to the final exam as bonus points. There will be three homeworks in all.
Attendance: At the start of each lecture, there will be an attendance sheet circulated in the room where students will mark their presence. There is reserved a 10% participation bonus. However to earn this, students have to attend at least 5 lectures of the course. These days in the age of AI our world is becoming increasingly disconnected and fragmented from the actual human experience of sharing and exchanging ideas/thoughts. This is intended as a way to encourage better engagement in the course lectures and exercises.
This semester's course is adapted from Advanced NLP Fall 2024, designed by Graham Neubig and taught at Carnegie Mellon University’s Language Technology Institute .
A nice textbook for NLP fundamentals is Jurafsky and Martin, Speech and Language Processing, 3rd ed. For this course, readings will mainly be NLP conference papers (e.g., from ACL, NAACL, and EMNLP). We will post all readings as PDFs.
Other useful texts for NLP include:
Eisenstein, Natural Language Processing. Draft textbook.
Smith, Linguistic Structure Prediction. Free access at UMass. Short book. Excellent coverage of structured prediction inference methods for NLP.
Murphy, Machine Learning: a Probabilistic Perspective. Excellent, though advanced, coverage of most of the machine learning methods we will use.
Bender, Linguistic Fundamentals for NLP. Short book. Focuses on linguisic issues relevant to NLP.
Bird et al, NLP with Python, a.k.a. the NLTK book. Aimed at a more introductory level than this course, but the book is a good gentle introduction to NLP with a CL (computational linguistics) emphasis. The NLTK software has easy-to-use data access and some interfaces to (not always SOTA) NLP tools.