CSE 5539: Cutting-Edge Topics in Natural Language Processing (AU20, Wed 10:00-11:50am, Zoom)

Instructor: Yu Su

Level and credits: U/G, 2

Prerequisites: CSE 3521/5521/5243/5525 highly recommended; requires solid understanding of machine/deep learning and natural language processing for meaningful participation


Paper-driven discussion of cutting-edge topics in natural language processing with a focus on pre-trained language models (BERT*, GPT-3, probing), language interfaces (dialogue systems, semantic parsing, question answering, and robot instruction following), and knowledge bases (construction and reasoning), and more.

Grading Plan


No required textbook. Recommended books for reading:

Health and Safety Statement

See here.

Academic Integrity Policy

Academic integrity is essential to maintaining an environment that fosters excellence in teaching, research, and other educational and scholarly activities. Thus, The Ohio State University and the Committee on Academic Misconduct (COAM) expect that all students have read and understand the University’s Code of Student Conduct, and that all students will complete all academic and scholarly assignments with fairness and honesty. Students must recognize that failure to follow the rules and guidelines established in the University’s Code of Student Conduct and this syllabus may constitute “Academic Misconduct.” For more info, click here.

Course Syllabus and Schedule (Tentative)

Week Date Topic Assignment Out Assignment Due Lecture Notes
1 08/26 Class outline
2 09/02 - Course project [slides]
- Attention is All You Need [paper] [slides]
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding [paper] [slides]
3 09/09 - DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter [paper] [slides]
- BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [paper] [slides]
4 09/16 - Longformer: The Long-Document Transformer [paper] [slides]
- Big Bird: Transformers for Longer Sequences [paper] [slides]
5 09/23 - Language Models are Few-Shot Learners [paper] [slides]
6 09/30 Project proposal
7 10/07 - Experience Grounds Language [paper] [slides]
- Deploying Lifelong Open-Domain Dialogue Learning [paper] [slides]
8 10/14 - Task-Oriented Dialogue as Dataflow Synthesis [paper] [slides]
- Conversational Semantic Parsing [paper] [slides]
9 10/21 - ALFRED: A Benchmark for Interpreting Grounded Instructions for Everyday Tasks [paper] [slides]
- RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers [paper] [slides]
10 10/28 - AutoKnow: Self-Driving Knowledge Collection for Products of Thousands of Types [paper] [slides]
- Neural Rule Grounding for Label-Efficient Relation Extraction [paper] [slides]
11 11/04 - Embedding Logical Queries on Knowledge Graphs [paper] [slides]
- Query2Box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings [paper] [slides]
12 11/11 Veterans Day observed - no class
13 11/18 - Language Models as Knowledge Bases: On Entity Representations, Storage Capacity, and Paraphrased Queries [paper] [slides]
- TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion [paper] [slides]
14 11/25 Project presentation
15 12/02 Project presentation