Structured prediction

Published

January 12, 2024

Sequence labelling is the task of assigning a class label to each item in an input sequence. Many tasks in natural language processing can be cast as sequence labelling problems over different sets of output labels, including part-of-speech tagging, word segmentation, and named entity recognition. This unit introduces several models for sequence labelling, both with local and global search.

Syntactic analysis, also called syntactic parsing, is the task of mapping a sentence to a formal representation of its syntactic structure. In this lecture you will learn about two approaches to dependency parsing, where the target representations take the form of dependency trees: the Eisner algorithm, which casts dependency parsing as combinatorial optimisation over graphs, and transition-based dependency parsing, which is the algorithm also used by Google.

Not yet updated

This page has not yet been updated for the 2024 session. It will be available on 2024-02-05.

Video lectures

(these will be reduced to six lectures)

Sequence labelling

Section Title

4.01 Introduction to sequence labelling
4.02 Sequence labelling with local search
4.03 Part-of-speech tagging with the perceptron
4.04 The perceptron learning algorithm
4.05 Sequence labelling with global search
4.06 The Viterbi algorithm

Syntactic parsing

Section Title

4.01 Introduction to dependency parsing
4.02 The arc-standard algorithm
4.03 The Eisner algorithm
4.04 Neural architectures for dependency parsing
4.05 Dynamic oracles

Reading

Reading

  • Eisenstein (2019), chapters 7–8, sections 2.3.1–2.3.2
  • Daumé, A Course in Machine Learning, section 4.6 (link)
  • Eisenstein (2019), chapter 11