96
Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University of Washington [email protected] May 1, 2017 1 / 96

Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

  • Upload
    others

  • View
    14

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Natural Language Processing (CSEP 517):Dependency Syntax and Parsing

Noah Smithc© 2017

University of [email protected]

May 1, 2017

1 / 96

Page 2: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

To-Do List

I Online quiz: due Sunday

I Read: Kubler et al. (2009, ch. 1, 2, 6)

I A3 due May 7 (Sunday)

I A4 due May 14 (Sunday)

2 / 96

Page 3: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependencies

Informally, you can think of dependency structures as a transformation ofphrase-structures that

I maintains the word-to-word relationships induced by lexicalization,

I adds labels to them, and

I eliminates the phrase categories.

There are also linguistic theories built on dependencies (Tesniere, 1959; Mel’cuk,1987), as well as treebanks corresponding to those.

I Free(r)-word order languages (e.g., Czech)

3 / 96

Page 4: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependency Tree: Definition

Let x = 〈x1, . . . , xn〉 be a sentence. Add a special root symbol as “x0.”

A dependency tree consists of a set of tuples 〈p, c, `〉, where

I p ∈ {0, . . . , n} is the index of a parent

I c ∈ {1, . . . , n} is the index of a child

I ` ∈ L is a label

Different annotation schemes define different label sets L, and different constraints onthe set of tuples. Most commonly:

I The tuple is represented as a directed edge from xp to xc with label `.

I The directed edges form an arborescence (directed tree) with x0 as the root(sometimes denoted root).

4 / 96

Page 5: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

S

NP

Pronoun

we

VP

Verb

wash

NP

Determiner

our

Noun

cats

Phrase-structure tree.

5 / 96

Page 6: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

S

NP

Pronoun

we

VP

Verb

wash

NP

Determiner

our

Noun

cats

Phrase-structure tree with heads.

6 / 96

Page 7: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

Swash

NPwe

Pronounwe

we

VPwash

Verbwash

wash

NPcats

Determinerour

our

Nouncats

cats

Phrase-structure tree with heads, lexicalized.

7 / 96

Page 8: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

we wash our cats

“Bare bones” dependency tree.

8 / 96

Page 9: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

we wash our cats who stink

9 / 96

Page 10: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

we vigorously wash our cats who stink

10 / 96

Page 11: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Content Heads vs. Function HeadsCredit: Nathan Schneider

little kids were always watching birds with fish

little kids were always watching birds with fish

11 / 96

Page 12: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Labels

kids saw birds with fish

root

sbj dobj prep

pobj

Key dependency relations captured in the labels include: subject, direct object,preposition object, adjectival modifier, adverbial modifier.

In this lecture, I will mostly not discuss labels, to keep the algorithms simpler.

12 / 96

Page 13: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Coordination Structures

we vigorously wash our cats and dogs who stink

The bugbear of dependency syntax.

13 / 96

Page 14: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

we vigorously wash our cats and dogs who stink

Make the first conjunct the head?

14 / 96

Page 15: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

we vigorously wash our cats and dogs who stink

Make the coordinating conjunction the head?

15 / 96

Page 16: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Example

we vigorously wash our cats and dogs who stink

Make the second conjunct the head?

16 / 96

Page 17: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependency Schemes

I Transform the treebank: define “head rules” that can select the head child of anynode in a phrase-structure tree and label the dependencies.

I More powerful, less local rule sets, possibly collapsing some words into arc labels.I Stanford dependencies are a popular example (de Marneffe et al., 2006).

I Direct annotation.

17 / 96

Page 18: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Three Approaches to Dependency Parsing

1. Dynamic programming with the Eisner algorithm.

2. Transition-based parsing with a stack.

3. Chu-Liu-Edmonds algorithm for arborescences.

18 / 96

Page 19: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependencies and Grammar

Context-free grammars can be used to encode dependency structures.

For every head word and constellation of dependent children:

Nhead → Nleftmost-sibling . . . Nhead . . . Nrightmost-sibling

And for every v ∈ V: Nv → v and S → Nv.

19 / 96

Page 20: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependencies and Grammar

Context-free grammars can be used to encode dependency structures.

For every head word and constellation of dependent children:

Nhead → Nleftmost-sibling . . . Nhead . . . Nrightmost-sibling

And for every v ∈ V: Nv → v and S → Nv.

A bilexical dependency grammar binarizes the dependents, generating only one perrule.

20 / 96

Page 21: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependencies and Grammar

Context-free grammars can be used to encode dependency structures.

For every head word and constellation of dependent children:

Nhead → Nleftmost-sibling . . . Nhead . . . Nrightmost-sibling

And for every v ∈ V: Nv → v and S → Nv.

A bilexical dependency grammar binarizes the dependents, generating only one perrule.

Such a grammar can produce only projective trees, which are (informally) trees inwhich the arcs don’t cross.

21 / 96

Page 22: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Bilexical Dependency Grammar: Derivation

S

Nwash

Nwe

we

Nwash

Nwash

wash

Ncats

Nour

our

Ncats

cats

Naıvely, the CKY algorithm will require O(n5) runtime. Why?

22 / 96

Page 23: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

CKY for Bilexical Context-Free Grammars

i k

Nxh

j + 1 k

Nxc

i j

Nxh

p(Nxh Nxc | Nxh)

i k

Nxh

j + 1 k

Nxh

i j

Nxc

p(Nxc Nxh | Nxh)

23 / 96

Page 24: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

CKY Example

we wash our cats

goal

Nwe Nwash Nour. Ncats

Ncats

Nwash

Nwash

S

24 / 96

Page 25: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependency Parsing with the Eisner Algorithm(Eisner, 1996)

Items:h d d h c h h c

I Both triangles indicate that xd is a descendant of xh.

I Both trapezoids indicate that xc can be attached as the child of xh.

I In all cases, the words “in between” are descendants of xh.

25 / 96

Page 26: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependency Parsing with the Eisner Algorithm(Eisner, 1996)

Initialization:

i i i i

1p(xi | Nxi)

Goal:

1 i i np(Nxi | S)

goal

26 / 96

Page 27: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependency Parsing with the Eisner Algorithm(Eisner, 1996)

Attaching a left dependent: Complete a left child:

i j j + 1 k

i k

p(Nxi Nxk | Nxk)j ki j

i k

27 / 96

Page 28: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Dependency Parsing with the Eisner Algorithm(Eisner, 1996)

Attaching a right dependent: Complete a right child:

i j j + 1 k

i k

p(Nxi Nxk | Nxi)i j j k

i k

28 / 96

Page 29: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Eisner Algorithm Example

we wash our cats

goal

29 / 96

Page 30: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Three Approaches to Dependency Parsing

1. Dynamic programming with the Eisner algorithm.

2. Transition-based parsing with a stack.

3. Chu-Liu-Edmonds algorithm for arborescences.

30 / 96

Page 31: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing

I Process x once, from left to right, making a sequence of greedy parsing decisions.

I Formally, the parser is a state machine (not a finite-state machine) whose stateis represented by a stack S and a buffer B.

I Initialize the buffer to contain x and the stack to contain the root symbol.I The “arc standard” transition set (Nivre, 2004):

I shift the word at the front of the buffer B onto the stack S.I right-arc: u = pop(S); v = pop(S); push(S, v → u).I left-arc: u = pop(S); v = pop(S); push(S, v ← u).

(For labeled parsing, add labels to the right-arc and left-arc transitions.)

I During parsing, apply a classifier to decide which transition to take next, greedily.No backtracking.

31 / 96

Page 32: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing

I Process x once, from left to right, making a sequence of greedy parsing decisions.

I Formally, the parser is a state machine (not a finite-state machine) whose stateis represented by a stack S and a buffer B.

I Initialize the buffer to contain x and the stack to contain the root symbol.I The “arc standard” transition set (Nivre, 2004):

I shift the word at the front of the buffer B onto the stack S.I right-arc: u = pop(S); v = pop(S); push(S, v → u).I left-arc: u = pop(S); v = pop(S); push(S, v ← u).

(For labeled parsing, add labels to the right-arc and left-arc transitions.)

I During parsing, apply a classifier to decide which transition to take next, greedily.No backtracking.

32 / 96

Page 33: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing

I Process x once, from left to right, making a sequence of greedy parsing decisions.

I Formally, the parser is a state machine (not a finite-state machine) whose stateis represented by a stack S and a buffer B.

I Initialize the buffer to contain x and the stack to contain the root symbol.

I The “arc standard” transition set (Nivre, 2004):I shift the word at the front of the buffer B onto the stack S.I right-arc: u = pop(S); v = pop(S); push(S, v → u).I left-arc: u = pop(S); v = pop(S); push(S, v ← u).

(For labeled parsing, add labels to the right-arc and left-arc transitions.)

I During parsing, apply a classifier to decide which transition to take next, greedily.No backtracking.

33 / 96

Page 34: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing

I Process x once, from left to right, making a sequence of greedy parsing decisions.

I Formally, the parser is a state machine (not a finite-state machine) whose stateis represented by a stack S and a buffer B.

I Initialize the buffer to contain x and the stack to contain the root symbol.I The “arc standard” transition set (Nivre, 2004):

I shift the word at the front of the buffer B onto the stack S.I right-arc: u = pop(S); v = pop(S); push(S, v → u).I left-arc: u = pop(S); v = pop(S); push(S, v ← u).

(For labeled parsing, add labels to the right-arc and left-arc transitions.)

I During parsing, apply a classifier to decide which transition to take next, greedily.No backtracking.

34 / 96

Page 35: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing

I Process x once, from left to right, making a sequence of greedy parsing decisions.

I Formally, the parser is a state machine (not a finite-state machine) whose stateis represented by a stack S and a buffer B.

I Initialize the buffer to contain x and the stack to contain the root symbol.I The “arc standard” transition set (Nivre, 2004):

I shift the word at the front of the buffer B onto the stack S.I right-arc: u = pop(S); v = pop(S); push(S, v → u).I left-arc: u = pop(S); v = pop(S); push(S, v ← u).

(For labeled parsing, add labels to the right-arc and left-arc transitions.)

I During parsing, apply a classifier to decide which transition to take next, greedily.No backtracking.

35 / 96

Page 36: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

root

Buffer B:

wevigorouslywashourcatswhostink

Actions:

36 / 96

Page 37: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

weroot

Buffer B:

vigorouslywashourcatswhostink

Actions: shift

37 / 96

Page 38: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

vigorouslyweroot

Buffer B:

washourcatswhostink

Actions: shift shift

38 / 96

Page 39: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

washvigorouslyweroot

Buffer B:

ourcatswhostink

Actions: shift shift shift

39 / 96

Page 40: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

vigorously wash

weroot

Buffer B:

ourcatswhostink

Actions: shift shift shift left-arc

40 / 96

Page 41: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

we vigorously wash

root

Buffer B:

ourcatswhostink

Actions: shift shift shift left-arc left-arc

41 / 96

Page 42: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

our

we vigorously wash

root

Buffer B:

catswhostink

Actions: shift shift shift left-arc left-arc shift

42 / 96

Page 43: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

catsour

we vigorously wash

root

Buffer B:

whostink

Actions: shift shift shift left-arc left-arc shift shift

43 / 96

Page 44: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

our cats

we vigorously wash

root

Buffer B:

whostink

Actions: shift shift shift left-arc left-arc shift shift left-arc

44 / 96

Page 45: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

who

our cats

we vigorously wash

root

Buffer B:

stink

Actions: shift shift shift left-arc left-arc shift shift left-arc shift

45 / 96

Page 46: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

stinkwho

our cats

we vigorously wash

root

Buffer B:

Actions: shift shift shift left-arc left-arc shift shift left-arc shiftshift

46 / 96

Page 47: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: ExampleStack S:

who stink

our cats

we vigorously wash

root

Buffer B:

Actions: shift shift shift left-arc left-arc shift shift left-arc shiftshift right-arc

47 / 96

Page 48: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

our cats who stink

we vigorously wash

root

Buffer B:

Actions: shift shift shift left-arc left-arc shift shift left-arc shiftshift right-arc right-arc

48 / 96

Page 49: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

we vigorously wash our cats who stink

root

Buffer B:

Actions: shift shift shift left-arc left-arc shift shift left-arc shiftshift right-arc right-arc right-arc

49 / 96

Page 50: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Example

Stack S:

we vigorously wash our cats who stink

root

Buffer B:

Actions: shift shift shift left-arc left-arc shift shift left-arc shiftshift right-arc right-arc right-arc right-arc

50 / 96

Page 51: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

The Core of Transition-Based Parsing: Classification

I At each iteration, choose among {shift, right-arc, left-arc}.(Actually, among all L-labeled variants of right- and left-arc.)

I Features can look S, B, and the history of past actions—usually there is nodecomposition into local structures.

I Training data: “oracle” transition sequence that gives the right tree converts into2 · n pairs: 〈state, correct transition〉. Each word gets shifted once andparticipates as a child in one arc.

51 / 96

Page 52: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

The Core of Transition-Based Parsing: Classification

I At each iteration, choose among {shift, right-arc, left-arc}.(Actually, among all L-labeled variants of right- and left-arc.)

I Features can look S, B, and the history of past actions—usually there is nodecomposition into local structures.

I Training data: “oracle” transition sequence that gives the right tree converts into2 · n pairs: 〈state, correct transition〉. Each word gets shifted once andparticipates as a child in one arc.

52 / 96

Page 53: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

The Core of Transition-Based Parsing: Classification

I At each iteration, choose among {shift, right-arc, left-arc}.(Actually, among all L-labeled variants of right- and left-arc.)

I Features can look S, B, and the history of past actions—usually there is nodecomposition into local structures.

I Training data: “oracle” transition sequence that gives the right tree converts into2 · n pairs: 〈state, correct transition〉. Each word gets shifted once andparticipates as a child in one arc.

53 / 96

Page 54: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Transition-Based Parsing: Remarks

I Can also be applied to phrase-structure parsing (e.g., Sagae and Lavie, 2006).Keyword: “shift-reduce” parsing.

I The algorithm for making decisions doesn’t need to be greedy; can maintainmultiple hypotheses.

I E.g., beam search, which we’ll discuss in the context of machine translation later.

I Potential flaw: the classifier is typically trained under the assumption thatprevious classification decisions were all correct.

I As yet, no principled solution to this problem, but see “dynamic oracles” (Goldbergand Nivre, 2012).

54 / 96

Page 55: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Three Approaches to Dependency Parsing

1. Dynamic programming with the Eisner algorithm.

2. Transition-based parsing with a stack.

3. Chu-Liu-Edmonds algorithm for arborescences.

55 / 96

Page 56: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Acknowledgment

Slides are mostly adapted from those by Swabha Swayamdipta and Sam Thomson.

56 / 96

Page 57: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Features in Dependency Parsing

For the Eisner algorithm, the score of an unlabeled parse y was

sglobal(y) =

n∑c=1

log p(xc | Nxc) + log

p(Nxc Nxp | Nxp) if 〈p, c〉 ∈ y ∧ c < p ∧ p > 0p(Nxp Nxc | Nxp) if 〈p, c〉 ∈ y ∧ c > p ∧ p > 0p(Nxc | S) if 〈0, c〉 ∈ y

For transition-based parsing, we could use any past decisions to score the currentdecision:

sglobal(y) = s(a) =

|a|∑i=1

s(ai | a0:i−1)

We gave up on any guarantee of finding the best possible y in favor of arbitraryfeatures.

I For a neural network-based model that fully exploits this, see Dyer et al. (2015).

57 / 96

Page 58: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Graph-Based Dependency Parsing(McDonald et al., 2005)

Every possible directed edge e between a parent p and a child c gets a local score, s(e).

This set, E, contains O(n2) edges.No incoming edges to x0, ensuring that it will be the root.

58 / 96

Page 59: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

First-Order Graph-Based (FOG) Dependency Parsing(McDonald et al., 2005)

y∗ = argmaxy⊂E

sglobal(y) = argmaxy⊂E

∑e∈y

s(e)

subject to the constraint that y is an arborescence

Classical algorithm to efficiently solve this problem: Chu and Liu (1965), Edmonds(1967)

59 / 96

Page 60: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds Intuitions

I Every non-root node needs exactly one incoming edge.

I In fact, every connected component that doesn’t contain x0 needs exactly oneincoming edge.

60 / 96

Page 61: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds Intuitions

I Every non-root node needs exactly one incoming edge.

I In fact, every connected component that doesn’t contain x0 needs exactly oneincoming edge.

61 / 96

Page 62: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds Intuitions

I Every non-root node needs exactly one incoming edge.

I In fact, every connected component that doesn’t contain x0 needs exactly oneincoming edge.

High-level view of the algorithm:

1. For every c, pick an incoming edge (i.e., pick a parent)—greedily.

2. If this forms an arborescence, you are done!

3. Otherwise, it’s because there’s a cycle, C.I Arborescences can’t have cycles, so some edge in C needs to be kicked out.I We also need to find an incoming edge for C.I Choosing the incoming edge for C determines which edge to kick out.

62 / 96

Page 63: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Recursive (Inefficient) Definition

def maxArborescence(V , E, root):# returns best arborescence as a map from each node to its parent

for c in V \ root:bestInEdge[c]← argmaxe∈E:e=〈p,c〉 e.s # i.e., s(e)

if bestInEdge contains a cycle C:# build a new graph where C is contracted into a single nodevC ← new Node()V ′ ← V ∪ {vC} \ CE′ ← {adjust(e, vC) for e ∈ E \ C}A← maxArborescence(V ′, E′, root)return {e.original for e ∈ A} ∪ C \ {A[vC ].kicksOut}

# each node got a parent without creating any cyclesreturn bestInEdge

63 / 96

Page 64: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Understanding Chu-Liu-Edmonds

There are two stages:

I Contraction (the stuff before the recursive call)

I Expansion (the stuff after the recursive call)

64 / 96

Page 65: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Contraction

I For each non-root node v, set bestInEdge[v] to be its highest scoring incomingedge.

I If a cycle C is formed:I contract the nodes in C into a new node vC

adjust subroutine on next slide performs the following:I Edges incoming to any node in C now get destination vCI For each node v in C, and for each edge e incoming to v from outside of C:

I Set e.kicksOut to bestInEdge[v], andI Set e.s to be e.s− e.kicksOut.s

I Edges outgoing from any node in C now get source vC

I Repeat until every non-root node has an incoming edge and no cycles are formed

65 / 96

Page 66: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Edge Adjustment Subroutine

def adjust(e, vC):e′ ← copy(e)e′.original← eif e.dest ∈ C:

e′.dest← vCe′.kicksOut← bestInEdge[e.dest]e′.s← e.s− e′.kicksOut.s

elif e.src ∈ C:e′.src← vC

return e′

66 / 96

Page 67: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V1

ROOT

V3V2

a : 5 b : 1 c : 1

f : 5d : 11

h : 9

e : 4

i : 8g : 10

bestInEdge

V1V2V3

kicksOutabcdefghi

67 / 96

Page 68: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V1

ROOT

V3V2

a : 5 b : 1 c : 1

f : 5d : 11

h : 9

e : 4

i : 8g : 10

bestInEdge

V1 gV2V3

kicksOutabcdefghi

68 / 96

Page 69: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V1

ROOT

V3V2

a : 5 b : 1 c : 1

f : 5d : 11

h : 9

e : 4

i : 8g : 10

bestInEdge

V1 gV2 dV3

kicksOutabcdefghi

69 / 96

Page 70: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V1

ROOT

V3V2

a : 5 − 10 b : 1 − 11 c : 1

f : 5d : 11

h : 9 − 10

e : 4

i : 8 − 11g : 10

V4

bestInEdge

V1 gV2 dV3

kicksOuta gb dcdefgh gi d

70 / 96

Page 71: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V4

ROOT

V3

b : −10 c : 1

f : 5

a : −5

h : −1

e : 4

i : −3

bestInEdge

V1 gV2 dV3V4

kicksOut

a gb dcdefgh gi d

71 / 96

Page 72: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V4

ROOT

V3

b : −10 c : 1

f : 5

a : −5

h : −1

e : 4

i : −3

bestInEdge

V1 gV2 dV3 fV4

kicksOut

a gb dcdefgh gi d

72 / 96

Page 73: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V4

ROOT

V3

b : −10 c : 1

f : 5

a : −5

h : −1

e : 4

i : −3

bestInEdge

V1 gV2 dV3 fV4 h

kicksOut

a gb dcdefgh gi d

73 / 96

Page 74: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V4

ROOT

V3

b : −10 − −1 c : 1 − 5

f : 5

a : −5 − −1

h : −1

e : 4

i : −3

V5

bestInEdge

V1 gV2 dV3 fV4 hV5

kicksOut

a g, hb d, hc fdefgh gi d

74 / 96

Page 75: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V5

ROOT

b : −9

a : −4 c : −4

bestInEdge

V1 gV2 dV3 fV4 hV5

kicksOut

a g, hb d, hc fde ffgh gi d

75 / 96

Page 76: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Contraction Example

V5

ROOT

b : −9

a : −4 c : −4

bestInEdge

V1 gV2 dV3 fV4 hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

76 / 96

Page 77: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Expansion

After the contraction stage, every contracted node will have exactly one bestInEdge.This edge will kick out one edge inside the contracted node, breaking the cycle.

I Go through each bestInEdge e in the reverse order that we added them

I Lock down e, and remove every edge in kicksOut(e) from bestInEdge.

77 / 96

Page 78: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Expansion Example

V5

ROOT

b : −9

a : −4 c : −4

bestInEdge

V1 gV2 dV3 fV4 hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

78 / 96

Page 79: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Expansion Example

V5

ROOT

b : −9

a : −4 c : −4

bestInEdge

V1 a �gV2 dV3 f

V4 a �hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

79 / 96

Page 80: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Expansion Example

V4

ROOT

V3

b : −10 c : 1

f : 5

a : −5

h : −1

e : 4

i : −3

bestInEdge

V1 a �gV2 dV3 f

V4 a �hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

80 / 96

Page 81: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Expansion Example

V4

ROOT

V3

b : −10 c : 1

f : 5

a : −5

h : −1

e : 4

i : −3

bestInEdge

V1 a �gV2 dV3 f

V4 a �hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

81 / 96

Page 82: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Expansion Example

V1

ROOT

V3V2

a : 5 b : 1 c : 1

f : 5d : 11

h : 9

e : 4

i : 8g : 10

bestInEdge

V1 a �gV2 dV3 f

V4 a �hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

82 / 96

Page 83: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Expansion Example

V1

ROOT

V3V2

a : 5 b : 1 c : 1

f : 5d : 11

h : 9

e : 4

i : 8g : 10

bestInEdge

V1 a �gV2 dV3 f

V4 a �hV5 a

kicksOut

a g, hb d, hc fde ffgh gi d

83 / 96

Page 84: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Observation

The set of arborescences strictly includes the set of projective dependency trees.

Is this a good thing or a bad thing?

84 / 96

Page 85: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Nonprojective Example

A hearing is scheduled on the issue today .

ROOT

ATT

ATT

SBJ

PU

VC

TMP

PC

ATT

85 / 96

Page 86: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Notes

I This is a greedy algorithm with a clever form of delayed backtracking to recoverfrom inconsistent decisions (cycles).

I CLE is exact: it always recovers an optimal arborescence.I What about labeled dependencies?

I As a matter of preprocessing, for each 〈p, c〉, keep only the top-scoring labeled edge.

I Tarjan (1977) offered a more efficient, but unfortunately incorrect,implementation.Camerini et al. (1979) corrected it.The approach is not recursive; instead using a disjoint set data structure to keeptrack of collapsed nodes.Even better: Gabow et al. (1986) used a Fibonacci heap to keep incoming edgessorted, and finds cycles in a more sensible way. Also constrains root to have onlyone outgoing edge.With these tricks, O(n2) runtime.

86 / 96

Page 87: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Notes

I This is a greedy algorithm with a clever form of delayed backtracking to recoverfrom inconsistent decisions (cycles).

I CLE is exact: it always recovers an optimal arborescence.

I What about labeled dependencies?I As a matter of preprocessing, for each 〈p, c〉, keep only the top-scoring labeled edge.

I Tarjan (1977) offered a more efficient, but unfortunately incorrect,implementation.Camerini et al. (1979) corrected it.The approach is not recursive; instead using a disjoint set data structure to keeptrack of collapsed nodes.Even better: Gabow et al. (1986) used a Fibonacci heap to keep incoming edgessorted, and finds cycles in a more sensible way. Also constrains root to have onlyone outgoing edge.With these tricks, O(n2) runtime.

87 / 96

Page 88: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Notes

I This is a greedy algorithm with a clever form of delayed backtracking to recoverfrom inconsistent decisions (cycles).

I CLE is exact: it always recovers an optimal arborescence.I What about labeled dependencies?

I As a matter of preprocessing, for each 〈p, c〉, keep only the top-scoring labeled edge.

I Tarjan (1977) offered a more efficient, but unfortunately incorrect,implementation.Camerini et al. (1979) corrected it.The approach is not recursive; instead using a disjoint set data structure to keeptrack of collapsed nodes.Even better: Gabow et al. (1986) used a Fibonacci heap to keep incoming edgessorted, and finds cycles in a more sensible way. Also constrains root to have onlyone outgoing edge.With these tricks, O(n2) runtime.

88 / 96

Page 89: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Chu-Liu-Edmonds: Notes

I This is a greedy algorithm with a clever form of delayed backtracking to recoverfrom inconsistent decisions (cycles).

I CLE is exact: it always recovers an optimal arborescence.I What about labeled dependencies?

I As a matter of preprocessing, for each 〈p, c〉, keep only the top-scoring labeled edge.

I Tarjan (1977) offered a more efficient, but unfortunately incorrect,implementation.Camerini et al. (1979) corrected it.The approach is not recursive; instead using a disjoint set data structure to keeptrack of collapsed nodes.Even better: Gabow et al. (1986) used a Fibonacci heap to keep incoming edgessorted, and finds cycles in a more sensible way. Also constrains root to have onlyone outgoing edge.With these tricks, O(n2) runtime.

89 / 96

Page 90: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

More Details on Statistical Dependency Parsing

I What about the scores? McDonald et al. (2005) used carefully-designed featuresand (something close to) the structured perceptron; Kiperwasser and Goldberg(2016) used bidirectional recurrent neural networks.

I What about higher-order parsing? Requires approximate inference, e.g., dualdecomposition (Martins et al., 2013).

90 / 96

Page 91: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

More Details on Statistical Dependency Parsing

I What about the scores? McDonald et al. (2005) used carefully-designed featuresand (something close to) the structured perceptron; Kiperwasser and Goldberg(2016) used bidirectional recurrent neural networks.

I What about higher-order parsing? Requires approximate inference, e.g., dualdecomposition (Martins et al., 2013).

91 / 96

Page 92: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Important Tradeoffs (and Not Just in NLP)

1. Two extremes:I Specialized algorithm that efficiently solves your problem, under your assumptions.

E.g., Chu-Liu-Edmonds for FOG dependency parsing.I General-purpose method that solves many problems, allowing you to test the effect

of different assumptions. E.g., dynamic programming, transition-based methods,some forms of approximate inference.

2. Two extremes:I Fast (linear-time) but greedyI Model-optimal but slow

92 / 96

Page 93: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Important Tradeoffs (and Not Just in NLP)

1. Two extremes:I Specialized algorithm that efficiently solves your problem, under your assumptions.

E.g., Chu-Liu-Edmonds for FOG dependency parsing.I General-purpose method that solves many problems, allowing you to test the effect

of different assumptions. E.g., dynamic programming, transition-based methods,some forms of approximate inference.

2. Two extremes:I Fast (linear-time) but greedyI Model-optimal but slow

93 / 96

Page 94: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

Important Tradeoffs (and Not Just in NLP)

1. Two extremes:I Specialized algorithm that efficiently solves your problem, under your assumptions.

E.g., Chu-Liu-Edmonds for FOG dependency parsing.I General-purpose method that solves many problems, allowing you to test the effect

of different assumptions. E.g., dynamic programming, transition-based methods,some forms of approximate inference.

2. Two extremes:I Fast (linear-time) but greedyI Model-optimal but slow

I Dirty secret: the best way to get (English) dependency trees is to runphrase-structure parsing, then convert.

94 / 96

Page 95: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

References I

Paolo M. Camerini, Luigi Fratta, and Francesco Maffioli. A note on finding optimum branchings. Networks, 9(4):309–312, 1979.

Y. J. Chu and T. H. Liu. On the shortest arborescence of a directed graph. Science Sinica, 14:1396–1400, 1965.

Marie-Catherine de Marneffe, Bill MacCartney, and Christopher D. Manning. Generating typed dependencyparses from phrase structure parses. In Proc. of LREC, 2006.

Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, and Noah A. Smith. Transition-based dependencyparsing with stack long short-term memory. In Proc. of ACL, 2015.

Jack Edmonds. Optimum branchings. Journal of Research of the National Bureau of Standards, 71B:233–240,1967.

Jason M. Eisner. Three new probabilistic models for dependency parsing: An exploration. In Proc. of COLING,1996.

Harold N. Gabow, Zvi Galil, Thomas Spencer, and Robert E. Tarjan. Efficient algorithms for finding minimumspanning trees in undirected and directed graphs. Combinatorica, 6(2):109–122, 1986.

Yoav Goldberg and Joakim Nivre. A dynamic oracle for arc-eager dependency parsing. In Proc. of COLING,2012.

Eliyahu Kiperwasser and Yoav Goldberg. Simple and accurate dependency parsing using bidirectional LSTMfeature representations. Transactions of the Association for Computational Linguistics, 4:313–327, 2016.

95 / 96

Page 96: Natural Language Processing (CSEP 517): Dependency Syntax … · 2017-05-02 · Natural Language Processing (CSEP 517): Dependency Syntax and Parsing Noah Smith c 2017 University

References II

Sandra Kubler, Ryan McDonald, and Joakim Nivre. Dependency Parsing. Synthesis Lectures on HumanLanguage Technologies. Morgan and Claypool, 2009. URLhttp://www.morganclaypool.com/doi/pdf/10.2200/S00169ED1V01Y200901HLT002.

Andre F. T. Martins, Miguel Almeida, and Noah A. Smith. Turning on the turbo: Fast third-ordernon-projective turbo parsers. In Proc. of ACL, 2013.

Ryan McDonald, Fernando Pereira, Kiril Ribarov, and Jan Hajic. Non-projective dependency parsing usingspanning tree algorithms. In Proceedings of HLT-EMNLP, 2005. URLhttp://www.aclweb.org/anthology/H/H05/H05-1066.pdf.

Igor A. Mel’cuk. Dependency Syntax: Theory and Practice. State University Press of New York, 1987.

Joakim Nivre. Incrementality in deterministic dependency parsing. In Proc. of ACL, 2004.

Kenji Sagae and Alon Lavie. A best-first probabilistic shift-reduce parser. In Proc. of COLING-ACL, 2006.

Robert E. Tarjan. Finding optimum branchings. Networks, 7:25–35, 1977.

L. Tesniere. Elements de Syntaxe Structurale. Klincksieck, 1959.

96 / 96