61A Lecture 35 - wla.berkeley.eduwla.berkeley.edu/~cs61a/fa13/slides/35-NLP_1pp.pdf ·...

Preview:

Citation preview

61A Lecture 35

Wednesday, December 4

Announcements

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

!There will be a screencast of live lecture (as always).

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

!There will be a screencast of live lecture (as always).

!Screencasts: http://www.youtube.com/view_play_list?p=-XXv-cvA_iCIEwJhyDVdyLMCiimv6Tup

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

!There will be a screencast of live lecture (as always).

!Screencasts: http://www.youtube.com/view_play_list?p=-XXv-cvA_iCIEwJhyDVdyLMCiimv6Tup

• Homework 12 due Tuesday 12/10 @ 11:59pm.

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

!There will be a screencast of live lecture (as always).

!Screencasts: http://www.youtube.com/view_play_list?p=-XXv-cvA_iCIEwJhyDVdyLMCiimv6Tup

• Homework 12 due Tuesday 12/10 @ 11:59pm.

!All you have to do is vote on your favorite recursive art.

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

!There will be a screencast of live lecture (as always).

!Screencasts: http://www.youtube.com/view_play_list?p=-XXv-cvA_iCIEwJhyDVdyLMCiimv6Tup

• Homework 12 due Tuesday 12/10 @ 11:59pm.

!All you have to do is vote on your favorite recursive art.

• 29 review sessions next week! Come learn about the topics that interest you the most.

2

Announcements

• Homework 11 due Thursday 12/5 @ 11:59pm.

• No video of lecture on Friday 12/6.

!Come to class and take the final survey.

!There will be a screencast of live lecture (as always).

!Screencasts: http://www.youtube.com/view_play_list?p=-XXv-cvA_iCIEwJhyDVdyLMCiimv6Tup

• Homework 12 due Tuesday 12/10 @ 11:59pm.

!All you have to do is vote on your favorite recursive art.

• 29 review sessions next week! Come learn about the topics that interest you the most.

!See http://inst.eecs.berkeley.edu/~cs61a/fa13/exams/final.html for the schedule.

2

Natural Language Processing

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Syntactic ambiguity:

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Syntactic ambiguity: TEACHER STRIKES IDLE KIDS

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Syntactic ambiguity: TEACHER STRIKES IDLE KIDS HOSPITALS ARE SUED BY 7 FOOT DOCTORS

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Syntactic ambiguity:

Semantic ambiguity:

TEACHER STRIKES IDLE KIDS HOSPITALS ARE SUED BY 7 FOOT DOCTORS

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Syntactic ambiguity:

Semantic ambiguity: IRAQI HEAD SEEKS ARMS

TEACHER STRIKES IDLE KIDS HOSPITALS ARE SUED BY 7 FOOT DOCTORS

Ambiguity in Natural Language

Unlike programming languages, natural languages are ambiguous.

4

Syntactic ambiguity:

Semantic ambiguity: IRAQI HEAD SEEKS ARMS

TEACHER STRIKES IDLE KIDS HOSPITALS ARE SUED BY 7 FOOT DOCTORS

STOLEN PAINTING FOUND BY TREE

Tasks in Natural Language Processing

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

Semantic Parsing. "When's my birthday?" Siri says, "Your birthday is May 1st."

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

Semantic Parsing. "When's my birthday?" Siri says, "Your birthday is May 1st."

Much attention is given to more focused language analysis problems:

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

Semantic Parsing. "When's my birthday?" Siri says, "Your birthday is May 1st."

Much attention is given to more focused language analysis problems:

Coreference Resolution: Do the phrases "Barack Obama" and "the president" co-refer?

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

Semantic Parsing. "When's my birthday?" Siri says, "Your birthday is May 1st."

Much attention is given to more focused language analysis problems:

Coreference Resolution: Do the phrases "Barack Obama" and "the president" co-refer?

Syntactic Parsing: In "I saw the man with the telescope," who has the telescope?

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

Semantic Parsing. "When's my birthday?" Siri says, "Your birthday is May 1st."

Much attention is given to more focused language analysis problems:

Coreference Resolution: Do the phrases "Barack Obama" and "the president" co-refer?

Syntactic Parsing: In "I saw the man with the telescope," who has the telescope?

Word Sense Disambiguation: Does the "bank of the Seine" have an ATM?

5

Tasks in Natural Language Processing

Research in natural language processing (NLP) focuses on tasks that involve language:

Question answering. "Harriet Boyd Hawes was the first woman to discover and excavate a Minoan settlement on this island." Watson says, "What is Crete?"

Machine Translation. "Call a spade a spade!" Google Translate says, "Appeler un chat un chat."

Semantic Parsing. "When's my birthday?" Siri says, "Your birthday is May 1st."

Much attention is given to more focused language analysis problems:

Coreference Resolution: Do the phrases "Barack Obama" and "the president" co-refer?

Syntactic Parsing: In "I saw the man with the telescope," who has the telescope?

Word Sense Disambiguation: Does the "bank of the Seine" have an ATM?

Named-Entity Recognition: What names are in "Did van Gogh paint the Bank of the Seine?"

5

Machine Translation

Machine Translation

7

Machine Translation

Target language corpus gives examples of well-formed sentences

I will get to it later See you later He will do it

7

Machine Translation

Parallel corpus gives translation examples

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

Target language corpus gives examples of well-formed sentences

I will get to it later See you later He will do it

7

Machine Translation

Parallel corpus gives translation examples

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

Machine translation system:

Target language corpus gives examples of well-formed sentences

I will get to it later See you later He will do it

7

Machine Translation

Parallel corpus gives translation examples

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

Machine translation system:

Model of translation

Target language corpus gives examples of well-formed sentences

I will get to it later See you later He will do it

7

Machine Translation

I will do it later

Target language

Parallel corpus gives translation examples

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

Machine translation system:

Model of translation

Target language corpus gives examples of well-formed sentences

I will get to it later See you later He will do it

Yo lo haré despuésNOVEL SENTENCE

Source language

7

Syntactic Agreement in Translation

VB

MD VP

VPNP

S

PRP ADV

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

PRPVB

MD VP

VPNP

S

PRP ADV

I will do it laterModel of translation

Yo lo haré después

Machine translation system:

8

Syntactic Agreement in Translation

VB

MD VP

VPNP

S

PRP ADV

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

PRPVB

MD VP

VPNP

S

PRP ADV

I will do it laterModel of translation

Yo lo haré despuésADV ADV

Machine translation system:

8

Syntactic Agreement in Translation

VB

MD VP

VPNP

S

PRP ADV

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

PRPVB

MD VP

VPNP

S

PRP ADV

I will do it laterModel of translation

Yo lo haré después

S SADV ADV

Machine translation system:

8

Syntactic Agreement in Translation

VB

MD VP

VPNP

S

PRP ADV

Yo lo haré de muy buen grado

I will do it gladly

Después lo veras

You will see later

PRPVB

MD VP

VPNP

S

PRP ADV

I will do it laterModel of translation

Yo lo haré después

S SADV ADV

Machine translation system:

8

Syntactic Reordering in Translation

9

pair added to the lexicon

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

pair

NP

S

NN

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

pair

NP

S

NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

pair

NP

S

NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

一対 がpair

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

一対 がpair

目録list

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

一対 がpair

目録list

にto

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Syntactic Reordering in Translation

9

pair added to the lexicon

NP

PP

VP

NP

S

NN VBD TO DT NN

to

PP

TO

一対 がpair

目録list

にto

追加されましたadd was

pair

NP

S

NN

the lexicon

NP

DT NN

VP

added

VBD

Context-Free Grammars

Grammar Rules

A Context-Free Grammar Models Language Generation

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

Grammar Rules

A Context-Free Grammar Models Language Generation

S

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

Grammar Rules

A Context-Free Grammar Models Language Generation

S

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

Grammar Rules

A Context-Free Grammar Models Language Generation

S

NP VP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

Grammar Rules

A Context-Free Grammar Models Language Generation

S

NP VP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

Grammar Rules

A Context-Free Grammar Models Language Generation

S

NP VP

PRP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

Grammar Rules

Lexicon

A Context-Free Grammar Models Language Generation

S

NP VP

PRP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP

I

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP

I

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP

I

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP VB NP

I

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

VB -> know

VB -> help

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP VB NP

I

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

VB -> know

VB -> help

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP VB NP

I know

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

VB -> know

VB -> help

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP VB NP

I know PRP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

PRP -> you

VB -> know

VB -> help

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP VB NP

I know PRP

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

S -> NP VP

NP -> PRP

VP -> VB

VP -> VB NP

PRP -> you

VB -> know

VB -> help

Grammar Rules

Lexicon

PRP -> I

A Context-Free Grammar Models Language Generation

S

NP VP

PRP VB NP

I know PRP

you

A grammar contains rules that hierarchically generate word sequences using syntactic tags.

11

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VB -> know

VB -> help

Grammar Rules

Lexicon

S

NP VP

PRP

I

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VB -> know

VB -> help

Grammar Rules

Lexicon

S

NP VP

PRP

I

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

Grammar Rules

Lexicon

S

NP VP

PRP

I

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

Grammar Rules

Lexicon

S

NP VP

PRP

I

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

Grammar Rules

Lexicon

S

NP VP

PRP

I

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

Grammar Rules

Lexicon

S

NP VP

PRP

I

MD VP

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

MD -> can

Grammar Rules

Lexicon

S

NP VP

PRP

I

MD VP

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

MD -> can

Grammar Rules

Lexicon

S

NP VP

PRP

I can

MD VP

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

MD -> can

Grammar Rules

Lexicon

S

NP VP

PRP

I can

MD VP

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

MD -> can

Grammar Rules

Lexicon

S

NP VP

PRP

I can

MD VP

VB NP

help

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

MD -> can

Grammar Rules

Lexicon

S

NP VP

PRP

I can

MD VP

VB NP

help PRP

0.2

0.7

0.1

12

Probabilistic Context-Free Grammars

S -> NP VP

NP -> PRP

PRP -> I

VP -> VB

VP -> VB NP

PRP -> you

VP -> MD VP

VB -> know

VB -> help

MD -> can

Grammar Rules

Lexicon

S

NP VP

PRP

I can

MD VP

VB NP

help PRP

you

0.2

0.7

0.1

12

Learning Probabilistic Context-Free Grammars

(Demo)

Parsing with Probabilistic Context-Free Grammars

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

time flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

time flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

time flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananastime flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananastime flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananastime flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

S -> NP VP

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

S -> NP VP

NP -> NN

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time

S -> NP VP

NP -> NN

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time

S -> NP VP

NP -> NN VP -> VBZ PP

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies

S -> NP VP

NP -> NN VP -> VBZ PP

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies

S -> NP VP

NP -> NN VP -> VBZ PP

PP -> IN NP

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies IN -> like

S -> NP VP

NP -> NN VP -> VBZ PP

PP -> IN NP

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies IN -> like

S -> NP VP

NP -> NN VP -> VBZ PP

PP -> IN NP

NP -> DT NN

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies IN -> like DT -> an

S -> NP VP

NP -> NN VP -> VBZ PP

PP -> IN NP

NP -> DT NN

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies IN -> like DT -> an NN -> arrow

S -> NP VP

NP -> NN VP -> VBZ PP

PP -> IN NP

NP -> DT NN

15

Parsing is Maximizing Likelihood

A probabilistic context-free grammar can be used to select a parse for a sentence.

fruit flies like bananas

Parse by finding the tree with the highest total probability that yields the sentence.

time flies like an arrow0 1 2 3 4 5

time flies like an arrow

Algorithm: Try every rule over every span. Match the lexicon to each word.

NN -> time VBZ -> flies IN -> like DT -> an NN -> arrow

S -> NP VP

NP -> NN VP -> VBZ PP

PP -> IN NP

NP -> DT NN

(Demo)

15

Tree Transformations

Reordering Modal Arguments

17

Reordering Modal Arguments

English

17

Reordering Modal Arguments

English Yoda-English

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

When 900 years old you reach, look as good, you will not. Hm.

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

When 900 years old you reach, look as good, you will not. Hm.

S

NP VP

PRP

I can

MD VP

VB PRP

help you

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

When 900 years old you reach, look as good, you will not. Hm.

S

NP VP

PRP

I can

MD VP

VB PRP

help you

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

When 900 years old you reach, look as good, you will not. Hm.

S

NP VP

PRP

I can

MDVB PRP

help you

VP

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

When 900 years old you reach, look as good, you will not. Hm.

S

NP VP

PRP

I can

MDVB PRP

help you

VP .

,

17

Reordering Modal Arguments

English Yoda-English Help you, I can! Yes! Mm!

When 900 years old you reach, look as good, you will not. Hm.

S

NP VP

PRP

I can

MDVB PRP

help you

VP .

,

(Demo)

17