Upload
stephany-mathews
View
228
Download
0
Tags:
Embed Size (px)
Citation preview
Nov 17, 2005 Learning-based MT 1
Learning-based MT Approaches for Languages with Limited
Resources
Alon LavieLanguage Technologies Institute
Carnegie Mellon University
Joint work with: Jaime Carbonell, Lori Levin, Kathrin Probst, Erik Peterson, Christian Monson, Ariadna Font-Llitjos, Alison Alvarez, Roberto Aranovich
Nov 17, 2005 Learning-based MT 2
Outline• Rationale for learning-based MT• Roadmap for learning-based MT• Framework overview• Elicitation• Learning transfer Rules• Automatic rule refinement• Learning Morphology• Example prototypes• Implications for MT with vast parallel data• Conclusions and future directions
Nov 17, 2005 Learning-based MT 3
Machine Translation: Where are we today?
• Age of Internet and Globalization – great demand for MT: – Multiple official languages of UN, EU, Canada, etc.– Documentation dissemination for large manufacturers
(Microsoft, IBM, Caterpillar)• Economic incentive is still primarily within a small
number of language pairs• Some fairly good commercial products in the market for
these language pairs– Primarily a product of rule-based systems after many years
of development• Pervasive MT between most language pairs still non-
existent and not on the immediate horizon
Nov 17, 2005 Learning-based MT 4
Mi chiamo Alon Lavie My name is Alon Lavie
Give-information+personal-data (name=alon_lavie)
[s [vp accusative_pronoun “chiamare” proper_name]]
[s [np [possessive_pronoun “name”]]
[vp “be” proper_name]]
Direct
Transfer
Interlingua
Analysis Generation
Approaches to MT: Vaquois MT Triangle
Nov 17, 2005 Learning-based MT 5
Progression of MT• Started with rule-based systems
– Very large expert human effort to construct language-specific resources (grammars, lexicons)
– High-quality MT extremely expensive only for handful of language pairs
• Along came EBMT and then SMT…– Replaced human effort with extremely large volumes of
parallel text data– Less expensive, but still only feasible for a small number of
language pairs– We “traded” human labor with data
• Where does this take us in 5-10 years?– Large parallel corpora for maybe 25-50 language pairs
• What about all the other languages?• Is all this data (with very shallow representation of
language structure) really necessary?• Can we build MT approaches that learn deeper levels of
language structure and how they map from one language to another?
Nov 17, 2005 Learning-based MT 6
Why Machine Translation for Languages with Limited Resources?
• We are in the age of information explosion– The internet+web+Google anyone can get the
information they want anytime…• But what about the text in all those other
languages?– How do they read all this English stuff?– How do we read all the stuff that they put online?
• MT for these languages would Enable:– Better government access to native indigenous and
minority communities– Better minority and native community participation in
information-rich activities (health care, education, government) without giving up their languages.
– Civilian and military applications (disaster relief)– Language preservation
Nov 17, 2005 Learning-based MT 7
The Roadmap to Learning-based MT
• Automatic acquisition of necessary language resources and knowledge using machine learning methodologies:– Learning morphology (analysis/generation)– Rapid acquisition of broad coverage word-to-word and
phrase-to-phrase translation lexicons– Learning of syntactic structural mappings
• Tree-to-tree structure transformations [Knight et al], [Eisner], [Melamed] require parse trees for both languages
• Learning syntactic transfer rules with resources (grammar, parses) for just one of the two languages
– Automatic rule refinement and/or post-editing• A framework for integrating the acquired MT resources
into effective MT prototype systems• Effective integration of acquired knowledge with
statistical/distributional information
Nov 17, 2005 Learning-based MT 8
CMU’s AVENUE Approach• Elicitation: use bilingual native informants to produce a
small high-quality word-aligned bilingual corpus of translated phrases and sentences– Building Elicitation corpora from feature structures– Feature Detection and Navigation
• Transfer-rule Learning: apply ML-based methods to automatically acquire syntactic transfer rules for translation between the two languages– Learn from major language to minor language– Translate from minor language to major language
• XFER + Decoder:– XFER engine produces a lattice of possible transferred
structures at all levels– Decoder searches and selects the best scoring combination
• Rule Refinement: refine the acquired rules via a process of interaction with bilingual informants
• Morphology Learning• Word and Phrase bilingual lexicon acquisition
Nov 17, 2005 Learning-based MT 9
AVENUE MT Approach Interlingua
Syntactic Parsing
Semantic Analysis
Sentence Planning
Text Generation
Source (e.g. Quechua)
Target(e.g. English)
Transfer Rules
Direct: SMT, EBMT
AVENUE: Automate Rule Learning
Nov 17, 2005 Learning-based MT 10
AVENUE Architecture
Learning Module
Transfer Rules
{PP,4894};;Score:0.0470PP::PP [NP POSTP] -> [PREP NP]((X2::Y1)(X1::Y2))
Translation Lexicon
Run Time Transfer System
Lattice Decoder
English Language Model
Word-to-Word Translation Probabilities
Word-aligned elicited data
Nov 17, 2005 Learning-based MT 11
Outline• Rationale for learning-based MT• Roadmap for learning-based MT• Framework overview• Elicitation• Learning transfer Rules• Automatic rule refinement• Learning Morphology• Example prototypes• Implications for MT with vast parallel data• Conclusions and future directions
Nov 17, 2005 Learning-based MT 12
Data Elicitation for Languages with Limited Resources
• Rationale:– Large volumes of parallel text not available create
a small maximally-diverse parallel corpus that directly supports the learning task
– Bilingual native informant(s) can translate and align a small pre-designed elicitation corpus, using elicitation tool
– Elicitation corpus designed to be typologically and structurally comprehensive and compositional
– Transfer-rule engine and new learning approach support acquisition of generalized transfer-rules from the data
Nov 17, 2005 Learning-based MT 13
Elicitation Tool: English-Chinese Example
Nov 17, 2005 Learning-based MT 14
Elicitation Tool:English-Chinese Example
Nov 17, 2005 Learning-based MT 15
Elicitation Tool:English-Hindi Example
Nov 17, 2005 Learning-based MT 16
Elicitation Tool:English-Arabic Example
Nov 17, 2005 Learning-based MT 17
Elicitation Tool:Spanish-Mapudungun Example
Nov 17, 2005 Learning-based MT 18
Designing Elicitation Corpora
• What do we want to elicit? – Diversity of linguistic phenomena and constructions– Syntactic structural diversity
• How do we construct an elicitation corpus?– Typological Elicitation Corpus based on elicitation and
documentation work of field linguists (e.g. Comrie 1977, Bouquiaux 1992): initial corpus size ~1000 examples
– Structural Elicitation Corpus based on representative sample of English phrase structures: ~120 examples
• Organized compositionally: elicit simple structures first, then use them as building blocks
• Goal: minimize size, maximize linguistic coverage
Nov 17, 2005 Learning-based MT 19
Typological Elicitation Corpus
• Feature Detection– Discover what features exist in the language and
where/how they are marked• Example: does the language mark gender of nouns?
How and where are these marked?– Method: compare translations of minimal pairs –
sentences that differ in only ONE feature
• Elicit translations/alignments for detected features and their combinations
• Dynamic corpus navigation based on feature detection: no need to elicit for combinations involving non-existent features
Nov 17, 2005 Learning-based MT 20
Typological Elicitation Corpus
• Initial typological corpus of about 1000 sentences was manually constructed
• New construction methodology for building an elicitation corpus using:– A feature specification: lists inventory of available
features and their values– A definition of the set of desired feature structures
• Schemas define sets of desired combinations of features and values
• Multiplier algorithm generates the comprehensive set of feature structures
– A generation grammar and lexicon: NLG generator generates NL sentences from the feature structures
Nov 17, 2005 Learning-based MT 21
Structural Elicitation Corpus• Goal: create a compact diverse sample corpus of
syntactic phrase structures in English in order to elicit how these map into the elicited language
• Methodology:– Extracted all CFG “rules” from Brown section of Penn
TreeBank (122K sentences)– Simplified POS tag set– Constructed frequency histogram of extracted rules– Pulled out simplest phrases for most frequent rules for NPs,
PPs, ADJPs, ADVPs, SBARs and Sentences– Some manual inspection and refinement
• Resulting corpus of about 120 phrases/sentences representing common structures
• See [Probst and Lavie, 2004]
Nov 17, 2005 Learning-based MT 22
Outline• Rationale for learning-based MT• Roadmap for learning-based MT• Framework overview• Elicitation• Learning transfer Rules• Automatic rule refinement• Learning Morphology• Example prototypes• Implications for MT with vast parallel data• Conclusions and future directions
Nov 17, 2005 Learning-based MT 23
Transfer Rule Formalism
Type informationPart-of-speech/constituent
informationAlignments
x-side constraints
y-side constraints
xy-constraints, e.g. ((Y1 AGR) = (X1 AGR))
;SL: the old man, TL: ha-ish ha-zaqen
NP::NP [DET ADJ N] -> [DET N DET ADJ]((X1::Y1)(X1::Y3)(X2::Y4)(X3::Y2)
((X1 AGR) = *3-SING)((X1 DEF = *DEF)((X3 AGR) = *3-SING)((X3 COUNT) = +)
((Y1 DEF) = *DEF)((Y3 DEF) = *DEF)((Y2 AGR) = *3-SING)((Y2 GENDER) = (Y4 GENDER)))
Nov 17, 2005 Learning-based MT 24
Transfer Rule Formalism (II)
Value constraints
Agreement constraints
;SL: the old man, TL: ha-ish ha-zaqen
NP::NP [DET ADJ N] -> [DET N DET ADJ]((X1::Y1)(X1::Y3)(X2::Y4)(X3::Y2)
((X1 AGR) = *3-SING)((X1 DEF = *DEF)((X3 AGR) = *3-SING)((X3 COUNT) = +)
((Y1 DEF) = *DEF)((Y3 DEF) = *DEF)((Y2 AGR) = *3-SING)((Y2 GENDER) = (Y4 GENDER)))
Nov 17, 2005 Learning-based MT 25
Rule Learning - Overview
• Goal: Acquire Syntactic Transfer Rules• Use available knowledge from the source
side (grammatical structure)• Three steps:
1. Flat Seed Generation: first guesses at transfer rules; flat syntactic structure
2. Compositionality Learning: use previously learned rules to learn hierarchical structure
3. Constraint Learning: refine rules by learning appropriate feature constraints
Nov 17, 2005 Learning-based MT 26
Flat Seed Rule Generation
Learning Example: NP
Eng: the big apple
Heb: ha-tapuax ha-gadol
Generated Seed Rule:
NP::NP [ART ADJ N] [ART N ART ADJ]
((X1::Y1)
(X1::Y3)
(X2::Y4)
(X3::Y2))
Nov 17, 2005 Learning-based MT 27
Flat Seed Rule Generation
• Create a “flat” transfer rule specific to the sentence pair, partially abstracted to POS– Words that are aligned word-to-word and have the
same POS in both languages are generalized to their POS
– Words that have complex alignments (or not the same POS) remain lexicalized
• One seed rule for each translation example• No feature constraints associated with seed
rules (but mark the example(s) from which it was learned)
Nov 17, 2005 Learning-based MT 28
Compositionality LearningInitial Flat Rules: S::S [ART ADJ N V ART N] [ART N ART ADJ V P ART N]
((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2) (X4::Y5) (X5::Y7) (X6::Y8))
NP::NP [ART ADJ N] [ART N ART ADJ]
((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2))
NP::NP [ART N] [ART N]
((X1::Y1) (X2::Y2))
Generated Compositional Rule:
S::S [NP V NP] [NP V P NP]
((X1::Y1) (X2::Y2) (X3::Y4))
Nov 17, 2005 Learning-based MT 29
Compositionality Learning
• Detection: traverse the c-structure of the English sentence, add compositional structure for translatable chunks
• Generalization: adjust constituent sequences and alignments
• Two implemented variants:– Safe Compositionality: there exists a transfer rule
that correctly translates the sub-constituent– Maximal Compositionality: Generalize the rule if
supported by the alignments, even in the absence of an existing transfer rule for the sub-constituent
Nov 17, 2005 Learning-based MT 30
Constraint LearningInput: Rules and their Example Sets
S::S [NP V NP] [NP V P NP] {ex1,ex12,ex17,ex26}
((X1::Y1) (X2::Y2) (X3::Y4))
NP::NP [ART ADJ N] [ART N ART ADJ] {ex2,ex3,ex13}
((X1::Y1) (X1::Y3) (X2::Y4) (X3::Y2))
NP::NP [ART N] [ART N] {ex4,ex5,ex6,ex8,ex10,ex11}
((X1::Y1) (X2::Y2))
Output: Rules with Feature Constraints:
S::S [NP V NP] [NP V P NP]
((X1::Y1) (X2::Y2) (X3::Y4)
(X1 NUM = X2 NUM)
(Y1 NUM = Y2 NUM)
(X1 NUM = Y1 NUM))
Nov 17, 2005 Learning-based MT 31
Constraint Learning
• Goal: add appropriate feature constraints to the acquired rules
• Methodology:– Preserve general structural transfer– Learn specific feature constraints from example set
• Seed rules are grouped into clusters of similar transfer structure (type, constituent sequences, alignments)
• Each cluster forms a version space: a partially ordered hypothesis space with a specific and a general boundary
• The seed rules in a group form the specific boundary of a version space
• The general boundary is the (implicit) transfer rule with the same type, constituent sequences, and alignments, but no feature constraints
Nov 17, 2005 Learning-based MT 32
Constraint Learning: Generalization
• The partial order of the version space:Definition: A transfer rule tr1 is strictly more general than another transfer rule tr2 if all f-structures that are satisfied by tr2 are also satisfied by tr1.
• Generalize rules by merging them:– Deletion of constraint– Raising two value constraints to an agreement
constraint, e.g. ((x1 num) = *pl), ((x3 num) = *pl) ((x1 num) = (x3 num))
Nov 17, 2005 Learning-based MT 33
Automated Rule Refinement
• Bilingual informants can identify translation errors and pinpoint the errors
• A sophisticated trace of the translation path can identify likely sources for the error and do “Blame Assignment”
• Rule Refinement operators can be developed to modify the underlying translation grammar (and lexicon) based on characteristics of the error source:– Add or delete feature constraints from a rule– Bifurcate a rule into two rules (general and specific)– Add or correct lexical entries
• See [Font-Llitjos, Carbonell & Lavie, 2005]
Nov 17, 2005 Learning-based MT 34
Outline• Rationale for learning-based MT• Roadmap for learning-based MT• Framework overview• Elicitation• Learning transfer Rules• Automatic rule refinement• Learning Morphology• Example prototypes• Implications for MT with vast parallel data• Conclusions and future directions
Nov 17, 2005 Learning-based MT 35
Morphology Learning• Goal: Unsupervised learning of morphemes and their
function from raw monolingual data– Segmentation of words into morphemes– Identification of morphological paradigms (inflections and
derivations)– Learning association between morphemes and their
function in the language• Organize the raw data in the form of a network of
paradigm candidate schemes• Search the network for a collection of schemes that
represent true morphology paradigms of the language• Learn mappings between the schemes and
features/functions using minimal pairs of elicited data• Construct analyzer based on the collection of schemes
and the acquired function mappings
Nov 17, 2005 Learning-based MT 36
Ø.sblamesolve
Example Vocabulary
blame blamed blames roamed
roaming roams solve solves solving
Ø.s.dblame
sblameroamsolve
e.esblamsolv
me.mesbla
e.esblamsolv
e.edblam
esblamsolv
Ø.s.dblame
Ø.sblamesolve
Øblameblamesblamedroams
roamedroaming
solvesolvessolving
e.es.edblam
edblamroam
dblameroame
Ø.dblame
s.dblame
sblameroamsolve
es.edblam
eblamsolv
me.mesbla
me.medbla
mesbla
me.mes.medbla
medblaroa
mes.medbla
mebla
37
a.as.o.os43
african, cas, jurídic, l, ...
a.as.o.os.tro1
cas
a.as.os50
afectad, cas, jurídic, l, ...
a.as.o59
cas, citad, jurídic, l, ...
a.o.os105
impuest, indonesi, italian, jurídic, ...
a.as199
huelg, incluid, industri,
inundad, ...
a.os134
impedid, impuest, indonesi,
inundad, ...
as.os68
cas, implicad, inundad, jurídic, ...
a.o214
id, indi, indonesi,
inmediat, ...
as.o85
intern, jurídic, just, l, ...
a.tro2
cas.cen
a1237
huelg, ib, id, iglesi, ...
as404
huelg, huelguist, incluid,
industri, ...
os534
humorístic, human, hígad,
impedid, ...
o1139
hub, hug, human,
huyend, ...
tro16
catas, ce, cen, cua, ...
as.o.os54
cas, implicad, jurídic, l, ...
o.os268
human, implicad, indici,
indocumentad, ...
Spanish Newswire Corpus
40,011 Tokens
6,975 Types
38
a.as.o.os43
african, cas, jurídic, l, ...
a.as.o.os.tro1
cas
a.as.os50
afectad, cas, jurídic, l, ...
a.as.o59
cas, citad, jurídic, l, ...
a.o.os105
impuest, indonesi, italian, jurídic, ...
a.as199
huelg, incluid, industri,
inundad, ...
a.os134
impedid, impuest, indonesi,
inundad, ...
as.os68
cas, implicad, inundad, jurídic, ...
a.o214
id, indi, indonesi,
inmediat, ...
as.o85
intern, jurídic, just, l, ...
a.tro2
cas.cen
a1237
huelg, ib, id, iglesi, ...
as404
huelg, huelguist, incluid,
industri, ...
os534
humorístic, human, hígad,
impedid, ...
o1139
hub, hug, human,
huyend, ...
tro16
catas, ce, cen, cua, ...
as.o.os54
cas, implicad, jurídic, l, ...
o.os268
human, implicad, indici,
indocumentad, ...
C-Suffixes
C-Stems
Level 5 = 5 C-suffixes
C-Stem Type Count
39
a.as.o.os43
african, cas, jurídic, l, ...
a.as.o.os.tro1
cas
a.tro2
cas.cen
tro16
catas, ce, cen, cua, ...
Adjective Inflection Class
40
a.as.os50
afectad, cas, jurídic, l, ...
a.as.o59
cas, citad, jurídic, l, ...
a.o.os105
impuest, indonesi, italian, jurídic, ...
a.as199
huelg, incluid, industri,
inundad, ...
a.os134
impedid, impuest, indonesi,
inundad, ...
as.os68
cas, implicad, inundad, jurídic, ...
a.o214
id, indi, indonesi,
inmediat, ...
as.o85
intern, jurídic, just, l, ...
a1237
huelg, ib, id, iglesi, ...
as404
huelg, huelguist, incluid,
industri, ...
os534
humorístic, human, hígad,
impedid, ...
o1139
hub, hug, human,
huyend, ...
as.o.os54
cas, implicad, jurídic, l, ...
o.os268
human, implicad, indici,
indocumentad, ...
From the spurious c-suffix “tro”
a.as.o.os.tro1
cas
a.tro2
cas.cen
tro16
catas, ce, cen, cua, ...
a.as.o.os43
african, cas, jurídic, l, ...
a.as.os50
afectad, cas, jurídic, l, ...
a.as.o59
cas, citad, jurídic, l, ...
a.o.os105
impuest, indonesi, italian, jurídic, ...
a.as199
huelg, incluid, industri,
inundad, ...
a.os134
impedid, impuest, indonesi,
inundad, ...
as.os68
cas, implicad, inundad, jurídic, ...
a.o214
id, indi, indonesi,
inmediat, ...
as.o85
intern, jurídic, just, l, ...
a1237
huelg, ib, id, iglesi, ...
as404
huelg, huelguist, incluid,
industri, ...
os534
humorístic, human, hígad,
impedid, ...
o1139
hub, hug, human,
huyend, ...
as.o.os54
cas, implicad, jurídic, l, ...
o.os268
human, implicad, indici,
indocumentad, ...
41
De
cre
asin
g C
-Ste
m C
oun
t
Incr
ea
sin
g C
-Su
ffix
Co
unt
Basic Search Procedure
Nov 17, 2005 Learning-based MT 42
Outline• Rationale for learning-based MT• Roadmap for learning-based MT• Framework overview• Elicitation• Learning transfer Rules• Automatic rule refinement• Learning Morphology• Example prototypes• Implications for MT with vast parallel data• Conclusions and future directions
Nov 17, 2005 Learning-based MT 43
AVENUE Prototypes
• General XFER framework under development for past three years
• Prototype systems so far:– German-to-English, Dutch-to-English– Chinese-to-English– Hindi-to-English– Hebrew-to-English
• In progress or planned:– Mapudungun-to-Spanish– Quechua-to-Spanish– Arabic-to-English– Native-Brazilian languages to Brazilian Portuguese
Nov 17, 2005 Learning-based MT 44
Challenges for Hebrew MT
• Paucity in existing language resources for Hebrew– No publicly available broad coverage morphological
analyzer– No publicly available bilingual lexicons or dictionaries– No POS-tagged corpus or parse tree-bank corpus for
Hebrew– No large Hebrew/English parallel corpus
• Scenario well suited for CMU transfer-based MT framework for languages with limited resources
Nov 17, 2005 Learning-based MT 45
Hebrew-to-English MT Prototype
• Initial prototype developed within a two month intensive effort
• Accomplished:– Adapted available morphological analyzer– Constructed a preliminary translation lexicon– Translated and aligned Elicitation Corpus– Learned XFER rules– Developed (small) manual XFER grammar as a point
of comparison– System debugging and development– Evaluated performance on unseen test data using
automatic evaluation metrics
Transfer Engine
English Language Model
Transfer Rules{NP1,3}NP1::NP1 [NP1 "H" ADJ] -> [ADJ NP1]((X3::Y1) (X1::Y2) ((X1 def) = +) ((X1 status) =c absolute) ((X1 num) = (X3 num)) ((X1 gen) = (X3 gen)) (X0 = X1))
Translation Lexicon
N::N |: ["$WR"] -> ["BULL"]((X1::Y1) ((X0 NUM) = s) ((Y0 lex) = "BULL"))
N::N |: ["$WRH"] -> ["LINE"]((X1::Y1) ((X0 NUM) = s) ((Y0 lex) = "LINE"))
Source Input
בשורה הבאה
Decoder
English Output
in the next line
Translation Output Lattice
(0 1 "IN" @PREP)(1 1 "THE" @DET)(2 2 "LINE" @N)(1 2 "THE LINE" @NP)(0 2 "IN LINE" @PP)(0 4 "IN THE NEXT LINE" @PP)
Preprocessing
Morphology
Nov 17, 2005 Learning-based MT 47
Morphology Example
• Input word: B$WRH
0 1 2 3 4 |--------B$WRH--------| |-----B-----|$WR|--H--| |--B--|-H--|--$WRH---|
Nov 17, 2005 Learning-based MT 48
Morphology ExampleY0: ((SPANSTART 0) Y1: ((SPANSTART 0) Y2: ((SPANSTART 1) (SPANEND 4) (SPANEND 2) (SPANEND 3) (LEX B$WRH) (LEX B) (LEX $WR) (POS N) (POS PREP)) (POS N) (GEN F) (GEN M) (NUM S) (NUM S) (STATUS ABSOLUTE)) (STATUS ABSOLUTE))
Y3: ((SPANSTART 3) Y4: ((SPANSTART 0) Y5: ((SPANSTART 1) (SPANEND 4) (SPANEND 1) (SPANEND 2) (LEX $LH) (LEX B) (LEX H) (POS POSS)) (POS PREP)) (POS DET))
Y6: ((SPANSTART 2) Y7: ((SPANSTART 0) (SPANEND 4) (SPANEND 4) (LEX $WRH) (LEX B$WRH) (POS N) (POS LEX)) (GEN F) (NUM S) (STATUS ABSOLUTE))
Nov 17, 2005 Learning-based MT 49
Sample Output (dev-data)
maxwell anurpung comes from ghana for israel four years ago and since worked in cleaning in hotels in eilat
a few weeks ago announced if management club hotel that for him to leave israel according to the government instructions and immigration police
in a letter in broken english which spread among the foreign workers thanks to them hotel for their hard work and announced that will purchase for hm flight tickets for their countries from their money
Nov 17, 2005 Learning-based MT 50
Evaluation Results
• Test set of 62 sentences from Haaretz newspaper, 2 reference translations
System BLEU NIST P R METEOR
No Gram 0.0616 3.4109 0.4090 0.4427 0.3298
Learned 0.0774 3.5451 0.4189 0.4488 0.3478
Manual 0.1026 3.7789 0.4334 0.4474 0.3617
Nov 17, 2005 Learning-based MT 51
Hebrew-English: Test Suite Evaluation
Grammar BLEU METEOR
Baseline (NoGram) 0.0996 0.4916
Learned Grammar 0.1608 0.5525
Manual Grammar 0.1642 0.5320
Nov 17, 2005 Learning-based MT 52
Outline• Rationale for learning-based MT• Roadmap for learning-based MT• Framework overview• Elicitation• Learning transfer Rules• Automatic rule refinement• Learning Morphology• Example prototypes• Implications for MT with vast parallel data• Conclusions and future directions
Nov 17, 2005 Learning-based MT 53
Implications for MT with Vast Amounts of Parallel Data
• Learning word/short-phrase translations vs. learning long phrase-to-phrase translations
• Phrase-to-phrase MT ill suited for long-range reorderings ungrammatical output
• Recent work on hierarchical Stat-MT [Chiang, 2005] and parsing-based MT [Melamed et al, 2005]
• Learning general tree-to-tree syntactic mappings is equally problematic:– Meaning is a hybrid of complex, non-compositional phrases
embedded within a syntactic structure– Some constituents can be translated in isolation, others
require contextual mappings
Nov 17, 2005 Learning-based MT 54
Implications for MT with Vast Amounts of Parallel Data
• Our approach for learning transfer rules is applicable to the large data scenario, subject to solutions for several challenges:– No elicitation corpus break-down parallel
sentences into reasonable learning examples– Working with less reliable automatic word alignments
rather than manual alignments– Effective use of reliable parse structures for ONE
language (i.e. English) and automatic word alignments in order to decompose the translation of a sentence into several compositional rules.
– Effective scoring of resulting very large transfer grammars, and scaled up transfer + decoding
Nov 17, 2005 Learning-based MT 55
Implications for MT with Vast Amounts of Parallel Data
• Example: 他 经常 与 江泽民 总统 通 电话 He freq with J Zemin Pres via phone
He freq talked with President J Zemin over the phone
Nov 17, 2005 Learning-based MT 56
Implications for MT with Vast Amounts of Parallel Data
• Example: 他 经常 与 江泽民 总统 通 电话 He freq with J Zemin Pres via phone
He freq talked with President J Zemin over the phone
NP1
NP1
NP2
NP2
NP3
NP3
Nov 17, 2005 Learning-based MT 57
Conclusions• There is hope yet for wide-spread MT between many of
the worlds language pairs• MT offers a fertile yet extremely challenging ground for
learning-based approaches that leverage from diverse sources of information:– Syntactic structure of one or both languages– Word-to-word correspondences– Decomposable units of translation– Statistical Language Models
• Provides a feasible solution to MT for languages with limited resources
• Extremely promising approach for addressing the fundamental weaknesses in current corpus-based MT for languages with vast resources
Nov 17, 2005 Learning-based MT 58
Future Research Directions• Automatic Transfer Rule Learning:
– In the “large-data” scenario: from large volumes of uncontrolled parallel text automatically word-aligned
– In the absence of morphology or POS annotated lexica
– Learning mappings for non-compositional structures– Effective models for rule scoring for
• Decoding: using scores at runtime• Pruning the large collections of learned rules
– Learning Unification Constraints
• Integrated Xfer Engine and Decoder– Improved models for scoring tree-to-tree mappings,
integration with LM and other knowledge sources in the course of the search
Nov 17, 2005 Learning-based MT 59
Future Research Directions
• Automatic Rule Refinement• Morphology Learning• Feature Detection and Corpus
Navigation• …
Nov 17, 2005 Learning-based MT 60
Nov 17, 2005 Learning-based MT 61
Mapudungun-to-Spanish Example
Mapudungun
pelafiñ Maria
Spanish
No vi a María
English
I didn’t see Maria
Nov 17, 2005 Learning-based MT 62
Mapudungun-to-Spanish Example
Mapudungun
pelafiñ Mariape -la -fi -ñ Mariasee -neg -3.obj -1.subj.indicative Maria
Spanish
No vi a MaríaNo vi a Maríaneg see.1.subj.past.indicative acc Maria
English
I didn’t see Maria
Nov 17, 2005 Learning-based MT 63
V
pe
pe-la-fi-ñ Maria
Nov 17, 2005 Learning-based MT 64
V
pe
pe-la-fi-ñ Maria
VSuff
laNegation = +
Nov 17, 2005 Learning-based MT 65
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffGPass all features up
Nov 17, 2005 Learning-based MT 66
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fiobject person = 3
Nov 17, 2005 Learning-based MT 67
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fi
VSuffGPass all features up from both children
Nov 17, 2005 Learning-based MT 68
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
person = 1number = sgmood = ind
Nov 17, 2005 Learning-based MT 69
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
Pass all features up from both children
VSuffG
Nov 17, 2005 Learning-based MT 70
V
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
Pass all features up from both children
VSuffGCheck that:1) negation = +2) tense is undefined
Nov 17, 2005 Learning-based MT 71
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
V NP
N
Maria
N person = 3number = sghuman = +
Nov 17, 2005 Learning-based MT 72
Pass features up from
V
pe
pe-la-fi-ñ Maria
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
Check that NP is human = +V VP
Nov 17, 2005 Learning-based MT 73
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
Nov 17, 2005 Learning-based MT 74
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
Pass all features to Spanish side
Nov 17, 2005 Learning-based MT 75
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
Pass all features down
Nov 17, 2005 Learning-based MT 76
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
Pass object features down
Nov 17, 2005 Learning-based MT 77
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
Accusative marker on objects is introduced because human = +
Nov 17, 2005 Learning-based MT 78
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
VP::VP [VBar NP] -> [VBar "a" NP]( (X1::Y1)
(X2::Y3)
((X2 type) = (*NOT* personal)) ((X2 human) =c +)
(X0 = X1) ((X0 object) = X2)
(Y0 = X0)
((Y0 object) = (X0 object))(Y1 = Y0)(Y3 = (Y0 object))((Y1 objmarker person) = (Y3 person))((Y1 objmarker number) = (Y3 number))((Y1 objmarker gender) = (Y3 ender)))
Nov 17, 2005 Learning-based MT 79
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
V“no”
Pass person, number, and mood features to Spanish Verb
Assign tense = past
Nov 17, 2005 Learning-based MT 80
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
V“no”
Introduced because negation = +
Nov 17, 2005 Learning-based MT 81
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
V“no”
ver
Nov 17, 2005 Learning-based MT 82
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
V“no”
vervi
person = 1number = sgmood = indicativetense = past
Nov 17, 2005 Learning-based MT 83
V
pe
Transfer to Spanish: Top-Down
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
V“no”
vi N
María
N
Pass features over to Spanish side
Nov 17, 2005 Learning-based MT 84
V
pe
I Didn’t see Maria
VSuff
la
VSuffG VSuff
fi
VSuffG VSuff
ñ
VSuffG
NP
N
Maria
N
S
V
VP
S
VP
NP“a”V
V“no”
vi N
María
N
Nov 17, 2005 Learning-based MT 85