99
Computational Historical Linguistics Gerhard J¨ ager Current Trends in Linguistics November 3, 2016 Gerhard J¨ ager Computational Historical Linguistics November 3, 2016 1 / 99

Computational Historical Linguistics

  • Upload
    others

  • View
    29

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Computational Historical Linguistics

Computational Historical Linguistics

Gerhard Jager

Current Trends in Linguistics

November 3, 2016

Gerhard Jager Computational Historical Linguistics November 3, 2016 1 / 99

Page 2: Computational Historical Linguistics

Similarity between languages

Gerhard Jager Computational Historical Linguistics November 3, 2016 2 / 99

Page 3: Computational Historical Linguistics

Similarity between languages

Gerhard Jager Computational Historical Linguistics November 3, 2016 3 / 99

Page 4: Computational Historical Linguistics

Similarity between languages

Gerhard Jager Computational Historical Linguistics November 3, 2016 4 / 99

Page 5: Computational Historical Linguistics

Sound laws

Gerhard Jager Computational Historical Linguistics November 3, 2016 5 / 99

Page 6: Computational Historical Linguistics

Sound laws

sound laws are specific for a particular period in language change

they hold nearly universally for all occurrences of the sound inquestion in the language in question

ideally we have written records of both stages (Latin/Romancelanguages, Old High German, Middle High German)

in most cases, sound laws must be reconstructed via systematiccomparison of related languages

applying sound laws backwards leads to reconstructed vocabularyof common mother language

Gerhard Jager Computational Historical Linguistics November 3, 2016 6 / 99

Page 7: Computational Historical Linguistics

Language trees

comparative methodgives rise topyhlogenetic trees ofhistoric development

Gerhard Jager Computational Historical Linguistics November 3, 2016 7 / 99

Page 8: Computational Historical Linguistics

Limits of the comparative method

Similarities between languages may be due to horizontal transfer(loans)

limited time depth (≤ 10,000 years)

Gerhard Jager Computational Historical Linguistics November 3, 2016 8 / 99

Page 9: Computational Historical Linguistics

Limits of the comparative method

Similarities between languages may be due to horizontal transfer(loans)

limited time depth (≤ 10,000 years)

Gerhard Jager Computational Historical Linguistics November 3, 2016 9 / 99

Page 10: Computational Historical Linguistics

Deep genetic relationships

Plethora of proposals beyond well-established families:Nostratic:

proposed by Pedersen (1903)original proposal: Indo-European, Finno-Ugric, Samoyed, Turkish,Mongolian, Manchu, Yukaghir, Eskimo, Semitic, and Hamiticrevived by “Moscow school” in 1960traditional comparative method, including reconstruction of protoforms

Gerhard Jager Computational Historical Linguistics November 3, 2016 10 / 99

Page 11: Computational Historical Linguistics

Deep genetic relationships

Plethora of proposals beyond well-established families:Eurasiatic

proposed by Greenberg (2000)comprises Indo-European, UralicYukaghir, Altaic,Chukotko-Kamchatkan, EskimoAleut, Korean-Japanese-Ainu,Gilyak, Etruscanmultitude of arguments, mostly from morphology and phonology

Gerhard Jager Computational Historical Linguistics November 3, 2016 11 / 99

Page 12: Computational Historical Linguistics

Deep genetic relationships

Plethora of proposals beyond well-established families:Dene-Caucasian

based on work by Sapir, Starostin, Swadesh and otherscomprises Ne-Dene, Caucasian, Sino-Tibetan, Yeniseian,Burushaski, perhaps Basque and other languagesalso multitude of arguments, mostly from morphology andphonology

Gerhard Jager Computational Historical Linguistics November 3, 2016 12 / 99

Page 13: Computational Historical Linguistics

Deep genetic relationships

Plethora of proposals beyond well-established families:Amerind

proposed by Greenberg (1987)comprises all American languages except Na-Dene andEskimo-Aleutarguments based on mass lexical comparison

Gerhard Jager Computational Historical Linguistics November 3, 2016 13 / 99

Page 14: Computational Historical Linguistics

Deep genetic relationships

Merritt Ruhlen, a student of Greenberg, even claims to havereconstructed a few words of “Proto-World” (for instance the wordaqua for water, which miraculously didn’t change from the dawn oftime till Cicero)

such deep connection are mostly based on suggestive salientfeatures of the languages involved, like pronoun forms

Nostratic pronouns

Amerind pronouns

generally, these approaches neither quantify the probability ofchance resemblances nor do they take negative evidence intoaccount

Gerhard Jager Computational Historical Linguistics November 3, 2016 14 / 99

Page 15: Computational Historical Linguistics

Computational methods

this project:starting from raw word lists (phonetic strings)automatically assess string similarityautomatically control for chance resemblancesquantify (dis)similarity between word listsevaluate results by

comparison to expert language classificationcorrelation with phenotypical distances between populations

Gerhard Jager Computational Historical Linguistics November 3, 2016 15 / 99

Page 16: Computational Historical Linguistics

The Automated Similarity Judgment Program

Project at MPI EVA in Leipzig around Søren Wichmann

covers more than 6,000 languages and dialects

basic vocabulary of 40 words for each language, in uniformphonetic transcription

freely available

used concepts: I, you, we, one, two, person, fish, dog, louse, tree, leaf, skin,

blood, bone, horn, ear, eye, nose, tooth, tongue, knee, hand, breast, liver, drink,

see, hear, die, come, sun, star, water, stone, fire, path, mountain, night, full, new,

name

Gerhard Jager Computational Historical Linguistics November 3, 2016 16 / 99

Page 17: Computational Historical Linguistics

Automated Similarity Judgment Project

concept Latin English

I ego Eiyou tu yuwe nos wione unus w3ntwo duo tuperson persona, homo pers3nfish piskis fiSdog kanis daglouse pedikulus laustree arbor trileaf foly∼u* lifskin kutis skinblood saNgw∼is bl3dbone os bonhorn kornu hornear auris ireye okulus Ei

concept Latin English

nose nasus nostooth dens tu8tongue liNgw∼E t3Nknee genu nihand manus hEndbreast pektus, mama brestliver yekur liv3rdrink bibere drinksee widere sihear audire hirdie mori dEicome wenire k3msun sol s3nstar stela starwater akw∼a wat3rstone lapis stonfire iNnis fEir

Gerhard Jager Computational Historical Linguistics November 3, 2016 17 / 99

Page 18: Computational Historical Linguistics

Determining distances between word lists

two steps:

compute similarity/distance between individual word formsaggregate word distances to doculect distances

Gerhard Jager Computational Historical Linguistics November 3, 2016 18 / 99

Page 19: Computational Historical Linguistics

Word distances

based on string alignment

baseline: Levenshtein alignment ⇒ count matches andmis-matches

too crude as it totally ignores sound correspondences

Gerhard Jager Computational Historical Linguistics November 3, 2016 19 / 99

Page 20: Computational Historical Linguistics

Capturing sound correspondences

weighted alignment using Pointwise Mutual Information (PMI,a.k.a. log-odds):

s(a, b) = logp(a, b)

q(a)q(b)

p(a, b): probability of sound a being etymologically related to soundb in a pair of cognatesq(a): relative frequency of sound a

Needleman-Wunsch algorithm: given a matrix of pairwise PMIscores between individual symbols and two strings, it returns thealignment that maximizes the aggregate PMI score

but first we need to estimate p(a, b) and q(a), q(b) for allsoundclasses a and b

q(a): relative frequency of occurence of segment a in all words inASJP

p(a, b): that’s a bit more complicated...

Gerhard Jager Computational Historical Linguistics November 3, 2016 20 / 99

Page 21: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 21 / 99

Page 22: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 22 / 99

Page 23: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 23 / 99

Page 24: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 24 / 99

Page 25: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 25 / 99

Page 26: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 26 / 99

Page 27: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 27 / 99

Page 28: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 28 / 99

Page 29: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 29 / 99

Page 30: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 30 / 99

Page 31: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 31 / 99

Page 32: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 32 / 99

Page 33: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 33 / 99

Page 34: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 34 / 99

Page 35: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 35 / 99

Page 36: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 36 / 99

Page 37: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 37 / 99

Page 38: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 38 / 99

Page 39: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 39 / 99

Page 40: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 40 / 99

Page 41: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9

Gerhard Jager Computational Historical Linguistics November 3, 2016 41 / 99

Page 42: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97

Gerhard Jager Computational Historical Linguistics November 3, 2016 42 / 99

Page 43: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15

Gerhard Jager Computational Historical Linguistics November 3, 2016 43 / 99

Page 44: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1

Gerhard Jager Computational Historical Linguistics November 3, 2016 44 / 99

Page 45: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1 8.84

Gerhard Jager Computational Historical Linguistics November 3, 2016 45 / 99

Page 46: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1 8.84

◮ memorizing in each step which of the three cells to the leftand above gave rise to the current entry lets us recover thecorresponing optimal alignment

Gerhard Jager Computational Historical Linguistics November 3, 2016 46 / 99

Page 47: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1 8.84

◮ memorizing in each step which of the three cells to the leftand above gave rise to the current entry lets us recover thecorresponing optimal alignment

Gerhard Jager Computational Historical Linguistics November 3, 2016 47 / 99

Page 48: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1 8.84

◮ memorizing in each step which of the three cells to the leftand above gave rise to the current entry lets us recover thecorresponing optimal alignment

Gerhard Jager Computational Historical Linguistics November 3, 2016 48 / 99

Page 49: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1 8.84

◮ memorizing in each step which of the three cells to the leftand above gave rise to the current entry lets us recover thecorresponing optimal alignment

Gerhard Jager Computational Historical Linguistics November 3, 2016 49 / 99

Page 50: Computational Historical Linguistics

Computing the weighted alignment score

◮ Dynamic Programming

− m E n S

− 0 −2.5 −4.1 −5.7 −7.3m −2.5 4.13 1.53 0.03 −1.47e −4.1 1.53 5.65 3.05 1.55n −5.7 0.03 3.05 9.2 6.6E −7.3 −1.47 4.75 6.6 7.62s −8.9 −2.97 2.15 5.1 8.84

◮ memorizing in each step which of the three cells to the leftand above gave rise to the current entry lets us recover thecorresponing optimal alignment

m E n - S

m e n E s

Gerhard Jager Computational Historical Linguistics November 3, 2016 50 / 99

Page 51: Computational Historical Linguistics

Capturing sound correspondences

First step: automatically compile a list of language pairs that are(fairly) certain to be related

start with a measure for language dissimilarity based onLevenshtein alignment

0

5

10

15

0.00 0.25 0.50 0.75dERC

dens

ity

all language pairs with dissimilarity ≤ 0.7 (ca. 1% of all pairs)qualify as probably related

Gerhard Jager Computational Historical Linguistics November 3, 2016 51 / 99

Page 52: Computational Historical Linguistics

Capturing sound correspondences

doculects probably related (in this sense) to English:

AFRIKAANS, ALSATIAN, BERNESE_GERMAN, BRABANTIC,

CIMBRIAN, DANISH, DUTCH, EASTERN_FRISIAN, FAROESE,

FRANS_VLAAMS, FRISIAN_WESTERN, GJESTAL_NORWEGIAN,

ICELANDIC, JAMTLANDIC, LIMBURGISH, LUXEMBOURGISH,

NORTH_FRISIAN_AMRUM, NORTHERN_LOW_SAXON, NORWEGIAN_BOKMAAL,

NORWEGIAN_NYNORSK_TOTEN, NORWEGIAN_RIKSMAL, PLAUTDIETSCH,

SANDNES_NORWEGIAN, SAXON_UPPER, SCOTS, STANDARD_GERMAN,

STELLINGWERFS, SWABIAN, SWEDISH, WESTVLAAMS, YIDDISH_EASTERN,

YIDDISH_WESTERN, ZEEUWS

these are all and only the Germanic languages

99.9% of all probably related pairs belong to the same family, and60% to the same genus

Gerhard Jager Computational Historical Linguistics November 3, 2016 52 / 99

Page 53: Computational Historical Linguistics

Capturing sound correspondences

Second step:let L1 and L2 be probably relatedevery pair of words w1/w2 from L1/L2 sharing the same meaningare considered potentially cognateall potential cognate pairs are (Levenshtein-)alignedrelative frequency of a being aligned with b is used as estimate ofs(a, b)all potential cognate pairs are Needleman-Wunsch aligned usingPMI scores obtained in the previous stepall potential cognate pairs with an aggregate PMI score ≥ 5.0 areconsidered probable cognatess(a, b) is re-estimated using only probable cognate pairsthis is repeated ten times

Gerhard Jager Computational Historical Linguistics November 3, 2016 53 / 99

Page 54: Computational Historical Linguistics

Capturing sound correspondences

only probabe cognate between English and Latin:pers3n/persona

probable cognates English/German:

fiS fiSlaus lausbl3d bluthorn hornbrest brustliv3r leb3rstar StErnwat3r vas3rful fol

Gerhard Jager Computational Historical Linguistics November 3, 2016 54 / 99

Page 55: Computational Historical Linguistics

Capturing sound correspondences

procedures results in pairwise PMI scores for each pair from the 41ASJP sound classes

positive PMI-score between a and b: evidence for etymologicalrelatedness

negative PMI-score between a and b: evidence againstetymological relatedness

Gerhard Jager Computational Historical Linguistics November 3, 2016 55 / 99

Page 56: Computational Historical Linguistics

a e i o u p b d t 8 s h

a 1.88 −1.35 −2.35 −1.66 −2.54 −8.49 −8.82 −7.07 −7.03 −4.64 −8.78 −8.40e −1.35 2.40 −0.48 −1.52 −2.88 −7.47 −7.80 −7.66 −6.01 −5.01 −7.76 −7.38i −2.35 −0.48 2.37 −2.81 −1.32 −6.75 −8.46 −8.33 −8.98 −3.48 −7.04 −6.66o −1.66 −1.52 −2.81 2.48 −0.27 −7.08 −8.10 −7.96 −8.61 −5.31 −8.06 −7.68u −2.54 −2.88 −1.32 −0.27 2.76 −6.62 −8.05 −7.91 −8.56 −5.26 −8.01 −7.63p −8.49 −7.47 −6.75 −7.08 −6.62 3.69 0.36 −6.59 −4.30 −3.94 −2.70 −0.49b −8.82 −7.80 −8.46 −8.10 −8.05 0.36 3.62 −4.84 −5.09 −3.58 −5.63 −3.24d −7.07 −7.66 −8.33 −7.96 −7.91 −6.59 −4.84 3.41 −0.10 2.52 −2.29 −2.81t −7.03 −6.01 −8.98 −8.61 −8.56 −4.30 −5.09 −0.10 3.15 2.11 −1.67 −1.768 −4.64 −5.01 −3.48 −5.31 −5.26 −3.94 −3.58 2.52 2.11 5.49 1.92 −0.85s −8.78 −7.76 −7.04 −8.06 −8.01 −2.70 −5.63 −2.29 −1.67 1.92 3.50 0.26h −8.40 −7.38 −6.66 −7.68 −7.63 −0.49 −3.24 −2.81 −1.76 −0.85 0.26 3.50

Page 57: Computational Historical Linguistics

Capturing sound correspondences

hierarchical clustering of sound classes according to PMI scores:o u

a

E e

3 i

S s h x C c T j z

y

L Z

l r t

8 d

f p

m

b

v w

7

k g

X

G q

5 n N

! 4

Gerhard Jager Computational Historical Linguistics November 3, 2016 57 / 99

Page 58: Computational Historical Linguistics

Capturing sound correspondences

multidimensional scaling of vowel classes according to PMI scores:

a

e

i

o

u

E

3

Gerhard Jager Computational Historical Linguistics November 3, 2016 58 / 99

Page 59: Computational Historical Linguistics

Weighted alignment

Gerhard Jager Computational Historical Linguistics November 3, 2016 59 / 99

Page 60: Computational Historical Linguistics

Weighted alignment

alignments German/Latin:

iX-

ego

du

tu

vir--

--nos

ain-s

-unus

cvai

d-uo

--mEnS

homo--

fiS---

piskis

hun-t

kanis

--la-u--s

pedikulus

--baum

arb-or

b-lat

folu-

haut--

k-utis

--blut

saNgis

knoX3n

--os--

-or--

auris

a-ug3-

okulus

naz3-

nasus

can-

dens

cuN-3

liNgE

k-ni

genu

han-t

manus

b--rust

pektus-

leb3r

yekur

triNk3n-

b-i-bere

--ze-3n

widere-

--her3n

audire-

Sterb3n

-mor-i-

kom3n---

w--enire

zon3

sol-

StErn-

ste-la

vas3r

-aka-

Sta-in

-lapis

foi--a-

--iNnis

p--at

viya-

bErk

mons

naxt

noks

f---ol

plenus

no-i-

nowus

nam3-

nomen

Gerhard Jager Computational Historical Linguistics November 3, 2016 60 / 99

Page 61: Computational Historical Linguistics

Weighted alignment

alignments German/Cimbrian:

iX

ix

du

dE

vir

bar

cvai-

sb-en

mEn-S

menEs

hunt

hunt

laus

laus

baum

p-om

blat

-lop

blut

plut

knoX3n

-po-an

horn

horn

o-r

oar

aug3

-ogE

--n--az3

kanipa--

cuN3-----

--gaprext

hant

hant

brus---t

p-uzamEn

leb3r-

lEbara

triNk3n

trink--

ze3n

ze-g

her3n

hor--

Sterb3n

sterb--

kom3n

kEm--

zon3

zuna

StE-rn

stEarn

vas3r

basar

St-ain

stoa-n

foia-

bo-ar

vek---

bEgale

bErk

perg

naxt

naxt

--fol--

gabasEt

noi

noy

nam3

namo

Gerhard Jager Computational Historical Linguistics November 3, 2016 61 / 99

Page 62: Computational Historical Linguistics

Aggregating word similarites

Needleman-Wunsch alignment returns a similarity score for eachword pair

not too reliable to identify cognates:often low scores for genuine cognate pairs (‘false negatives’):

lat. genu/eng. knee: −3.39lat. unus/eng. one: −5.00

occasionally high scores for non-cognates (‘chancesimilarities’/‘false positives’):

grm. Blatt (’leaf’)/Tilquiapan bldag (’leaf’): 0.22lat. oculus (’eye)/Lachixio ikulu (’eye’): 6.72

approach pursued here:

for each language pair, estimate amount of chance similaritiesquantify to what degree the observed similarities exceed expectedchance similarities

Gerhard Jager Computational Historical Linguistics November 3, 2016 62 / 99

Page 63: Computational Historical Linguistics

Aggregating word distances

English / Swedish

Ei yu wi w3n tu fiS . . .

yog −7.77 0.75 −7.68 −7.90 −8.57 −10.50du −7.62 0.33 −5.71 −7.41 2.66 −8.57vi −2.72 −2.83 4.04 −1.34 −6.45 0.70et −5.47 −7.87 −5.47 −6.43 −1.83 −4.70tvo −7.91 −4.27 −3.64 −4.57 0.39 −6.98fisk −7.45 −11.2 −3.07 −9.97 −8.66 7.58...

values along diagonal give similarity between candidates forcognacy (possibility of meaning change is disregarded)

values off diagonal provide sample of similarity distributionbetween non-cognates

Gerhard Jager Computational Historical Linguistics November 3, 2016 63 / 99

Page 64: Computational Historical Linguistics

Aggregating word distances

●●●●

●●

●●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

−20

−10

0

10

diagonal off−diagonalposition

PM

I

English/Swedish

●●●

● ●●

●●

●●●

●●●●

●●

●●

● ●●

● ●

●●

●●●

●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●●●

−20

−10

0

10

diagonal off−diagonalposition

PM

I

English/Swahili

distance between two word lists is a measure for how much thedistribution along the diagonal differs from the distribution off thediagonal

Gerhard Jager Computational Historical Linguistics November 3, 2016 64 / 99

Page 65: Computational Historical Linguistics

Aggregating word distances

some examples

A B d(A,B)English Scots 0.2139Danish Swedish 0.2773English Swedish 0.3981English Frisian 0.4215English Dutch 0.4040Hindi Farsi 0.6231English French 0.7720English Hindi 0.7735Amharic Vietnamese 0.8566Swahili Warlpiri 0.8573Navajo Dyirbal 0.8436Japanese Haida 0.8504English Swahili 0.8901

Gerhard Jager Computational Historical Linguistics November 3, 2016 65 / 99

Page 66: Computational Historical Linguistics

Phylogenetic inference

pairwise distances for all (extant) languages present in ASJP arecomputed

resulting distance matrix is fed into distance-based phylogeneticalgorithm (Neighbor Joining + Ordinary Least Square NearestNeighbor Interchange Optimization)

outcome recognizes language families and their internal structureremarkably well

Gerhard Jager Computational Historical Linguistics November 3, 2016 66 / 99

Page 67: Computational Historical Linguistics

Phylogenetic inference

IE.GERMANIC.WESTVLAAMSIE.GERMANIC.FRANS_VLAAMS

0.99

IE.GERMANIC.ZEEUWS1

IE.GERMANIC.STELLINGWERFS0.71

IE.GERMANIC.AFRIKAANSIE.GERMANIC.DUTCH

1

0.77

IE.GERMANIC.BRABANTIC

1

IE.GERMANIC.NORTH_FRISIAN_AMRUMIE.GERMANIC.FRISIAN_WESTERN

1

0.87

IE.GERMANIC.LIMBURGISHIE.GERMANIC.NORTHERN_LOW_SAXON

0.25

0.33

IE.GERMANIC.PLAUTDIETSCHIE.GERMANIC.EASTERN_FRISIAN

0.46

0.36

IE.GERMANIC.SWABIANIE.GERMANIC.SAXON_UPPER

0.77

IE.GERMANIC.STANDARD_GERMAN0.98

IE.GERMANIC.LUXEMBOURGISH0.55

IE.GERMANIC.BERNESE_GERMANIE.GERMANIC.ALSATIAN

1

0.54

IE.GERMANIC.YIDDISH_WESTERNIE.GERMANIC.YIDDISH_EASTERN

1

IE.GERMANIC.CIMBRIAN0.95

0.63

1

IE.GERMANIC.JAMTLANDICIE.GERMANIC.SWEDISH

0.9

IE.GERMANIC.NORWEGIAN_NYNORSK_TOTEN0.99

IE.GERMANIC.DANISHIE.GERMANIC.NORWEGIAN_BOKMAAL

0.94

1

IE.GERMANIC.SANDNES_NORWEGIANIE.GERMANIC.GJESTAL_NORWEGIAN

1

IE.GERMANIC.NORWEGIAN_RIKSMAL1

0.97

IE.GERMANIC.ICELANDICIE.GERMANIC.FAROESE

1

1

IE.GERMANIC.SCOTSIE.GERMANIC.ENGLISH

1

0.89

1

Gerhard Jager Computational Historical Linguistics November 3, 2016 67 / 99

Page 68: Computational Historical Linguistics

Phylogenetic inference

IE.SLAVIC.POLISH

IE.SLAVIC.SLOVENIAN0.62

IE.SLAVIC.CZECH

0.56

IE.SLAVIC.LOWER_SORBIAN

IE.SLAVIC.LOWER_SORBIAN_21

IE.SLAVIC.UPPER_SORBIAN

1

0.61

IE.SLAVIC.SLOVAK

0.6

IE.SLAVIC.UKRAINIAN

IE.SLAVIC.BELARUSIAN1

IE.SLAVIC.RUSSIAN

IE.SLAVIC.NINILCHIK_RUSSIAN0.91

1

0.61

IE.SLAVIC.BOSNIAN

IE.SLAVIC.CROATIAN0.85

IE.SLAVIC.SERBOCROATIAN

1

IE.SLAVIC.BULGARIAN

IE.SLAVIC.MACEDONIAN1

0.77

1

IE.BALTIC.LATVIAN

IE.BALTIC.LITHUANIAN1

1

Gerhard Jager Computational Historical Linguistics November 3, 2016 68 / 99

Page 69: Computational Historical Linguistics

Phylogenetic inference

Indic: 1

Iranian: 1

1

Armenian: 1

0.92

Germanic: 1

Balto-Slavic: 1

0.99

Romance: 1

0.61

Albanian: 1

0.35

Celtic: 0.89

0.5

0.99

1.0

Gerhard Jager Computational Historical Linguistics November 3, 2016 69 / 99

Page 70: Computational Historical Linguistics

Phylogenetic inference

Languages of Eurasia

Uralic

Hmong-Mien

Chukotko

-Kamchatkan

Japonic

Dravid

ian

Austro

nesian

Nakh

-Daghestanian

Tai-Kadai

Tungusic

Sino-Tibetan

Mongolic

Yeniseian

Ainu

Austro

asiatic

Nivkh

Indo-European

Turkic

99.4%100%96.8%99.9%

100%

96.9%100%

Yukag

hir

Gerhard Jager Computational Historical Linguistics November 3, 2016 70 / 99

Page 71: Computational Historical Linguistics

Phylogenetic inference

Languages of Eurasia

Gerhard Jager Computational Historical Linguistics November 3, 2016 71 / 99

Page 72: Computational Historical Linguistics

Phylogenetic inference

Austronesian

Niger-Congo

Tai-Kadai

Aust

ro-A

siati

cSin

o-Tib

eta

n

Uto

-AztecanM

ayan

Quechuan

Altaic

Africa

Eurasia

Papunesia

Australia

America

SubsaharanAfrica

NW E

urasia

Australia/Papua

SE AsiaAmer

ica

Papua

Khoisa

nN

ilo-S

ahara

nKadugli

Nilo

-Sahara

nN

iger-C

ongo

Dravid

ianTi

mor-

Alo

r-Pa

nta

r

Indo-EuropeanUralic

Afro-Asiatic

Afro-Asiatic

Australian

Gerhard Jager Computational Historical Linguistics November 3, 2016 72 / 99

Page 73: Computational Historical Linguistics

Distant relationships

(joint work with Cecil Brown, Eric Holman, Johann-Mattis List and SørenWichmann)

compute aggregate distances between language families

find threshold with false discovery rate of 5%: all families pairswith a distance below this threshold are genuinely related (due tocommon descent or contact) with a confidence or 95%

Gerhard Jager Computational Historical Linguistics November 3, 2016 73 / 99

Page 74: Computational Historical Linguistics

Distant relationships

Gerhard Jager Computational Historical Linguistics November 3, 2016 74 / 99

Page 75: Computational Historical Linguistics

Distant relationships

Gerhard Jager Computational Historical Linguistics November 3, 2016 75 / 99

Page 76: Computational Historical Linguistics

Distant relationships

Gerhard Jager Computational Historical Linguistics November 3, 2016 76 / 99

Page 77: Computational Historical Linguistics

Distant relationships

Gerhard Jager Computational Historical Linguistics November 3, 2016 77 / 99

Page 78: Computational Historical Linguistics
Page 79: Computational Historical Linguistics

Words and bones

(joint work with Katerina Harvati and Hugo Reyes-Centeno)

Since Cavalli-Sforza’s work: lot of interest in correlations betweengenetic and linguistic features of human populations

our work: correlations between phenotypical (cranial) andlinguistic (vocabulary-based) features

motivation:

different parts of the cranium respond to different selective pressuresASJP provides data for computing linguistic distances on anunprecedented scale; this study provides (additional) evidence forthe reliability of ASJP-based distances across language familyboundariespart of the general endeavor to disentangle human bio-historicalco-evolution

Gerhard Jager Computational Historical Linguistics November 3, 2016 79 / 99

Page 80: Computational Historical Linguistics

• Whole Cranium: 30 variables

• Face: 15 variables

• Neurocranium: 15 variables

Cranial Phenotype Data

Gerhard Jager Computational Historical Linguistics November 3, 2016 80 / 99

Page 81: Computational Historical Linguistics

Does language track population history?

• Hypothesis 1: Language reflects genetic population history if there is a significant relationship with neurocranial morphology and geography

• Hypothesis 2: Language reflects other factors if there is a significant relationship with facial morphology

Gerhard Jager Computational Historical Linguistics November 3, 2016 81 / 99

Page 82: Computational Historical Linguistics

Mapping bones to languages

cranial data from 135 populations

●●

● ●

●●

●●

●● ●

●●

●●

●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

−50

0

50

−100 0 100 200long

lat

Gerhard Jager Computational Historical Linguistics November 3, 2016 82 / 99

Page 83: Computational Historical Linguistics

Assigning languages to populations

in some cases, assignment is straightforward:

WestAleut → AleutSouth West Alaska → Central YupikSerbia → Serbo-CroatianGyzeh → Late Egyptian

sometimes, several candidate languages from the same languagefamily or genus

North East Asia → Inupiaq, 3 dialects of Yupik (all Eskimolanguages)Germany → Standard German + 6 German dialectsRecent Italy → Corsican, Friulian, Italian, Sardinian

Gerhard Jager Computational Historical Linguistics November 3, 2016 83 / 99

Page 84: Computational Historical Linguistics

Assigning languages to populations

in many cases, assignment is pure guesswork (based on geography)

PNG, Australia, sub-Saharan Africa, America, India

criteria:

geographic location (according to ASJP) ≤ 300 km fromcoordinates of cranial datafor islands (New Caledonia, Hebrides, Torres Strait, ...): Ethnologueinformationif cranial data contain ethnic information, these override geography

Han North is mapped to Mandarin, even though several Turkiclanguages are closeronly Khoisan languages are considered for South Africa

number of candidate languages assigned to single populationsrange from 1 to 535 (for Madang/PNG)

average: 37 languages per population

Gerhard Jager Computational Historical Linguistics November 3, 2016 84 / 99

Page 85: Computational Historical Linguistics

Assigning languages to populations

●●

● ●

● ●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

−50

0

50

−100 0 100 200long

lat

Candidate languages per population

Gerhard Jager Computational Historical Linguistics November 3, 2016 85 / 99

Page 86: Computational Historical Linguistics

Assigning languages to populations

0

200

400

0 50 100populations

num

ber

of c

andi

date

lang

uage

s

Gerhard Jager Computational Historical Linguistics November 3, 2016 86 / 99

Page 87: Computational Historical Linguistics

Assigning languages to populations

in most cases, candidate languages belong to the same languagefamilies

maximum number of candidate families: 46 (for East Sepik, PNG)

mean number of candidate families per population: 3 (median: 1)

0

10

20

30

40

0 50 100populations

num

ber

of c

andi

date

lang

uage

s

Gerhard Jager Computational Historical Linguistics November 3, 2016 87 / 99

Page 88: Computational Historical Linguistics

Assigning languages to populations

●●

●●●

●●

●●

●●

●●

●●

●●

●●

−50

0

50

−100 0 100 200long

lat

Candidate language families per population

in the sequel, the linguistic distance between two populations iscomputed as the average distance between the correspondingcandidate languages

Gerhard Jager Computational Historical Linguistics November 3, 2016 88 / 99

Page 89: Computational Historical Linguistics

Land-based distances

●●

● ●

●●

●●

●● ●

●●

●●

●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

−50

0

50

−100 0 100 200long

lat

following Atkinson 2011:Africa/Asia: CairoAsia/Europ: IstanbulAsia/Oceania: Phnom PhenAsia/North America: Bering StraitNorth America/South America: Panama

Gerhard Jager Computational Historical Linguistics November 3, 2016 89 / 99

Page 90: Computational Historical Linguistics

Correlations

correlations between land-based geographic distancesphenotypical/linguistic distances

●●

●●

● ●●

●●●

●●

● ●

●●●

●●

●●

●● ● ●

●●

● ●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●●

● ●●●

●● ●●●

● ●●

●●

●●

●●

●● ●

● ●

● ●

●●

●●

● ●

●●

●●

●●

●● ●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●●

●●

● ●●

●●

●●

● ●●●

●●

●●

● ●

●●

●●

● ●

●●●

●●

●●

●●●

● ●

● ●

●●

●●

●●

●● ●

●●

●●

●●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

● ●●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●● ●

●● ●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●●

●●● ●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●

●● ●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●● ●●

●●

●●

●●

●● ●

● ●

●●

● ●

● ● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●●

●●●

●●

● ●

●●

●● ●

●●

●●

● ●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●● ●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●

●●

●●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

● ●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

● ●

●●

●●●

●●●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

● ●

●● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

●● ●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

● ●

●●

●●● ●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●

●● ●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

● ●

●●

● ● ●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●● ●

● ●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●●

●●

●● ●

●●

●●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●●

● ●

●●●

●●●

●●

●●

●●

●●

● ●● ●

●●●

●●

●●

●● ●

●●

●●

●●

● ●

● ●

● ●● ●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●● ●

●●

● ●

●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ● ●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

● ●

● ●

● ●

●●

●●

● ●

●●

●●

● ●

● ●

●●

● ● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●

● ●

●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

● ●

●● ●●

● ●

●●

●●

● ●

●●

● ●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

●● ●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●●

●●●

●●

● ●● ●●

●●

● ●

●●

●●

●●

●●●

● ● ●

● ●

●●

● ●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●●

●●

● ●

●●

●●

●●

●●

●●

● ●●

● ●

●● ●

●●

●●

● ●

●●

●●

● ●

●●●

● ●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●● ●

● ●

● ●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●● ●

●●

●●

● ●

● ●

●●

●●

●● ●●

● ●

●●

●●●

●●

●● ●●

●●●

●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●●

● ●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●

● ●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

● ●

● ● ●●

●●

● ●

● ●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

● ● ●

●●

● ● ●●

●●

●●●

●●

●●

● ●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●● ●

●●

● ● ●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

● ●● ●

● ●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

● ●●

●●

●●●

●●

● ●

● ●

●●

●●

●●

0

20

40

60

0 10000 20000 30000 40000land−based distance

Who

le

● ●

●●●

● ●

●●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

● ● ●

●●

● ●

●●

●●

● ●

●●

●●●

●●

●●

●●

● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●●●●

●●

●●

●●●

●●

● ●

●●

●●

●●●

●●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

● ●

● ●●

● ●

●●

●●●

● ●

●●

●●

● ●

●●●●

●●

● ●

●●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

● ● ●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●●

●●

●●

● ●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

● ●●

●●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●● ●

●●

● ●●

●●

●●

●●

●●●

●●●

●●●

●●●

●●

●●

●● ●

●● ●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

●●

●●

● ● ●●

●●

●●

●●

●● ●●

● ●

● ●

●● ● ●

●●

●●

● ●●

●●

●●●

●●●

●● ●

●●

●●●

●●

●●

● ●

●●

● ●

●●●

●●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●● ● ●

● ● ●

●●

●●

●●

●●

●●

●●

●●

● ● ●

●●

●●

●●

● ●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●● ●

● ●

●●

●●

● ●

●●

●●●

●●

● ●

●●

●●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

●●

●●

●● ●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

● ●

● ●

● ●●

●●●

●●

● ●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

● ●●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

●●

●●

● ●●

●●

●● ●

●●

●●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

● ●

● ●

● ●

●●

●●

●●

●●●

●●

●●

● ●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●●●

●●

● ●

●●

● ●●

● ●

●●

● ●● ● ●●

●●

●● ●

●●

● ●●

●●

●● ●●●

●●

●●

●●●

●● ●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

● ●

● ● ●

●●●

● ●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●●

●●

●● ●

● ●

●●

●●

●● ●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

● ● ●

●● ●

●●

● ● ●

●●

● ●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●●●

●●

● ● ●

●●●

●●

●●

● ●

●●

●●

●●

●● ●●

●●

●●

●●

● ●

●● ●

●●

● ●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

● ●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●● ●

● ●

●●●

●●

● ●●

●● ●

●●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●●

●●

●●

● ●●

●● ●

●● ●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

●●

●● ● ●●●

● ●

●●

●●

● ● ●

●● ●

●●

●●

●●

●●

● ●

●● ●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●●

● ●

● ●

●●

●●

●●

● ●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

● ●

● ●

●●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●●● ●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

● ●

● ●

●●●

● ●

● ●

●● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

● ●●●

● ●

●●●

● ●

●●

●●

● ●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●● ●

●●●

●●

● ●

●●

●●● ●●

●●

●●●

●●

●●

●●

● ●

● ●

●●●

●●

●●

●● ● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●●

●● ● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●●

● ●

●●

●●

● ●

●●

●●

● ●

●●●

●●

●●

●●

● ●

●●●

●●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●●

● ●●

● ●

●●

●●

●●

●●●

●●

●●

● ●●

●●

● ●

●●

●●

●●

●● ●

●●

●●●

●●

●●●

●●

● ●●

●●

●●

● ●●

●●

●●●

●●

●●

●●

●● ●

●● ●

● ●●

● ●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●●

● ●

●●

● ●

●●

● ●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●● ●●●

●●●

●●

● ●

●● ●

● ●

● ●●●

●● ●

●●

●●

●●

●●●

● ●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●●●

●● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

● ●

●●

● ●

●●

● ●●

●●

●●● ●

●●

●●

●●

● ●

●●

●●

● ●●

●● ●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●● ●

●●

●●

●●

●●●

● ●

●●

● ●●

●●●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ●

●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

● ●●

●●

●●

●●

● ●●

●●

●●

●● ●

●●●

●●●

●●

● ●

●●

●● ●

● ●

● ●

● ●

●●

●●

● ●●

● ● ●

●●

● ●

● ●

●●●

● ●

●●

●●

●●

● ●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●●

● ●

● ●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

● ●

●●

●●

●● ●

●●

●●

●●●

● ●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●●

●●

●●

●●●●

● ●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●●●

●●

● ●●

●●

●●

●●

●●●

●● ●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●● ●

●●

● ●

●●

●●

●●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●● ●●●

●●

●●

●●

●● ●

●●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●

● ●

● ●

●●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●● ●●●

● ●

●●

●●

●●

●●

● ●

●● ●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

● ●● ●

●●

0

10

20

30

0 10000 20000 30000 40000land−based distance

Face

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●●

●●●

●●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

● ●●

●●

●●

●●●

●●

● ●●

●●

● ●●

●●

●●

● ●

● ●

●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●●

●●

●●

●●

●●●

●●

●●

● ●

● ●

●●

●●

●●

● ●●

●●

●●

●●●

● ●

●●

●●

●●●

●●

● ●

●●

● ●

● ●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●● ●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

● ●●

●●

●●

● ●

●●

●●

●●

● ●●

●●

● ●

●●

●●

● ●

●●

● ●

●●

● ●

●●

● ●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●● ●

●●

● ●●

● ●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

●● ●●

●●

●●

●●

●●

● ●

●●

●● ●

●●

● ●●

●●●

●●

● ●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

● ●●

● ●

● ●

●●● ●

●●

●●

●●

●●●

● ●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

● ●

● ●

●●

●●

●●

● ●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

● ●

●● ●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●

●●●

●●

●●

●●●

●●

●●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●● ●

●●

●●

●●

●● ●●

●●

●●●

● ●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

● ●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ●●

● ●●

●●

●●●

● ●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●●

●●

●●●

●●

●●

●●

● ●

● ●

● ●

●●

●●

●●●●

●●

● ●

●● ●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●● ●

●●

● ●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●●

●● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●●

●●●

●●

●●

● ●

●●

●●

●●●

●●●

●●

●●●

●● ●

● ●

●●

● ●

●●

● ●

●●

● ●

● ●

●●

●● ●

●●

●●●

●●

●●●

●●

● ●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●● ● ●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

● ●

● ●

●●

●●

●●

● ●

●●●

●●

●●

●●

●● ●

●●

●●●

●●

●●

● ●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●●

● ●● ●

●●

●●

●●

●●

● ●●●

●●

●●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

● ●

●● ●

●●

● ●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

● ●

●●

●●

●●

●●●

● ●● ●

●●

● ●●

● ●

● ●

●●

●●

● ● ●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●

●●

● ● ●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●●

●●

●●

●●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

● ●

●●●

●● ●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●● ●

●●

●●

● ●

●●●

● ●

●●●

●●

●●

● ●

●●

● ●

●●

● ●●

●●

● ●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

● ●●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●● ●●

● ●

● ●

● ●●

● ●

●●●● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●● ●

●● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●● ●

● ●

●●

●●

●●

● ● ●

● ●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●●

● ●

●●

●●

●●

●●●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●●●

● ●

●●

●● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●● ●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

● ●

●●

● ●●

●● ●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

● ●

●●● ●

●●

● ●

●●

● ●●●

● ●

●●

●●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

● ●

● ●

● ●

●●

●●

● ●

●●

● ●

●●●

●●

●●

●●

●●

● ●

●●

●●●

● ●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

● ●

●●

● ●●●

●●

● ●●

●●●

●●

●●

● ●

●●

●●

●● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●● ●

● ●● ●

●●

● ●● ●

● ●

●●

●●

●●

●●

● ●

●●

●●

0

10

20

30

40

0 10000 20000 30000 40000land−based distance

Neu

rocr

aniu

m

●● ●

●●

●●●

●●● ●

●●●●

● ●●

●●●

● ●●●●●

●●● ●●● ●●

● ●●●

●● ●● ● ●● ●

●● ●●●● ●● ●

●●

● ●●●●● ●

●●● ●

● ●●

●●

●●●

●●

● ● ●●●●

●●●

●●●●

●●

● ●● ●●

● ●●

●●

●●●

●●●

●●●

●● ●

●● ● ●

●●●

●● ●

●●

●●● ●● ●●● ● ●● ●●

●● ● ●●

●● ●●

●● ●●● ●●

●● ●●

●●

●● ●●

● ●● ●●●

●●● ●●

●● ● ● ●● ●

● ●●

●● ●●

● ● ●●●

●●●● ●

●●● ●

●●

●●

●● ●●●● ●● ●

●●

● ●

●●

●●

● ●●●●

●● ●

●●●

●●

● ●●

●● ●●●

●●● ●●

●●●

●●

●●

● ● ●●

●●

● ●●

●● ●●●

●● ●

●●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●●

●●

●●

● ●●●●●●

●●●●● ● ●●

● ●●●

●●

● ●●● ● ●● ●●● ●

●● ●●●

●●● ●●●● ●● ●●●

●● ●● ● ●●●

● ●● ●●● ●● ●●● ●● ●● ●●●● ● ●● ●

● ● ●● ●●

●●●● ●●

●●

●●●

●●● ● ●● ●● ● ●●●● ● ● ● ●● ●

●●●

●●●●●● ●● ●

●● ● ●●● ●●

●●●●● ● ●●● ● ●● ●● ●●● ● ●● ●

●●●● ●

● ●● ●●●● ●●● ●● ●●●

●●● ●

●● ● ●● ● ● ●●●

●●●●

●● ● ●●●●● ●●

●● ●●●● ●

●●● ●●

● ●●●● ●● ●●●● ●● ●●

●●

● ●●●

●● ●

●●●● ●● ● ●● ● ●

●● ● ● ● ●●●● ● ●

●●

●●● ●●

● ●●

●●

●●● ●● ●

●● ●● ●●● ●● ●●●● ●●●● ● ● ●●

● ● ●●●● ●● ●● ●● ● ●● ●●● ●●● ● ●●●●●

●●● ●●●

●●●●●

● ●●●

●●● ●● ●● ● ●

●● ●● ●

●●

●●● ●● ● ● ●

● ●●●●

● ●●

●●●● ●●

● ●●● ●● ●● ●●

● ●● ●●● ●● ● ●●

●●

●●●●

● ●●●● ●● ●●●

● ●●

●●

● ●●

●● ●●

●●●

●●

●●●

●●● ●

● ●● ●● ●

●●

●●●

●● ●●● ● ●●●●● ●

●●● ●●

●● ●●

● ● ●●●●● ●● ● ●●● ●●● ● ●●● ●

● ●●● ●●

●● ●

●●●

●● ●●

●● ●●● ●● ●● ●● ●

● ●●

●●

●●● ● ●●●

● ●●● ●● ●●●

●●●

●●●

●● ●● ●● ●● ●●●● ●

●●● ●●

●●

●●●

● ●

●●●

● ●●●

●●

●●● ●●● ●●●

●●

●●

●● ●

●● ●

●●

●● ●●●●

●●●

●●

● ●●●●● ●

●●

● ●●

●●●

●● ● ●● ●●●

●●

●●

●●

●●●●

●●

● ●●

●● ●

● ●● ●●● ●

●●●

● ● ●

●●●

● ●●●●●

●●● ●

●● ●●● ● ●●

●● ●

●● ●

●●

●● ●●●● ●● ●

● ●

● ●●●●● ●

●●

●●

● ●●

●●

●●●

●●

● ●●●●

●●

●●

●●

●●●● ●

●●

● ●●

●●

●● ● ●●

●●

●●

●●

● ●●

● ●●●● ● ●

● ●●●

● ● ●●● ●

● ●●

●●

●●

●●

● ● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●● ●●

●●●

● ●●

●●● ●●●

● ●●

●● ●

●●●●

●●

●●

●●

●●●

● ●● ● ●●● ● ●● ●●●● ●

●●● ● ●● ●● ●

●● ● ●● ●●

●●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●●● ● ●

● ● ● ●●●● ●● ●●●

● ●●● ●●● ●●● ●

●●●● ● ●●● ●●●● ●●●●

●● ●●●● ●●

●● ●● ● ●●● ●● ●●●●

●●●● ● ● ●

●●● ●●

● ●●

●● ●●●●

●●

●● ● ●

●●● ●●

●●●●

●● ● ●●●

● ●●

● ●●

●●● ● ● ●●●● ●

●●●●

●●● ●●

●●●

●●

●●

●●● ●●●●

●●●●●

●●

● ●● ●●

● ●●● ●

● ● ●● ●● ●●

●●

●●●●●

●●●

● ● ●● ● ●

●●

●●

●●●●● ●

●●

●●●

●●

● ● ●● ●●

● ●●

●●●

●● ●●● ●●

●●

●●● ●●● ●●● ●

●● ●●●● ●●● ●●●●

● ●●

●●

●●●●

●●

●●

●●● ●

●●

●● ●●

●●

●●●

●●

● ● ●●●

●● ●●●● ● ●● ● ●

●●●● ●● ●●

●●● ●

●●

●● ● ●● ●

● ●● ● ●●●● ●●●●

●● ● ●●● ●●● ●●● ●●

●●●● ●●● ● ●●● ●

● ●●●●

●●●●● ● ●●●

●●

● ●●● ●● ● ● ●● ●● ●●

●●●

● ●●

●●

●●● ● ●● ●●●

● ●●

●● ●●

● ●●● ●

●●

●●

●● ●

●● ●● ●

● ●

●●

●● ●

●●

●●

● ●

●●

●●●

●●●●●

●●

● ●●

●●

● ●● ●● ● ●●●●

● ●● ●●

●●●● ● ●

●●●

● ●●

● ●●

● ●●●

●●

● ● ●●● ●●

● ●●● ●●

●●

●●

●●●

● ●

●●

●● ●

●●●●

● ●

●● ●● ●

●●●●

● ●● ●●●

● ●●●

●●

●●●

● ●

●●●

●●●● ●

● ● ●

●●●●

● ●

●●●● ●

● ●●●

● ● ●

●● ●●● ●

● ●● ●● ●● ●●● ●● ●●●● ●

●●● ●

●●●

●●

●●●● ●● ●

●●

● ● ●●

●●●●●

● ●

●●●

● ●●●●●●● ●

●●

●●

●●●●

●●●

●●● ●●

●●

● ●●

●●●●●

●●

●●

●●

● ●

●●

●●●●

●●● ●

●●●

●● ●

● ●●●

● ● ●●●

●● ●

●●

● ●●●● ●● ●● ●

●●

●●●

●● ●

●●

● ●●

●● ●●●● ●

●● ●

●●●●●●

●●

● ●● ●●●

● ●●

● ●●●●

● ●●

●●●●●●●

●●

●●

●●

●●● ●●

●●●

●●

● ●●● ●●

●●

●●●●

●● ●●● ●● ●●● ●●● ●●

●● ●

●● ●● ●

●●

●●●

●●●

●● ● ●●●

●●

●●

●●● ●●●●● ● ● ●●

●●

●● ●●●

●●●

● ●●

●●●

●●

●●

● ●●● ●●

●●

● ●

●●●

● ● ●●

●●

●●

●●

● ●●●● ●●

●●

● ●●

●●

●●

●●

●●●

● ●●●

●●

●●

●●

● ●●●●

●●

●●

●●

●● ●

●●● ●

● ●●●

●●

●●● ●●● ●●

●●

●● ●●

●●●

●●

●●

● ●●●●

●●● ●●●

●●●

●●

●●

●●

●●● ●

●●

●●

●● ● ●●●

● ●●

●● ● ●●●●●

●●

●●

● ●●●

●●●

● ●● ●●

●●●

● ● ●●●

● ●

●●

● ● ●●● ●●

● ●●●

●● ●

●● ●●

● ●

●●

●● ●

●●●● ● ●

●●

● ●●●

●●●●

● ●● ●●

● ●●

●●●

●●●●

●●

●● ●

●● ●● ● ●

● ● ●●●● ●

●●●●

● ● ●●● ●●

● ●●● ●●●

●●

● ● ●

●● ●● ●

●●

●●

●● ●

●●●● ● ●

●●● ●● ●●●●●

●●

● ●● ●●●●

● ●● ●● ●●●●● ● ● ● ●● ●●●● ●● ●

● ● ● ●●● ● ●●

●●●● ● ● ●●● ●●● ● ●●● ●●

● ●● ●● ● ● ● ●● ●● ●● ●● ● ●

● ●●● ●● ●●●●●● ● ●● ●●● ●● ● ●●●●● ●● ● ●● ●●●●

●● ●● ●● ● ●●●● ● ● ● ●● ●●●● ●

● ●●

● ●●

●●

●● ● ●

●●●

●●● ●

●●

● ●●● ●●

●●●●●

●●

●●

●●

● ●●● ●●● ●●

●●●● ●●●

●●

●●●

● ●●

●● ●●

●●

●●

●●●

● ●●●

●●

●●

●● ●●●●●

●●

● ●●

●●

●●● ●● ●

●●

●●●●● ●

●●●

●● ●● ●●

●●● ●

●● ●●●

● ●●

●● ●●●● ●●

●●

●●

●● ●●

●●●

● ● ●●● ●●●●

●●●

● ●●

●●● ●

●●

●●

●●

●●

●●

●● ●●● ● ●

●●●●

●●

●●● ●●

● ●●● ●●

●●

●●

● ● ●

●● ●● ●

●●

●●

●● ●

●●

●●

● ●

●●

● ●●

●●●

●●

● ●● ●●●●

● ●●

●●

● ●●●

●●

●●●

●●●● ●● ●●

●●

●●

●● ●

● ●●● ●●●

●●

● ●

●●●

●●

● ●●

●●●

●●

●●

● ●

●●

● ●●●● ●● ●●● ●

● ●

● ●

●● ●●

●●●

●●

●●●

●●

●●●●

●●

●●●

●● ●●

●●● ●

●●

●●

●● ●● ●●● ● ●

● ●● ●●● ●● ●

●● ●● ●●● ●● ●

●● ● ●● ●● ● ●●

● ●●

●●●● ●● ●●●

● ●●● ●

●● ●●● ● ●

●●● ●

●●●

●●●●

●●●●

●●

●●●● ●● ●

●●

●● ●

● ● ●●● ●● ●

●●

● ●

●●●● ●●●

●●●

● ●●

●● ●●● ●●

●●● ●● ●● ● ●

●● ● ● ●●●●●●

●● ●●

●●● ●

●● ●●●

●●

●●● ● ●●● ●●●●

●●

●●

●●

●●

●●

●●

●● ●

●● ●● ●●

● ●●

●●●●

●● ●●

●●● ●●

●● ● ●● ●

● ●● ● ●●●● ●●●●

●● ●

●●● ●●● ●●● ●

●●●● ●●● ●●

●● ●

●●

●●

●●●●● ● ●●

●● ●●● ●● ● ● ●● ●● ●●

● ●●●

●●●

● ●●● ●● ●● ●●●

● ●● ●●

● ●

● ●●● ●

●●

●●

● ●●●● ●● ●●● ●

● ●●

●●

● ●●

●●

● ●●

●●

●●

●●● ●

●● ●

●●

● ●● ●●●

●●● ●● ●●●

●●● ●

●●

● ●● ● ●● ●●

● ●● ● ●●●● ●●● ●●● ● ●●● ●●● ●●●

●●●

●●●● ●●● ● ●●● ● ● ●●●●

●●●●● ● ●●●● ●●

●●● ●● ● ● ●● ●● ●●

●●

● ● ●● ●● ●●●

●● ●● ●● ●● ●●●

●●●

●●

●●

● ●

●●

● ● ●●●● ●● ●● ●

● ●●

● ●●●●

●●

●●

●●

●● ●●

●●

●●

●● ●●●

●●

●● ●● ● ●●

●● ●●●●● ● ● ●●● ●●● ● ●●● ●●

●●● ●● ● ● ● ●

●●● ●● ●

●●

●● ●●●

●● ●●●●●● ● ●

●●●● ●● ●●●●●● ●●

● ●● ●●●●●

● ●● ●● ●●●●

●● ● ● ●● ●●●● ●●●

●●

●●

●●

●●

●● ●●●●

● ●●

●●

● ●

● ●●

● ●●● ●

● ●●● ●●●

●●

●●●

●●●

●●

●●

● ●●●

● ● ●●●

●●●

● ●●

●●●

● ●●●

● ●●●

●● ●●●● ● ●●

●● ●● ●

●●●

● ●● ●

●● ●●●

●●●●

●●● ●

●●● ●●●●●●● ●

●● ●● ●

●● ● ●

●●●

●● ● ●●●● ●●●●

●● ●

●●

●●

●● ●●

●● ●●●● ●

●● ●●●● ● ●●

●●●● ●

●●●

● ●● ●

●● ●●●

●●●●

●●

● ●●● ●● ●

●●●●● ●●

● ●● ●●

● ● ●●●●

●● ● ●● ●●●●●●

●● ●

●●

●●

●● ●●

●● ●●●● ●

●● ●●●

●●

● ●●●●

●●

● ● ●

●●●●

●●

●●

●●

●●

● ●

●●●

●●●

●●●

● ●●

●●

●●

●● ●● ●

●●●●

● ●

●●●

●●●● ●

●● ●●

●●

●● ●

● ●● ●●●

●● ●●

●●

●●●●

● ●●●● ●

●●● ●●

● ●

●● ●

●●●

●●

●●

●●

●●

●●

●●●

●●●

● ●●●

●● ●

●●

● ●● ●● ●●●

●●● ●●

●●

●●● ● ●● ●

● ●●

● ●●●● ●●● ●●● ● ●●● ●●● ●●● ●●

●●●● ●●● ● ●●● ● ● ●●●●

●●●●● ● ●●●

● ●● ●●● ●● ● ● ●● ●● ●●

●●●

●● ●●

●●●● ●●

● ●● ●●

● ●● ●● ● ●●●

● ● ●●●● ●

●● ●● ● ●●●

●● ●

●● ●●●

●●● ●●● ●●●

●●●

●●● ●●

●●

●● ●● ●● ● ●

● ●● ●

●●

● ●●

●●

●●

●●

●●

●●●

●● ●●●●

●● ●

●●● ●●

● ●●● ●●● ●

●●●

●●

●●

●●●

● ● ●●●

● ●●●●● ● ●●●●

●● ●●● ●● ●

●●●

●●

●●● ●●

● ●●

●●

●●

●●

●●

●●●

●● ●●●●

●● ●

●●● ●●

● ●●● ●●● ●

●●●

●●

●●

●●●

● ● ●●●

● ●●●●● ● ●●●●

●● ●●● ●● ●

●●●

●●

●●● ●●

●● ●● ●● ●●●

●●

●●

● ●● ●● ● ●●● ● ● ●●●● ●

●● ●●● ●● ●●● ●●

● ● ●●●● ● ●●● ●

●●● ●

●●● ●● ●

●●

● ●●●

● ●● ● ●

●●●

●●●

●● ●● ● ●●●

●● ●● ● ●

●●● ●●● ●●

●● ●●●

●●●●●● ●●

● ●●●● ●●● ●●

●● ● ● ●●●

● ●●●●●

●●●●

● ●●●●● ●● ●

● ●● ●● ●●●

● ●●●

●●●

●●

● ● ●

●●

●● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●● ●

●●●

● ●●

●● ●

●●●●

●●

● ●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●●● ●●● ●

●●

● ●●● ●●●

●●● ●

●●●●

●●

●●

●●●

●● ●●●

●●●●● ● ●●●

●● ●●● ●

● ●

● ●● ●● ●●

●● ●● ●●● ●● ●●● ● ●● ●● ●

●●

● ● ●●●●● ●● ●●

●● ● ●● ●

●● ●●●● ●

●●● ● ●●●

●●●●

●●●●

●● ●

●●● ●● ●●●

●● ●

● ●● ●●● ● ●● ●

● ●● ● ●●●

● ●●● ●●●

● ●●● ●●● ●●●

●●●

●●●● ●●●

●●

●● ● ● ●●●

●●●●● ● ●●●

●● ●●● ●● ● ● ●● ●● ●●

●● ●●

●● ●● ● ●●●●

●● ●

●● ●●●●●

●● ●● ●●

●● ● ●●

●● ● ●

● ● ●●●●● ●●

● ●● ●●●●

●● ●● ●

●●●●

● ●● ● ●

●●●● ●

●●

●● ● ●

● ● ●●● ● ●●●

●●

● ●●●

● ●●●

●● ●●

● ●●●

●●

●●

●●●

● ●

●●● ●●●●●

●●

●●

●●

●● ●●●

● ●● ●● ●

●●

●●● ●●●

● ●●

● ●●

●●● ● ● ●●●● ●

●●●

●●●

●● ●●

●●

●●●

● ●●●●

●●●●●

●●●

●● ●●

● ●●●●

●● ● ●● ●●

●●●

●● ●● ● ● ● ●

●●● ●● ●

●● ●

● ●●●●● ●

●●●●● ● ●

●●●● ●● ● ●●●●●

●●● ●● ●●●●

●● ●● ●● ● ●●●● ● ● ● ●● ●●●● ●● ●

● ●

●● ●●●●

●● ●●

●●● ●●● ●

● ●● ●●

●● ●● ●

●●

●●

● ●●●●

●●● ●● ●●●●●

●●●

●●●●●● ●

● ● ● ●●●

●●● ● ●

● ●

● ● ●

●●

●● ●

●●

●●

●●

●● ●

●●

●● ● ●

●●

● ●

●●

●●●●

● ●●

● ●● ●●●●

●●●

●●

●●●●

●●

●● ●●●● ● ● ●

● ●●●●● ●●●

●● ●●● ●●

●●

●● ●●

●●●● ●

●●●

●●●

●● ●

●●

●●●●●

●●

●●

●●

●●● ●●

● ● ●● ●● ●●

● ● ●

●●

●● ●

● ●

●●

●● ●

●●

●●

●●●

●●

●●●●

●●

●●● ●

●●

●●●

●●

●●●●●

●● ●

●● ●●●● ●

●●

●● ●● ●

● ●

●●

●●

●● ● ●

●●● ●●

●●

●●●

●●

● ●●●

●●●● ● ●●●

●● ●

● ●● ●

●●●

●● ●●

●●

●●●●

●● ● ●●●

●●

● ●●

●●

●●

●● ●●●

●●●

●●

●●

●●

●●●●●

●●●

●● ●●●

●●

●● ●● ●● ●●

●● ●● ●● ● ●

●● ● ● ●●●● ●

●● ●●

●●● ●●● ●●● ● ●

●●● ● ●●● ●

●●● ●

●●●

●● ●

●●

● ●●

●● ●● ● ●

● ●● ●●● ●

●● ●●● ● ● ●

●●● ●●●● ●●

● ●●●●● ●

●●

●●●●● ●

●●● ●●

●● ●●●

●●● ●●●

●●● ●● ●

●● ●● ● ●● ●●●

●● ●

● ●

● ●

●●

●● ●

●●●●● ●

●●● ●

● ● ●●●●

●●

●●● ●●●●

●●

●●● ●

●●●●● ● ● ●● ●●●● ●●

●●

●●●● ● ●

●● ● ●● ●●●

●●

● ●● ●●

●●

● ●●●●● ●

●●● ●●●●

●●●●

●●●●

●●

●●●●

●● ●● ●

●●●●● ●

●● ●●● ●● ●● ●

●●● ● ●●

●● ● ●

● ● ●●●

● ●●●

●● ●●●●●

●●●

●●

●●● ● ● ● ● ●● ●●

●● ●

●●

●● ●●●

● ●● ●● ●

●●●

●●●

●●●

● ●●●

● ●●

●●

●●●

●●●

●●●

●●●

●● ●●

●● ●●●

●● ●

●●● ●

●●●● ●●●

● ●

● ●●●

●● ●●

●● ●

●●

● ● ●●● ●●

●●

●●

●● ●●

●●●●

●●●

● ●●●

●●●

●● ●●●●

●● ●●●

●●

●●

●●

● ●●

●●

●● ●

●●●

●●●●●

●●●●●

●● ●●●

● ●●●

●●●● ●

● ●

●● ●

●●

●●

●●

●● ●

●● ●

●●●

●●

●●●

●●

●●

● ●●

●●●●

● ● ●● ●●

●●

●● ●●● ●● ●●●● ●

● ●● ●

●●● ● ●●●

●●

●● ●● ●●●●● ●

●●

●●●

●● ●

● ●● ●●●● ●● ●●●●

●● ●●●●

● ●●

●●

● ●

●●●

●●●

●●

●●

●●

●● ●●● ●

●●

● ●● ●●

●●

●●

●● ●

●●●

● ● ●

●●● ●● ●

●●●

●●

●●

● ●● ●●●●

● ●● ●● ● ●●●

● ● ● ● ●● ●●●● ●

●● ●

●●●●

● ●

●●●

●● ●●

●●●●

●●

●●● ●●

●●

● ●● ●●

●●●●●

● ● ● ●● ●●●●●

●● ●

●●●●

● ●

●●●

●● ●

●●●●●

●●

● ●● ●●●●

●●

● ●● ●

●●●●

●● ● ●● ●

●●● ●●● ●

●●

●● ● ●

●●● ●

● ●●●●●

● ●● ●●●●

● ●● ●● ●●●●●

●●

●●● ●●●● ● ●● ●

●● ● ●●●●● ●●

●●

●●●● ●

●●●

●●●● ●

●●● ●●●●●● ●● ●● ●

● ●● ●●● ● ●

●●

●●● ● ●

● ●●

●● ● ●●●●●

●● ●

●●●●●●

●● ●

●●

●●●● ●● ●

●● ●●

● ● ●● ●●● ●● ●

●● ●●● ● ●●●● ● ●●● ●

●●● ●●●

● ●● ●●

●● ●●

●● ●● ● ●

● ●● ●●

●●●● ●

●● ●●●

●●●

●●

● ●●●●

●●● ●●

●●●

● ● ●

●●●

●●

●●●

●● ● ●● ●●● ●●● ● ●●●● ● ●●● ●●

●●

●●● ●● ●●●● ●● ●● ●

● ●●●

●● ●●

●● ●●

●●

●● ● ●●●

● ●●

●● ● ●●●●●

●●

●●

● ●●●

●●●

● ●● ●●

●●● ● ●●●●● ●

●●

● ●●●

● ● ●●● ●●

●●

●●

●● ●●

●●●●

●●●

● ●●●

●●●● ●

●●

●●

●●

●●

● ●●●●

●●

●●

●●● ●●● ●

●●

● ●●

●●

●● ● ●●●● ●●● ● ●●● ● ● ●●●

●●●●●● ● ●

●●● ●

● ●●● ●● ● ● ●● ●● ●●● ●●● ●●

●●

●●●● ● ●●●●●●●

●●

●●

●● ●●●●

●●●●

●●● ●● ●

●●●●

●●

●●

●●●

●●

●●

● ●●

●●

● ●

●●

● ●

●●●

● ●●

●●

●●

●●

●●

●● ●

●●● ●●

●●

●●

●●

● ●● ●●●

●●

●●●

●● ●●●●

●●

●●●

●●

● ●●●●●

●●●

●●●●

●●

● ●● ●●

●● ●

●●

●●●●

●●●● ● ●

●●●●●●

●● ●● ●

●●●●

● ●● ●● ●●

● ●

●●

● ●

●●● ●●

●●●

●●

●●

●●●●

●●

● ●● ●●●

●●

●●●

● ●●●

●● ●

●●

●●

●●

●●

●●●

●●

●● ●●

●●●

●●

●●

● ●●●

●●

● ●●

●●●

●●

●●●

●●

●●

●●●●

●●

●● ●

●● ●

●●

● ●● ●● ●●

● ● ●●

● ●●●●

●●

●●

●●●

●●

●●● ●● ●●

●●

●● ● ●●●

●●

●●

●●●

●●●

●●

● ●●●●

●●●

●●●

● ●● ●●●●● ●●

●●● ●

● ●● ●● ●●●● ●

●●

●●● ●●

●●

●●

●● ●

●●● ●

● ●

● ●● ●●●●●

● ●●

●●●●

●● ●● ●

●●●● ● ●

● ●● ●

●●● ●● ●● ●●●●

● ●● ●● ● ●●●●

●● ● ●● ●●●● ●● ●●●

●●

●●● ●●

●●●

● ●●●

●● ●●● ●●●●● ● ●●●

●●

●●● ●● ●

● ●● ●● ●●

●●●●● ● ●● ●● ●

●●●●●

● ●●● ●●

● ●● ●●●

●●● ●●●● ●●

●●●

● ●●●

●● ●●

●●● ●● ●●●●

●● ●●●

● ● ●●

● ●● ●●

●●●● ●

●●

●●●●●

●● ●●●

●● ●●

●●

●● ●

●●

● ●● ●●

●●● ● ●

●●

● ●

●●●●

●●

●●● ●

●●

●●

●●

●●●

●●●●●●

●● ●

●●● ●●● ●

●●● ●

●●●●● ●

● ●●●

●●

● ●● ●

● ●●

●●

● ●●● ●● ● ● ●● ●● ●●

●●●

●●

● ●● ●●

●● ●●●● ●● ●

● ●● ●●●● ●●● ●

●●● ●●

● ●● ●

●●

●●● ●●

●● ●●●

●●● ●●

●● ●●●

● ●● ●●

●● ●

● ●● ●●●

●●

●●● ●● ●●

●● ●

●●

●● ●●● ● ●● ●●

●●● ●● ●● ●●● ●● ●●

0.25

0.50

0.75

0 10000 20000 30000 40000land−based distance

Lang

uage

Gerhard Jager Computational Historical Linguistics November 3, 2016 90 / 99

Page 91: Computational Historical Linguistics

Correlations

correlations between land-based geographic distancesphenotypical/linguistic distances

determined via Mantel test

(Spearman) correlation

Whole 0.399 (10−4)Face 0.250 (10−4)Neurocranium 0.457 (10−4)Language 0.246 (10−4)

Gerhard Jager Computational Historical Linguistics November 3, 2016 91 / 99

Page 92: Computational Historical Linguistics

Correlations

Correlation of linguistic distances to various cranial distances

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●

● ●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●● ●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0

20

40

60

0.25 0.50 0.75language

Who

le

●●

●●

●●

●●●

●●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●●

●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●●

●●●

●●

●●

●●

●●

●●●

● ●●●

●●

●●

● ●

●●●

●●

● ●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●●

●●

● ● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

●●

●●

● ●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

0

10

20

30

0.25 0.50 0.75language

Face

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●● ●

●●

●●

●●●

●●

●●

●●

● ●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●

0

10

20

30

40

0.25 0.50 0.75language

Neu

rocr

aniu

m

Gerhard Jager Computational Historical Linguistics November 3, 2016 92 / 99

Page 93: Computational Historical Linguistics

Correlations

Correlation of linguistic distances to various cranial distances

unconditional conditioned on geography

Whole 0.296(10−4) 0.222(10−4)Face 0.321(10−4) 0.276(10−4)Neurocranium 0.246(10−4) 0.155(10−4)

Gerhard Jager Computational Historical Linguistics November 3, 2016 93 / 99

Page 94: Computational Historical Linguistics

Correlations within language families

intra-family correlation of language with

Whole: 0.290Face: 0.200Neurocranium: 0.272

●●

●●

10

20

30

0.4 0.6 0.8linguistics distance

Who

le

●●

●●

●●

●●

0

5

10

0.4 0.6 0.8linguistics distance

Face

● ●●●

●●

●●

●●

0

5

10

15

20

0.4 0.6 0.8linguistics distance

Neu

rocr

aniu

m

Gerhard Jager Computational Historical Linguistics November 3, 2016 94 / 99

Page 95: Computational Historical Linguistics

Correlations across language families

inter-family correlation of language with

Whole: 0.139Face: 0.177Neurocranium: 0.120

●●

● ● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●● ●●

● ●

● ●

●● ●

●●

● ●

●●

●●

● ●

●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●

● ●

●●

●●

● ●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

● ●●

●●

●●

●●●

●●

●●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

● ●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

● ●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

●●

●●●

●●

●●●

● ●

●●

●●

●●

●●●●

●●

●●

●●

●● ●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●

●●

● ●

●●

●● ●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

● ●

● ●

●●

● ●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

● ●

0

20

40

60

0.80 0.84 0.88 0.92linguistics distance

Who

le

●●

●●

●●

●●●

● ●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●● ●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

● ●●

●●● ●

●● ●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

● ●●

●●●

●●

● ●

●●

● ●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

● ●

● ●

●●

●●

●●

●●●

●●

● ●

●●

● ●

●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●● ●

●●

●●

● ●

●●

● ●

●●●

●●

●●

● ● ●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●● ●●

● ●

●●

●●

●●

●●●

●●

●●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●●

●● ●● ●

● ●

●●

●●

●●

●● ●

●●●

●●

●●

●●

●●

●●●

●●

● ●

● ●●●

●●

●●

●●

●●

● ●

●●●

●●

● ●

● ●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

● ●

● ●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

● ●

●●

● ●

● ●

● ●

● ●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

● ●

● ●

●●

● ●

●●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●●

●●● ●

0

10

20

30

0.80 0.84 0.88 0.92linguistics distance

Face

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

● ●

●●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●●

●●

● ●

●●

●●

●●

●● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●●

● ●

●●

●●●

●● ●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

● ●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

● ●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●●

●●

●●

● ●

● ●

●●

● ●

● ●

●● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

● ●

● ●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

● ●

●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●

●●

● ●●

●●

●●●

●●●

● ●

●●

●●

●●

● ●

●●

● ●

●●

●●

●● ●

●●

● ●

●●

● ●

●●

● ●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●● ●

● ●●

● ●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●

●● ●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●● ●

●●

●●

●●

0

10

20

30

40

0.80 0.84 0.88 0.92linguistics distance

Neu

rocr

aniu

m

Gerhard Jager Computational Historical Linguistics November 3, 2016 95 / 99

Page 96: Computational Historical Linguistics

Separating language families

correlation of degree on non-overlap of the candidate languagefamilies of a population with

Whole: 0.365Face: 0.351Neurocranium: 0.299

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●●

●●

0

20

40

60

0 1same (0) vs. different(1) family

Who

le

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

0

10

20

30

0 1same (0) vs. different(1) family

Face

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

0

10

20

30

40

0 1same (0) vs. different(1) family

Neu

rocr

aniu

m

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●●●●●●●●

●●

●●●

●●

●●●●●●●●

●●

●●●●●●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●

●●●

●●

●●●●

●●●

●●

●●●

●●●

●●●

●●

●●●

●●●

●●

●●●

●●

●●●●●●

●●

●●●●●●●

●●

●●●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●●

●●●●

●●●

●●●●●●●●

●●●●

●●●

●●●●●

●●●●

●●●●●●●

●●●●●

●●

●●

●●●

●●●

●●

●●●●●

●●●

●●●●●●

●●

●●

●●●

0.4

0.6

0.8

0 1same (0) vs. different(1) family

Lang

uage

Gerhard Jager Computational Historical Linguistics November 3, 2016 96 / 99

Page 97: Computational Historical Linguistics

Aggregating language families

a population “belongs” to a given language family f if allcandidate languages for that population belong to f

the phenetic (Whole, Face, Neurocranium)/geographical distancebetween the families f1 and f2 is defined as the average distancebetween the populations belonging to f1/f2 respectively

the linguistic distance between f1 and f2 is the average distancebetween all languages assigned to populations that belong tof1/f2 respectively

Gerhard Jager Computational Historical Linguistics November 3, 2016 97 / 99

Page 98: Computational Historical Linguistics

Aggregating language families

aggregated correlations of language withWhole: 0.198 (p = 0.013)Face: 0.256 (p < 0.001)Neurocranium: 0.178 (p = 0.028)

partial correlations, conditioned on land-based distanceWhole: 0.141 (p = 0.089)Face: 0.219 (p = 0.003)Neurocranium: 0.116 (p = 0.155)

●●

●●

●●

●●

●●

●●

●●

●●

●●

10

20

30

40

50

60

0.82 0.84 0.86 0.88 0.90linguistic distance

Who

le

●●

●●

● ●●

●●

●●

10

20

30

0.82 0.84 0.86 0.88 0.90linguistic distance

Face

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

10

20

30

0.82 0.84 0.86 0.88 0.90linguistic distance

Neu

rocr

aniu

m

Gerhard Jager Computational Historical Linguistics November 3, 2016 98 / 99

Page 99: Computational Historical Linguistics

Considerations and hypotheses

• Evolutionary rate of change – Genes and neurocranium evolve slowly

– Language and face evolve faster?

• Depth of population history – Genes and neurocranium track deep history

– Language and face track recent history?

• Modes of transmission – Genes and neurocranium are vertically transmitted

– Language and face are horizontally transmitted?

• Selection on face and language?

Gerhard Jager Computational Historical Linguistics November 3, 2016 99 / 99