Upload
klaus-frieler
View
306
Download
4
Embed Size (px)
DESCRIPTION
Talk held at the ESCOM 2009 in Jyäskylä
Citation preview
Melodic Complexity
Klaus FrielerUniversität Hamburg
Musikwissenschaftliches InstitutESCOM 2009, Jyväskylä
12.8.2009
Melodic Complexity
• Perceived complexity is a complex process generated within a signal/receiver system
• Hypothesis: Perceived complexity is a function of (objective) signal complexity
Melodic Complexity
• Idea: Test various algorithmical melodic complexity measures in psychological experiments
• If there are significant correlations, build a model
Melodic Complexity
Complexity algorithms for n-gram sequences:
• Entropies • Zipf complexity• N-gram redundancy
Algorithm Construction
• Given: Melody as onset/pitch sequences with metrical annotations
• Apply basic transformations• Here: pitches, intervals, durations,
metrical circle map (cf. later)
Algorithm Construction
• Main transformation: n-gram sequences, i.e. sequences of subsequences of length n
• Calculate histograms of n-grams• Here: n = 1, 2, 3, 4, variable
Entropies
•Entropies of n-gram distribution
•Norm by max. entropy
Zipf complexity
Zipf‘s law: The ordered sequence of term frequencies obeys a power law (k = rank):
h(k) ~ k-s
log h(k) ~ - s log k
Zipf complexity
Source: Wikipedia
Zipf complexity
• Ordered n-gram frequencies• Regression on log-log data with
slope s• Define c := 2s as Zipf complexity• s = 0 ⇒ c = 1, s = -1 ⇒ c = 0.5,
s = -∞ ⇒ c = 0
N-gram redundancy
• Number of distinct elements in a sequence is a simple measure of redundancy.
The more distinct elements the more „complex“
N-gram redundancy
• Let |n(s)| be the count of distinct n-grams in a sequence s of length N. Then
N-gram redundancy
• Extensions: Weighted sum of n-gram redundancies up to a fixed or variable nmax
Metrical Circle Map
Ex.: „Mandy“ by Barry Manilow
Experiments
– Two listening experiments with a total of 47 subjects
– Stimuli: 12 folk songs and 3 jazz saxophon chorusses, 9 melodies identical in both experiments
– Task: Judgement of melodic complexity on a scale from 1-7
Results
• Normal distributed, reliable judgements
⇒ Pooling of data from both experiments and
⇒ Using subject means for further comparisions
Results
42 complexity measures:– Note count– Metrical Markov entropies (0th, 1th
order)– Zipf complexities (int, pitch, dur)– N-gram redundancies (int, pitch,
dur)– Entropies (int, pitch, dur)
Correlations
Note countr = ,869**
Results
• Note count explains judgement nearly perfect!
• Calculate partial correlations for other measures ⇒ Only metrical entropies left
Correlations
0th order metrical entropy
r = ,934**r‘= ,837**
Correlations
1st order metrical entropy
r = ,944**r‘= ,867**
Correlations
Pitch entropy
r = ,132r‘= ,092
Linear Regression
• Stepwise regression of variables with highest correlation
• Corrected R2 = .929
zsubjmean = .345 * znotecount + .677 * zmeter1ent
Conclusion
• Good agreement with measured complexity
• 1st order Metrical Markov entropy shows highest correlation
• But: Note count explains most of all correlations
⇒ Rather simple complexity ?!
Conclusion
• No partial correlation with any pitch/interval based measure could be found
• Meter is the most important dimension…
Outlook
• We plan experiments with note counts kept constant
• Pretests show, that metrical entropies might be suited to predict „hit-potential“ of pop melodies
Thank you!
Metrical Intervals
Metrical Markov chain0th order
Metrical Markov chains1st order