10
Chaos and Randomness Ryan Marshall and Max Proctor April 2, 2014 Abstract To explore correllations between outputs of chaotic mappings and com- pare them to known statistical theorems. 1 Independent Random Variables In our project we will reinforce some basic statistical theorems to allow us to contrast and compare independent random variables with sets of numbers from dynamical systems. We know that dynamical systems are prone to chaos (sensitive dependence on initial conditions), we wish to study the mappings of dynamical systems in order to get a better determine the behaviour of the sys- tem in the presence of chaos. We will begin our study with an understanding of randomness by considering sets of independent random variables. Although it may seem like the very defi- nition of random suggests that there is no correlation between sets of randomly generated numbers, mathematics can show that there are intimate relationships between these numbers. The ideas we will be studying do not give us a rela- tionship between two elements of the set, but rather describe the behaviour of the set of independent random variables. Consider the Law of Large Numbers (LLN), which states that the mean of a sample average converges almost surely to the expected value as the sample size increases to infinity. This theorem suggests that relationships between means of sample averages become ‘predictable’ for the long term. This appears to be in a stark contrast to chaotic dynamical systems. Sensitive dependence on initial conditions is on some level the idea that solutions are unpredictable in the long term. The proof for the law of large numbers is time-consuming and does not lead to broader understanding of the LLN so we will omit it. We have however taken advantage of computer simulation to aid us in gaining insight on the LLN. The idea behind our simulation was to generate random distributions (with some chosen standard deviation and mean) of different sizes, analyze each generated distribution and determine its mean, and then to plot each distribution’s mean 1

Chaos Randomness

Embed Size (px)

DESCRIPTION

Chaos Randomness

Citation preview

  • Chaos and Randomness

    Ryan Marshall and Max Proctor

    April 2, 2014

    Abstract

    To explore correllations between outputs of chaotic mappings and com-pare them to known statistical theorems.

    1 Independent Random Variables

    In our project we will reinforce some basic statistical theorems to allow usto contrast and compare independent random variables with sets of numbersfrom dynamical systems. We know that dynamical systems are prone to chaos(sensitive dependence on initial conditions), we wish to study the mappings ofdynamical systems in order to get a better determine the behaviour of the sys-tem in the presence of chaos.

    We will begin our study with an understanding of randomness by consideringsets of independent random variables. Although it may seem like the very defi-nition of random suggests that there is no correlation between sets of randomlygenerated numbers, mathematics can show that there are intimate relationshipsbetween these numbers. The ideas we will be studying do not give us a rela-tionship between two elements of the set, but rather describe the behaviour ofthe set of independent random variables.

    Consider the Law of Large Numbers (LLN), which states that the mean of asample average converges almost surely to the expected value as the sample sizeincreases to infinity. This theorem suggests that relationships between means ofsample averages become predictable for the long term. This appears to be ina stark contrast to chaotic dynamical systems. Sensitive dependence on initialconditions is on some level the idea that solutions are unpredictable in the longterm.

    The proof for the law of large numbers is time-consuming and does not leadto broader understanding of the LLN so we will omit it. We have however takenadvantage of computer simulation to aid us in gaining insight on the LLN. Theidea behind our simulation was to generate random distributions (with somechosen standard deviation and mean) of different sizes, analyze each generateddistribution and determine its mean, and then to plot each distributions mean

    1

  • against its size (i.e. vs n). In order to demonstrate the theorem one wouldexpect a plot that deviated from the expected value for small n and then as nincreases the sample mean will converge to the expected value.

    0 2000 4000 6000 8000 10000

    89

    10

    11

    12

    Law of Large Numbers

    n

    Mean

    The above graph demonstrates the simulation with values as follows:

    = 10

    2 = 3

    n(max) = 10000

    You can clearly infer that as n increases the mean tends to converge to = 10.

    The predictive nature of these sets of independent random variables begs usto investigate other relations between sets of independent random variables. Sowe would like to review quickly the Central Limit Theorem(CLT).

    Given X1, X2,. . . . is a sequence of i.i.d random variables, each with ex-pected value and 2. Let

    ni=1Xi = Si, the CLT states that for large n,

    Sn will approximately be a normal random variable with expected value n

    and variance n2. Then as a result we have P ( (Snn)(n)

    ) is approximately the

    standard normal distribution function. with the approximation becoming moreexact as n grows larger.

    This is more tenuous than one would think. The implication of the hypothe-ses is that this quantity Sn will be a normal random variable with expected

    2

  • value n and variance n2. The second part of the theorem is basically a corol-lary.

    To simplify things the CLT says that if you sum together numerous inde-pendent random variables from the same probability distribution then as youincrease the number of terms in the summand (i.e increase n), the summands

    converge to a normal distribution. This also implies that (Snn)(n)

    converges to

    the standard normal distribution.

    This is pretty useful. It gives you a way to easily compute many differentprobabilities from many different distributions.

    We attempted to demonstrate the CLT through simulation as follows. Firstwe generate many different distributions to create Sn, followed by calculation of

    the normalized Sn through the equation(Snn)(n)

    , then finally we computed

    the probability distribution function (PDF) for (Snn(n)

    s. From there we could

    compare the PDF to the known PDF of the standard normal distribution.

    PDF

    S_n

    Frequency

    -4 -2 0 2 4

    0200

    400

    600

    800

    I ran this one, with 10,000 Sns formed from 10,000 generated poisson dis-tributions. As you can see it is very close to the PDF given by the standardnormal distribution, reinforcing the validity of our simulation.

    From our review of theorems from statistics we can clearly see that there arepredictable trends in random numbers. Which will help us to develop our orig-inal question of whether or not mappings of chaotic dynamical systems exhibitsimilar behaviour and to guide our study in understanding what the mappingsdo correlate to.

    3

  • 2 Chaotic Maps

    We now consider the dynamical system T (z) = 4z(1 z). It is worth notingthat this is similar to the logistic equation x = rx(1 x). A good place tofirst consider this dynamical system is to look at the bifurcations of the logisticequation. Below is the relevant bifurcation diagram:

    We certainly expect to see chaos for the map T (z) = 4z(1 z) Settingz0 = 0.3 and iterating using zn = 4zn1(1 zn1) we get z1000 = 0.0401, but ifz0 = 0.299 then we get z1000 = 0.8439. This may not prove that there is chaosbut it definitely looks like that is the case.

    So if the map T (z) = 4z(1z) creates chaos, then maybe we can use this mapto compare chaos to independent random variables. We have already discussedthe strong law of large numbers, but do the zn of the map follow this law? Asthis map creates chaos we find that as n tends to infinity our values z0, ..., zn aredense in the unit interval [0, 1]. It is worth checking how chaotic this actuallyis. To do this we can compute the zn iteratively before summing all n of themtogether before dividing the sum by n; this would be the means of the zns. Thisis easy to do on a computer. We do this for z0 = 0.3421 (chosen arbitrarily).

    For n = 10: sum(z)/n = 0.556455392058685 For n = 1, 000: sum(z)/n = 0.515545540139079 For n = 100, 000: sum(z)/n = 0.50246319958991 For n = 10, 000, 000: sum(z)/n = 0.500034797296272

    We can see that as n tends to infinity sum(z)/n tends towards 0.5. In factwe get this same result for any z0 chosen on (0, 1) barring

    12 . With the zns

    seeming to be fairly random on the unit interval it is no surprise to expect amean of 0.5 for a large amount of them, and this result is analogous to thestrong law of large numbers for independent random variables. This is certainlya similarity between chaos and randomness.

    4

  • Again we use our recently aquired statistical knowledge to complete anothercomparison, this time with the CLT. To do this we generated 1000 randomvariables with = .5 and 2 = .25 in attempt to keep them within [0, 1] anduse them as our initial conditions. We then mapped each initial condition 1000times and from these sets we created our Sns (analagous to the CLT). Fromhere it was easy to create a histogram to see if it matched up with the PDF forthe standard normal distribution.

    CLT with Chaos

    holder

    Frequency

    -4 -2 0 2 4

    010

    2030

    4050

    60

    It is clear that even for fairly small n (1000), that the data is correllating veryclosely to that which we observed earlier while introducing the CLT.

    Another way we can compare chaos and randomness could be to plot znagainst zn+2 and xn against xn+2. For this example we produce 1000 binomialdistributed random numbers x1 to x1000 with parameters n = 10, p = 0.6. Thesenumbers are independent of one another and so we expect this to be reflectedwhen we plot xn against xn+2:

    The expectation for these random variables is n p = 6 so this diagram ap-pears reasonable. If we were to produce another 1000 binomial random numberswith the same parameters we would expect a slightly different diagram, despite

    5

  • still being centred on the expectation. We get a similar diagram if we do thiswith numbers generated from a different distribution (Poisson for example), thebiggest difference being that each diagram is roughly centred on the expecta-tion, as above.

    So do we get a similar graph if we do the same for zns outputted from themap T (z) = 4z(1 z)? One would expect the graph to appear less random each zn is derived from zn1, in other words they are not truly independent.Using the z0, ..., zn generated from using zn = 4zn1(1zn1) with z0 = 0.7884(again, chosen arbitrarily) zn plotted againstzn+2 looks like

    It looks like all of the points (zn, zn+2) fall in what is roughly an M-shape.In fact if we plot each of these points as a bubble without connecting them, wesee the M-shape much more clearly. We do this below for even larger n, heren = 1, 000, 000:

    This is very interesting indeed, each point (zn, zn+2) falls on this M-shapeperfectly. Thisdiagram certainly implies the dependence of the zn. This is instriking contract to what we witnessed for the case of the iid random numbers.

    6

  • It may now be of interest to plot zn against zn+1 for the chaotic case:

    Here we see that there is one maximum on this graph. Could this be linkedto the fact that zn+1 = T (zn)? We had two maxima in the previous case andso it may be worth noting that zn+2 = T (zn+1) = T

    2(zn). We can now checkhow many maxima occur when we plot zn against zn+3 (bearing in mind thatzn+3 = T

    3(zn)):

    Clearly our hypothesis regarding the maxima has broken down. What aboutplotting zn against zn+4? We get

    7

  • Now what happens if we make similar plots when the xn are iid randomnumbers? We have already seen what this looks like for xn plotted againstxn+2. It turns out that the graph is extremely similar when we plot xn againstxn+1 or even if we plot xn against xn+1000. Should this surprise us? Not really we have already outlined the fact that these numbers are independent. xn+1is not dependent on xn so clearly xn+2 is not dependent on xn.

    This outlines a major difference between chaos and randomness. Chaossometimes may appear random but it really isnt, and this would be due to thedeterministic nature of a map. We have sensitive dependence on initial condi-tions; but if we use exactly the same z0 to calculate z1000 then we will get theexact same value for z1000 every single time. Clearly this is not the same withindependent random numbers. Generating two iid random numbers will give ustwo different numbers with a probability extremely close to 1.

    We noted earlier that chaos seems to obey the strong law of large numbers.It would certainly be of interest to see if the same applies for the central limittheorem. Is chaos normally distributed? To check this we find z1,..., z10,000starting from z0, before computing

    z1+...+z10,00010,000

    . We do this 1, 000 times using

    different values of z0 (randomly chosen from a uniform distribution). We thenplot all of these values on a histogram.

    8

  • We can see that this histogram loosely follows the bell-shaped curve of anormal distribution. Chaos appears to satisfy the central limit theorem. Nowwhat happens when we do this for yn = f(zn) or yn = f(xn) where f is anarbitrary function? We find that the f(xn) are normally distributed for iid xn.To consider the chaos case we take f(s) = 3s2 2s + 7. Doing the same asabove but with

    y1+...+y10,00010,000

    we get

    Once again it looks like these values are obeying the central limit theorem a trait of random numbers.

    From the above it appears that chaos behaves similarly to randomness albeitin a more constrained way. We have seen that chaos obeys the strong law of largenumbers and the central limit theorem; indicating that the behaviour of chaosand randomness does settle down when we consider a large number of chaoticzn or random xn. The fundamental difference is that chaos is fully dependentof what has happened in the past; a stark contrast to random numbers whichare independent of one another. This was certainly apparent when plotting xnagainst xn+i and zn against zn+i for various i. We also noted that mapping thechaotic zn against zn+i was consistent with our constrained comment, whereas

    9

  • doing the same for random xn was certainly less constrained. Despite sharingsome similarities it is clear that chaos is not the same as randomness.

    10