520

Click here to load reader

Stochastic Processes - Ross

Embed Size (px)

Citation preview

STOCHASTICPROCESSES SecondEdition Sheldon M.Ross University of California, Berkeley JOHN WILEY &SONS.INC. New York Chichester Brisbane Toronto Singapore ACQUISITIONSEDITORBradWileyII MARKETINGMANAGERDebraRiegert SENIORPRODUCfIONEDITORTonyVenGraitis MANUFACfURINGMANAGERDorothySinclair TEXT ANDCOVERDESIGNAGood Thing,Inc PRODUCfIONCOORDINATIONElmStreetPublishing Services,Inc ThisbookwassetinTimesRomanbyBi-Comp,Incandprintedandboundby Courier/StoughtonThe cover wasprintedbyPhoenixColor Recognizingtheimportanceof preservingwhathasbeenwritten,itisapolicyof John Wiley&Sons,Inctohavebooksof enduring valuepublishedintheUnitedStatesprintedon acid-freepaper,andweexertourbesteffortsto thatend Thepaper inthisbookwasmanufacturedbyamillwhoseforestmanagementprograms include sustainedyieldharvesting of itstimberlandsSustainedyieldharvestingprinciples ensurethatthenumber of treescuteachyear doesnotexceedtheamountof newgrowth Copyright1996,byJohnWiley&Sons,Inc AllrightsreservedPublishedsimultaneouslyinCanada Reproductionor translationof anypartof thisworkbeyondthatpermittedbySections 107and108of the1976United StatesCopyright Actwithoutthepermissionof thecopyright owner isunlawfulRequestsforpermission or furtherinformationshOUldbeaddressedto thePermissionsDepartment, JohnWiley&Sons,Inc Library of CongressCataloging-in-PublicationData: Ross,SheldonM Stochasticprocesses/SheldonMRoss -2nd ed pcm Includesbibliographicalreferencesandindex ISBN0-471-12062-6(cloth alkpaper) 1StochasticprocessesITitle QA274 R651996 5192-dc20 PrintedintheUnitedStatesof America 1098765432 95-38012 CIP OnMarch30,1980,abeautiful six-year-old girl died. Thisbook isdedicatedtothememory of Nichole Pomaras Preface tothe FirstEdition Thistextisanonmeasuretheoretic introduction to stochastic processes,and as suchassumes a knowledge of calculus and elementary probability_In it we attempt to present someof thetheory of stochasticprocesses,to indicateits diverserangeof applications,and alsoto givethestudent some probabilistic intuition and insight in thinking about problemsWe have attempted, wherever possible,toviewprocessesfromaprobabilisticinsteadof an analyticpoint of view.Thisattempt,forinstance,hasledusto studymostprocessesfrom a samplepath point ofview. I would like to thank Mark Brown, Cyrus Derman, Shun-Chen Niu, Michael Pinedo,and Zvi Schechnerfortheir helpfulcomments SHELDONM.Ross vii Prefaceto the SecondEdition ThesecondeditionofStochasticProcessesincludesthefollowingchanges' (i)AdditionalmaterialinChapter2oncompoundPoissonrandomvari-ables,including anidentitythat can be usedto efficiently compute moments, andwhichleadstoanelegantrecursiveequationfortheprobabilItymass functionof a nonnegative integer valued compound Poisson random variable; (ii)Aseparate chapter(Chapter6)onmartingales,includingsectionson theAzumainequality;and (iii)Anewchapter(Chapter10)onPoissonapproximations,including both theStein-Chen method forbounding the error of theseapproximations anda methodforimprovingtheapproximationitself. Inaddition,wehaveadded numerous exercisesandproblemsthroughout thetext.Additions toindividualchapters follow: InChapter1,wehavenewexamplesontheprobabilisticmethod,the multivanatenormaldistnbution,randomwalksongraphs,andthecomplete match problemAlso, we have new sections on probability inequalities (includ-ingChernoff bounds)and on Bayes estimators (showing that they are almost neverunbiased).Aproof of thestrong lawof largenumbersisgiveninthe Appendixtothischapter. Newexamples on patterns and on memoryless optimal cointossing strate-giesaregiveninChapter 3. There is new matenal in Chapter 4 covering the mean time spent in transient states,aswellasexamplesrelatingtotheGibb'ssampler,theMetropolis algonthm,andthemean covertimeinstar graphs. Chapter 5includesanexample onatwo-sexpopulationgrowthmodel. Chapter6hasadditionalexamplesillustratingtheuseofthemartingale stoppingtheorem. Chapter 7 includes new material on Spitzer's identity and using it to compute mean delaysinsingle-server queues withgamma-distnbutedinterarrivaland servicetimes. Chapter 8 on Brownian motionhasbeen movedtofollowthechapter on martingalestoallowustoutilIZemartingalestoanalyzeBrownianmotion. ix xPREFACE TO THE SECOND EDITION Chapter 9 on stochastic order relations now includes a section on associated random variables, as well as new examples utilizing coupling in coupon collect-ingandbinpacking problems. We wouldliketothank allthose who werekind enough to write and send commentsaboutthefirstedition,withparticularthankstoHeSheng-wu, Stephen Herschkorn, Robert Kertz, James Matis, Erol Pekoz, Maria Rieders, and TomaszRolskifortheirmanyhelpfulcomments. SHELDONM. Ross Contents CHAPTER1.PRELIMINARIES1 1.1.Probability1 1.2.RandomVariables7 1.3.ExpectedValue9 1.4Moment Generating,Characteristic Functions,and LaplaceTransforms15 1.5.ConditionalExpeCtation20 1.5.1ConditionalExpectationsandBayes Estimators33 1.6.TheExponentialDistribution,Lackof Memory,and HazardRate Functions35 1.7.SomeProbabilityInequalities39 1.8.Limit Theorems41 1.9.StochasticProcesses41 Problems46 References55 Appendix56 CHAPTER2.THE POISSON PROCESS59 2 1.ThePoissonProcess59 22.InterarrivalandWaiting TimeDistributions64 2.3ConditionalDistributionof theArrivalTimes66 2.31.TheMIGIIBusyPeriod73 2.4.NonhomogeneousPoissonProcess78 2.5CompoundPoissonRandomVariablesand Processes82 2.51.ACompound PoissonIdentity84 25.2.CompoundPoissonProcesses87 xi xiiCONTENTS 2.6ConditionalPoissonProcesses88 Problems89 References97 CHAPTER3.RENEWAL THEORY98 3.1IntroductionandPreliminaries98 32Distribution of N(t)99 3 3SomeLimit Theorems101 331Wald's Equation104 332BacktoRenewalTheory106 34TheKeyRenewalTheoremandApplications109 341AlternatingRenewalProcesses114 342LimitingMeanExcessandExpansion of met)119 343Age-DependentBranchingProcesses121 3.5DelayedRenewalProcesses123 3 6RenewalRewardProcesses132 3 6 1AQueueingApplication138 3.7.RegenerativeProcesses140 3.7.1TheSymmetric RandomWalkandtheArc Sine Laws142 3.8StationaryPointProcesses149 Problems153 References161 CHAPTER4.MARKOV CHAINS163 41IntroductionandExamples163 42.Chapman-KolmogorovEquationsandClassification of States167 4 3Limit Theorems173 44.Transitions amongClasses,theGambler'sRuinProblem, andMean Times inTransient States185 4 5Branching Processes191 46.Applications of MarkovChains193 4 6 1AMarkovChainModelof Algorithmic Efficiency193 462AnApplicationtoRuns-A MarkovChainwitha ContinuousStateSpace195 463ListOrdering Rules-Optimality of the TranspositionRule198 A CONTENTSXIII 4.7Time-ReversibleMarkovChains203 48Semi-MarkovProcesses213 Problems219 References230 CHAPTER5.CONTINUOUS-TIME MARKOV CHAINS231 5 1Introduction231 52.Continuous-TimeMarkovChains231 5.3.-Birth andDeath Processes233 5.4TheKolmogorovDifferentialEquations239 5.4.1ComputingtheTransitionProbabilities249 5.5.Limiting251 5 6.TimeReversibility257 5.6.1TandemQueues262 5.62AStochasticPopulationModel263 5.7Applications of theReversedChaintoQueueing Theory270 57.1.Networkof Queues271 57 2.TheErlang LossFormula275 573TheMIG/1SharedProcessorSystem278 58.Uniformization282 Problems2R6 References294 CHAPTER6.MARTINGALES295 Introduction295 6 1Martingales295 62Stopping Times298 6 3.Azuma'sInequalityforMartingales305 64.Submartingales,Supermartingales.andtheMartingale ConvergenceTheorem313 65.AGeneralizedAzumaInequality319 Problems322 References327 CHAPTER7.RANDOM WALKS328 Introduction32R 7 1.DualityinRandomWalks329 xivCONTENTS 7.2Some RemarksConcerningExchangeableRandom Variables338 73Using MartingalestoAnalyzeRandomWalks 74ApplicationstoGI Gil Queues andRuin Problems344 7.4.1The GIGll Queue 7 4 2ARuinProblem 344 347 7 5Blackwell's Theorem on theLine349 Problems352 References355 341 CHAPTER8.BROWNIANMOTION ANDOTHER MARKOV PROCESSES356 8.1IntroductionandPreliminaries356 8.2.Hitting Times,MaximumVariable,andArc Sine Laws363 83.Variations onBrownianMotion366 83.1Brownian MotionAbsorbedataValue 8.3.2BrownianMotionReflectedattheOrigin 8 3 3Geometric BrownianMotion368 8.3.4Integrated BrownianMotion369 8.4Brownian MotionwithDrift372 84.1Using Martingalesto AnalyzeBrownian Motion381 366 368 85BackwardandForward DiffusionEquations383 8.6Applications of theKolmogorovEquationstoObtaining Limiting Distributions385 8.61.Semi-MarkovProcesses385 862.The MIG/1Queue388 8.6.3.ARuinProblem inRiskTheory 87.AMarkovShot NoiseProcess'393 392 88Stationary Processes396 Problems399 References403 CHAPTER9.STOCHASTIC ORDER RELATIONS404 Introduction404 9 1StochasticallyLarger404 CONTENTSxv 9 2.Coupling409 9.2 1.Stochastic MonotonicityProperties of Birthand Death Processes416 9.2.2ExponentialConvergence inMarkov Chains418 0.3.HazardRateOrderingandApplicationstoCounting Processes420 94.LikelihoodRatio Ordering428 95.Stochastically MoreVariable433 9.6Applicationsof VariabilityOrderings437 9.6.1.Comparisonof GIGl1Queues439 9.6.2.ARenewalProcessApplication440 963.ABranching ProcessApplication443 9.7.AssociatedRandom Variables446 Probl ems449 References456 CHAPTER10.POISSON APPROXIMATIONS457 Introduction457 10.1Brun's Sieve457 10.2The Stein-ChenMethodforBoundingtheError of the PoissonApproximation462 10.3.ImprovingthePoissonApproximation467 Problems470 References472 ANSWERS ANDSOLUTIONS TOSELECTED PROBLEMS473 INDEX505 CHAPTER1 Preliminaries 1. 1PROBABILITY Abasicnotioninprobabilitytheoryisrandomexperimentanexperiment whoseoutcomecannotbedeterminedinadvance.Thesetofallpossible outcomes of an experiment iscalledthe sample space of that experiment, and wedenoteitby S. An event isa subset of asample space,and issaidto occur if the outcome of the experiment isan element of that subset.We shall suppose that for each eventEofthesamplespaceSanumberP(E)isdefinedandsatisfiesthe followingthreeaxioms*: Axiom(1)O:s;;P(E)~1. Axiom(2)P(S)=1 Axiom(3)ForanysequenceofeventsE.,E2,thataremutually exclusive, that is, events for which E,E,= cf>when i ~j (where cf>isthenullset), P ( 9E,)=~P(E,). We refertoP(E)astheprobability of the eventE. Somesimpleconsequences of axioms(1).(2).and(3)are. 1.1.1.If ECF,thenP(E):SP(F). 1.1.2.P(EC)=1- P(E)whereECisthecomplement of E. 1.1.3.P ( U ~E,)=~ ~P(E,)whentheE,are mutuallyexclusive. 1.1.4.P ( U ~E,):S~ ~P(E,). The inequality(114) isknownasBoole's inequality * Actually P(E) willonly be definedfor the so-calledmeasurable events of SButthis restriction neednotconcernus 1 2PRELIMIN ARIES An important property of the probability functionP is that it iscontinuous. Tomakethismoreprecise,weneedtheconceptof alimitingevent,which we define as followsA sequence of even ts {E" , n 2=I} is said to be an increasing sequenceif E"CE,,+1>n;>1 andissaidtobe decreasingifE":JE,,+I,n;> 1.If {E",n2=I}isanincreasingsequenceofevents,thenwedefineanew event,denotedby lim"_,,,E"by OIllimE,,=U E;when E"CE,,+I,n 2=1. ,,_00 ,"'I Similarlyif {E", n2=I}isadecreasing sequence,thendefinelim,,_ooE"by OIllimE" =n E" when E":J E,,+I,n 2=1. ,,_00 We may nowstate the following: PROPOSITION1.1.1 If {E",n2:1}iseither anincreasingordecreasing sequence of events,then ProofSuppose,first,that{E",n2:1}isanincreasingsequence,anddefineevents F", n2:1 by ( II-I) F"= E"YE,t= E " E ~ _ \ ,n>l That is,F"consistsof thosepointsinE"thatarenotinanyof theearlier E,.i ~lim 2: P(E,) n_OCIc=n =0. andthe resultisproven. ExAMPlE1.1.(a)Let Xl, X2,be suchthat P{Xn= O}=lIn2 =1- P{Xn=1},n ~ 1 . If weletEn={Xn=O},then,as~ :peEn}2: P(En)=00, n-l then P{aninfinitenumber of theEnoccur}= 1. 6PRELIMINARIES Proof P{an infinite number ofthe En occur} ==Ptd E,} ==limp(U Ei) n_ooNow, (by independence) = II (1- P(E, (by the inequality 1 - x se-Z) = exp ( - P(E,) '" =0since 2: P(E,)=00for all n. i-If Hencetheresultfollows. ExAMPLE1.1(c)Let XI,X2,beindependentandsuchthat P{Xn=O}=lin=1- P{Xn=I}, If weletE"= {Xn= O},thenas P(E,,)=00it followsfrom Proposition1.1.3thatEnoccursinfinitelyoften.Also,as =00italso followsthatalsooccursinfinitely often. Hence,withprobability1,Xnwillequal 0infinitely oftenandwill also equal1 infinitely often. Hence,withprobabilityI, X"willnot approachalImitingvalueasn -400. RANDOMVARIABLES.7 1.2RANDOMVARIABLES Consider arandom experiment having sample space SArandomvariableX isafunctionthatassignsarealvaluetoeachoutcomeinS.Foranysetof realnumbers A, theprobability that Xwillassumeavaluethat iscontained inthe set Aisequaltotheprobability thatthe outcome of the experiment is containedinX-I(A).That is, P{X EA}=P(X-I(A, whereX-I(A)isthe event consisting of allpoints sESsuchthat X(s)EA. ThedistributionfunctionF of therandomvariableXisdefinedforany realnumber xby F(x)=P{X::$;x}=P{X E(- 00,x]). Weshalldenote1- F(x)byF(x),and so F(x)= P{X> x} ArandomvariableXissaidtobediscreteifitssetof possiblevaluesis countableFordiscreterandomvariables, F(x)2: P{X = y} YSJ: Arandom variable is called continuous if there exists a functionf(x), called the probabilitydensity function,suchthat P{X is in B} =fBf(x) dx for every set B.SinceF(x)J ~...f(x)dx,it followsthat d f(x) =dx F(x). The jointdistributionfunctionF of tworandomvariablesXandYisde finedby F(x,y)=P{X ::$;x,Y::$;y}. The distributionfunctionsof XandY, Fx(x)P{X s;x}andFy(y)=P{Y 1, are increasing and '" lim{X :5;x, Y:5; Yn}U {X O. since(7) =0 when i > m. 11 u Hence,if welet ifN>O if N =0, then(1.3.5)and(1.3.6)yield 1 - I= ~( ~ )(-1)' or (1.3.7) 1= ~( ~ )(-1)1+1 Taking expectations of both sidesof (1.3.7)yields PRELIMINARIES (1.3.8)E[Il = E[N] - E[(:)] + ... + (-1)n+IE [(:)]. However, E[I] = P{N> O} ==P{at least one of the A, occurs} and E[N] =E[ ~ ~ ]= ~P(A,), E[ ( ~ ) ]= E[number of pairs of the A, that occur] =LL P(A,A), 10,b>0 ProbabilityDensityFunctIOn, f(x) 1 0 Ae-.lr(Ax),,-1 (n-I)'11! --e-(X-") 12fT - 00< xk},validforallnonnegativeintegervalued randomvariablesN(seeProblem1.1) EXAMPLE1..5(1)ClassifyingaPoissonNumberofEvents. Suppose that we are observing events, and that N, the total number thatoccur,isaPoissonrandomvariablewithmeanA.Suppose alsothateacheventthatoccursis,independentofotherevents, classifiedasatypejeventwithprobabilityPI'j=1,... ,k, k ~PI=1LetNIdenotethenumberoftypejeventsthatoccur, 1= 1 j=1,,k,andletusdeterminetheirjointprobabilitymass function k Foranynonnegativeintegersn" j=1, Then, sinceN= ~NI,wehavethat ,k,letn=~nl 1= 1 I P{NI=n"j= 1,. ,k} =P{NI = n"j = 1, ... ,k IN = n}P{N = n} + P{NI = nl,j = 1, ... ,k I N~n}P{N =n} =P{NI = n"j =1,., kiN = n}P{N = n}. Now,giventhatthereare atotalof N=nevents it follows,since each event isindependently a type jevent withprobability PI'1 :S jO.if itsprobability density functionisgivenby {A-Ax f(x)=oe x2::0 x s+ tlX> t}=P{X> s}for s, t ;;::O. 36PRELiMINARIES If wethink of Xas being the lifetime of someinstrument, then (1.62) states thattheprobabilitythattheinstrumentlivesforat leasts+ thours,given that ithas survivedthours,isthesameastheinitialprobabilitythat it lives forat least shours.Inotherwords,iftheinstrumentisaliveattimet,then thedistributionof itsremaininglifeistheoriginallifetimedistribution.The condition(1.6.2)isequivalentto F(s+ t)F(s)F(t). andsincethisissatisfiedwhen F isthe exponential, wesee that such random variablesare memoryless ExAMPlE1..6(A)Considerapost officehaving two cler ks,andsup-posethatwhenAentersthesystemhediscoversthatBisbeing served byoneof the clerksandC bytheother.Suppose alsothat AistoldthathisservicewillbeginassoonaseitherBorC leaves.If theamountof timeaclerkspendswithacustomeris exponentiallydistributedwithmean11 A,whatistheprobability that,of thethree customers,Aisthe lastto leavethepost office? Theanswerisobtainedbyreasoningasfollows:Considerthe timeatwhichAfirstfindsafreeclerk.A tthispointeitherBor C wouldhave just left andtheother onewouldstillbeinservice. However, bythe lack of memory of the exponential, it followsthat theamountof additionaltimethatthisother personhasto spend inthe post officeisexponentially distributed with mean11 A.That is,itisthesameasif hewasjust startinghisserviceat thispoint Hence, by symmetry, the probability that he finishes before Amust equal ~ ExAMPlE 1..6(B)Let XI, X2,be independent and identically dis-tributed continuous random variables with distributionF.Wesay thatarecordoccursattimen,n>0,andhasvalueXnif Xn>max(XI!. ,Xn-,), whereXo=00.That is, a record occurs eachtimeanewhighisreached.Let1idenotethetimebetween theith andthe(i+ l)th record.What isits distribution? As a preliminary to computing the distribution of 1i, let usnote thatthe recordtimes of thesequenceX"X2,willbethe same as for the sequence F(X,), F(X2),,and since F(X) has a uniform (0,1) distribution (see Problem 1.2), it followsthat the distribution of 1idoesnot depend ontheactualdistributionF (as longasit is continuous). So let us suppose that Fis the exponential distribution withparameterA1 To computethedistributionofT"wewillconditiononR,the ithrecordvalue.NowRI=XIisexponentialwithrate1.R2has the distribution of an exponential with rate 1 giventhat it is greater than R,. But by the lack of memory property of the exponential this EXPONENTIALDIS I RIBUIION,LACKOFMEMORY.HAZARDRAIEFUNCTIONS37 means that R2hasthe same distribution as RIplus an independent exponentialwithrate 1.Hence R2hasthe same distribution asthe sumoftwoindependentexponentialrandomvariableswithrate 1.Thesameargument showsthat R,hasthesamedistributionas thesumof iindependentexponentialswithrate1.Butitiswell known(seeProblem1.29)thatsucharandomvariablehasthe gammadistributionwithparameters (i,1).That is,thedensityof R,isgivenby t~ O . Hence, conditioningonR,yields i ~1, where thelast equation followssinceif theith record value equals t,thennoneof thenext kvalueswillberecordsif theyare allless thant. It turns out that not only isthe exponentialdistribution "memoryless," but itistheunique distribution possessing this property. To see this, suppose that Xismemorylessand letF(x)=P{X> x}Then F(s+ t)=F(s)F(t). Thatis,Fsatisfiesthefunctionalequation g(s+ t)=g(s)g(t). However,theonlysolutionsoftheaboveequationthatsatisfyanysortof reasonablecondition(suchasmonotonicity,rightor leftcontinuity,oreven measurability)areof theform g(x)=e-Ax for some suitable value of A.[Asimple proof when g is assumed right continu-ousisasfollows.Sinceg(s+ t)= g(s)g(t),it followsthat g(2In)=g(lIn+ lin)= g2(lIn).Repeatingthis yields g(mln)= g"'(lIn).Also g(l)= g(lIn+ ... +lin)=gn(lIn).Hence,g(mln)=(g(l))mln,whichimplies,sincegis rightcontinuous,thatg(x)=(g(l)),Sinceg(l)=g2(1I2)~0,weobtain g(x)=eA....whereA=-log(g(l))]Sinceadistributionfunctionisalways 38PRELIMINARIES rightcontinuous,wemusthave Thememorylesspropertyoftheexponentialisfurtherillustratedbythe failureratefunction(alsocalledthehazardratefunction)oftheexponen-tialdistribution ConsideracontinuousrandomvariableXhavingdistributionfunctionF anddensity fThe failure(or hazard)ratefunctionA(t)isdefinedby (16.3) A(t)= [(f) F(t) To interpretA(t),thinkof Xasbeingthe lifetimeof some item, and suppose thatXhassurvivedforthoursandwedesiretheprobability thatitwillnot survive for an additional time dtThat is, consider P{X E(t, t + dt) I X> t}Now P{X(d) I X} =P{X E(t, t + dt), X> t} Et, t+t> tP{X> t} _P{XE (t,t + dt)} - P{X> t} f(t) dt ~ - - - -F(t) = A(t) dt That is,A(t) represents the probability intensity that at-year-old item willfaiL SupposenowthatthelifetimedistributionisexponentialThen,bythe memorylessproperty,itfollowsthatthedistributionof remaininglifefora t-year-olditemisthe same as foranew item.HenceA(t)should be constant. This checks out since Thus,thefailureratefunctionfortheexponentialdistributionisconstant The parameter A isoften referred to asthe rateof the distribution.(Note that therateisthereciprocalof themean,andviceversa) ItturnsoutthatthefailureratefunctionA(t)uniquelydeterminesthe distributionF.To provethis,wenotethat d-- - F(t) A( t)= ---=d::--t _ F(t) SOMEPROBABILITYINEQUALITIES Integrationyields log F(t) =f>\(I) dt+ k or Letting t==0 showsthatc=1 and so F(t)~exp{ - f ~A(t) dt} 1.7SOMEPROBABILITYINEQUALITIES Westartwithaninequality known asMarkov's inequality. Lemma 1.7.1Markov's Inequality If Xisanonnegativerandom variable,then forany a >0 P{X:;::a}=s;E[X]/a 39 ProofLet I{X ;:::a} be1 jf X;:::a and 0otherwise. Then,itiseasy to see sinceX;::: o that aI{X ;:::a}=s;X Taking expectationsyieldstheresult PROPOSITION1.7.2ChernoffBounds LetXbearandomvariablewithmomentgenerating functionM(t)E[e'X]Then for a> 0 P{X;::: a} =s;e-IOM(I) P{X =s;a} =s;e-lUM(t) foral! t > 0 for all t < O. 40PRELIMINARIES ProofFor t >0 wheretheinequality followsfromMarkov'sinequality. The proof for t a}byusingthattthatmini-mizes r'DM(t) ExAMPLE1.7(A)Chemoff Bounds/or PoissonRandomVariables. If Xis Poisson with mean A,then M(t):=eA(e'-I)Hence, the Chernoff bound forP{X ~j}is The value of tthat minimizes the preceding isthat value forwhich e':=j/ A.Provided that j/ A >1,this minimizing value will be positive andso weobtaininthiscase that Our nextinequality relatestoexpectations ratherthanprobabilities. PROPOSITION1.7.3Jensen'sInequality If f isaconvexfunction,then E[f(X)]~f(E[X]) providedtheexpectations exist. ProofWe willgive a proof under the supposition that fhas a Taylor series expansion Expanding about IJ- = E[X] and using the Taylor series with a remainderformula yields f(x)= f(lJ-)+ f'(IJ-)(x- IJ-)+ f"(fJ(x - ~ ) 2 / 2 ~f(lJ-)+ f'(IJ-)(x- IJ-) since f " ( ~ )~0by convexityHence, STOCHASTICPROCESSES41 Taking expectationsgivesthat 1.8LIMITTHEOREMS Someof themostimportant resultsinprobabilitytheory areintheformof limittheorems. The twomostimportantare: StrongJLaw of Large Numbers If XI,X2, are independent andidenticallydistnbuted withmean IL,then P {lim(XI + ... + X/I)/n= IL}1. /1-+'" Central Limit Theorem If XI,X2,areindependentandidenticallydistnbutedwithmeanILand variancea2,then lim P/I$a=IIe-x212 dx. {XI+ ... + X- nIL}f1 n-+'"aVn-'" Th us if we let S"=:2::=1X"where XI,X2 , are independent and identically distributed, then the Strong Law of Large Numbers states that, with probability 1,S,,/nwillconvergetoE[X,j; whereasthecentrallimittheorem statesthat S"willhaveanasymptotic normaldistributionasn ~00. 1.9STOCHASTICPROCESSES AstochasticprocessX={X(t),tET}isacollectionof randomvariables. Thatis,foreachtintheindexsetT,X(t)isarandomvariable.Weoften interpret tastimeand callX(t) thestate of the process at time t.If the index setTisa countable set, wecallXa discrete-time stochastic process,and if T isa continuum, wecallitacontinuous-time process. Anyrealizationof Xiscalledasamplepath.Forinstance,ifeventsare occurringrandomlyintimeandX(t)representsthenumberof eventsthat occur in[0,t],thenFigure1 91givesa samplepath of Xwhichcorresponds Aproof of theStrongLawof LargeNumhersisgivenintheAppendixtothischapter 42PRELIMINARIES 3 2 234 t Figure1.9.1.Asample palhoj X(t)=number oj eventsin[0,I]. to theinitialevent occumng at time1,the nexteventattime 3 and thethird attime4,andno events anywhereelse A continuous-time stochastic process {X(t), tET} issaidto have indepen-dent incrementsifforallto1, INTERARRIVAL ANDWAITING TIME DISTRIBUTIONS65' itiseasy to show,using momentgenerating functions,thatProposition 2.2.1 impliesthatSnhasagammadistributionwithparametersnandA.Thatis, itsprobability density is (AtY-1 f(t) =Ae-Ar ,t;;::: O. (n- 1)' The above couldalsohavebeen derived bynoting thatthe nth eventoccurs prioror attimetif,andonlyif,thenumberof eventsoccurring by timetis atleast n. That is, Hence, N(t)nSnt. P{Sn$t}=P{N(t) >n} _-AI (At)! - L.Je-.,-, ;=nJ whichupon differentiationyieldsthatthedensityfunctionof Snis f(t)=- f Ae-AI + f Ae-Ar

J.(j-1). _(At)n-I - Ae(n- 1)! RemarkAnother way of obtaining the density of Snis to use the independent incrementassumptionasfollows P{t < Sn< t + dt}= P{N(t) =n- 1,1event in (t, t + dt)}+ o(dt) = P{N(t) = n- I}P{levent in (t, t + dt)}+ o (dt) -M(At)n-1 = eAdt+ o(dt) (n- I)! whichyields,upondividing byd(t)andthenletting itapproach 0,that Proposition2.2.1alsogivesusanother wayof definingaPoissonprocess. Forsupposethatwestartoutwithasequence{Xn,n1}ofindependent identicallydistributedexponentialrandomvariableseachhavingmean1/ A. Nowletusdefineacountingprocessbysayingthatthentheventofthis 66THEPOISSONPROCESS processoccursattimeSn,where Theresultantcounting process {N(t),t~o}willbePoissonwithrateA. 2.3CONDITIONALDISTRIBUTIONOFTHE ARRIVALTIMES SupposewearetoldthatexactlyoneeventofaPoissonprocesshastaken placebytimet,andweareaskedto determinethedistributionofthetime at which the event occurred. Since a Poisson process possesses stationary and independentincrements,itseemsreasonablethateachintervalin{O,t]of equallengthshouldhavethesameprobabilityofcontainingtheeventIn other words, the time of the event should be uniformly distributed over [0, t]. This iseasilychecked since,fors:=::;t, P{XY2,... ,y,,), CONDITIONAL DISTRIBUTIONOF THE ARRIVAL TIMES67 and(ii)the probability densitythat (Yh Yz, ... ,Yn) isequal to Yit,Y'2'... , Yiis /(y,)/(y, )... /(y, )=II; /(y,) when (y, , Y,, ..,Yi) isa permutation 12.12. of (Yl>Y2,.., Yn). If the Y" i= 1,... , n, are uniformly distributed over (0,t), then it follows fromthe abovethat the joint density functionof the order statistics Y(I),Y(2) . . . , YIn)is n! r' 0< Y'< Y2< ... < Yn< t. Wearenowreadyforthefollowingusefultheorem. THEOREM2.3.1 GiventhatN(t)= n,thenarrivaltimes 51,. , 5n havethesamedistributionasthe order statisticscorresponding tonindependent randomvariables uniformly distributed ontheinterval(0,t). ProofWeshallcomputetheconditionaldensityfunctionof 51,', 5n giventhat N(t)=nSolet0=2: E[e/(X,++XnlIN =n]e-A/{At)"lnl " ~ O ex>(2.5.1 ) =2: E[e/(X,++Xnl]e-At{At)"lnI n=O ex>(25.2)=2: E[e'X']"e-A/{At)"ln! n=O where (2.5.1) follows from the independence of {XI, X2, }and N, and (2.5.2) followsfromtheindependenceof theX,Hence,letting denotethe momentgenerating functionof theX"wehavefrom(2.5.2)that 0; E [e'W] = 2:[4>x{t)]"e-A/{At)"lnl n ~ O (2.5.3)=exp{At{4>x{t)- 1)] COMPOUNDPOISSONRANDOMVARIABLESANDPROCESSES 83. It iseasilyshown,eitherbydifferentiating(25.3)orbydirectlyusinga conditioningargument,that whereXhasdistributionF. E[W]=AE[X] Var(W) =AE [X2] EXAMPLE2.5(A)Asidefromthewayinwhichtheyaredefined, compoundPoissonrandomvariablesoftenariseinthefollowing mannerSupposethateventsareoccurringinaccordancewitha Poissonprocesshavingrate(say)a,andthatwheneveranevent occursa certaincontributionresultsSpecifically,supposethatan event occurringattime swill,independent of the past,resultina contnbutionwhosevalueisarandomvariablewithdistribution Fs.LetW denote the sum of the contributions up to time t-that is, whereN(t)isthenumber of events occurringbytimet,andXiis the contribution made when event; occurs. Then, even though the XIareneitherindependentnoridenticallydistributeditfollows thatW isacompoundPoissonrandomvariablewithparameters A = atand 1 II F(x)=- Fs(x)ds. t0 ThiscanbeshownbycalculatingthedistributionofWbyfirst conditioningonN(t),andthenusingtheresultthat,givenN(t), theunorderedsetofN(t)eventtimesareindependentuniform (0,t)randomvariables(seeSection 2.3). When F is a discrete probability distribution function there is an interesting representation of W asalinear combinationof independentPoissonrandom variables.SupposethattheXIarediscreterandom variables suchthat k P{XI=j}= p"j = 1, .. , k, L p,= 1. , ~ I If weletN,denotethenumber of theX/sthatareequalto j, j=1,... ,k, thenwecanexpressW as (25.4) 84THEPOISSONPROCESS where,usingtheresultsofExample1.5(1),theareindependentPoisson random variableswithrespectivemeansAPi'j=1,.,kAs acheck,letus usethe representation(2.5.4)to computethemeanandvariance of W. E[W] = =2,jAp, =AE[X] , Var(W) = = 2,FAp, = AE[X2] ,, whichcheck withourpreviousresults. 2.5.1A Compound Poisson Identity N Asbefore,letW=XrbeacompoundPoissonrandomvariablewithN 1=1 being Poisson with mean A and the X,having distribution F.We now present ausefulidentity concerningW. PROPOSITION2.5.2 Let Xbearandom variablehavingdistribution F that i0Let hex)={O lIn ifx# n if x= n SinceWheW)=I{W =n}.whichisdefinedtoequal1 ifW=nand 0otherwise.we obtainuponapplyingProposition 252 that P{W = n} = AE[Xh(W + X)] = A2: E[Xh(W+X)lx= naJ J = A 2:jE[h(W + j)]aJ J =A "j-l P{W + j=n}a L.Jn J J COMPOUNDPOISSONR.-:-NDOMVARIABLESANDPROCESSES RemarkWhentheX,areidenticallyequalto1,theprecedingrecursion reducestothe well-knownidentity forPoissonprobabilities P{N = O}=e-A A P{N =n}=- P{N =n- I}, n

ExAMPLE2.5(a)LetWbeacompoundPoissonrandomvariable withPoissonparameterA =4andwith p{X,=i}= 114,i= 1,2, 3,4. TodetermineP{W=5},weusetherecursionof Corollary2.5.4 asfollows: Po=e-A = e-4 p.=Aa.Po = e-4 P2 = + 2a2PO}=P3 = {a. P2 + 2a2Pl+ 3a3PO}= 1: e-4 P4 ={a. P3 + 2a2P2+ 3a3P.+ 4a4PO}= e-4 A501-4 Ps =5" {a. P4 + 2a2P3+ 3a3P2+ 4a4P.+ 5asPo} =120 e 2.5.2Compound Poisson Processes Astochasticprocess{X(t),tO}issaidtobeacompoundPoissonprocess if itcanberepresented,fort0,by N(,) X(t)=2: X, ,=. where{N(t),t>O}isaPoissonprocess,and{X"i=1,2,...} isafamilyof independent and identically distributedrandom variables that isindependent oftheprocess{N(t),tO}.Thus,if{X(t),tO}isacompoundPoisson processthenX(t)isa compoundPoissonrandomvariable. Asanexampleof acompoundPoissonprocess,supposethatcustomers arrive ata store ata Poisson rate A.Suppose,also, that the amounts of money spent byeach customer forma set of independent and identically distributed randomvariablesthatisindependentof thearrivalprocess.If X(t)denotes 88THE POISSONPROCESS thetotalamountspentinthestorebyallcustomersarrivingbytimet,then {X(t),t:>O}isa compoundPoissonprocess. 2.6CONDITIONALPOISSONPROCESSES LetA bea positiverandom variablehaving distributionGand let {N(t),t:> O}bea counting process suchthat, giventhat A =:::: A,{N(t), t:> O}isa Poisson processhavingrateA.Thus,forinstance, P{N(t + s)N(s)n} The process {N(t),t2!:O}is calleda conditional Poissonprocess since, condi-tionalontheeventthatA= A,itisaPoissonprocess withrateA.It should benoted,however,that{N(t),t:>O}isnotaPoissonprocess.Forinstance, whereasitdoeshavestationaryincrements,itdoesnothaveindependent ones(Why not?) Letuscomputetheconditional distributionof A giventhat N(t)=n.For dAsmall, P{AE(A,A + dA) I N(t)n} _P{N(t)n I A E(A,A + dA)}P{AE(A,A + dA)} e-Ar (At)ndG(A) n! = - - ~ - : - - - -J'"e-AI(At)ndG(A) on! P{N(t) = n} and sothe conditionaldistributionof A,giventhat N(t)= n,is givenby ExAMPLE2.6(A)Suppose that, depending on factorsnot at present understood,theaveragerateatwhichseismicshocksoccurina certain region over a givenseason iseitherAIor A2. Supposealso that it isAIfor100 ppercent of theseasons andA2theremaining time.Asimplemodelforsuchasituationwouldbetosuppose that{N(t),0O}isa PoissonprocesswithrateAI+ A2Also, show thattheprobability that thefirsteventofthecombinedprocesscomesfrom{N1 (t),t2:O}is AI/(AI+ A2).independently of thetimeof theevent. 2.6.Amachineneedstwotypesofcomponentsinordertofunction.We haveastockpileofntype-lcomponentsandmtype-2components. 90TH EPOISSONPROCESS Type-icomponentslastforanexponentialtimewithrateILlbefore failing.Computethemean lengthof timethemachineisoperativeifa failed component isreplaced by one of the same type from the stockpile; that is, computeXI!Y,)], where the X,(Y,) are exponential withrateILl (IL2). 2.7.Computethe joint distributionof 51,52,53. 2.8.Generating a Poisson Random Variable. LetVI!V2, be independent uniform(0,1)randomvariables. (a)If X,=(-logV,)/A,showthatX,isexponentiallydistnbutedwith rateA. (b)Use part (a) to show that N isPoisson distributed with mean A when Nisdefinedtoequalthat valueof nsuch that nn+1 II V,e-.1> II V" where V,1.ComparewithProblem1 21of Chapter1. 2.9.SupposethateventsoccuraccordingtoaPoissonprocesswithrateA. Each timeanevent occurswemust decide whether or not to stop, with ourobjectivebeingtostopatthelasteventtooccurpriortosome specified timeT.That is,ifanevent occursat timet, $t-n} 2.13.Supposethat shocksoccuraccordingtoaPoissonprocesswithrateA, andsupposethateachshock,independently.causesthesystemtofail with probability pLet Ndenotethenumber of shocksthat ittakesfor thesystemtofailandletTdenotethetimeoffailureFind P{Nnl T=t} 2.14.Consideranelevatorthatstartsinthebasementandtravelsupward. Let Ntdenotethenumber of peoplethat getintheelevatorat floori. AssumetheN,areindependentandthatNtisPoissonwithmeanA, Eachpersonenteringatiwill,independent of everythingelse,getoff at j withprobability Plj'~ j > 'Plj = 1.Let OJnumber of people getting off theelevatoratfloorj (a)ComputeE[OJ (b)Whatisthedistributionof OJ? (c)Whatisthejointdistributionof OJandOk? 2.15.Consider an r-sided coin and suppose that on each flipof the coin exactly one of the sidesappears' side iwithprobability PI' ~ ~PI=1For given numbers n'l,., n"let N,denote the number of flipsrequired until side ihasappearedforthe nl time,i1,., r,andlet 92THEPOISSONPROCESS Thus N isthe number of flipsrequired until sidei hasappeared n,times forsomei= 1,, r (a)What isthedistributionof N,? (b)AretheN,independent? Nowsupposethattheflipsareperformedatrandomtimesgenerated byaPoissonprocess withrateA =1LetT,denote thetimeuntilside ihasappearedforthen,time,i=1,, randlet T=minTI 1=1.r (c)What isthedistributionof TI? '(d)Arethe11independent? (e)DeriveanexpressionforE[T] (f)Use(e)toderiveanexpressionforE[Nl 2.16.ThenumberoftrialstobeperformedisaPoissonrandomvariable withmeanAEachtrialhasnpossibleoutcomesand,independentof everythingelse,resultsinoutcomenumberiwithprobabilityPi. ~ ~P,1LetXIdenotethenumber of outcomesthatoccurexactly j times, j0,1,.ComputeE[X;l,Var(X;). 2.17.LetXI,X2,,Xnbeindependent continuousrandomvariableswith common density function! Let XIt) denote the ith smallest of Xl,.. , Xn (a)Notethat inorder forX(I)to equal x,exactly i- 1 of theX's must belessthan x,onemustequalx.andtheother nimustallbe greater than xUsing this factargue that thedensity function of X(I) isgivenby (b)X(I)willbelessthan xif,andonly if,howmany of theX's areless than x? (c)Use(b)toobtainanexpressionforP{Xlr)Sx} (d)Using(a)and(c)establishtheidentity for0~yS1 PROBLEMS 93. (e)Let S,denotethetime of theith event of the Poissonprocess {N(t). t;>O}.Find E[S,IN(t) =n l ~{ i> n 2.18.LetV(llI,V(n)denotethe order statistics of a set of n uniform(0,1) random variablesShow that given V(n)= y, V(I),,V(n-I)aredistributed asthe order statistics of a set of n- 1 uniform (0, y)random variables 2.19.Busloadsof customersarriveataninfiniteserverqueueataPoisson rate A Let Gdenote the service distribution. A bus contains j customers with probability a"j = 1,.Let X(t) denote the number of customers that havebeenservedbytime t (a)E[X(t)]=?, (b)IsX(t)Poissondistributed? 2.20.Suppose that each event of aPoisson process with rate A isclassifiedas being either of type1,2,,kIf the event occursat s,then,indepen-dentlyof allelse,itisclassifiedastypeiwithprobabilityP,(s),i=1, .. ,k,~ ~P,(s)= 1.LetN,(t)denotethenumberoftypeiarrivalsin [0,t]ShowthattheN,(t).i= 1,,kareindependentandN,(t)is PoissondistributedwithmeanA J ~P,(s)ds 2.21.Individuals enter the system inaccordance with a Poisson process having rateAEacharrivalindependently makesitswaythroughthe statesof the systemLet a,(s) denote the probability that an individualisinstate iatime s afteritarrivedLet N,(t)denotethe number of individuals in state iattimet.ShowthattheN,(t),i~1,areindependentandN,(t) isPoissonwithmeanequalto AE[amount of timeanindividualisinstateiduringitsfirsttunitsin the system] 2.22.Suppose cars enter a one-way infinitehighwayata PoissonrateAThe ith car to enter chooses a velocity V,and travels at this velocityAssume that the V,'s are independent positive random variables having a common distributionF.Derivethedistributionof thenumberof carsthatare located intheinterval (a,h) at time tAssume that no time islostwhen one car overtakesanother car 2.23.Forthemodelof Example 2.3(C),find (a)Var[D(t)]. (b)Cov[D(t),D(t+ s)] 94THEPOISSONPROCESS 2.24.Supposethatcarsenter a one-wayhighway of lengthLinaccordance witha Poissonprocess withrate AEach car travelsat a constant speed thatisrandomlydetermined,independentlyfromcartocar,fromthe distributionF.Whenafastercar encountersaslowerone,itpassesit withnolossof timeSupposethatacar entersthehighwayattimet. Showthatast-+00thespeed of thecarthatminimizestheexpected number of encounters with other cars, where we sayan encounter occurs whena cariseitherpassedby or passesanother car,isthemedianof thedistributionG 2.25.SupposethateventsoccurinaccordancewithaPoissonprocesswith rateA,andthatanevent occurringat times,independentof thepast, contributesarandomamounthavingdistributionFHS:>O.Showthat W,the sum of all contributions bytime t,is a compound Poisson random N variableThat is, showthatWhas the samedistribution as~x"where 1",1 theX,areindependentandidenticallydistributedrandomvariables andareindependentofN,aPoissonrandomvariableIdentifythe distribution of theX,andthemean of N 2.26.Compute theconditional distribution of S" Sh. , Sngiventhat Sn=t 2.27.Computethemoment generatingfunctionof D(/)inExample23(C). 2.28.ProveLemma 2.3 3 2.29.Completetheproof thatfora nonhomogeneousPoisson process N(t+ s)- N(t)isPoissonwithmeanmet+ s)- met) 2.30.LetT" Tz, denotetheinterarrival times of events of a nonhomoge-neousPoissonprocess having intensityfunctionA(t) (a)AretheTIindependent? (b)Arethe~identicallydistributed? (c)Findthedistribution of T, (d)FindthedistributionofT2 2.31.ConsideranonhomogeneousPoissonprocess{N(t),t2:O},where A(t)> 0 forallt.Let N*(t)=N(m-I(t)) Showthat {N*(t),t2:O}isaPoissonprocess withrateA1 PROBLEMS95 2.32.(a)Let {N(t),t :>O}beanonhomogeneous Poisson process withmean valuefunctionm(t)GivenN(t)=n,showthattheunorderedset ofarrivaltimeshasthesamedistributionasnindependentand identically distributed random variables having distribution function {m(x) F(x)=x :5,t x> t (b)Supposethat workers incuraccidentsinaccordance withanonho-mogeneous Poisson process with mean value function m(t). Suppose further that each injured person is out of work for a random amount of timehavingdistributionF.LetX(t)bethenumberof workers whoareout of workattimetComputeE[X(t)]andVar(X(t)). 2.33.Atwo-dimensionalPoissonisaprocessof eventsintheplane suchthat(i)foranyregionofareaA,thenumberof eventsinAis PoissondistributedwithmeanAA,and(ii)thenumbersofeventsin nonoverlappingregionsareindependent.Considerafixedpoint,and letXdenotethedistancefromthatpointtoitsnearestevent,where distanceismeasuredintheusualEuclidean mannerShowthat (a)P{X >t}=e-AlI,2. (b)E[X]=1/(20). Let R"i1 denote the distancefromanarbitrary point totheith closest eventtoitShow that,withRo= 0, (c)1TRf- 1TRf-I'i1 areindependent exponentialrandomvariables, each withrateA. 2.34.Repeat Problem 2 25 when the events occur according to a nonhomoge-neousPoissonprocess withintensity A(t),t0 2.35.Let {N(t),tO}beanonhomogeneousPoissonprocesswithintensity functionA(t), t0However, suppose one starts observing theprocess atarandomtimeThavingdistnbutionfunctionF.Let N*(t)= N( T+ t)- N( T)denote the number of events that occur inthe first t timeunits of observation (a)Doestheprocess{N*(t),t:>O}possessindependent increments? I (b)Repeat(a) when{N(t),tO}isaPoissonprocess. 2.36.LetCdenotethenumberof customersservedinanMIG/1busype-riod.Find (a)E[C]. (b)Var(C) 96THEPOISSONPROCESS N(t) 2.37.Let {X(t), t2:O}be a compound Poisson process withX(t)== and 1=1 suppose that the XI can only assumea finite set of possible values. Argue that, fortlarge,thedistributionof X(t)isapproximately normal N(I) 2.38.Let {X(t), t2: O}be a compound Poisson process with X(t) and 1=1 supposethatA=1andP{X1 ==j}==JIIO,j=1,2,3,4.Calculate P{X(4)20}, 2.39.Compute Cov(X(s),X(t))fora compound Poisson process 2.40.Give an example of a counting process {N(t), t:> O}that is not a Poisson processbutwhichhasthepropertythatconditionalonN(t)=nthe firstn event timesaredistributed asthe order statisticsfroma of n independentuniform(0,t)randomvariables. 2.41.Fora conditionalPoissonprocess. (a)ExplainwhyaconditionalPoissonprocesshasstationarybutnot independent increments. (b)Computethe conditionaldistributionof Agiven{N(s),0Sn~t. From(3.21)weobtain (322)P{N(t) =n} =P{N(t) ~n} - P{N(t) ~n+ 1} =P{Sn- I}(by (3.3.7)) , =P{S"- rilL > I - rilL} (Tv"(Tv" =P"I>_ Y1 +Y(T {s- r IL() -112} (Tv"vt,;. 109 Now,bythecentrallinut (S,- ',IL)I(Tv" convergestoanormalrandom vanablehavingmean0andvariancel'as I(andthusr,)approaches00Also,since ( Y(T) -112 -Y1+-- --Y vt,;. weseethat and since f'"2IY2 e-X 12dx =e-X 12dx, -ytheresultfollows Remarks (i)There isa slight difficultyin theaboveargument since "should bean integerforustousetherelationship(33.7).It isnot,however,too difficultto maketheaboveargumentrigorous. (ii)Theorem3.3.5statesthatN(t)isasymptoticallynormalwithmean tIp.and variancetu2/p.3. 3.4THEKEYRENEWALTHEOREM ANDApPLICATIONS AnonnegativerandomvariableXissaidtobelatticeifthereexistsd2!0 suchthat :L:=oP{X =nd}=1.That is,Xislattice if it onlytakes on integral 110RENEWAL THEORY multiplesof somenonnegativenumber dThelargestd having thisproperty issaidto betheperiod of XIf XislatticeandF isthedistribution function of X,then wesaythatF islattice. Weshallstatewithoutproof thefollowingtheorem THEOREM3.4.1 (Blackwell's Theorem). (i)If F isnot lattice,then m(t + a)- m(t)- alii-ast _00 foralla~0 (ii)If F islatticewithperiod d,then E[number of renewalsat nd]- dlli-asn_00 ThusBlackwell'stheorem statesthat ifF isnotlattice,then theexpected number of renewalsin an interval of length a,farfromtheorigin, isapproxi-matelyalp.Thisisquiteintuitive,foraswego furtherawayfromtheorigin itwouldseemthattheinitialeffectswearawayandthus (3.4.1)g(a) ==lim[m(t+ a)- m(t)J I_a: shouldexistHowever,iftheabovelimitdoesinfactexist,then,asasimple consequenceoftheelementaryrenewaltheorem,it mustequalalp..To see thisnotefirstthat g(a+ h)=lim[m(t + a + h)- m(t)] I_a: =lim[m(t + a + h)- m(t + a)+ m(t + a)- m(t)J 1-'" = g(h) + g(a) However,theonly(increasing)solution of g(a+ h)=g(a)+ g(h)is g(a)= ca,a>O THE KEYRENEWAL THEOREMANDAPPLICATIONS forsomeconstant c.To showthat c=1//Ldefine Then implyingthat or Xl=m(l) - m(O) X2=m(2) - m(1) Xn= m(n) - m(n - 1) 1.Xl+.. + Xn 1m=C n-'"n lim m(n) =c n-'"n Hence,bytheelementary renewaltheorem,c=1//L. 111 WhenF islatticewithperiodd,then thelimitin(34 1)cannot existFor nowrenewalscan only occur at integral multiples of d andthus the expected numberof renewalsinanintervalfarfromtheoriginwouldclearlydepend not on the intervals' length per sebut rather on how many points of theform nd, n:> 0,itcontains.Thus in thelatticecasethe'relevant limit isthat of the expected number of renewals at nd and, again, if limn_",E [number of renewals atnd J exists,then bytheelementary renewaltheoremitmust equaldl/L.If interarrivalsarealwayspositive,thenpart(ii)of Blackwell's theorem states that,inthelatticecase, limP{renewal at nd} =!!.. n ~ O C/L Lethbeafunctiondefinedon[0,00 JForanya>0letmn(a)bethe supremum,andmn(a)theinfinumof h(t)overtheinterval(n- l)a- 0and " '" lima L mn(a) 0-.0n=1 lim a L mll(a) 0_0n=l-Asufficient condition forhto be directlyRiemann integrable isthat. " (i)h(t);;:0forallt;;:0, (ii)h(t) isnonincreasing, (iii)f; h(t)dt x} ~F{x) Thatis,forany xit ismorelikelythatthelengthoftheintervalcontaining thepointtisgreaterthanxthanit isthatanordinaryrenewalintervalis greater than x. This result, whichat first glance may seem surprising, isknown astheinspectionparadox. Wewillnowusealternating renewalprocesstheory to obtain the limiting distributionofX N(r)+1Againletanon-offcyclecorrespondtoarenewal interval,andsaythattheontimeinthecycleisthetotalcycletimeifthat time isgreater than xand is zero otherwise. That is,the system is either totally onduringacycle(iftherenewalintervalisgreaterthanx)ortotallyoff otherwise.Then P{XN(,)+I> x} = P{length of renewal interval containing t > x} =P{on at time t}. Thusby Theorem 3.4 4,providedF isnotlattice,weobtain IP{X} - E[on time in cycle] 1mN(I)+I> X- ----"------=------'-'-'"p. =E[XIX > xlF{x)/p. =r y dF{y)/p., 118RENEWAL THEORY or,equivalently, (3.4.2)limP{XN(r)+.:s;x}= JXy dF(y)/p.. ~ ~0 RemarkTobetterunderstandtheinspectionparadox,reasonasfollows: Since the line is covered by renewal intervals, isitnot more likely that a larger interval-as opposed to a shorter one-covers the point t?In fact,in the limit (as t--+00)it isexactly true that an interval of length yis ytimesmorelikely tocover tthan one of length1.For ifthiswerethecase, then thedensityof theinterval containing thepoint t,callit g,would be g(y)= ydF(y)/c (since dF(y)istheprobabilitythatanarbitraryintervalisof lengthyandylcthe conditionalprobability thatitcontains thepoint).Butby(3.4.2)weseethat thisisindeedthe limitingdensity. For another illustration of the variedusesof alternating renewalprocesses considerthe followingexample. EXAMPLE3.4(A)An InventoryExample.Supposethatcustomers arriveata store,which sellsa singletypeof commodity,inaccor-dance with a renewal process having nonlattice interamval distribu-tionF.Theamountsdesiredbythecustomersareassumedtobe independentwithacommondistributionG.Thestoreusesthe following(s,S)ordering policy' If the inventory levelafter serving acustomer isbelow s,then an order isplacedto bring itupto S OtherwisenoorderisplacedThusiftheinventorylevelafter servingacustomerisx,then theamountorderedis S - xif x < s, oifx;;?s. Theorder isassumedto beinstantaneously filled. LetX(t)denotetheinventorylevelattimet,andsupposewe desirelim( .... ~P{X(t)>x}.If X(O)=S,thenifwesaythatthe system is"on" whenever the inventory level isat least xand "off" otherwise, the aboveisjust an alternating renewal process. Hence, fromTheorem 3.4.4, lim P{X(t) ;;?x} =E [amount of time. the inventory;;? x in a cycle]. (.... ~E[tlme of a cycle] Now if weletY.,Y2,,denote the successive customer demands andlet (3.4.3)Nx =min{n:Y.+ ... + Yn > S - x}, THE KEYRENEWAL THEOREMANDAPPLICATIONS thenitistheNx customerinthecyclethatcausestheinventory levelto fallbelow x,and itistheNscustomerthat ends the cycle. Hence if x"i~1,denote the interarrival times of customers, then Nl amount of "on" time in cycle =Lx" 1=1 N, time of cycle =Lx,. 1=1 Assuming that the interarrival times are independent of the succes-sivedemands,wethushaveupontakingexpectations (344) [ N- ] 1 ELX, rP{X():>} =1=1= E[Nx] L ~tx[N,]E [Ns] ELX, 1= 1 However, as the Y" i:> 1, are independent and identically distrib-uted,itfollowsfrom(3.4.3)thatwecaninterpretNx - 1asthe numberofrenewalsbytimeS- xofarenewalprocesswith interarrivaltimeY;,i~1.Hence, E[Nx]=mG(S - x) + 1, E[N,]= mG(S - s)+ 1, whereGisthecustomerdemanddistnbutionand '" ma(t) = L Gn(t). n=1 Hence,from(3.4.4),wearnve at :>_1 + mG(S- x) 11mP{X(t) - x} - 1(S)' 1-+'"+ mG- s s ~ x ~ S . 3.4.2Limiting Mean Excess and the Expansion of m(t) 119 Letusstartbycomputingthemeanexcessofanonlatticerenewalprocess. Conditioning on SN(/)yields(byLemma3.4.3) E[Y(t)]=E[Y(t)ISN(r)=O]F(t) + f ~E[Y(t)ISN(r)=y]F(t - y) dm(y). 120RENEWAL THEORY ---------x--------------x--------------------x----Now, y ---Y(t)---Figure 3.4.1.SN(I)=Y;X::renewal. E [Y(t)ISN(r)=0]=E[X - tlX > t], E[Y(t)ISN(t)=y]=E[X - (t - y)IX > t - y]. wheretheabovefollowssinceSNIt)=ymeansthatthereisarenewalaty andthenextinterarrivaltime-call itX-is greaterthant- Y(seeFigure 3.4.1).Hence, E[Y(t)]=E[X - tlX> t]P(t) + J: E[X - (t - y)IX> t - y]F(t - y) dm(y). Nowitcan be shown thatthefunctionh(t)=E [X - tlX >t] F(t)isdirectly Riemann integrable provided E[X2] t] F(t) dtl IL =J; J ~(x- t) dF(x) dtllL =J; J: (x- t) dt dF(x)11L =J; x2 dF(x )/21L =E[X2]/21L. Thus wehaveproven thefollowing. PROPosnION3.4.6 (by interchange of order of integration) If theinterarrivaldistributionisnonlatticeandE[X2]1 THEOREM3.4.8 If Xo=1,m>1,andFnot lattice,then whereaistheuniquenumber suchthat J'"1 e-a'dF(x)=-om 122RENEWAL THEORY ProalByconditioningonT ..thelifetimeof theinitialorganism,weobtain M(t)= t IX(t)ITI= s]dF(s) However, (345) ~ '{I IX(t) ITI= s]= mM(t - s) ifs> t ifs st Toseewhy(345)istrue,supposethatT]=s,sst, andsupposefurtherthatthe organismhasjoffspringThenthenumber of organismsaliveattmaybewnttenas Y]+.+Y"whereY,isthenumber of descendants(includinghimself)of theith offspring that are alive at tClearly,Y1,,Y,are independent with the same distribu-tionasX(t- s)Thus,(YI++Y,)=jM(t- s),and(345) followsbytaking theexpectation(withrespectto j) of jM(t- s) Thus,fromtheaboveweobtain (346)M(t)= F(t)+ m t M(t- s) dF(s). Now,letadenotetheuniquepositivenumber suchthat anddefinethedistributionGby G(s)= mt eaydF(y),Oss< 00 Uponmultiplyingbothsidesof(346)byea'andusingthefactthatdG(s) me-a,dF(s),weobtain (347)e-a'M(t)= ea'F(t)+ J: e-a('-O.If a renewaldoesnotOccur att,thenthedistributionofthetimewemustwaituntilthefirstobserved renewalwillnotbethe sameastheremaininginterarrivaldistributions. Formally, let {Xn,n=1,2, ...} be a sequence of independent nonnegative randomvariableswithXIhavingdistributionG,andXnhavingdistribution F,n>1LetSo=0, ,Sn=L ~Xi,n::=1,anddefine ND(t)= sup{n:Sn:St}. Definition Thestochasticprocess {No(t),t2:O}iscalledageneral or adelayed renewal process WhenG=F,wehave,of course,anordinary renewalprocess.Asin the ordinarycase,wehave Let P{Nf)(t) =n} =P{Sn:S t}- P{Sn+1ErNB])' EXAMPLE3.5(a)Asystemconsistsof nindependentcomponents, each of whichacts likean exponentialalternating renewal process More specifically, component i, i =1, ... ,n, isup for an exponential timewithmeanA,andthen goesdown,in whichstateitremains foranexponentialtimewithmeanJL"beforegoingbackupand startinganew. Supposethatthesystemissaidto befunctionalatanytimeif atleastonecomponent isupatthattime(sucha systemiscalled parallel).If weletN(t)denotethenumberoftimesthesystem becomesnonfunctional(that is,breaks down)in[0,t],then {N(t), t;>0)isadelayedrenewalprocess Supposewewantto computethemeantimebetweensystem breakdowns.Todosoletusfirstlookattheprobabilityofa breakdown in(t,t+ h)forlargetandsmallh.Nowonewayfor a breakdown to occur in (t,t+ h) isto haveexactly1 component upat time t and all others down, and then have that component fail. Sinceallother possibilities taken together clearly haveprobability a (h),weseethat limp{breakdOWnin(t,t+h)}=:t{A,ITJLI }lh+O(h). I_eo,;1A,+ JL,I""Al+ JLI A, But by Blackwell's theorem the above isjust h times the reciprocal of themean timebetween breakdowns. and so upon letting h.......0 weobtain ( nJLn1 )-1 Ertime between breakdowns]=IT)L-1=1Al+ JLI 1-1JL, Astheexpectedlengthof abreakdown periodis( ~ ; = 11/JLlt\ wecancomputetheaveragelengthofanup(orfunctional) DELAYEDRENEWALPROCESSES periodfrom ( n1 )-1 E[length of up period]=E[time between breakdowns] - 2.:-1IL, Asa checkof theabove,note that the system may be regarded asa delayed alternating renewal process whose limiting probability of beingdownis n lim P{system isdown at t}=nILl. 1-'"I ~ IAI+ ILl We can nowverify that theaboveisindeed equal to theexpected lengthof adown period dividedbytheexpected timelengthof a cycle(or timebetween breakdowns). ExAMPLE3.5(c)Considertwocoins,andsupposethateachtime coiniisflippeditlandsontailswithsomeunknownprobability p" i=1,2.Our objectiveisto continuallyflipamongthesecoins so asto make the long-run proportion of tails equal to min(PI' P2)' Thefollowingstrategy,havinga very smallmemoryrequirement, willaccomplishthisobjectiveStartbyflippingcoin1untilatail occurs,at which point switch to coin 2 and flipituntila tail occurs. Saythatcycle1 endsatthispointNowflipcoin1untiltwotails ina rowoccur,andthen switchanddo the same withcoin 2.Say that cycle 2 ends at this point. In general, when cycle n ends, return to coin1andflipituntiln+ 1 tailsinarowoccurandthenflip coin 2untilthisoccurs,whichends cyclen+ 1. To showthatthe preceding policy meets our objective,let P = max(p), P2)and ap= min(p I,P2),where a0, P{Bm ~sGm forinfinitelymany m}= O. U9 130RENEWAL THEORY ProofWewillshowthat ., L P{Bm ~eGm} 1], E[RN(I)-dSN(,)=:s]E[Rnl Xn> t - s], and so (361) Now,let and notethatsince EIRd =f; EfiRd IX,x] dF(x) 1} isa regenerative process thatregenerates whenever Xntakes value O.To obtain some of the properties of this regenerative process, we will firststudy the symmetricrandomwalk{Zn,n2O}. Let Un= P{Z2n= O} andnote that (3.7.1) 2n-1 Un=2nUn-I' NowletusrecallfromtheresultsofExamplel.S(E)ofChapter1(the ballotproblem example)the expression forthe probability thatthefirstvisit to 0inthe symmetricrandom walkoccursattime2n.Namely, (3.72) e)mh P{ZI=1= 0, Z2=1= 0, ... ,Z2n-1=1= 0, Z2n= O}=2n_1 == 2n-1' Wewillneedthefollowinglemma,whichstatesthatun-the probability thatthesymmetncrandomwalkisat0attime2n-isalsoequaltothe probability that therandomwalkdoesnot hit0bytime2n. 144RENEW AL THEORy Lemma3.7.3 ProofFrom(37 2)weseethat Hencewemustshowthat (37.3) which we willdobyinduction on n. When n= 1,the aboveidentityholds since UI=! Soassume(373) for n- 1Now nn-1 1- 2:_U_k_= 1- 2:_U_k ___U_ n _ k ~ 12k - 1k=12k - 12n- 1 Un = Un_I- 2n- 1 (by the induction hypothesis) = Un(by (371. Thustheproofiscomplete Since u.~( ~ )(i)'", itfollowsuponusinganapproximationduetoStirling-whichstatesthat n!- nn+1J2e-nV2ii-that (2n)2n+IJ2e-2nV2ii1 Un- n2n+1e-2n(21T)22n = V;;' and so Un~ Oas n ~00. Thus from Lemma 3.7.3 we see that, with probability 1, thesymmetricrandomwalkwillreturntotheorigin. REGENERATIVEPROCESSES145 Thenextpropositiongivesthedistributionof thetimeof thelastvisitto o uptoandincluding time2n PROPOSITION3.7.4 For k= 0,1,, n, Proof P{Z2.=0, ZU+I-:/=0,, Z2n-:/=O} = P{Z2.= 0}P{Z2HI-:/=0, = U.Un_. where we have used Lemma 3 7 3 to evaluate the second term on the right inthe above We are now ready for our major result, which is that if we plot the symmetric randomwalk(startingwithZo=0)byconnectingZkandZk+1byastraight line(see Figure 3.7.1), then theprobability that upto time 2nthe processhas beenpositivefor2ktimeunitsandnegativefor2n- 2ktimeunitsisthe sameastheprobabilitygiveninProposition3.7.4.(Forthesamplepath presentedinFigure371, of thefirsteighttimeunitstherandomwalkwas positiveforsixandnegativefortwo) z" Figure 3.7. J.Asample path for the randomwalk. 146RENEWAL THEORY THEOREM3.7.5 LetEk ndenotetheeventthatbytime2nthesymmetricrandomwalkwil( bepositive for2ktimeunitsand negative for2n- 2ktimeunits,and letbkn =P(Ekn)Then (374) ProofTheproof isbyinductiononnSince Uo= 1, itfollowsthat(374) isvalid whenn=1Soassumethatbkm =UkUm-kforallvalues of msuchthat m 2n}P{T > 2n}. ,=1 NowgiventhatT=2r,itisequallylikelythattherandomwalkhasalwaysbeen positiveor alwaysnegativein(0,2r)anditisat 0at 2rHence, P{EnniT>2n}=t andso n bn n=~L bn -, n_,P{T =2r}+ !P{T > 2n} ,=1 n = ~L un-,P{T= 2r}+ ~ P { T >2n}, ,=1 wherethelastequality,bn-,n-,= Un-,Un,followsfromtheinductionhypothesisNow, nn L un-,P{T= 2r}=L P{Zz,,-2,= O}P{T= 2r} ,=1,=1 n =L P{Zz"= OiT= 2r}P{T= 2r} ,==1 ==Un, and so (by Lemma 3 7 3) REGENERATIVEPROCESSES147 Hence(374) isvalidfork=nand,infact,bysymmetry,alsofork=0The proof that (374) isvalid for 0< k< n followsinasimilar fashionAgain,conditioning on T yields /I bkll = L P{Ehl T2r}P{T = 2r} ,-1 NowgiventhatT2r,itisequallylikelythattherandomwalkhaseither always beenpositiveoralwaysnegativein(0,2r)HenceinorderforEk ntooccur,the continuation fromtime2r to2nwouldneed 2k- 2r positiveunitsinthe former case and 2kinthelatterHence, /In bkn= L bk-rll-rP{T + t L bkn_rP{T = 2r} r-I /III = L uk_rP{T2r} + L U,,-r-kP{T = 2r}, r=l wherethelastequalityfollowsfromtheinductionhypothesisAs /I L uk-rP{T = 2r} ::::Uk, II L U"r-kP{T =2r} =UII _.. ,=1 weseethat which completes theproof Theprobability distributiongivenby Theorem 37.5, namely, iscalledthediscretearcsinedistribution.Wecallitsuchsinceforlargek and nwehave,byStirling'sapproximation,that 1 UIr;UII Ie--;;=== 1TYk(n- k) 148RENEWAL THEORY Henceforanyx, 0 O}and note thatf(t) is nonnegative and non decreasingAlso Hence and byinduction f(s+ t)=P{N(s+ t) - N(s)> 0orN(s) > O} ::; P{N(s + t)- N(s) > O}+ P{N(s) > O} = f(t)+ f(s). f(t)::;2f(tI2) f(t)::;nf(tln)foralln= 1,2,.. 150 Thus,letting a be suchthat [(a)> 0,wehave (382) [(a) be suchthat [(s)ls> A - 6.Now,foranytE(0, s) thereisaninteger nsuchthat ss nn - 1 From themono tonicity of let) and from(3.82), weobtain that for allt inthisinterval (383) let)[(sin)_n- 1 [(sin)n- 1 [(s) -=:::------=:::.----tsl(n - 11nsinns Hence, Since6isarbitrary,and sincen00ast 0,itfollowsthatIim,....o[(t)lt= A NowassumeA=00.Inthiscase,fixanylargeA>andchoosessuchthat [(s)ls> AThen,from(383), it fOIlOWSthatforalltE(0,s) let) =:::n- 1 [(s) > n- 1 A, tnsn whichimplies [(t)lt==00,and theproof iscomplete ExAMPLE3.8(A)For theequilibnum renewalprocess P{N(t) >O}= Fe(t) = F(y) dyllL Hence,usingL'hospital'srule, A =limP{N(t) > O}=limF(t)=1.. HOtHOILIL STATIONARYPOINTPROCESSES Thus,fortheequilibriumrenewalprocess,A istherateofthe renewalprocess. Forany stationary pointprocess{N(t),t~O},wehavethat E[N(t + s)]==E[N(t + s)- N(s)]+ E[N(s)] = E[N(t)] + E[N(s)] implyingthat,forsome constant c, E[N(t)]=ct 151 WhatistherelationshipbetweencandA?(Inthecaseof theequilibrium renewalprocess,itfollowsfromExample 3 8(A) and Theorem 3.5.2 thatA = c=111-')In generalwenotethat since c =i nP{N(t) = n} n= It ~iP{N(t)n} n=1t _P{N(t) > O} t it followsthat c ~A.In order to determine when c =A,weneed the following concept.Astationary pointprocessissaidtobe regular or orderlyif (38.4)P{N(t) ::>2} = o(t) It should be noted that,fora stationarypoint process,(3.8.4)impliesthat the probability that two or more events will occur simultaneously at any point isO.To seethis,dividetheinterval[0,1]intonequal parts.The probability of a simultaneous occurrence of eventsislessthantheprobability of twoor moreeventsinanyof theintervals j=0, 1,,n1, and thusthisprobability isbounded bynP{N(lIn) ~2},which by(3.8.4) goes to zero asn ~00.When c k}, B.;fortbe even+ C: 1) - NW"'2}. n-l Bn= U Bn!' }:O C fortbeevent {NC: 1) - NW'" 1.N(1)-NC: 1) = k} LetB> 0andapositiveinteger mbegivenFromtheassumedregularityof the process,itfollowsthat B P(Bn})< n(m + 1)' j=O,I,.,n-I forallsufficientlylargenHence, Therefore, (385) whereBnisthecomplement of BnHowever,alittlethoughtrevealsthat n-l A.Iin = U c,lin, }:O andhence n-1 P(A,Bn) SL P(Cn,}), }:O PROBLEMS whichtogether with(385)impliesthat (386) mn-Im L P(Ak )::; L L P( Cnk)+ 8 k=OrOk-O ==nP{N(lIn) ::: 1}+ 8 ::; ,\ + 28 for allnsufficientlylargeNowsince(386)istruefor allm,itfollowsthat 00 L P(Ad ::; ,\ + 28 k=O Hence, 0000 c ==[N(1)l= L P{N(1) > k} ==L P(Ak)::;,\ + 28 and theresultisobtained as8isarbitraryanditisalreadyknownthat c;::::,\ PROBLEMS 3.1.Isit truethat (a)N(t)< nif and onlyifSn>t? (b)N(t)t? (c)N(t)>nifand onlyifSnX2,,"withErN.] 0, the nth departure leaves behind Xn customers-of which one enters service and the other Xn- 1 wait inline.Hence, at the next departure the system will contain theXn- 1 customers tha twere in line in addition to any arrivals during the service time of the(n+ 1)stcustomerSinceasimilarargumentholdswhen Xn=0,weseethat ( 4.1.2) if Xn> 0 if Xn= O. Since Yn,n~1, represent the number of arrivals in nonoverla p-ping service intervals, it follows,the arrival process being aPoisson process,that theyareindependent and fa:(Ax) I (4.1.3)P{Yn= j} =e-AX -.-, dG(x), oJ. j=0,1, .. From(4.1.2)and(41.3)itfollowsthat {Xn'n=1,2,. '.}isa Markov chainwithtransitionprobabilities givenby POI=f'"e-AX ( A ~ ) IdG(x),j;::=:0, oJ feo(Ax)r 1+I P'I =0e-AX (j _ i + 1)' dG(x), P'I = 0otherwise INTRODUCfION AND EXAMPLES EXAMPLE4.1(B)TheG/M/l Queue.Supposethatcustomersar-rive at a single-server service center in accordance with an arbitrary renewal process having interarrival distribution G. Suppose further that the servicedistributionisexponential withrate p.. If weletXIIdenotethenumber of customersinthesystemas seen by the nth arrival, itiseasyto seethat the process {X"' n2:: I}isaMarkovchainTo computethetransitionprobabilitiesP" forthisMarkovchain,letusfirstnotethat,aslongasthereare customerstobeserved,thenumberofservicesinanylengthof time t is aPoisson random variable withmean p.tThis istrue since thetimebetweensuccessiveservicesisexponentialand,aswe know,thisimpliesthatthenumberof servicesthusconstitutes a Poissonprocess.Therefore, f'e-I'I (p::)' dG(t), oJ. j=O,I, ... ,i, which followssince if an arrival findsi in the system, then the next arrival willfind i + 1 minus the number served, and theprobability thatjwillbeservediseasilyseen(byconditioningonthetime betweenthesuccessivearrivals)toequaltheright-handsideof the above TheformulaforP,oislittledifferent(itistheprobabilitythat atleasti+1Poissoneventsoccurinarandomlengthoftime havingdistributionG) andthusisgivenby P=f'"~e-1'1 (p.t)k dG(t)i 2::O. 100L.,kl' k ~ ,+ I 165 RemarkThe reader should note that in the previous two examples wewere abletodiscoveranembedded Markovchainbylooking attheprocessonly at certain time points, and by choosing thesetimepoints so asto exploitthe lack of memory of the exponential distributionThis is often a fruitful approach forprocessesinwhichtheexponentialdistributionispresent EXAMPLE4.1(c)Sums ofIndependent, Identically Distributed Ran-domVariables.TheGeneral RandomWalk.LetX"i>1,be independent andidenticallydistributedwith P{X,=j}= a"J0,::tl, If welet " So0andS"2: X" ,-I then {SlItn>O}isaMarkovchainforwhich 166MARKOVCHAINS {Sn'n;>O}iscalledthegeneralrandomwalkandwillbe studied inChapter 7. ExAMPLE4.1(D)The Absolute Value of the Simple Random Walk. Therandomwalk{Sn,n~I},whereSn= ~ ~X"issaidtobea simple randomwalkif forsome p,0< Ppmpn> 0 'k- L.Jrr,k - 'IIk ,=0 Similarly,wemay showthereexistsans forwhichPi">0 CHAPMAN-KOLMOGOROVEQUATIONSANDCLASSIFICATIONOFSTATES169 Twostatesthatcommunicatearesaidtobeinthesameciass,andby Proposition 4.2], any two classesare either disjointor identicalWe saythat the Markov chain isirreducible if thereisonly one class-that is,if allstates communicatewith each other Statei issaidtohaveperiodd if P ~ ,= whenever nisnot divisiblebyd and d isthe greatest integer withthisproperty.(If P ~ ,=foralln> 0, then definetheperiodofitobeinfinite)Astatewithperiod]issaidtobe aperiodicLet d(i)denotetheperiod of i.Wenow show thatperiodicity isa classproperty PROPOSITION4.2.2 If i ~j, thend(i)=dO) ProofLet mandnbesuchthatp ~P7,> 0,andsupposethatP;,> 0Then pn jm:>pn pm> 0 IJ- I''I pn j, jm:>pn P' pm> 0 II- I'"'I' wherethesecondinequalityfollows,forinstance,sincetheleft-handsiderepresents the probability that starting in jthechain willbebackin jafter n+ s+ mtransitions, whereasthe right-handsideisthe probability of the same event subjecttothe further restrictionthat the chainisini both after nand n+ stransitionsHence, dO)divides bothn+mandn+s+m,thusn+s+m- (n+m)=s,wheneverP;,>O. Therefore,dO)dividesd(i)Asimilarargumentyieldsthatd(i)dividesdO),thus d(i)= dO) For any states i and j define t ~ 1tobethe probability that, starting in i,the firsttransitioninto joccursat time nFormally, t?1= 0, til =P{Xn= j, Xk =;6j, k=],,n - ] IXo=i}. Let Then t'l denotes the probability of ever making atransitioninto state j,given that the process starts in i.(Note that fori=;6j, t,1ispositiveif,and only if,jis accessible from i.) State j is said to be recurrent if ~ I=], and transient otherwise. 170 PROPOSITION4.2.3 State]isrecurrentif.andonlyif. ., '" pn= 00 L../I CHAINS ProofState]isrecurrent if,withprobability 1,aprocess starting at j willeventually returnHowever, by the Markovian property it follows that the process pr 0, p ~> 0Nowforanys~0 pm+n+r:>pm P' pn }/- JIII1/ andthus "pm+n+r 2: pm pn "pr= 00 L..J11 I''1L..JIt' ,r andtheresultfollowsfromProposition423 EXAMPLE4.2(A)TheSimpleRandomWalk.TheMarkovchain whose state space isthe set of all integers and hastransition proba-bilities P,,_I=P=1 - P,,-I'i =0,::t:1,.. , where 0 < P < 1, is called the simple random walkOne interpreta-tion of this process is that it represents the wanderings of a drunken man ashewalks along a straight line.Another isthat it represents thewinningsofagamblerwhooneachplayof thegameeither winsor losesonedollar. SinceallstatesclearlycommunicateitfollowsfromCorollary 4.2.4thattheyareeitheralltransientorallrecurrentSoletus consider state 0 and attempt to determine if ~ : = IPoo isfiniteor in-finite Sinceitisimpossibletobeeven(usingthegamblingmodel interpretation)afteran oddnumber of plays,wemust,of course, havethat P2n+I_0 00- ,n= 1,2, ... Ontheotherhand,thegamblerwouldbeevenafter2ntrials if,and only if,hewon nof theseand lostnof theseAs eachplay ofthegameresultsinawinwithprobabilitypandalosswith probability 1 - p, the desired probability is thus the binomial proba-bility n=1,2,3, .. Byusinganapproximation,dueto Stirling,whichassertsthat 172MARKOV CHAINS wherewesaythat a"- b"whenlim" .... "'(a"lb,,)= 1,weobtain P2"(4p(1- p))" 00-V;;; Nowitiseasytoverifythatifa"- b",thenL" a"0,j>i-l, P'j= 0,j < ; - 1 Let p =~ jjajSince p equals the mean number of arrivals during a serviceperiod, itfollows,upon conditioning on the length of that period,i,ha t p= '\[S], whereSisaservicetimehavingdistributionG Weshallnow show that theMarkovchainispositiverecurrent whenpO.Then, as iand jdonotcommunicate (since je R). P;,= 0 for allnHence if theprocess starts instate i,thereisa positiveprobabilityof atleast P"thattheprocesswillneverreturnto iThis contradicts thefactthat iisrecurrent, andsoPI,:::::O. 186MARKOV CHAINS LetjbeagivenrecurrentstateandletTdenotethesetof alltransient states.For iET,weareofteninterested incomputing /,J'the probability of everentering jgiventhattheprocessstartsiniThefollowingproposition, by conditioning on the state after the initial transition, yields a set of equations satisfiedby the /', PROPOSrrlON4.4.2 If jisrecurrent,thenthesetof probabilities{fr" iET} satisfies whereRdenotesthe setof states communicatingwith j Proof fr,= P{NJ(oo) > 0IXo i} =2: P{N,(oo) > 0lxo =i,XI = k}P{XI = klxo= i} .11k where wehave usedCorollary 425 in asserting that h,1 for kERandProposition 441inassertingthat h,= forkf.T,kf.R. EXAMPLE4.4(A)TheGamblersRuin Problem.Consideragam-bIer whoateachplay of thegamehasprobability pof winning1 unit and probability q =1 - P of losing 1 unit.Assuming successive playsof thegameareindependent,whatistheprobabilitythat, startingwithiunits,thegambIer'sfortunewillreachNbefore reaching If we let Xndenote the player's fortune at time n, then the process {Xn,n= 0, 1, 2,.}is a Markov chain with transition probabilities Poo=PNN=I, Pi i+ 1 = P = 1PI iI' i =1,2,,N1. ThisMarkovchainhasthreeclasses,namely,{O},{I,2,... , NI},and{N},thefirstandthirdclassbeingrecurrentand the secondtransient.Sinceeachtransientstateisonlyvisitedfinitely TRANSITIONSAMONGCLASSES, THE GAMBLER'SRUINPROBLEM often, itfollows that, after some finite amount of time, the gambler willeither attainher goalof Nor gobroke. Letf,==f,Ndenotetheprobabilitythat,startingwithi,0:s; i 0 and Po+ PI < 1. Then(i) 1Tois the smallest positive number satisfying (ii)1To= 1 if,andonly if,II.:s;1. ProofToshowthat1Toisthesmallestsolutionof(4.5.1),let1T~0satisfy(45.1). We'llfirstshowbyinductionthat 1T~P{X"O}foralln.Now 1T= 2: 1TJ PJ ~1Topo =Po= P{X1 = O} J andassumethat 1T~P{X"= O}Then P{X"+I= O}2: P{X"+1= OlX1 = j}PI J 2: (P{X" = Oni PI I (by the induction hypothesis) 1T Hence, 1T~P{X"=O}foralln, andletting n_00, 1T~lim P{X"O}P{population dies out} = 1To " To prove(ii)definethe generating function APPLICATIONS OF MARKOV CHAINS (1,1) / / / Pot--___~ / /45/ / / / / / / / Figure 4.5.1Figure 4.5.2 SincePo+PI"(S)= L j(j - l)sJ-ZPJ > 0 J ~ O (1,1) / 193 foralls E(0,1)Hence,c/>(s)isastrictlyconvexfunctionintheopeninterval(0,1) Wenowdistinguishtwocases(Figures45 1 and452)InFigure4.51c/>(s)>s for allsE(0,1),andinFigure452,c/>(s)=s forsomes E(0,1).It isgeometncally clear that Figure 45 1 representsthe appropnatepicture whenC/>'(1):::;1.andFigure 452 isappropriatewhenc/>'(1)>1Thus,sincec/>(170)= 170,170= 1 if,andonlyif, c/>'(1):::;1Theresultfollows,sincec/>'(1)=L ~jPJ = IJ. 4.6ApPLICATIONSOFMARKOVCHAINS 4.6.1AMarkovChain Model of Algorithmic Efficiency Certainalgorithmsinoperationsresearchandcomputerscienceactinthe followingmannerthe objective isto determine the best of a set of N ordered elementsThe algorithm starts with one of the elements and then successively movestoabetterelementuntilitreachesthebest.(Themostimportant exampleisprobablythesimplexalgorithmoflinearprogramming,which attempts to maximize a linear function subject to linear constraints and where an element correspondstoanextreme point of thefeasibilityregion.)If one looksatthealgorithm'sefficiencyfroma"worsecase"pointof view,then examples can usually be constructed that require roughly N- 1 steps to reach theoptimalelementInthissection,wewillpresentasimpleprobabilistic modelforthe number of necessarysteps.Specifically. weconsider aMarkov chain that whentransitingfromany state isequallylikelyto enter anyof the better ones 194 ConsideraMarkovchainforwhichPI I= 1 and 1 p'J=-'-1' 1-j=I, ... ,i-l,i>l, MARKOV CHAINS andletT,denotethenumberof transitionstogofromstateito state1.A recursive formula for E{T;] can be obtained by conditioning on the initial tran-sition: (4.6.1) 1,-1 E[T,]=1 + i-I J ~E[TJ Starting withE [TI]=0,wesuccessivelyseethat E{T2]=1, E{T)]= 1 + l, E{T4]= 1 +1(1+1 +i) = 1 +i+1, anditisnotdifficultto guessandthenproveinductivelythat ,-11 E[T,]=2:-:. J=IJ However,toobtainamorecompletedescriptionofTN'wewillusethe representation where I={I J0 N-l TN= 2:Ij, j=I if the process ever enters j otherwise. The importanceof theaboverepresentationstemsfromthefollowing. Lemma4.6.1 f., fN-1 areindependentand P{fJ =l}=1/j,l:=:;;j:=:;;N-l APPLICATIONS OF MARKOV CHAINS195 proofGiven',+1>,'N,let n=min{ii> j."=I}Then P{',=11',-'-1' ,} =lI(n - 1)=! ,Nj/(n-I)j PROPOSITION4.6.2 N-I1 (i)[TNJ= 2:""7 , ~1] N-1I(1) (ii)Var(TN)=2:-:1 - -: ,- 1]] (iii)For Nlarge.TNhasapproximatelyaPoisson distributionwithmeanlogN ProofParts(i)and(ii)followfromLemma46 1andtherepresentationTN= ' 2 : , ~ 1 1',. Since the sumof alargenumber of independentBernoulli random variables. each having a smallprobability of being nonzero, isapproximately Poisson distributed, part(iii)followssince or andso fNdX~ I1fN-1dx -mlx}=-x (m- 1)' since inorder forthe run length to be atleast m, the next m- 1 values must allbegreater than xand they mustbe inincreasingorder. To obtaintheunconditionaldistributionof thelengthof agivenrun,let Indenote the initial value of the nth run.Now itiseasy to see that {In.nI} isaMarkovchainhavingacontinuousstatespaceTo compute p(Ylx), the probability density that the nextrunbeginswiththe value y giventhatarun has justbegunwithinitialvalue x.reasonasfollows co E(y,y + dY)l/n=x}=LP{ln+1E(Y.y + dy). Ln=ml/n =x}. m&1 whereLnisthelengthof thenthrun.Nowtherunwillbeof lengthmand thenextonewillstartwithvalueyif (i)the next m- 1 values are in increasing order and are all greater thanx; (ii)the mth valuemustequal y; (iii)the maximum of the firstm- 1 valuesmustexceedy. Hence, P{ln+ IE(y, Y + dy),Ln= ml/n = x} _(1_x)m-1 - (m-l)'dyP{max(X1,.,Xm_IyIX,>x,i=I ... ,m-l} (1- x)m-I (m-1)'dy ify< x = (l-x)m-1[(y_x)m-I] (m- 1)'dy1 - 1 - x if Y > x. APPLICATIONS OF MARKOV CHAINS summing over myields {I-x p(ylx) =el_x Y-.t e-e ify x. 197 That is. {In'n~I}isa Markov chainhaving the continuous state space(0. 1) andatransition probability density P ( ylx)givenby theabove. Toobtainthelimitingdistnbutionof Inwewillfirsthazardaguessand thenvenfyourguessbyusingtheanalogof Theorem4.3.3.NowII. being the initial value ofthe firstrun,isuniformly distributed over (0,1). However, thelater runs beginwhenever avaluesmaller thantheprevious one occurs. So it seems plausible that the long-run proportion of such values that are less thanywouldequaltheprobability thatauniform(0,1)randomvariableis lessthanygiventhatitissmaller thanasecond,andindependent,uniform (0,1)randomvanable.Since it seems plausiblethat 17'( y),thelimiting density of In'isgivenby 1T(Y)=2(1y),O 0 /;1+ 1 with the notation P""signifying that the above are limiting probabilities. Before writing down the steady-state equations, it may be worth noting the following (I)Any elementmovestowardthebackof thelistatmost one position ata time. (ii)If anelementisinpositioniandneitheritnoranyof theelements in the following k(i) positions are requested, it will remain in position i. (iii)Anyelementinoneofthepositionsi,i+1,... ,i+k (i)willbe movedtoaposition1- =b ,1 + (k(i)lp)q - 1 + qlp + ..+ (qlp)lrlll" 202MARKOV CHAINS NowdefineaMarkovchain withstatesO.1., nandtransitionprobabilities (466)P,,= {c, 1 - c, ifj=i-l. ifj = i+ k(i). i= 1,,n - 1 Let f,denotetheprobabilitythatthisMarkovchainever entersstate0giventhatit startsinstateiThen f,satisfies f,=C,f,-I+(1- c,)f, fk(l).1=1.,n - 1. fo= 1, Hence,asitcanbeshownthattheabovesetofequationshasauniquesolution.it followsfrom(464) thatifwetakec,equalto a, foralli,then f,willequaltheIT,of ruleR,andfrom(465)ifwelet c,=b"then f,equals TI,Nowitisintuitivelyclear (andwedeferaformalproofuntilChapter 8)thattheprobabilitythattheMarkov chaindefinedby(466)willever enter 0isanincreasingfunctionof thevector f=: (c1Cn_l)Hence.since a,;:::b"i=1..n, we seethat IT,;:::n,for all i When p:s;lin,then a, ::s;b" i= 1,. n- 1.andtheaboveinequalityisreversed THEOREM4.6.4 Among therulesconsidered,thelimitingexpected positionof theelementrequested is minimized bythetranspositionrule ProofLettingXdenotethepositionofelwehaveuponconditioningonwhether ornotel isrequestedthattheexpectedpositionoftherequestedelementcanbe expressedas E"'..]E[]() E[1+ 2++ n - X] LPosltlon= pX+1 - Pn_1 =(p_1 - p) E[X] + (1- p)n(n + 1) n- 12(n- 1) Thus.ifp;:::lin. the expectedpositionisminimizedbyminimizingE[X], andif p~ lin.bymaximizingE[X]SinceE[X]=2 . ; ~ 0PiX>it,theresultfollowsfrom Proposition 46 3 TIME_REVERSIBLEMARKOVCHAINS203 4.7TIME-REVERSIBLEMARKOVCHAINS An irreducible positive recurrent Markov chain is stationary if the initial state is chosen accordingto the stationary probabilities.(In the caseof an ergodic chainthis isequivalent to imagining that the process begins at time t=-00.) Wesaythat sucha chainisin steady state. Consider now a stationary Markov chain having transition probabilities p,J andstationaryprobabilities1T,.andsupposethatstartingatsometimewe trace the sequence of states going backwards in time. That is. starting at time n consider the sequence of states Xn Xn -I, ...It turns out that this sequence of statesisitself aMarkov chainwithtransition probabilitiesP ~definedby P ~P{Xm=jlXm+1 =i} _P{Xm+1= ilXm = j}P{Xm= j} - P{Xm+1=i} ToprovethatthereversedprocessisindeedaMarkovchainweneedto verifythat Toseethattheprecedingistrue,thinkofthepresenttimeasbeingtime m+ 1.Then,sinceXn,n::>1isaMarkovchainitfollowsthatgiventhe presentstateXm+1thepaststateXmandthefuturestatesXm+2Xm+3, areindependent.Butthisisexactlywhatthepreceding equationstates. Thus the reversed process isalso a Markov chain with transition probabili-tiesgivenby 1T,P" P*-'/---' 1T, If P ~= p,Jforalli,j,thentheMarkovchainissaidto betimereversible. Theconditionfortimereversibility,namely,that (4.7.1)foralli. j, can be interpreted asstating that, forallstates i and j, therateatwhichthe processgoesfromito j(namely,1T,P'J)isequaltotherateatwhichitgoes from jto i (namely, 1T,P,,).It should benoted that this is an obvious necessary conditionfortimereversibilitysinceatransition fromi to jgoingbackward 204 MARKOV CHAINS intime isequivalent to a transition fromj to i going forwardintime; that is ifXm=iandXmI= j,thenatransitionfromitojisobservedifwe looking backward in time and one from j to i if weare looking forward in time If wecanfindnonnegativenumbers,summingto1,whichsatisfy(4.71) thenitfollowsthattheMarkovchainistimereversibleandtherepresentthestationary probabilities. Thisisso sinceif-(47.2)for all i, j2: Xt 1, thensumming over i yields 2: Xl= 1 Sincethestationary probabilities'Frt aretheuniquesolutionof theabove,it followsthatX,='Fr,foralli. ExAMPLE4.7(A)An Ergodic RandomWalk.We can argue,with-out any need for computations, that an ergodic chain withp/., + I+ Ptr I=1 is time reversible. This followsby noting that the number of transitionsfromitoi+ 1 mustatalltimes bewithin1 of the number from i + 1 to i.This is so since between any two transitions fromito i+ 1 there must be one fromi+ 1 to i (and conversely) sincetheonlywaytore-enterifromahigherstateisbywayof statei+1.Henceitfollowsthattherateoftransitionsfromi toi+1equalstheratefromi+1toi,andsotheprocessis timereversible. ExAMP1E4.7(a)TheMetropolis Algorithm.Let ap j=1,... , m m bepositivenumbers,andletA=2:a,Supposethatmislarge ,= 1 and that Aisdifficult to compute, and supposeweideallywant to simulate the values of a sequence of independent random variables whoseprobabilitiesareP,=ajlA,j=1,... ,m.Onewayof simulatingasequenceofrandomvariableswhosedistributions convergeto {PI'j=1,.. , m}istofindaMarkovchainthatis botheasy to simulateandwhoselimitingprobabilitiesarethe PI' TheMetropolis algorithmprovides anapproach foraccomplishing thistask LetQbeanyirreducibletransitionprobabilitymatrixonthe integers1,... , nsuchthatq/,=q/,foralliand j.Nowdefinea Markovchain{Xn, n::>O}asfollows.If Xn=i,thengeneriitea random variablethat isequal to j withprobability qlJ' i, j1; .., m. If this random variable takes on the value j, then set Xn+1 equal to jwithprobability min{l,a,lat},andsetitequalto iotherwise. TIME-REVERSIBLE MARKOV CHAINS Thatis,the transitionprobabilitiesof {Xn,n~O}are ifj * i if j =i. Wewillnowshowthatthelimitingprobabilitiesof thisMarkov chainareprecisely the PI" Toprovethatthe p}arethelimitingprobabilities,wewillfirst show that the chainistimereversiblewithstationary probabilities PI'j=1,... , mbyshowingthat To verifytheprecedingwemustshowthat Now,q,}= q},and a/a,= pJpl and sowemustverifythat P,min(l, pJp,)= p}min(l, p,Ip). Howeverthisisimmediatesincebothsidesof theequationare eq ualtomine P"p}).Thatthesestationaryprobabilitiesarealso limiting probabilities follows from the fact that since Q is an irreduc-ible transition probability matrix, {Xn}willalso be irreducible, and as(exceptinthe trivialcasewhere Pi==lin)PII > 0for somei,it isalsoaperiodic BychoosingatransitionprobabilitymatrixQthatiseasyto simulate-thatis,foreachiitiseasytogeneratethevalueof a randomvariablethatisequalto jwithprobability q,}, j=1,... , n-we canusetheprecedingtogenerateaMarkovchainwhose limitingprobabilitiesarea/A, j=1,... ,n.Thiscanalsobe accomplishedwithoutcomputing A. 205 Consider agraphhavingapositivenumberw,}associatedwitheach edge (i, j), and suppose that a particle moves from vertex to vertex in the following manner If the particle ispresently at vertex i then itwillnext move to vertex jwithprobability wherew,}is 0 if (i,j) isnot an edge of the graph. The Markov chain describing the sequence of vertices visited by the particle iscalledarandomwalkon an edgeweightedgraph. 206MARKOV CHAINS PROPOSITION4.7.:1. Consider a randomwalkonanedgeweightedgraph withafinitenumber of vertices IfthisMarkovchainisirreduciblethenitis,insteadystate,timereversiblewith stationary probabilities givenby 1 7 , = ~ = - -II ProofThe timereversibilityequations reduce to or,equivalently, sincewII WI/ implying that which,sinceL 17,= 1,provestheresult. ExAMPLE 4.7(0)Consider a star graph consisting of r rays, with each ray consisting of n vertices.(See Example 1.9(C) forthe definition of astar graph.)Letleaf idenotetheleaf onrayi.Assumethat aparticlemovesalongthe verticesof thegraphinthe following mannerWheneveritisatthecentralvertexO.itisthenequally likely to move to any ofits neighborsWhenever it is on an internal (nonJeaf) vertex of a ray. then it moves towards the leaf of that ray with probability p and towardsOwith probability 1 - p. Wheneverit isataleaf.itmovestoitsneighborvertexwithprobability1. StartingatvertexO.weareinterestedinfindingtheexpected number of transitions that it takes to visitanthe vertices and then returnto O. TIME-REVERSIBLE MARKOV CHAINS Figure 4.7.1.Astar graphwithweights:w:=p/(l - p). Tobegin,letusdeterminethe expected number of transitions betweenreturnsto the centralvertex O.To evaluate thisquantity, note the Markovchainof successivevertices visited isof thetype consideredinProposition 4.7.1.To seethis,attacha weightequal to1 witheachedgeconnected to 0,andaweightequaltow'on anedgeconnecting theithand(i+ 1)stvertex(from0)of a ray, wherew=pl(l- p)(seeFigure47.1)Then,withtheseedge weights,theprobabilitythataparticleatavertexistepsfrom0 movestowardsitsleaf isw'/(w'+ WI-I)p. Sincethetotalof thesumof theweightsontheedgesoutof eachof theverticesis andthesumof the weightsonthe edgesoutof vertex0isr,we see fromProposition 4 7 1 that Therefore, /Loo,theexpectednumber of stepsbetweenreturns to vertex 0,is Now,say thata newcyclebegins whenever the particlereturns to vertex0,andletXIbethenumber of transitionsinthe jth cycle, j;;:::1Also,fixiandletNdenotethenumberof cyclesthatit 207 MARKOV CHAINS takes for the particle to visit leaf i and then return to 0With these N definitions,2: X,isequaltothenumber of stepsittakestovisit 1= 1 leaf iand then return to O.AsN isclearly a stopping timeforthe X"weobtain fromWald'sequationthat E [f X,]=ILooE[N]::: --'-1-w-O}isirreducibleandpositive recurrent,andletitsstationary probabilitiesbe1TJ,j~O.Thatis,the1Tj,j2! 0,istheunique solutionof, and1Tjhas the interpretationof beingthe proportion of theX/s thatequals j(If theMarkovchainisaperiodic,then1Tjisalsoequaltolimn_ex>P{Xn= j}.) Nowas1TJ equals the proportion of transitions thatare into state j, and /Lj isthemeantimespentinstatejpertransition,itseemsintuitivethatthe limitingprobabilitiesshouldbe proportionalto 1TJ/LrWe nowprove this. THEOREM4.8.3 Supposetheconditionsof Proposition481andsupposefurtherthattheembedded Markovchain{Xn'n2::O}ispositiverecurrentThen ProofDefinethenotationasfollows Y, U) = amount of time spent in state i dunng the jth visit to that state, i,j::::: 0 N, (m)= number of visits to state i in the first m transitions of the semi-Markov process In termsof theabovenotation weseethat theproportion of timeinidunng thefirst m transitions,callitP,-m,isasfollows N,(m) LY,(j) (481) J= 1 P, = m=---'--N-c,(--'m>,---LLY,(j) ,J=I N,(m) N:;>Y,(j) -- L.J--m/=1N,(m) L N,(m)N)Y,U) ,m/=1N,(m) 216MARKOVCHAINS Now sinceN,(m) _00as m _00,itfollowsfromthe strong lawof largenumbers that N,(m)Y,U) 2:-N()-ILl' ,_I, m and,bythestronglaw forrenewalprocesses,that N, (m)(E[bf..b..'])-1 --_numer 0transItionsetween VISitSto I= 1f, m Hence,lettingm_00in(48 1)showsthat andtheproof iscomplete. From Theorem 4.8.3itfollowsthatthe limitingprobabilitiesdepend only on the transitionprobabilitiesP,)andthemean timesILl>i, j~O. ExAMPLE4.8(A)Consideramachinethatcanbeinoneofthree states:goodcondition,faircondition,orbrokendown.Suppose that a machine ingoodcondition willremainthiswayforamean time ILland willthen goto either the faircondition or thebroken conditionwithrespectiveprobabilities t and t.Amachineinthe fairconditionwillremainthatwayforameantimeILzandwill then break down.Abroken machine willbe repaired, whichtakes a meantime IL3,and whenrepairedwillbeinthegoodcondition withprobability iandthefairconditionwithprobability 1.What proportionof timeisthe machineineachstate? Solution.Letting the statesbe1,2,3,wehavethat the TT,satisfy The solutionis TTl+ TT2+ TT3=1, TTl= iTT3, TTz = tTTI+ iTT3, TT3=tTTI+ TT2 SEMI-MARKOVPROCESSES Hence, P"the proportion of time the machine is in state i, is given by The problem of determining the limiting distribution of a semi-MarkovprocessisnotcompletelysolvedbyderivingtheP,.For wemayaskforthe limit,ast ~00,of being instate i attime tof making the next transition after time t+ x, and of thisnexttransi-tionbeingintostate jTo expressthisprobabilitylet yet) =time from t until the next transition, Set)= state entered at the first transition after t. To compute lim P{Z(t) =i,yet) > x, Set)=n, , ~ ' " weagainusethe theory of alternatingrenewalprocesses THEOREM4.8.4 If thesemi-Markovprocess isirreducibleand not lattice,then (482)limP{Z(t)= i,Y(t)> x, S(t)= jIZ(O)= k} ,_00 p , J ~F,,(y)dy = 217 ProofSaythatacyclebeginseachtimetheprocessentersstateiandsaythatitis "on" ifthe stateisi andit willremaini for atleastthenext xtimeunits andthenext stateisjSayitis"off"otherwiseThuswehaveanalternatingrenewalprocess Conditioning onwhether thestateafter iisjor not, weseethat ["on" timeinacycle]=P"[(X,,- x)'], 218MARKOVCHAINS whereX,)isarandomvanablehavingdistributionF,)andrepresentingthetimeto makeatransitionfromito j,andy+= max(O,y)Hence E["on" time in cycle]= P"f ~P{X,) - X> a} da = P,)f ~F,)(a+ x) da =P,) r F"( y) dy AsE[cycletime]=iJ-,,,theresultfollowsfromalternating renewalprocesses By the same technique (or by summing (4.8.2) over j) we can prove the fol-lowing. Corollary4.8.5 If thesemi-Markovprocessisirreducibleandnotlattice,then (483)! ~ ~P{Z(t) = i,Y(t) > xIZ(O)= k} = r H, (y) dy1iJ-/1 Remarks (1)OfcoursethelimitingprobabilitiesinTheorem4.8.4andCorollary 4.8.5alsohaveinterpretationsaslong-runproportionsForinstance, thelong-runproportionoftimethatthesemi-Markovprocessisin stateiandwillspendthenextxtimeunitswithoutatransitionand willthengoto state jisgivenby Theorem 4.8.4. (2)Multiplyinganddividing(4.8.3)byJLiJandusingP,=JL,IJLii'gives lim P{Z(t) = i,yet)> x} = P, H,.e(X), 1-'" whereH, eistheequilibriumdistributionofH,Hencethelimiting probabilityofbeinginstateiisP"and,giventhatthestateattis i,thetimeuntiltransition(astapproaches00)hastheequilibrium distributionof H" PROBLEMS219 PROBLEMS 4.1.A store that stocks a certain commodity uses the following (s, S) ordering policy;if its supplyatthe beginning of a time penod isx, then it orders o S-x if xs, if x< s. The order isimmediately filled. The daily demands are independent and equaljwithprobability(Xi'Alldemandsthatcannotbeimmediately metarelost.LetXndenotetheinventorylevelattheendofthenth time penod. Argue that {Xn,nI}isa Markovchainand compute its transitionprobabilities. - -4.2.For aMarkovchainprove that whenever n.O}a Markov chain? If so, give its transition probabilities. PROBLEMS 221 (c)Is {Yn,n2: O}a Markov chain? If so, give its transition probabilities. (d)Is {(Xn' Yn), n 2: O}a Markov chain? If so, give its transition probabil-ities. 4.11.If /" P) P )+, w h e n ~>P,. 4.44.Consideratime-reversibleMarkovchainwithtransitionprobabilities p')and limiting probabilities 1f"andnowconsider the same chain trun-catedtothestates0,1,..,M.Thatis,forthetruncatedchainits transitionprobabilitiesp,)are P,,=P ')' 0, o ~ i ~M,j =i 05:i=l=j 0forall ioFj istimereversibleif,and only if, foralli, j, k PROBLEMS 229 4.46.Let {Xn' n~1} denote an irreducible Markov chainhavinga countable state space.Now consider a new stochastic process {Yn,n~O}that only acceptsvalues of theMarkovchainthatarebetween 0andNThat is, wedefineYn tobethe nth valueof theMarkovchainthatisbetween o andNForinstance,ifN= 3andXI= 1,Xl= 3,X3= 5,X4 = 6, X52,thenY1 :;::1,Y2 =3,Y3 =2. (a)Is{Yn,n;;:;:O}aMarkovchain?Explainbriefly (b)Let11)denote theproportion of timethat{Xn,n::>I}isinstate j If 11J > 0forallj, whatproportionof timeis{Yn,n;;:;:O}ineach of the states 0,1,... , N? (c)Suppose {Xn}isnull recurrent and let 1T,(N), i=0, 1,, N denote thelong-run proportions for {Yn,n.2 O}Showthat 1T)(N)=11,(N)E[time theXprocess spends in j betweenreturns to i],j=1= i. (d)Use(c)toarguethatinasymmetricrandomwalktheexpected numberof visitsto state ibeforereturningtothe originequals1 (e)If {Xn,n;;:;:O}istimereversible,showthat {Yn,n;;:;:O}isalso 4.47.Mballsare initially distributed among murns.At each stage one of the balls is selected at random, taken from whichever urn it is in, and placed, at random, inone of the other m- 1 urnsConsider the Markovchain whose state at any time isthe vector (nl', nm), where nl denotes the numberofballsinurniGuessatthelimitingprobabilitiesforthis Markovchainandthenverifyyourguessandshowatthe sametime that theMarkovchain istimereversible 4.48.For anergodic semi-Markovprocess. (a)Computetherateatwhichtheprocessmakesatransitionfromi into j. (b)Show that 2: PI) IlL It=11ILw I (c)Show that the proportion of timethat the processisinstate iand headed for state jisP,) 71'J IlL IIwhere71.)=I; F,)(I) dl (d)Show that the proportion of timethat the state isi and willnext be jwithina time x is P,) 71'J F'() './x, IL" whereF ~ Jisthe equilibriumdistributionof F.) 230MARKOV CHA INS 4.49.Foranergodicsemi-Markovprocessderiveanexpression,ast~00 forthelimitingconditionalprobabilitythatthenextstatevisitedafte; t isstate j,givenX(t)= i 4.50.Ataxialternates between three locationsWhen itreaches location1 it isequallylikelytogonexttoeither2or3Whenitreaches2itwill nextgoto1withprobability~andto3withprobabilityi.From3 it alwaysgoesto1Themeantimesbetweenlocationsiandjaretl2"" 20,tl3=30,t23 =30(t,)=t),) (a)Whatisthe(limiting)probabilitythatthetaxi'smostrecentstop wasatlocationi,i=1,2,3? (b)What isthe(limiting)probability thatthetaxiisheading forloca-tion2? (c)What fraction of time is the taxi traveling from 2 to 3?Note.Upon arrivalatalocationthe taxiimmediately departs. REFERENCES References1,2,3,4,6,9,10,11,and12givealternatetreatmentsofMarkov chains Reference5isastandardtextinbranchingprocessesReferences7and8havenice treatmentsoftimereversibility 1NBhat,Elementsof Applied StochasticProcesses,Wiley,NewYork,1984 2CLChiang,AnIntroductiontoStochasticProcessesandTheirApplications, Krieger,1980 3ECinlar,IntroductiontoStochasticProcesses,Prentice-Hal.!,EnglewoodCliffs, NJ,1975 40RCoxandH0Miller,TheTheoryof StochasticProcesses,Methuen,Lon-don,1965 5THarris,TheTheoryof BranchingProcesses,Springer-Verlag,Berlin,1963 6SKarlin and HTaylor, AFirstCourse inStochasticProcesses,2nd ed , Academic Press,Orlando,FL,1975 7JKeilson,MarkovChainModels-RarityandExponentiality,Springer-Verlag, Berlin,1979 8FKelly,Reversibility and StochasticNetworks,Wiley,Chichester, England,1979 9JKemeny,LSnel.!,and AKnapp,Denumerable MarkovChains,VanNostrand, Princeton,NJ,1966 lOSResnick,Adventures inStochasticProcesses,Birkhauser,Boston,MA,1992 11SMRoss,Introductionto Probability Models,5th ed , Academic Press, Orlando, FL,1993 12HCTijms,StochasticModels,AnAlgorithmicAppr