Upload
miche345
View
217
Download
0
Embed Size (px)
Citation preview
8/12/2019 Ts Tutorial.nb
1/28
Time Series Tutorial
Hal VarianOct 08, 2003
Load routines
Load data
In[253]:=
SetDirectory"homehalHomeConsultingGoogleTutorialsTimeSeries";
Data is from https://www/eng/loganalysis.html , 3.Google.com results page, and consists of data and
number of pages, starting with "1998/11/16",378
In[254]:=
datReadList"google.dat", Number, RecordListsTrue;
In[255]:=
d0, g0Transposedat;
tstutorial.nb 1
8/12/2019 Ts Tutorial.nb
2/28
Plot the data
In[256]:=
ListPlotg0, PlotLabel"Daily searches", AxesLabel"day", "searches"
250 500 750 1000 1250 1500day
1108
2108
3108
searches Daily searches
Out[256]=
Graphics
Connect the dots
In[257]:=
ListPlotg0, PlotJoinedTrue,PlotLabel"Daily searches", AxesLabel"day", "searches"
250 500 750 1000 1250 1500day
1108
2108
3108
searches Daily searches
Out[257]=
Graphics
See the weekly pattern:
tstutorial.nb 2
8/12/2019 Ts Tutorial.nb
3/28
In[258]:=
ListPlotg0, PlotJoinedTrue, PlotLabel"Daily searches",AxesLabel"day", "searches", PlotRange1500, 1600, Automatic
1520 1540 1560 1580 1600day
1108
2108
3108
searches Daily searches
Out[258]=
Graphics
Smooth the data
In[259]:=
ListPlotMovingAverageg0, 7, PlotJoinedTrue,PlotLabel "Daily searches 7day MA", AxesLabel"day", "searches"
250 500 750 1000 1250 1500 day
5107
1108
1.5108
2108
2.5108
3108
searches Daily searches 7day MA
Out[259]= Graphics
Note Xmas, summer drops. Are the seasonal effects getting larger? Looks like exponential growth. Is traffic still growing
extremely rapidly?
tstutorial.nb 3
8/12/2019 Ts Tutorial.nb
4/28
Take logs
If data shows exponential growth, probably want to take logs. Absolute changes will then become proportional changes.
Explanation :if g[t] is Google traffic at time t, then the change in the log of Google traffic is over a small change in time
(i.e., one day) is:
In[260]:=
DLoght, t
Out[260]=
ht
ht
So the absolute changes in Log g[t] are equal to proportional changes in g[t].
In[261]:=
ListPlotLogMovingAverageg0, 7, PlotJoinedTrue, PlotLabel"Log daily searches"
250 500 750 1000 1250 1500
10
12
14
16
18
Log daily searches
Out[261]=
Graphics
Note the outliers! These are data errors. Can interpolate, or, in this case, drop them. To make life easy, let us also start on
20000101.
In[262]:=
d0411
Out[262]=
19991231
Clean data
In[263]:=
g Dropg0, 411;d Dropd0, 411;
tstutorial.nb 4
8/12/2019 Ts Tutorial.nb
5/28
In[265]:=
Taked, 1
Out[265]=
20000101
In[266]:=
Taked,1
Out[266]=
20030611
In[267]:=
ListPlotLogMovingAverageg, 7, PlotJoinedTrue, PlotLabel"Log daily searches"
200 400 600 800 1000 1200
16
17
18
19
Log daily searches
Out[267]=
Graphics
Look at it
We see Xmas, Easter, summer slump easily in seasonals. Note that Xmas is same drop (proportionally) summer slump
getting bigger. Also see general slowing down in growth. (Constant exponential growth would be a straight line.)
General philosophy
Want to decompose the series into seasonal pattern, trend, residual. Seasonals should be highly predictable, trend we hope
we can smoothly extrapolate, residuals should be unpredictable.
We will switch from a moving average to total weekly searches.
In[268]:=
wgNLogApplyPlus, Partitiong, 7, 1;
tstutorial.nb 5
8/12/2019 Ts Tutorial.nb
6/28
In[269]:=
Tablei, Modi, 13, i, 0, 175, 13
Out[269]=
0, 0,13, 0,26, 0,39, 0,52, 0,65, 0,78, 0,91, 0,104, 0,117, 0,130, 0,143, 0,156, 0,169, 0
In[270]:=
tickMarks1, "Q1\n2000", 13, "Q2", 26, "Q3", 39, "Q4",52, "Q1\n2001", 65, "Q2", 78, "Q3", 91, "Q4", 104, "Q1\n2002",117, "Q2", 130, "Q3", 143, "Q4", 156, "Q1\n2003", 169, "Q2";
In[271]:=
wPlotListPlotwg, PlotJoinedTrue, PlotLabel"Log weekly searches",AxesLabel"week", "log g", TickstickMarks, Automatic
Q12000
Q2 Q3 Q4 Q12001
Q2 Q3 Q4 Q12002
Q2 Q3 Q4 Q12003
Q2week
18
19
20
21
log g Log weekly searches
Out[271]=
Graphics
In[272]:=
decompImport"decomp.eps"
Out[272]=
Graphics
Heres an example of a "mechanical decomposition" into trend, seasonal, and residual using the"R" statistical software
package. (Open source version of "S").
tstutorial.nb 6
8/12/2019 Ts Tutorial.nb
7/28
In[273]:=
Showdecomp
17
18192021
data
0.300.200.100.00
0.05
seasonal
1718192021
trend
0.05
0.000.050.100.15
1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5
remainder
time
Out[273]=
Graphics
Well do something like this below.
Approach 1: ARIMA
We have some time series of data xtfor t=1,...T.
AR : Autoregressive : xt a xt1 et , where et is a random error.
I: Integrated: forecast differences (rates), then add up to get levels:xt xt1 , etc.
MA : Movingaverage : xt b et1 et , etc. whereet1 is the forecast error in periodt 1
In general an ARIMA model with p AR terms and q MA terms looks like this:
yt a1 yt1 a2 yt2 ...ap ytp b1 et1 b2 et2 ...bq etq et
where
yt xt xt1 oryt xt xt1 xt1 xt2etc ...
Youll see expressions like ARIMA(p,d,q) where:
p=number of AR terms
q=number of MA terms
d= degree of differencing
tstutorial.nb 7
8/12/2019 Ts Tutorial.nb
8/28
8/12/2019 Ts Tutorial.nb
9/28
Average growth rate is this:
In[276]:=
muMeanDifferencewg, 1
Out[276]=
0.0252693
Fit the trend:
In[277]:=
FitDifferencewg, 1, 1, t, t
Out[277]=
0.0441875 0.000211378 t
In[278]:=
fig02Plot0.04418754661530251 0.00021137752232568352 t, t, 0, 175, PlotStyleHue.8
25 50 75 100 125 150 175
0.01
0.02
0.03
0.04
Out[278]=
Graphics
In[279]:=
Showfig01, fig02
25 50 75 100 125 150 175
-0.02
0.02
0.04
0.06
0.08MA Growth Rates
Out[279]=
Graphics
(Well do this in a better way below...)
tstutorial.nb 9
8/12/2019 Ts Tutorial.nb
10/28
Detrend the data. I could subtract off .044 .0002t, but in this case Im going to be lazy and just subtract off the mean
growth rate:
In[280]:=
mu
Out[280]=
0.0252693
In[281]:=
y Differencewg, 1 mu;
In[282]:=
ListPloty, PlotLabel"weekly growth rates mean", PlotJoinedTrue
25 50 75 100 125 150 175
-0.1
-0.05
0.05
0.1
weekly growth rates mean
Out[282]=
Graphics
Observe the pattern, obvious Xmas is a big deal, but you can also see how growth rates fall in the summer.
Look at autocorrelations: shows how obs t is correlated with obs tk. The dashed lines indicate statistical significance of
the estimated autocorrelation.
In[283]:=
myplotcorr2CorrelationFunctiony, 15,PlotLabel"Autocorrelation", AxesLabel"k", "corr"
2 4 6 8 10 12 14 k
-0.4
-0.2
0.2
0.4
0.6
0.8
1
corr Autocorrelation
Out[283]=
Graphics
In this case the autocorrelations drop off quickly to zero, indicating that this is an AR process with short lag (maybe 0...).
But note seasonal correlation at lag 52:
tstutorial.nb 10
8/12/2019 Ts Tutorial.nb
11/28
8/12/2019 Ts Tutorial.nb
12/28
In[289]:=
myplotcorr2CorrelationFunctionres, 55
10 20 30 40 50
-0.2
0.2
0.4
0.6
0.8
1
Out[289]=
Graphics
Model says once you demean the series:
loggt loggt1 .80loggt52 loggt53
Or:
gtgt1
gt52gt53
.80
In other words: the weekly growth rate this year is the weekly growth rate last year raised to the power .80.
Lets look at some recent data to see how the estimated parameter changes.
In[290]:=
Lengthwg
Out[290]=
179
In[291]:=
estmodelConditionalMLEstimateDifferenceTakey,160, 1, model
Out[291]=
SARIMAModel0, 0, 52,,0.746602,,, 0.00351982
In[292]:=
estmodelConditionalMLEstimateDifferenceTakey,140, 1, model
Out[292]=
SARIMAModel0, 0, 52,,0.819953,,, 0.00363497
In[293]:=
estmodelConditionalMLEstimateDifferenceTakey,100, 1, model
Out[293]=
SARIMAModel0, 0, 52,,0.717923,,, 0.00403863
tstutorial.nb 12
8/12/2019 Ts Tutorial.nb
13/28
8/12/2019 Ts Tutorial.nb
14/28
In[302]:=
aPlotListPlotTakewg, upTo, Lengthwg, PlotJoinedTrue
5 10 15 20 25 30
21.1
21.2
21.3
21.4
21.5
Out[302]=
Graphics
In[303]:=
ShowaPlot, fPlot
5 10 15 20 25 30
21.1
21.2
21.3
21.4
21.5
21.6
Out[303]=
Graphics
Note error with Easter! (Which needs to be handled separately...)
Approach 2: VAR
VAR=Vector autoregression (actually vector scalarregression in this case...)
We had:
loggt loggt1 bloggt52 loggt53 mu
Which is a special case of:
loggt a1 loggt1 a52 loggt52 a53 loggt53 mu
with a1 =1, a53 = a52 . Why not estimate the coefficients directly?
tstutorial.nb 14
8/12/2019 Ts Tutorial.nb
15/28
In[304]:=
gtTablewgt, t, 54, Lengthwg;gt1Tablewgt 1, t, 54, Lengthwg;gt52Tablewgt 52, t, 54, Lengthwg;gt53Tablewgt 53, t, 54, Lengthwg;
In[308]:=
MapLength, gt, gt1, gt52, gt53
Out[308]=
126, 126, 126, 126
In[309]:=
RegressTransposegt1, gt52, gt53, gt, LAG1, LAG52, LAG53, LAG1, LAG52, LAG53
Out[309]=
ParameterTable
Estimate SE TStat PValue
1. 2.79123 0.503679 5.54168 1.75874107
LAG1 0.707748 0.0529117 13.376 0.
LAG52 0.763323 0.0601906 12.6818 0.
LAG53 0.596255 0.0710146 8.39622 9.733861014
,
RSquared 0.997159, AdjustedRSquared 0.99709, EstimatedVariance 0.00130016,
ANOVATable
DF SumOfSq MeanSq FRatio PValue
Model 3 55.6836 18.5612 14276. 0.
Error 122 0.15862 0.00130016
Total 125 55.8422
In[310]:=
CleargFgFt_ :gFt 2.7912 0.70774gFt 1 0.76332gFt 52 0.59625gFt 53;TablegFt wgt, t, 1, upTo;
In[313]:=
wgFTakeTablegFt, t, upTo, Lengthwg;
In[314]:=
vPlotListPlotwgF, PlotJoinedTrue, PlotStyleHue.6
5 10 15 20 25 30
21.2
21.3
21.4
21.5
Out[314]=
Graphics
Seems to fit a little better...
tstutorial.nb 15
8/12/2019 Ts Tutorial.nb
16/28
In[315]:=
ShowvPlot, aPlot, fPlot
5 10 15 20 25 30
21.1
21.2
21.3
21.4
21.5
21.6
Out[315]=
Graphics
Advantage: quite easy to add other explanatory variables (e.g., Yahoo traffic, AOL traffic, etc.) On the other hand, is often
difficult to guess the lag structure. The autocorrelation function and other ARIMA tools work better for that.
Approach 3: Basic Structural Model
Go back to decomposition of series into trend + seasonal + noise. Well set up dummies (0/1 variables) for each week and
estimate "week effect" and "trend" simultaneously. Since we are working with yearly data, we want 52 dummies, or 51
dummies + constant.
Example: suppose we were looking at daily data. We would want 6 dummies, one for each day of the week (with the
omitted dummy being the base day). In this case the sum of the dummies over the week is 1. However, if we choose
dummies a little more cleverly, we can make the sum of the dummies over the week equal to zero.
In[316]:=
seasonalT_, s_:TableWhichModt, sj, 1,Modt, s0, 1, True, 0, t, 1, T, j, 1, s 1;
In[317]:=
TableFormseasonal16, 7
Out[317]//TableForm=
1 0 0 0 0 0
0 1 0 0 0 0
0 0 1 0 0 0
0 0 0 1 0 0
0 0 0 0 1 0
0 0 0 0 0 1
1 1 1 1 1 11 0 0 0 0 0
0 1 0 0 0 0
0 0 1 0 0 0
0 0 0 1 0 0
0 0 0 0 1 0
0 0 0 0 0 1
1 1 1 1 1 1
1 0 0 0 0 0
0 1 0 0 0 0
tstutorial.nb 16
8/12/2019 Ts Tutorial.nb
17/28
Over the course of any week, the seasonal dummies sum up to zero so they measure the "pure" seasonal effect.
Then we want to estimated a regression like:
yt b0 b1 t b2 t2 1 d1 2 d2 ... 6 d6
The first part is a quadratic trend, second part is the dayofweek effect, or in our case, weekofyear effect. (Turns outthat t to the 1.7 works a little better than quadratic.)
In[318]:=
fx_ : x, x^2;BSMdata_, s_, g_, opts___ :
ModuleT, t, j, seasonal, dat, vars, TLengthdata;seasonalTableWhichModt, sj, 1,
Modt, s0, 1, True, 0, t, 1, T, j, 1, s 1;tempTablegi, i, 1, Lengthdata;tnames gtrend;datTransposeJoinTransposeseasonal, Transposetemp, data;varsJoinTableStringJoin"s", ToStringj, j, 1, s 1, tnames;Regressdat, vars, vars, opts
In[320]:=
BSMwg, 52, f
Out[320]=
ParameterTable
tstutorial.nb 17
8/12/2019 Ts Tutorial.nb
18/28
Estimate SE TStat PValue
1. 16.9223 0.0138337 1223.26 0.
s1 0.100933 0.0289021 3.49224 0.000662803
s2 0.0689877 0.028897 2.38737 0.0184677
s3 0.0852454 0.0288924 2.95044 0.00378961
s4 0.0971057 0.0288883 3.36141 0.00102915
s5 0.0878672 0.0288848 3.04199 0.0028646
s6 0.0897671 0.0288817 3.10809 0.00233162s7 0.070653 0.0288791 2.4465 0.0158157
s8 0.0787919 0.028877 2.72853 0.0072787
s9 0.0908617 0.0288754 3.14668 0.00206463
s10 0.0905012 0.0288743 3.13432 0.00214689
s11 0.0988849 0.0288736 3.42475 0.000832912
s12 0.0876539 0.0288733 3.03581 0.00291982
s13 0.0618928 0.0288736 2.14358 0.0340035
s14 0.0678918 0.0288743 2.35129 0.0202725
s15 0.0644817 0.0288754 2.2331 0.0273216
s16 0.0318148 0.028877 1.10173 0.272694
s17 0.0473982 0.0288791 1.64126 0.103257
s18 0.0387293 0.0288817 1.34096 0.182364
s19 0.0593591 0.0288848 2.05503 0.0419592
s20 0.0630918 0.0288883 2.18399 0.0308297s21 0.0590778 0.0288924 2.04475 0.042979
s22 0.0204953 0.028897 0.709252 0.479489
s23 0.0335837 0.0289021 1.16198 0.247457
s24 0.0167598 0.0332324 0.504321 0.614924
s25 0.0480584 0.0332312 1.44619 0.150628
s26 0.0574113 0.0332301 1.72769 0.0865135
s27 0.100312 0.0332292 3.01879 0.00307693
s28 0.051014 0.0332284 1.53525 0.127249
s29 0.0623868 0.0332278 1.87755 0.0627734
s30 0.0712248 0.0332272 2.14357 0.0340043
s31 0.077899 0.0332268 2.34447 0.0206309
s32 0.0908669 0.0332264 2.73478 0.0071497
s33 0.098071 0.0332261 2.95163 0.00377609
s34 0.0687385 0.0332258 2.06883 0.0406236s35 0.0473488 0.0332257 1.42507 0.15663
s36 0.0636908 0.0332255 1.91692 0.0575302
s37 0.0653859 0.0332255 1.96794 0.0512882
s38 0.0466455 0.0332254 1.40391 0.162825
s39 0.0497186 0.0332255 1.4964 0.13707
s40 0.0483864 0.0332255 1.4563 0.147817
s41 0.0402299 0.0332257 1.21081 0.228253
s42 0.0224446 0.0332258 0.675517 0.500595
s43 0.0138281 0.0332261 0.416181 0.677991
s44 0.00892977 0.0332264 0.268755 0.788561
s45 0.0106015 0.0332268 0.319065 0.750209
s46 0.0259868 0.0332272 0.782093 0.435639
s47 0.00482012 0.0332278 0.145063 0.884895
s48 0.0407816 0.0332284 1.22731 0.222013
s49 0.0376962 0.0332292 1.13443 0.258786
s50 0.014447 0.0332301 0.434757 0.664489
s51 0.0647843 0.0332312 1.9495 0.0534745
trend 0.0475137 0.000356546 133.261 0.
trend2 0.000125449 1.92374106 65.2112 2.183031098
,
tstutorial.nb 18
8/12/2019 Ts Tutorial.nb
19/28
RSquared 0.998651, AdjustedRSquared 0.998079, EstimatedVariance 0.00337758,
ANOVATable
DF SumOfSq MeanSq FRatio PValue
Model 53 312.557 5.89731 1746.02 0.
Error 125 0.422197 0.00337758
Total 178 312.98
Here are the BSM estimates:
In[321]:=
regBSMwg, 52, f, RegressionReportPredictedResponse;
In[322]:=
predPredictedResponse. reg;
In[323]:=
pPlotListPlotpred, PlotStyleHue.8, PlotJoinedTrue
25 50 75 100 125 150 175
18
19
20
21
Out[323]=
Graphics
In[324]:=
ShowpPlot, wPlot, Ticks tickMarks, Automatic
Q12000
Q2 Q3 Q4 Q12001
Q2 Q3 Q4 Q12002
Q2 Q3 Q4 Q12003
Q2
18
19
20
21
Out[324]=
Graphics
Pull out the estimated parameters:
tstutorial.nb 19
8/12/2019 Ts Tutorial.nb
20/28
In[325]:=
regBSMwg, 52, f, RegressionReportBestFitParameters;
In[326]:=
bfBestFitParameters. reg
Out[326]=
16.9223,
0.100933, 0.0689877, 0.0852454, 0.0971057, 0.0878672, 0.0897671,0.070653, 0.0787919, 0.0908617, 0.0905012, 0.0988849, 0.0876539, 0.0618928,0.0678918, 0.0644817, 0.0318148, 0.0473982, 0.0387293, 0.0593591, 0.0630918,0.0590778, 0.0204953, 0.0335837, 0.0167598, 0.0480584, 0.0574113, 0.100312,0.051014, 0.0623868, 0.0712248, 0.077899, 0.0908669, 0.098071, 0.0687385,0.0473488, 0.0636908, 0.0653859, 0.0466455, 0.0497186, 0.0483864,0.0402299, 0.0224446, 0.0138281, 0.00892977, 0.0106015, 0.0259868,0.00482012, 0.0407816, 0.0376962, 0.014447, 0.0647843, 0.0475137, 0.000125449
Pull out the coefficients on the weekly dummies:
In[327]:=
tempTakebf, 2, 52;
The dummy for week 52 is negative the sum of the other dummies.
In[328]:=
seasAppendtemp,ApplyPlus, temp;
Heres the pattern of seasonal adjustment around the trend:
In[329]:=
sPlotListPlotseas, PlotJoinedTrue
10 20 30 40 50
-0.3
-0.2
-0.1
0.1
Out[329]=
Graphics
In[330]:=
bForecastTakepred,Lengthwg upTo 1;
tstutorial.nb 20
8/12/2019 Ts Tutorial.nb
21/28
In[331]:=
bsmPlotListPlotbForecast, PlotJoinedTrue, PlotStyleHue.4
5 10 15 20 25 30
21.1
21.2
21.3
21.4
Out[331]=
Graphics
In[332]:=
ShowvPlot, aPlot, fPlot, bsmPlot
5 10 15 20 25 30
21.1
21.2
21.3
21.4
21.5
21.6
Out[332]=
Graphics
Note that it is underforecasting a bit. One reason is that the quadratic trend is too pessimistic. We can go back and estimate
that too, and it turns out that 1.7 is somewhat better.
Out of sample forecasts
You should also do out of sample forecasts if possible. The last day in this dataset was June 11, 2003. I loaded in some
other data up until Oct 5 and forecast that. Heres the result.
tstutorial.nb 21
8/12/2019 Ts Tutorial.nb
22/28
150 160 170 180 190
20.6
20.8
21.2
21.4
21.6
21.8
22
In this case, VAR performs pretty well. BSM is underforecasting (because the quadratic slowing is too extreme), AR isoverforecasting, because there is no slowing factor in the trend.
Seasonal adjustment
Actual traffic:
In[333]:=
ShowwPlot, DisplayFunction$DisplayFunction
Q12000
Q2 Q3 Q4Q12001
Q2 Q3 Q4Q12002
Q2 Q3 Q4Q12003
Q2week
18
19
20
21
log g Log weekly searches
Out[333]= Graphics
Seasonal adjustment
In[334]:=
ModLengthwg, 52
Out[334]=
23
tstutorial.nb 22
8/12/2019 Ts Tutorial.nb
23/28
In[335]:=
sAdjFlattenJoinseas, seas, seas, Takeseas, ModLengthwg, 52;
In[336]:=
ListPlotsAdj, PlotJoinedTrue
25 50 75 100 125 150 175
-0.3
-0.2
-0.1
0.1
Out[336]=
Graphics
Put them together to get seasonally adjusted data (trend + noise)
In[337]:=
trendPlusNoiseListPlotwg sAdj, TickstickMarks, Automatic, PlotJoinedTrue
Q12000
Q2 Q3 Q4Q12001
Q2 Q3 Q4Q12002
Q2 Q3 Q4Q12003
Q2
18
19
20
21
Out[337]=
Graphics
Heres the quadratic trend from the BSM:
tstutorial.nb 23
8/12/2019 Ts Tutorial.nb
24/28
In[338]:=
pureTrendPlot16.92 .0475t 0.000125t^2, t, 1, Lengthwg, PlotStyleHue.8
25 50 75 100 125 150 175
18
19
20
21
Out[338]=
Graphics
Here is trend plus noise (=actualseasonal adjustment) and pure trend:
In[339]:=
ShowtrendPlusNoise, pureTrend
Q12000
Q2 Q3 Q4Q12001
Q2 Q3 Q4Q12002
Q2 Q3 Q4Q12003
Q2
18
19
20
21
Out[339]=
Graphics
Long term forecasting
Look back at log of traffic. The most obvious "trend" is the slowing down of growth rates.
tstutorial.nb 24
8/12/2019 Ts Tutorial.nb
25/28
In[340]:=
ShowwPlot, DisplayFunction$DisplayFunction
Q12000
Q2 Q3 Q4Q12001
Q2 Q3 Q4Q12002
Q2 Q3 Q4Q12003
Q2week
18
19
20
21
log g Log weekly searches
Out[340]=
Graphics
This is also apparent in looking at the growth rates
In[341]:=
Showfig0, fig02
25 50 75 100 125 150 175
-0.05
0.05
0.1
Out[341]=
Graphics
But a better way to visualize this is to look at the yeartoyear growth rates
In[342]:=
z NApplyPlus, Partitiong, 7, 1;
tstutorial.nb 25
8/12/2019 Ts Tutorial.nb
26/28
8/12/2019 Ts Tutorial.nb
27/28
25 50 75 100 125 150 175week
13.5
14
14.5
15
LoggLogq cruises
25 50 75 100 125 150 175week
13.5
14
14.5
15
LoggLogqdigitalcameras
25 50 75 100 125 150 175week
14
15
16
17
LoggLogq golfclubs
25 50 75 100 125 150 175week
10.8
11.2
11.4
11.6
11.8
12
12.2
LoggLogq hotels
tstutorial.nb 27
8/12/2019 Ts Tutorial.nb
28/28
25 50 75 100 125 150 175week
14
15
16
17
18LoggLogq summerjobs
tstutorial.nb 28