104
Analysis, Forecasting and Mining Information from Time Series using F-Transform and Fuzzy Natural Logic Vil ´ em Nov ´ ak University of Ostrava Institute for Research and Applications of Fuzzy Modeling NSC IT4Innovations Ostrava, Czech Republic [email protected] Ho Chi Minh City, October 19, 2015

Analysis, Forecasting and Mining Information from Time

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Analysis, Forecasting and Mining Information from Time

Analysis, Forecasting and MiningInformation from Time Series using

F-Transform and Fuzzy Natural Logic

Vilem Novak

University of OstravaInstitute for Research and Applications of Fuzzy Modeling

NSC IT4InnovationsOstrava, Czech [email protected]

Ho Chi Minh City, October 19, 2015

Page 2: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Outline

1 Introduction

2 Special soft computing techniques

3 Analysis of time series

4 Forecasting

5 Demonstration of forecasting using LFL Forecaster

6 Linguistic characterization of trend

7 Conclusions

Page 3: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Time series

X : T× Ω −→ R

T = 0,1, . . . ,N;T = [a,b];T = R

Fixed ω ∈ Ω: Realization

X (t) | t ∈ T

Page 4: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Standard approaches

Box-Jenkins methodology: e.g. ARMA(p,q)

X (t) = c + ε(t) +

p∑i=1

ϕiX (t − i) +

q∑i=1

θiε(t − i)

ϕ1, . . . , ϕp — autoregressive model parametersθ1, . . . , θq — moving average model parametersε — noise

Very powerful in forecasting

How to interpret the process? How to understand it?

Page 5: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Standard approaches

Decomposition

X (t) = Tr(t) + C(t)︸ ︷︷ ︸TC(t)

+S(t) + R(t), t ∈ T

Tr(t) — trendC(t) — cycleS(t) — seasonal componentR(t) — random errorTC(t) —trend-cycle

Page 6: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Special soft computing techniques

(i) Fuzzy Natural Logic (FNL)(ii) Fuzzy (F-)transform

Applied to :

• time series analysis• time series forecasting• mining information from time series

Page 7: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Special soft computing techniques

(i) Fuzzy Natural Logic (FNL)(ii) Fuzzy (F-)transform

Applied to :

• time series analysis• time series forecasting• mining information from time series

Page 8: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Special soft computing techniques

(i) Fuzzy Natural Logic (FNL)(ii) Fuzzy (F-)transform

Applied to :

• time series analysis• time series forecasting• mining information from time series

Page 9: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy (F)-transform

Irina Perfilieva

Transform a continuous function f to a finite n-dimensional(real) vector of numbers (direct phase) and then approximatethe original function (inverse phase) by f .

The inverse F-transform f can be constructed to haverequired properties.

Page 10: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy (F)-transform

Irina Perfilieva

Transform a continuous function f to a finite n-dimensional(real) vector of numbers (direct phase) and then approximatethe original function (inverse phase) by f .

The inverse F-transform f can be constructed to haverequired properties.

Page 11: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy Natural Logic

Fuzzy natural logic is a mathematical theory modelingterms and rules that come with natural language and allowus to reason and argue in it

Current constituents of FNL• Theory of evaluative linguistic expressions• Theory of fuzzy/linguistic IF-THEN rules and logical

inference (Perception-based Logical Deduction)• Theory of generalized fuzzy and intermediate quantifiers;

generalized Aristotle syllogisms

Page 12: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy Natural Logic

Fuzzy natural logic is a mathematical theory modelingterms and rules that come with natural language and allowus to reason and argue in it

Current constituents of FNL• Theory of evaluative linguistic expressions• Theory of fuzzy/linguistic IF-THEN rules and logical

inference (Perception-based Logical Deduction)• Theory of generalized fuzzy and intermediate quantifiers;

generalized Aristotle syllogisms

Page 13: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy Natural Logic

Fuzzy natural logic is a mathematical theory modelingterms and rules that come with natural language and allowus to reason and argue in it

Current constituents of FNL• Theory of evaluative linguistic expressions• Theory of fuzzy/linguistic IF-THEN rules and logical

inference (Perception-based Logical Deduction)• Theory of generalized fuzzy and intermediate quantifiers;

generalized Aristotle syllogisms

Page 14: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy Natural Logic

Fuzzy natural logic is a mathematical theory modelingterms and rules that come with natural language and allowus to reason and argue in it

Current constituents of FNL• Theory of evaluative linguistic expressions• Theory of fuzzy/linguistic IF-THEN rules and logical

inference (Perception-based Logical Deduction)• Theory of generalized fuzzy and intermediate quantifiers;

generalized Aristotle syllogisms

Page 15: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Evaluative linguistic expressions

Special expressions of natural language using which peopleevaluate encountered phenomena and processes

(i) Simple evaluative expressions: small, medium, big; low,medium, high; cheap, medium expensive, expensive, verylow, more or less high, very rough increase, extremely highprofit

(ii) Fuzzy numbers: twenty thousand, roughly one thousand,about one million

(iii) Compound and negative evaluative expressions: roughlysmall or medium, high but not very (high)

Page 16: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Evaluative linguistic expressions

Special expressions of natural language using which peopleevaluate encountered phenomena and processes

(i) Simple evaluative expressions: small, medium, big; low,medium, high; cheap, medium expensive, expensive, verylow, more or less high, very rough increase, extremely highprofit

(ii) Fuzzy numbers: twenty thousand, roughly one thousand,about one million

(iii) Compound and negative evaluative expressions: roughlysmall or medium, high but not very (high)

Page 17: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Evaluative linguistic expressions

Special expressions of natural language using which peopleevaluate encountered phenomena and processes

(i) Simple evaluative expressions: small, medium, big; low,medium, high; cheap, medium expensive, expensive, verylow, more or less high, very rough increase, extremely highprofit

(ii) Fuzzy numbers: twenty thousand, roughly one thousand,about one million

(iii) Compound and negative evaluative expressions: roughlysmall or medium, high but not very (high)

Page 18: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Evaluative linguistic expressions

Special expressions of natural language using which peopleevaluate encountered phenomena and processes

(i) Simple evaluative expressions: small, medium, big; low,medium, high; cheap, medium expensive, expensive, verylow, more or less high, very rough increase, extremely highprofit

(ii) Fuzzy numbers: twenty thousand, roughly one thousand,about one million

(iii) Compound and negative evaluative expressions: roughlysmall or medium, high but not very (high)

Page 19: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Semantics of evaluative linguistic expressions

Context (possible world)

w = 〈vL, vS, vR〉 vL, vS, vR ∈ R

vL — the smallest thinkable valuevS — typical medium valuevR — the largest thinkable value

Example (Context for population of a town)

〈 vL vS vR〉Czech Republic: 〈 3 000 50 000 1 000 000〉USA: 〈30 000 200 000 10 000 000〉

Page 20: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Semantics of evaluative linguistic expressions

Context (possible world)

w = 〈vL, vS, vR〉 vL, vS, vR ∈ R

vL — the smallest thinkable valuevS — typical medium valuevR — the largest thinkable value

Example (Context for population of a town)

〈 vL vS vR〉Czech Republic: 〈 3 000 50 000 1 000 000〉USA: 〈30 000 200 000 10 000 000〉

Page 21: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Extension of evaluative expressions

Small Medium BigExtremely

Very

0 4 10

ExSm

VeSm? Small?Big ?

0 4 10

0 4 10

Roughly

(a)

(b)

(c)

Page 22: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Semantics of evaluative linguistic expressions

Intension: W −→ F (R)

֏

֏

֏

various contexts extensions

( ([0,1]))wFWIntension:

Page 23: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy/linguistic IF-THEN rules and linguisticdescription

Fuzzy/linguistic IF-THEN rule

R := IF X is A THEN Y is B

X is A ,Y is B — evaluative linguistic predications

A ,B :small, very small, roughly medium, extremely big

Linguistic description R

R1 : IF X is A1 THEN Y is B1

. . . . . . . . . . . . . . . . . . . . . . . .Rm : IF X is Am THEN Y is Bm

Page 24: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Fuzzy/linguistic IF-THEN rules and linguisticdescription

Fuzzy/linguistic IF-THEN rule

R := IF X is A THEN Y is B

X is A ,Y is B — evaluative linguistic predications

A ,B :small, very small, roughly medium, extremely big

Linguistic description R

R1 : IF X is A1 THEN Y is B1

. . . . . . . . . . . . . . . . . . . . . . . .Rm : IF X is Am THEN Y is Bm

Page 25: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perception-based logical deduction

Consider a simple linguistic description:

R1 := IF X is small THEN Y is very bigR2 := IF X is very big THEN Y is small

Both rules provide us with a certain knowledge. Though theyare vague, we may distinguish between them.

Linguistic context of X ,Y : 〈0,0.4,1〉

“small” ≈ 0.2 (and smaller); perception of X = 0.2“very big” ≈ 0.8 (and bigger); perception of Y = 0.8

Given a value X = 0.2: we expect Y ≈ 0.8 due to R1(X is evaluated as small)

Similarly, X = 0.8 leads to Y ≈ 0.2 due to R2.

Page 26: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perception-based logical deduction

Consider a simple linguistic description:

R1 := IF X is small THEN Y is very bigR2 := IF X is very big THEN Y is small

Both rules provide us with a certain knowledge. Though theyare vague, we may distinguish between them.

Linguistic context of X ,Y : 〈0,0.4,1〉

“small” ≈ 0.2 (and smaller); perception of X = 0.2“very big” ≈ 0.8 (and bigger); perception of Y = 0.8

Given a value X = 0.2: we expect Y ≈ 0.8 due to R1(X is evaluated as small)

Similarly, X = 0.8 leads to Y ≈ 0.2 due to R2.

Page 27: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perception-based logical deduction

Consider a simple linguistic description:

R1 := IF X is small THEN Y is very bigR2 := IF X is very big THEN Y is small

Both rules provide us with a certain knowledge. Though theyare vague, we may distinguish between them.

Linguistic context of X ,Y : 〈0,0.4,1〉

“small” ≈ 0.2 (and smaller); perception of X = 0.2“very big” ≈ 0.8 (and bigger); perception of Y = 0.8

Given a value X = 0.2: we expect Y ≈ 0.8 due to R1(X is evaluated as small)

Similarly, X = 0.8 leads to Y ≈ 0.2 due to R2.

Page 28: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Simple demonstration of PbLD

LFLC 2000 (University of Ostrava)

Page 29: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Simple demonstration of PbLD

LFLC 2000 (University of Ostrava)

Page 30: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Learning of linguistic description

Linguistic description is learned from the dataSet linguistic context:

w = 〈vL, vS, vR〉

Learning procedure

X1 X2 . . . Xn Y999.999 999.999 · · · 999.999 999.999. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .999.999 999.999 · · · 999.999 999.999

Page 31: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Learning of linguistic description

Linguistic description is learned from the dataSet linguistic context:

w = 〈vL, vS, vR〉

Learning procedure

X1 X2 . . . Xn YX1 is A11 X2 is A21 · · · Xn is An1 Y is B1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .X1 is A1m X2 is A2m · · · Xn is Anm Y is Bm

Page 32: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Learning of linguistic description

Linguistic description is learned from the dataSet linguistic context:

w = 〈vL, vS, vR〉

Learning procedure

IF X1 is A11 AND X2 is A21 AND · · · AND Xn is An1THEN Y is B1

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .IF X1 is A1m AND X2 is A2m AND · · · AND Xn is Anm

THEN Y is Bm

Page 33: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Analysis of time series

Decomposition

X (t) = Tr(t) + C(t)︸ ︷︷ ︸TC(t)

+S(t) + R(t , ω), t ∈ T

• Trend component Tr(t)• Cycle component C(t)• Seasonal component S(t)• Random noise (error) R(t , ω); µ ≈ 0, σ > 0• Trend-cycle TC(t) = Tr(t) + C(t)

Well interpretable — appropriate for alternative methods

Page 34: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend and trend-cycle

General OECD specification:

Trend (tendency)Component of a time series that represents variations of lowfrequency in a time series, the high and medium frequencyfluctuations having been filtered out.

Trend-cycleComponent that represents variations of low frequency in atime series, the high frequency fluctuations having been filteredout — period longer than a chosen threshold (longer, than 1year = bussiness cycle).

Page 35: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend and trend-cycle

General OECD specification:

Trend (tendency)Component of a time series that represents variations of lowfrequency in a time series, the high and medium frequencyfluctuations having been filtered out.

Trend-cycleComponent that represents variations of low frequency in atime series, the high frequency fluctuations having been filteredout — period longer than a chosen threshold (longer, than 1year = bussiness cycle).

Page 36: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Predefined trend-cycle function?

Which function should be used to model the trend-cycle?

May we afford to forecast trend values just by prolongation ofthe chosen trend-cycle function?

Page 37: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Predefined trend-cycle function?

Which function should be used to model the trend-cycle?

May we afford to forecast trend values just by prolongation ofthe chosen trend-cycle function?

Page 38: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Assumption

X (t) = TC(t) +r∑

j=1

Pj ei(λj t+ϕj )

︸ ︷︷ ︸S(t)

+R(t)

Frequencies: λi , i = 1, . . . , rPeriodicities: Ti = 2π

λiRandom noise is a stationary stochastic process

R(t) = ξeiλt+ϕ, ξ ∈ R

Apply F-transform to X

F[X ] = F[TC] + F[S] + F[R]

Page 39: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Assumption

X (t) = TC(t) +r∑

j=1

Pj ei(λj t+ϕj )

︸ ︷︷ ︸S(t)

+R(t)

Frequencies: λi , i = 1, . . . , rPeriodicities: Ti = 2π

λiRandom noise is a stationary stochastic process

R(t) = ξeiλt+ϕ, ξ ∈ R

Apply F-transform to X

F[X ] = F[TC] + F[S] + F[R]

Page 40: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Assumption

X (t) = TC(t) +r∑

j=1

Pj ei(λj t+ϕj )

︸ ︷︷ ︸S(t)

+R(t)

Frequencies: λi , i = 1, . . . , rPeriodicities: Ti = 2π

λiRandom noise is a stationary stochastic process

R(t) = ξeiλt+ϕ, ξ ∈ R

Apply F-transform to X

F[X ] = F[TC] + F[S] + F[R]

Page 41: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

We consider interval [0,b]

h-equidistant nodes

c0 = 0, c1 = dT , . . . ,ck−1 = (k − 1)dT , ck = kdT , ck+1 = (k + 1)dT ,

. . . , cn = ndT = b

where

h = d T

λ =2πd

h

Page 42: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend and trend-cycle using inverse F-transform

TheoremIf we set the distance h between nodes properly (h = dT ) then• X (t) = TC(t) + D (T — the longest seasonal periodicity)• X (t) = Tr(t) + D (T — the longest cyclic periodicity)

D → 0

F-transform makes it possible to extract trend-cycle as wellas trend with high fidelity

Page 43: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Artificial time series

X (t) = TC(t) + 5 sin(0.63t + 1.5) + 5 sin(1.26t + 0.35)+

+ 15 sin(2.7t + 1.12) + 7 sin(0.41t + 0.79) + R(t)

Trend-cycle: given by artificial data without clear periodicityOther: T1 = 10, T2 = 5, T3 = 2.3, T4 = 15.4.

T = T4, d = 1, i.e. h = 15;Width of basic functions: 2h = 30

R(t) — random noise with average µ = 0.1

Page 44: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Artificial time series

- - - - - Original trend-cycleInverse F-transform of X

Page 45: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Estimation of trend-cycle

Page 46: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Estimation of trend-cycle

Page 47: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Estimation of trend

Page 48: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Trend-cycle using inverse F-transform

Estimation of trend

Page 49: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting methodology

The data are split into in-samples/out-samples

Learning set Validationset

Testingset

In-samples Out-samples

In-samples = learning set TL + validation set TV

Out-samples = testing set TT

T = TL ∪ TV ∪ TT

Page 50: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting methodology

Symmetric Mean Absolute Percent Error

SMAPE =1|T|

∑t∈T

|X (t)− X (t)||X(t)|+|X(t)|

2

SMAPE∈ [0,200]

The best trend-cycle and seasonal predictor is chosenaccording to SMAPE error on the validation set TV

Page 51: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting trend-cycle (autoregression)

• Compute F-transform components F1[X ], . . . ,Fn[X ] andtheir differences ∆Fi [X ],∆2Fi [X ]

• Learn a linguistic description consisting of the rules suchas the following:

IF ∆Fi−1[X ] is A∆i−1 AND Fi [X ] is Ai THEN ∆Fi+1[X ] is B

The learned linguistic description is used for forecasting of theF-transform components

F[X ] = (Fn+1[X ], . . . ,Fn+`[X ])

Forecast of X (t), t ∈ TV (or t ∈ TT ) is computed from F[X ]

Page 52: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting trend-cycle (autoregression)

• Compute F-transform components F1[X ], . . . ,Fn[X ] andtheir differences ∆Fi [X ],∆2Fi [X ]

• Learn a linguistic description consisting of the rules suchas the following:

IF ∆Fi−1[X ] is A∆i−1 AND Fi [X ] is Ai THEN ∆Fi+1[X ] is B

The learned linguistic description is used for forecasting of theF-transform components

F[X ] = (Fn+1[X ], . . . ,Fn+`[X ])

Forecast of X (t), t ∈ TV (or t ∈ TT ) is computed from F[X ]

Page 53: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting trend-cycle (autoregression)

• Compute F-transform components F1[X ], . . . ,Fn[X ] andtheir differences ∆Fi [X ],∆2Fi [X ]

• Learn a linguistic description consisting of the rules suchas the following:

IF ∆Fi−1[X ] is A∆i−1 AND Fi [X ] is Ai THEN ∆Fi+1[X ] is B

The learned linguistic description is used for forecasting of theF-transform components

F[X ] = (Fn+1[X ], . . . ,Fn+`[X ])

Forecast of X (t), t ∈ TV (or t ∈ TT ) is computed from F[X ]

Page 54: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

1 Simple prediction: The next (difference of) componentFi+1[X ] is predicted from the previous (differences of)components Fi [X ],Fi−1[X ], . . . ,Fi−k [X ] using one linguisticdescription

2 Independent models: Each following (difference of)component Fn+1[X ], . . . ,Fn+`[X ] is predicted from(differences of) components Fn,Fn−1[X ], . . . ,Fn−k [X ] usingdifferent linguistic descriptions

Page 55: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

1 Simple prediction: The next (difference of) componentFi+1[X ] is predicted from the previous (differences of)components Fi [X ],Fi−1[X ], . . . ,Fi−k [X ] using one linguisticdescription

2 Independent models: Each following (difference of)component Fn+1[X ], . . . ,Fn+`[X ] is predicted from(differences of) components Fn,Fn−1[X ], . . . ,Fn−k [X ] usingdifferent linguistic descriptions

Page 56: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting seasonal component (autoregression)

• θ vectors of the seasonal component:

Sj = (S((j − 1)p + 1),S((j − 1)p + 2), . . . ,S(jp))

p — period of seasonality (e.g., p = 12), j ∈ 1, . . . , θ• Finding optimal solution of the system of equations

Sθ =θ−1∑j=1

dj · Sθ−j

• Use the coefficients d1, . . . ,dθ to determine future vectorsSθ+1, . . .

Other possibility: neural networks

Page 57: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting seasonal component (autoregression)

• θ vectors of the seasonal component:

Sj = (S((j − 1)p + 1),S((j − 1)p + 2), . . . ,S(jp))

p — period of seasonality (e.g., p = 12), j ∈ 1, . . . , θ• Finding optimal solution of the system of equations

Sθ =θ−1∑j=1

dj · Sθ−j

• Use the coefficients d1, . . . ,dθ to determine future vectorsSθ+1, . . .

Other possibility: neural networks

Page 58: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting seasonal component (autoregression)

• θ vectors of the seasonal component:

Sj = (S((j − 1)p + 1),S((j − 1)p + 2), . . . ,S(jp))

p — period of seasonality (e.g., p = 12), j ∈ 1, . . . , θ• Finding optimal solution of the system of equations

Sθ =θ−1∑j=1

dj · Sθ−j

• Use the coefficients d1, . . . ,dθ to determine future vectorsSθ+1, . . .

Other possibility: neural networks

Page 59: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Forecasting seasonal component (autoregression)

• θ vectors of the seasonal component:

Sj = (S((j − 1)p + 1),S((j − 1)p + 2), . . . ,S(jp))

p — period of seasonality (e.g., p = 12), j ∈ 1, . . . , θ• Finding optimal solution of the system of equations

Sθ =θ−1∑j=1

dj · Sθ−j

• Use the coefficients d1, . . . ,dθ to determine future vectorsSθ+1, . . .

Other possibility: neural networks

Page 60: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Monthly gasoline demand in Ontario in 1960–1975

Measured by SMAPE, forecast horizon= 18 months

Validation set error: 3.38% ; testing set error: 3.04%

Page 61: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Monthly gasoline demand in Ontario in 1960–1975

Measured by SMAPE, forecast horizon= 18 months

Validation set error: 3.38% ; testing set error: 3.04%

Page 62: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Monthly gasoline demand in Ontario in 1960–1975context = 〈86800,154000,256000〉; partition period=24 months

Generated linguistic description

Nr. Fi [X ] ∆2Fi [X ] ⇒ ∆Fi+1[X ]

1 ze -ml me ⇒ si sm2 ve sm sm ⇒ sm3 sm sm ⇒ ve sm4 ml sm -si sm ⇒ vr sm5 qr sm vr bi ⇒ sm6 ml me -me ⇒ me7 ml me qr bi ⇒ ro sm8 vr bi -me ⇒ sm9 qr bi -ml sm ⇒ ro bi

10 ml bi ex bi ⇒ qr bi11 bi -ex sm ⇒ si bi

Page 63: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Monthly gasoline demand in Ontario in 1960–1975context = 〈86800,154000,256000〉; partition period=24 months

Generated linguistic description

Nr. Fi [X ] ∆2Fi [X ] ⇒ ∆Fi+1[X ]

1 ze -ml me ⇒ si sm2 ve sm sm ⇒ sm3 sm sm ⇒ ve sm4 ml sm -si sm ⇒ vr sm5 qr sm vr bi ⇒ sm6 ml me -me ⇒ me7 ml me qr bi ⇒ ro sm8 vr bi -me ⇒ sm9 qr bi -ml sm ⇒ ro bi

10 ml bi ex bi ⇒ qr bi11 bi -ex sm ⇒ si bi

Page 64: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Monthly gasoline demand in Ontario in 1960–1975context = 〈86800,154000,256000〉; partition period=24 months

Generated linguistic description

Nr. Fi [X ] ∆2Fi [X ] ⇒ ∆Fi+1[X ]

1 ze -ml me ⇒ si sm2 ve sm sm ⇒ sm3 sm sm ⇒ ve smIf average gasoline demand is SMALL in previous two years

and average accelleration is SMALL four years agothen average change in upcoming two years will be VERY SMALL7 ml me qr bi ⇒ ro sm8 vr bi -me ⇒ sm9 qr bi -ml sm ⇒ ro bi

10 ml bi ex bi ⇒ qr bi11 bi -ex sm ⇒ si bi

Page 65: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Mining information from time series

1 Reduction of dimensionality2 Perceptionally important points3 Linguistic evaluation of the direction of local and global

trend; linguistic characterization of future course of TC or Tr4 Finding intervals of monotonous behavior5 Linguistic summarization of information about time series

and syllogistic reasoning

Page 66: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Replace the original X by the reduced one

X [h] = β01 , . . . , β

0n−1.

Advantages:1 Dimension of X [h] is significantly lower (depends on h)2 X [h] fits well the shape of the original time series X , keeps

all its salient points3 X [h] is free of all frequencies higher than λq

4 Variance of the noise of X [h] is lower than variance of theoriginal noise R(t , ω)

Page 67: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Replace the original X by the reduced one

X [h] = β01 , . . . , β

0n−1.

Advantages:1 Dimension of X [h] is significantly lower (depends on h)2 X [h] fits well the shape of the original time series X , keeps

all its salient points3 X [h] is free of all frequencies higher than λq

4 Variance of the noise of X [h] is lower than variance of theoriginal noise R(t , ω)

Page 68: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Replace the original X by the reduced one

X [h] = β01 , . . . , β

0n−1.

Advantages:1 Dimension of X [h] is significantly lower (depends on h)2 X [h] fits well the shape of the original time series X , keeps

all its salient points3 X [h] is free of all frequencies higher than λq

4 Variance of the noise of X [h] is lower than variance of theoriginal noise R(t , ω)

Page 69: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Replace the original X by the reduced one

X [h] = β01 , . . . , β

0n−1.

Advantages:1 Dimension of X [h] is significantly lower (depends on h)2 X [h] fits well the shape of the original time series X , keeps

all its salient points3 X [h] is free of all frequencies higher than λq

4 Variance of the noise of X [h] is lower than variance of theoriginal noise R(t , ω)

Page 70: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Original time series, 186 and its inverse F0-transform, h = 3Reduced time series, components β0

1 , . . . , β062

Page 71: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Original time series, 186 and its inverse F0-transform, h = 3Reduced time series, components β0

1 , . . . , β062

Page 72: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Reduction of dimensionality

Original time series, 186 and its inverse F0-transform, h = 3Reduced time series, components β0

1 , . . . , β062

Page 73: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Estimation of local trend

β1k =

∫ ck+1ck−1

X (t)(t − ck )Ak (t)dt∫ ck+1ck−1

(t − ck )2Ak (t)dt

Estimation of the first derivative of X

Page 74: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perceptionally important points

F-transform provides estimation of derivatives

Main idea of the algorithm

Find periods consisting of several components β1k ,. . . with the

same sign followed by one or more components β1j that have

opposite sign and are sufficiently large

Page 75: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perceptionally important points

(i) Set fuzzy partition: h = dTk for some d ∈ N and k < q.Position of nodes: sum

∑n−1k=1 |β1

k | is minimal.(ii) Find position of potentional perceptionally important points

Nodes ck with the highest |β2k |

β2k =

∫ ck+1ck−1

X (t)((t − ck )2 − I2)Ak (t)dt∫ ck+1ck−1

((t − ck )2 − I2)2Ak (t)dt

I2 = 1h

∫ ck+1ck−1

(t − ck )2Ak (t)dt

(iii) PIP: several (at least two) β1k , β

1k+1,. . . have the same sign

followed by one or more β1j with opposite sign and/or are

sufficiently large.

Page 76: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perceptionally important points

(i) Set fuzzy partition: h = dTk for some d ∈ N and k < q.Position of nodes: sum

∑n−1k=1 |β1

k | is minimal.(ii) Find position of potentional perceptionally important points

Nodes ck with the highest |β2k |

β2k =

∫ ck+1ck−1

X (t)((t − ck )2 − I2)Ak (t)dt∫ ck+1ck−1

((t − ck )2 − I2)2Ak (t)dt

I2 = 1h

∫ ck+1ck−1

(t − ck )2Ak (t)dt

(iii) PIP: several (at least two) β1k , β

1k+1,. . . have the same sign

followed by one or more β1j with opposite sign and/or are

sufficiently large.

Page 77: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perceptionally important points

(i) Set fuzzy partition: h = dTk for some d ∈ N and k < q.Position of nodes: sum

∑n−1k=1 |β1

k | is minimal.(ii) Find position of potentional perceptionally important points

Nodes ck with the highest |β2k |

β2k =

∫ ck+1ck−1

X (t)((t − ck )2 − I2)Ak (t)dt∫ ck+1ck−1

((t − ck )2 − I2)2Ak (t)dt

I2 = 1h

∫ ck+1ck−1

(t − ck )2Ak (t)dt

(iii) PIP: several (at least two) β1k , β

1k+1,. . . have the same sign

followed by one or more β1j with opposite sign and/or are

sufficiently large.

Page 78: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perceptionally important points

(i) Set fuzzy partition: h = dTk for some d ∈ N and k < q.Position of nodes: sum

∑n−1k=1 |β1

k | is minimal.(ii) Find position of potentional perceptionally important points

Nodes ck with the highest |β2k |

β2k =

∫ ck+1ck−1

X (t)((t − ck )2 − I2)Ak (t)dt∫ ck+1ck−1

((t − ck )2 − I2)2Ak (t)dt

I2 = 1h

∫ ck+1ck−1

(t − ck )2Ak (t)dt

(iii) PIP: several (at least two) β1k , β

1k+1,. . . have the same sign

followed by one or more β1j with opposite sign and/or are

sufficiently large.

Page 79: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Perceptionally important points

Example:

Page 80: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Automatically generated linguistic expressions

negligibly decreasing stagnating slightlydecreasing

somewhatdecreasing

Trend of the (whole) time series is stagnating

Page 81: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Combination of F1-transform and fuzzy natural logic

Local perception LPerc : R×W −→ EvExpr

Linguistic context wtg = 〈0, vS, vR〉 (or (−1)wtg = 〈−vR ,−vS,0〉)

Ev := LPerc(β1[X |T],wtg) 7→ 〈special hedge〉Inc/Dec

Linguistic evaluation of the direction of trend

Trend of X in period T is 〈special hedge〉Inc/Dec .

Example

Trend of X in period T is slightly increasing

Page 82: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Combination of F1-transform and fuzzy natural logic

Local perception LPerc : R×W −→ EvExpr

Linguistic context wtg = 〈0, vS, vR〉 (or (−1)wtg = 〈−vR ,−vS,0〉)

Ev := LPerc(β1[X |T],wtg) 7→ 〈special hedge〉Inc/Dec

Linguistic evaluation of the direction of trend

Trend of X in period T is 〈special hedge〉Inc/Dec .

Example

Trend of X in period T is slightly increasing

Page 83: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Combination of F1-transform and fuzzy natural logic

Local perception LPerc : R×W −→ EvExpr

Linguistic context wtg = 〈0, vS, vR〉 (or (−1)wtg = 〈−vR ,−vS,0〉)

Ev := LPerc(β1[X |T],wtg) 7→ 〈special hedge〉Inc/Dec

Linguistic evaluation of the direction of trend

Trend of X in period T is 〈special hedge〉Inc/Dec .

Example

Trend of X in period T is slightly increasing

Page 84: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Trend of the whole series is stagnating.

Trend of time series:in Slot 1 (time 41-63) is negligibly increasingin Slot 2 (time 66-114) is slightly decreasing

Future trend of time series is clearly decreasing(Verification of real course: clear decrease)

Page 85: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Trend of the whole series is stagnating.

Trend of time series:in Slot 1 (time 41-63) is negligibly increasingin Slot 2 (time 66-114) is slightly decreasing

Future trend of time series is clearly decreasing(Verification of real course: clear decrease)

Page 86: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic evaluation of the direction of trend

Trend of the whole series is stagnating.

Trend of time series:in Slot 1 (time 41-63) is negligibly increasingin Slot 2 (time 66-114) is slightly decreasing

Future trend of time series is clearly decreasing(Verification of real course: clear decrease)

Page 87: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Finding intervals of monotonous behavior

Special algorithm combining F-transform and methods of FNL

interval Ev [X |Ti ] interval Ev [X |Ti ]

T1 clear increase T6 huge decreaseT2 somewhat increase T7 clear increaseT3 huge increase T8 clear decreaseT4 huge decrease T9 clear decreaseT5 huge increase

Page 88: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic comments to time series

Example• In the first (second, third, fourth) quarter of the year, the

trend was slightly decreasing (increasing, stagnating).• The longest period of stagnating (increasing, slightly

decreasing) trend lasted for T time moments (days, weeks,months, etc.)

T — the longest of the found intervals T1, . . . ,Ts in whichtrend of X is stagnating (sharply increasing, slightlydecreasing, etc.).

• The last quarter of the year, the trend was decreasing butits forecast for the next quarter is clear increase

Page 89: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic comments to time series

Example• In the first (second, third, fourth) quarter of the year, the

trend was slightly decreasing (increasing, stagnating).• The longest period of stagnating (increasing, slightly

decreasing) trend lasted for T time moments (days, weeks,months, etc.)

T — the longest of the found intervals T1, . . . ,Ts in whichtrend of X is stagnating (sharply increasing, slightlydecreasing, etc.).

• The last quarter of the year, the trend was decreasing butits forecast for the next quarter is clear increase

Page 90: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic comments to time series

Example• In the first (second, third, fourth) quarter of the year, the

trend was slightly decreasing (increasing, stagnating).• The longest period of stagnating (increasing, slightly

decreasing) trend lasted for T time moments (days, weeks,months, etc.)

T — the longest of the found intervals T1, . . . ,Ts in whichtrend of X is stagnating (sharply increasing, slightlydecreasing, etc.).

• The last quarter of the year, the trend was decreasing butits forecast for the next quarter is clear increase

Page 91: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarization

GoalSummarize knowledge about one or more time series.Application of the formal theory of intermediate quantifiers (partof FNL).

Examples of intermediate quantifiers:Most, Many, Almost all, Few, A large part of

Page 92: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarizing information about time series

One time seriesTypical summarizing proposition:

ExampleIn most (a few, many) time intervals, trend of the time series Xwas slightly increasing (stagnating, sharply decreasing).

Formalization: type 〈1〉 quantifier

(Q∀Bi VeT)((SlInc X )wT)

Page 93: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarizing information about time series

Result: set of formulas formalizing propositions, for example

In most (many, few) cases, the time series wasstagnating (sharply decreasing, roughly increasing)

Choose a formula that has the highest truth degree and thehighest level of steepness

Alternative: Use the function of local perception LPerc

Page 94: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic summarization of information abouttime series

Many time series:

Example• Most (many, few) analyzed time series stagnated recently

but their future trend is slightly increasing• There is an evidence of huge (slight, clear) decrease of

trend of almost all time series in the recent quarter of theyear

Page 95: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Linguistic summarization of information abouttime series

Many time series:

Example• Most (many, few) analyzed time series stagnated recently

but their future trend is slightly increasing• There is an evidence of huge (slight, clear) decrease of

trend of almost all time series in the recent quarter of theyear

Page 96: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarization — syllogistic reasoning

Example (Valid generalized Aristotle’s syllogism)

In few cases the increase of time series is not small

In many cases the increase of time series is clearIn few cases the clear increase of time series is not small

Validity of over 105 generalized syllogisms withintermediate quantifiers was mathematically proved(Murinova, Novak)

Page 97: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarization — syllogistic reasoning

Example (Valid generalized Aristotle’s syllogisms)

All carefully inspected time series are financialMost available time series are carefully inspectedMost available time series are financial

In no period there was increasing TS from car industryIn most periods there was inspected TS from car industryIn some periods, inspected time series was not increasing

Most financial TS were sharply decreasing last monthAll sharply decreasing TS last month are now stagnatingSome time series that are now stagnating are financial

Page 98: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarization — syllogistic reasoning

Example (Valid generalized Aristotle’s syllogisms)

All carefully inspected time series are financialMost available time series are carefully inspectedMost available time series are financial

In no period there was increasing TS from car industryIn most periods there was inspected TS from car industryIn some periods, inspected time series was not increasing

Most financial TS were sharply decreasing last monthAll sharply decreasing TS last month are now stagnatingSome time series that are now stagnating are financial

Page 99: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Summarization — syllogistic reasoning

Example (Valid generalized Aristotle’s syllogisms)

All carefully inspected time series are financialMost available time series are carefully inspectedMost available time series are financial

In no period there was increasing TS from car industryIn most periods there was inspected TS from car industryIn some periods, inspected time series was not increasing

Most financial TS were sharply decreasing last monthAll sharply decreasing TS last month are now stagnatingSome time series that are now stagnating are financial

Page 100: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Conclusions

Fuzzy Transform and Fuzzy Natural Logic applied inelaboration of time series:• Analysis of time series, estimation of trend-cycle and trend• Forecasting of time series from learned linguistic

description using PbLD method• Mining information from time series

• Reduction of dimensionality• Perceptionally important points• Finding intervals of monotonous behavior• Linguistic evaluation of the direction of local and global

trend; linguistic characterization of future course of TC or Tr• Linguistic summarization of information about time series

and syllogistic reasoning

Page 101: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Conclusions

Fuzzy Transform and Fuzzy Natural Logic applied inelaboration of time series:• Analysis of time series, estimation of trend-cycle and trend• Forecasting of time series from learned linguistic

description using PbLD method• Mining information from time series

• Reduction of dimensionality• Perceptionally important points• Finding intervals of monotonous behavior• Linguistic evaluation of the direction of local and global

trend; linguistic characterization of future course of TC or Tr• Linguistic summarization of information about time series

and syllogistic reasoning

Page 102: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Conclusions

Fuzzy Transform and Fuzzy Natural Logic applied inelaboration of time series:• Analysis of time series, estimation of trend-cycle and trend• Forecasting of time series from learned linguistic

description using PbLD method• Mining information from time series

• Reduction of dimensionality• Perceptionally important points• Finding intervals of monotonous behavior• Linguistic evaluation of the direction of local and global

trend; linguistic characterization of future course of TC or Tr• Linguistic summarization of information about time series

and syllogistic reasoning

Page 103: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Conclusions

Fuzzy Transform and Fuzzy Natural Logic applied inelaboration of time series:• Analysis of time series, estimation of trend-cycle and trend• Forecasting of time series from learned linguistic

description using PbLD method• Mining information from time series

• Reduction of dimensionality• Perceptionally important points• Finding intervals of monotonous behavior• Linguistic evaluation of the direction of local and global

trend; linguistic characterization of future course of TC or Tr• Linguistic summarization of information about time series

and syllogistic reasoning

Page 104: Analysis, Forecasting and Mining Information from Time

Introduction SC techniques Analysis of TS Forecasting LFL Forecaster Trend characterization Conclusions

Conclusions

Fuzzy Transform and Fuzzy Natural Logic applied inelaboration of time series:• Analysis of time series, estimation of trend-cycle and trend• Forecasting of time series from learned linguistic

description using PbLD method• Mining information from time series

• Reduction of dimensionality• Perceptionally important points• Finding intervals of monotonous behavior• Linguistic evaluation of the direction of local and global

trend; linguistic characterization of future course of TC or Tr• Linguistic summarization of information about time series

and syllogistic reasoning