17
Theory of Computations III CS-6800 |SPRING -2014

Theory of Computations III CS-6800 |SPRING -2014

Embed Size (px)

Citation preview

Page 1: Theory of Computations III CS-6800 |SPRING -2014

Theory of Computations IIICS-6800 |SPRING -2014

Page 2: Theory of Computations III CS-6800 |SPRING -2014

Markov Models

By

Sandeep Ravikanti

Page 3: Theory of Computations III CS-6800 |SPRING -2014

ContentsIntroductionDefinitionExamplesTypes of Markov ModelsApplicationsReferences

Page 4: Theory of Computations III CS-6800 |SPRING -2014

Introduction

Origin

Mid 20th cent.: named after Andrei A. Markov (1856–1922), Russian mathematician.

Markov model

It’s a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event

Page 5: Theory of Computations III CS-6800 |SPRING -2014

Formal Definition Markov model is a NDFSM.

Each step of state is predicted through probability distribution with the current state.

Steps here usually corresponds as time intervals, they can also correspond to any ordered discrete sequence.

Formally a Markov Model is said to be triple

M = (K, , A)

K is finite set of States

is a vector that contains the intial probabilities of each of the states.

A is Matrix that represents transitions probabilities

A[ p , q] = Pr (state q at time t | state p at time t -1)

Formal Definition Cont.…..

Page 6: Theory of Computations III CS-6800 |SPRING -2014

Examples of Markov models Simple Markov Model for weather Influence of weather on day T is

determined by weather on day T-1 The transition matrix for weather

model M = 0.75 0.25

0.7 0.3 Transition matrix rows sum up to 1.0

since its transition matrix

Sunny

Rainy

0.75

0.7

0.30.25

= 0.4

= 0.6

Page 7: Theory of Computations III CS-6800 |SPRING -2014

Markov Model For Weather Probability of being Sunny for five days in a row.

Sunny , Sunny , Sunny , Sunny , Sunny

0.4 * = 0.1266

Probability of being Rainy for five days in a row. Rainy , Rainy , Rainy, Rainy, Rainy

0.6 * = 0.2401

Given its sunny today ,probability of staying sunny for four more days

= 0.316

Page 8: Theory of Computations III CS-6800 |SPRING -2014

Modelling of Markov Models Link Structure of World Wide Web.

Performances of a System.

Population Genetics.

Types of Markov Models

Cont.….

Page 9: Theory of Computations III CS-6800 |SPRING -2014

Markov chain

• Defined as sequences of outputs produced by Markov model.

• Simplest Markov model.

• System state at time t+1 depends on state at t.

• Markov chain that describes some random process helps : In answering probability of sequences of states S1,S2,S3,………Sn.

To compute probability by multiplying probabilities of each transition using probability of

S1.

Pr (s1 s2. . . Sn) = n

i=2 A[si-1,si ]. Knowing the result for an arbitrarily long sequences of steps

Cont.….

Page 10: Theory of Computations III CS-6800 |SPRING -2014

Markov Chain for predicting the weather

The matrix M represents the weather model.

M = is a transition matrix.

Weather on day “0” is known to be sunny.

Vector representation in which its sunny.

X (0) = [1 0]

Weather on day “1” can be predicted by:

X (1) = X (0) M = [1 0] = [0.3 0.7]

Weather on day “2” can be predicted by:

X (2) = X (1) M = X (0) M2 = [1 0] 2 = [0.51 0.49].

General rules for day “n”

X (n) = X (n-1) M = = X (n) = X (0) Mn

Sunny

Rainy

Page 11: Theory of Computations III CS-6800 |SPRING -2014

Properties of Markov chains Reducibility

Accessible

Essential or final

Irreducible

Periodicity

Recurrence Transient

Recurrent

Ergodicity

A B

C D

Page 12: Theory of Computations III CS-6800 |SPRING -2014

Hidden Markov model

Defined as non deterministic finite state transducer.

States of the system are not directly observable

Derived from two key properties.

They are Markov models. Their state at time t is a functions solely of their time at t-1.

Actual Progression of machine is hidden. Only output string is observed.

Page 13: Theory of Computations III CS-6800 |SPRING -2014

Markov Decision Process

•It is a Markov chain in which state transitions depend on the current state and an action vector that is applied to the system.

•Used to compute a policy of actions that will maximize some utility

•Can be solved with value iteration and related methods.

Partially observable Markov decision process

•It is a Markov decision process in which the state of the system is only partially observed.

•Pomdps are known to be NP complete.

•Recent approximation techniques have made them useful for A variety of applications, such as controlling simple agents or robots

Page 14: Theory of Computations III CS-6800 |SPRING -2014

Applications PHYSICS

CHEMISTRY

TESTING

INFORMATION SCIENCES

QUEUING THEORY

INTERNET APPLICATIONS

STATISTICS

ECONOMICS AND FINANCE

SOCIAL SCIENCE

MATHEMATICAL BIOLOGY

GAMES

MUSIC

Page 15: Theory of Computations III CS-6800 |SPRING -2014

References

 

Elaine A Rich, Automata, Computability And Complexity, Theory And Application .1st Edition

Wikipedia-Markov Model," [Online]. Available: Http://En.Wikipedia.Org/Wiki/Markov_chain. [Accessed 28 10 2012].

P. Xinhui Zhang, "Dtmarkovchains," [Online]. Available: Http://Www.Wright.Edu/~Xinhui.Zhang/. [Accessed 28 10 2012].

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/chapter11.pdf

Page 16: Theory of Computations III CS-6800 |SPRING -2014

Questions…..?

Page 17: Theory of Computations III CS-6800 |SPRING -2014