Upload
annabelle-lyons
View
218
Download
2
Tags:
Embed Size (px)
Citation preview
Theory of Computations IIICS-6800 |SPRING -2014
Markov Models
By
Sandeep Ravikanti
ContentsIntroductionDefinitionExamplesTypes of Markov ModelsApplicationsReferences
Introduction
Origin
Mid 20th cent.: named after Andrei A. Markov (1856–1922), Russian mathematician.
Markov model
It’s a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event
Formal Definition Markov model is a NDFSM.
Each step of state is predicted through probability distribution with the current state.
Steps here usually corresponds as time intervals, they can also correspond to any ordered discrete sequence.
Formally a Markov Model is said to be triple
M = (K, , A)
K is finite set of States
is a vector that contains the intial probabilities of each of the states.
A is Matrix that represents transitions probabilities
A[ p , q] = Pr (state q at time t | state p at time t -1)
Formal Definition Cont.…..
Examples of Markov models Simple Markov Model for weather Influence of weather on day T is
determined by weather on day T-1 The transition matrix for weather
model M = 0.75 0.25
0.7 0.3 Transition matrix rows sum up to 1.0
since its transition matrix
Sunny
Rainy
0.75
0.7
0.30.25
= 0.4
= 0.6
Markov Model For Weather Probability of being Sunny for five days in a row.
Sunny , Sunny , Sunny , Sunny , Sunny
0.4 * = 0.1266
Probability of being Rainy for five days in a row. Rainy , Rainy , Rainy, Rainy, Rainy
0.6 * = 0.2401
Given its sunny today ,probability of staying sunny for four more days
= 0.316
Modelling of Markov Models Link Structure of World Wide Web.
Performances of a System.
Population Genetics.
Types of Markov Models
Cont.….
Markov chain
• Defined as sequences of outputs produced by Markov model.
• Simplest Markov model.
• System state at time t+1 depends on state at t.
• Markov chain that describes some random process helps : In answering probability of sequences of states S1,S2,S3,………Sn.
To compute probability by multiplying probabilities of each transition using probability of
S1.
Pr (s1 s2. . . Sn) = n
i=2 A[si-1,si ]. Knowing the result for an arbitrarily long sequences of steps
Cont.….
Markov Chain for predicting the weather
The matrix M represents the weather model.
M = is a transition matrix.
Weather on day “0” is known to be sunny.
Vector representation in which its sunny.
X (0) = [1 0]
Weather on day “1” can be predicted by:
X (1) = X (0) M = [1 0] = [0.3 0.7]
Weather on day “2” can be predicted by:
X (2) = X (1) M = X (0) M2 = [1 0] 2 = [0.51 0.49].
General rules for day “n”
X (n) = X (n-1) M = = X (n) = X (0) Mn
Sunny
Rainy
Properties of Markov chains Reducibility
Accessible
Essential or final
Irreducible
Periodicity
Recurrence Transient
Recurrent
Ergodicity
A B
C D
Hidden Markov model
Defined as non deterministic finite state transducer.
States of the system are not directly observable
Derived from two key properties.
They are Markov models. Their state at time t is a functions solely of their time at t-1.
Actual Progression of machine is hidden. Only output string is observed.
Markov Decision Process
•It is a Markov chain in which state transitions depend on the current state and an action vector that is applied to the system.
•Used to compute a policy of actions that will maximize some utility
•Can be solved with value iteration and related methods.
Partially observable Markov decision process
•It is a Markov decision process in which the state of the system is only partially observed.
•Pomdps are known to be NP complete.
•Recent approximation techniques have made them useful for A variety of applications, such as controlling simple agents or robots
Applications PHYSICS
CHEMISTRY
TESTING
INFORMATION SCIENCES
QUEUING THEORY
INTERNET APPLICATIONS
STATISTICS
ECONOMICS AND FINANCE
SOCIAL SCIENCE
MATHEMATICAL BIOLOGY
GAMES
MUSIC
References
Elaine A Rich, Automata, Computability And Complexity, Theory And Application .1st Edition
Wikipedia-Markov Model," [Online]. Available: Http://En.Wikipedia.Org/Wiki/Markov_chain. [Accessed 28 10 2012].
P. Xinhui Zhang, "Dtmarkovchains," [Online]. Available: Http://Www.Wright.Edu/~Xinhui.Zhang/. [Accessed 28 10 2012].
http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/chapter11.pdf
Questions…..?