Modeling a Markov Chain

A Markov Chain is all about a set of States and Transition Probabilities. Say you have some states and you know the probabilities for going from one state to another, then you can model a Markov chain that can forecast the future state and long-term / steady states. A Markov chain can be modeled using the Rational Will, or the standalone Markov Decision Process application. Let's say you have 2 weather conditions. "Sunny day" and "Rainy day" with the following transition matrix.

transition-matrix

Where the matrix T represents the weather model in which, a Sunny day is 90% likely to be followed by another sunny day, and a rainy day is 50% likely to be followed by another rainy day.

Start the Markov Decision Process tool from Rational Will or the standalone application. You will be asked to enter the states. Enter the 2 states "Sunny day" and "Rainy day" as shown below.

markov-chain-states

Click the "Proceed" button. In the next screen, you will be asked if there is any action that you can take that can affect the probabilities of the transition. Simply click "No".

markov-chain-question

Also, click "No" in the next question when you are asked if any action you can take on Rainy day. Then you will be asked to enter the transition probabilities from Sunny Day state as shown below.

transition-probabilities- 1

Then click the "Proceed" button. Now, you will be asked to enter the transition probabilities from Rainy day state as shown below.

transition-probabilities- 2

Then click the "Proceed" button. You will be shown the following screen. Click the Finish button.

wizard-finish

Now, click the "Finish" button. You will see the following view.

markov-chain-window

From the carousel, you can see the Steady State probabilities and lots of useful charts. For example, another useful chart is the forecast chart where you can predict the future state after a given number of iteration.

forecast-chart

Finally, from the Ribbon's View section, you can click the "Decision Graph" button to see the graph of the Markov chain.

decision-graph

Last updated on 19 September 2018, Wednesday, 8:30:28 AM
If you have any questions or concerns about this tutorial, Please feel free to share your comment.