Getting Started

A Markov Chain is all about a set of States and Transition Probabilities. it is a stochastic process with the Markov property.

Say you have some states and you know the probabilities for going from one state to another, then you can model a Markov chain that can forecast the future state and long-term / steady states. In this tutorial, we will present an example of a Markov chain with two states for weather conditions. The states are "Sunny day" and "Rainy day". Say, a Sunny day is 90% likely to be followed by another sunny day, and a rainy day is 50% likely to be followed by another rainy day. Then a transition matrix can be expressed as

transition-matrix

In order to model this simple Markov chain, start the Markov Chain Calculator. Then you will be asked to enter the states of your Markov chain. Enter the 2 states "Sunny day" and "Rainy day" as shown below.

markov-chain-states

After you have entered all the states, click the "Proceed" button. Once you click the Proceed button, you will be asked to enter the transition probabilities for the first state "Sunny day". Set the probabilities as we discussed. Sunny day to Sunny day is 0.9 and Sunny day to Rainy day is 0.1.

transition-probabilities- 1

Then click the "Proceed" button. Now, you will be asked to enter the transition probabilities from Rainy day state as shown below. Enter both probabilities as 0.5 as discussed.

transition-probabilities- 2

Then click the "Proceed" button. You will be shown the following screen. Click the Finish button.

wizard-finish

Once you click the finish button, you will see various user interfaces for the generated Markov chain. You can change the transition probabilities, rename the states, add new states or delete/modify existing states etc.

markov-chain-window

From the carousel, you can see the Steady State probabilities and lots of useful charts. You can pop out all the charts to another window as shown below.

all-charts

Finally, from the Ribbon's View section, you can click the "Decision Graph" button to see the graph of the Markov chain.

decision-graph

Last updated on 13 October 2018, Saturday, 11:28:42 PM
If you have any questions or concerns about this tutorial, Please feel free to share your comment.