A Markov Chain is all about a set of States and Transition Probabilities. Say you have some states and you know the probabilities for going from one state to another, then you can model a Markov chain that can forecast the future state and long-term / steady states. Let's say you have 2 weather conditions. "Sunny day" and "Rainy day" with the following transition matrix.
Where the matrix T represents the weather model in which, a Sunny day is 90% likely to be followed by another sunny day, and a rainy day is 50% likely to be followed by another rainy day.
Start the Markov Decision Process tool from Rational Will or the standalone application. You will be asked to enter the states. Enter the 2 states "Sunny day" and "Rainy day" as shown below.
Click the "Proceed" button. In the next screen, you will be asked if there is any action that you can take that can affect the probabilities of the transition. Simply click "No".
Also, click "No" in the next question when you are asked if any action you can take on a Rainy day. Then you will be asked to enter the transition probabilities from Sunny Day state as shown below.
Then click the "Proceed" button. Now, you will be asked to enter the transition probabilities from Rainy day state as shown below.
Then click the "Proceed" button. You will be shown the following screen. Click the Finish button.
Now, click the "Finish" button. You will see the following view.
From the carousel, you can see the Steady State probabilities and lots of useful charts. For example, another useful chart is the forecast chart where you can predict the future state after a given number of iteration.
Finally, from the Ribbon's View section, you can click the "Decision Graph" button to see the graph of the Markov chain.