0Still learning
Know0
What is meant by a state when working with Markov chains?
States refer to mutually exclusive events. The current event can change over time.
For example, the states could represent the weather conditions for a given day.
True or False?
A Markov chain is a model that describes a sequence of states over a period of time.
True.
A Markov chain is a model that describes a sequence of states over a period of time.
What is a transition probability in a Markov chain?
A transition probability is the probability of the next state being a particular state given its current state.
For example, the probability that it is sunny tomorrow given that it is raining today.
Enjoying Flashcards?
Tell us what you think
What is meant by a state when working with Markov chains?
States refer to mutually exclusive events. The current event can change over time.
For example, the states could represent the weather conditions for a given day.
True or False?
A Markov chain is a model that describes a sequence of states over a period of time.
True.
A Markov chain is a model that describes a sequence of states over a period of time.
What is a transition probability in a Markov chain?
A transition probability is the probability of the next state being a particular state given its current state.
For example, the probability that it is sunny tomorrow given that it is raining today.
True or False?
In a Markov chain, transition probabilities change over time.
False.
In a Markov chain, transition probabilities do not change over time.
For example, if there's a 10% chance that tomorrow is sunny given that today is rainy, then if any day is rainy there will be a 10% chance that the next day is sunny.
In a Markov chain, the probabilities for the next state only depend on what?
In a Markov chain, the probabilities for the next state only depend on the current state.
They do not depend on the previous states.
When is a Markov chain said to be regular?
A Markov chain is said to be regular if there is a value k such that in exactly k steps it is possible to reach every state from any initial state.
What is a transition state diagram?
A transition state diagram is a directed graph where the vertices are the states and the edges are labelled with the transition probabilities between the states.
True or False?
On a transition state diagram, the probabilities on the edges going into a vertex add up to 1.
False.
On a transition state diagram, the probabilities on the edges coming out of a vertex add up to 1.
Using the transition state diagram below, what is the probability that the next state is given that the current state is ?
The probability that the next state is given that the current state is is equal to .
True or False?
There must be an edge between any pair of vertices on a transition diagram.
False.
There does not need to be an edge between any pair of vertices on a transition diagram. There must be a way to get between any pair of vertices but this can be a sequence of edges.
For example, to get from A to C you could go via B.
What is a transition matrix, ?
A transition matrix shows the transition probabilities between the current state and the next state.
Do the columns of a transition matrix represent the current states or the next states?
The columns of a transition matrix represent the current states.
The rows represent the next states.
True or False?
The entries in each row of a transition matrix add up to 1.
False.
The entries in each column of a transition matrix add up to 1.
Give an interpretation of the value in the transition matrix below.
In the transition matrix below, is the probability that the next state is given that the current state is .
What is an initial state matrix ?
An initial state matrix is a column vector which contains the probabilities of each state being chosen as the initial state.
Alternatively, it can contain the initial frequencies for each state.
What is a column state matrix ?
A column state matrix is a column vector which contains the probabilities of each state being chosen after steps.
Alternatively, it can contain the frequencies for each state after steps.
If you know the column state matrix , how do you find ?
If you know the column state matrix , you can find by pre-multiplying by the transition matrix, i.e. .
True or False?
The entries in each column of any power of a transition matrix add up to 1.
True.
The entries in each column of any power of a transition matrix add up to 1.
The matrix below represents where is a transition matrix.
Give an interpretation of the value in the transition matrix below.
The matrix below represents where is a transition matrix.
is the probability that after 3 steps the state will be , given that the current state is .
Given the initial state matrix, , and the transition matrix , how can you find the column state matrix, ?
Given the initial state matrix, , and the transition matrix , you can find the column state matrix, by:
raising the transition matrix to the power ,
post-multiplying by the initial state matrix.
This formula is given in the exam formula booklet.
What is meant by a steady state vector, ?
A steady state vector, , is a vector that does not change when multiplied by the transition matrix, i.e. .
True or False?
Regular Markov chains have steady states.
True.
Regular Markov chains have steady states.
What is the definition of a regular Markov chain in terms of its transition matrix, ?
A Markov chain is said to be regular if there exists a positive integer such that none of the entries are equal to 0 in the matrix .
If a Markov chain is regular, what do you know about the eigenvalues of its transition matrix?
If a Markov chain is regular, then its transition matrix has exactly one eigenvalue equal to 1 and the rest will all be less than 1.
True or False?
A steady state vector is an eigenvector of the transition matrix corresponding to the eigenvalue of 1.
True.
A steady state vector is an eigenvector of the transition matrix corresponding to the eigenvalue of 1.
Topic 1
Topic 2
Topic 3
Topic 4
Topic 5
Topic 6
Topic 7
Topic 8
Topic 9
Topic 10
Topic 11
Topic 12
Topic 13