Transition Matrices & Markov Chains (DP IB Applications & Interpretation (AI))

Flashcards

1/25

0Still learning

Know0

  • What is meant by a state when working with Markov chains?

Enjoying Flashcards?
Tell us what you think

Cards in this collection (25)

  • What is meant by a state when working with Markov chains?

    States refer to mutually exclusive events. The current event can change over time.

    For example, the states could represent the weather conditions for a given day.

  • True or False?

    A Markov chain is a model that describes a sequence of states over a period of time.

    True.

    A Markov chain is a model that describes a sequence of states over a period of time.

  • What is a transition probability in a Markov chain?

    A transition probability is the probability of the next state being a particular state given its current state.

    For example, the probability that it is sunny tomorrow given that it is raining today.

  • True or False?

    In a Markov chain, transition probabilities change over time.

    False.

    In a Markov chain, transition probabilities do not change over time.

    For example, if there's a 10% chance that tomorrow is sunny given that today is rainy, then if any day is rainy there will be a 10% chance that the next day is sunny.

  • In a Markov chain, the probabilities for the next state only depend on what?

    In a Markov chain, the probabilities for the next state only depend on the current state.

    They do not depend on the previous states.

  • When is a Markov chain said to be regular?

    A Markov chain is said to be regular if there is a value k such that in exactly k steps it is possible to reach every state from any initial state.

  • What is a transition state diagram?

    A transition state diagram is a directed graph where the vertices are the states and the edges are labelled with the transition probabilities between the states.

    Diagram showing three states A, B, C with transition probabilities. Arrows indicate probability flow, such as A to B at 0.2, B to C at 0.5, and C to A at 0.1.
  • True or False?

    On a transition state diagram, the probabilities on the edges going into a vertex add up to 1.

    False.

    On a transition state diagram, the probabilities on the edges coming out of a vertex add up to 1.

  • Using the transition state diagram below, what is the probability that the next state is A given that the current state is L?

    A transition state diagram with three nodes A, L and O. The nodes are connected with directed edges with weights that represent the probability of going from one state to another. The arrow leading from L to A has a weight of 0.35 and the arrow leading from A to L has a weight of 0.5.

    The probability that the next state is A given that the current state is L is equal to 0.35.

    A transition state diagram with three nodes A, L and O. The nodes are connected with directed edges with weights that represent the probability of going from one state to another. The arrow leading from L to A has a weight of 0.35 and the arrow leading from A to L has a weight of 0.5.
  • True or False?

    There must be an edge between any pair of vertices on a transition diagram.

    False.

    There does not need to be an edge between any pair of vertices on a transition diagram. There must be a way to get between any pair of vertices but this can be a sequence of edges.

    For example, to get from A to C you could go via B.

  • What is a transition matrix, bold italic T?

    A transition matrix bold italic T shows the transition probabilities between the current state and the next state.

  • Do the columns of a transition matrix represent the current states or the next states?

    The columns of a transition matrix represent the current states.

    The rows represent the next states.

  • True or False?

    The entries in each row of a transition matrix add up to 1.

    False.

    The entries in each column of a transition matrix add up to 1.

  • Give an interpretation of the value 0.5 in the transition matrix below.

    table row blank cell table row A B C end table end cell row cell table row A row B row C end table end cell cell open parentheses table row cell 0.2 end cell cell 0.4 end cell 0 row cell 0.8 end cell cell 0.1 end cell 1 row 0 cell 0.5 end cell 0 end table close parentheses end cell end table

    In the transition matrix below, 0.5 is the probability that the next state is C given that the current state is B.

    table row blank cell table row A B C end table end cell row cell table row A row B row C end table end cell cell open parentheses table row cell 0.2 end cell cell 0.4 end cell 0 row cell 0.8 end cell cell 0.1 end cell 1 row 0 cell 0.5 end cell 0 end table close parentheses end cell end table

  • What is an initial state matrix bold italic s subscript 0?

    An initial state matrix bold italic s subscript 0 is a column vector which contains the probabilities of each state being chosen as the initial state.

    Alternatively, it can contain the initial frequencies for each state.

  • What is a column state matrix bold italic s subscript n?

    A column state matrix bold italic s subscript n is a column vector which contains the probabilities of each state being chosen after n steps.

    Alternatively, it can contain the frequencies for each state after n steps.

  • If you know the column state matrix bold italic s subscript k, how do you find bold italic s subscript k plus 1 end subscript?

    If you know the column state matrix bold italic s subscript k, you can find bold italic s subscript k plus 1 end subscript by pre-multiplying bold italic s subscript k by the transition matrix, i.e. bold italic s subscript k plus 1 end subscript equals bold italic T bold italic s subscript k.

  • True or False?

    The entries in each column of any power of a transition matrix add up to 1.

    True.

    The entries in each column of any power of a transition matrix add up to 1.

  • The matrix below represents bold italic T cubed where bold italic T is a transition matrix.

    Give an interpretation of the value 0.5 in the transition matrix below.

    table row blank cell table row A B C end table end cell row cell table row A row B row C end table end cell cell open parentheses table row cell 0.2 end cell cell 0.4 end cell 0 row cell 0.8 end cell cell 0.1 end cell 1 row 0 cell 0.5 end cell 0 end table close parentheses end cell end table

    The matrix below represents bold italic T cubed where bold italic T is a transition matrix.

    0.5 is the probability that after 3 steps the state will be C, given that the current state is B.

    table row blank cell table row A B C end table end cell row cell table row A row B row C end table end cell cell open parentheses table row cell 0.2 end cell cell 0.4 end cell 0 row cell 0.8 end cell cell 0.1 end cell 1 row 0 cell 0.5 end cell 0 end table close parentheses end cell end table

  • Given the initial state matrix, bold italic s subscript 0, and the transition matrix bold italic T, how can you find the column state matrix, bold italic s subscript n?

    Given the initial state matrix, bold italic s subscript 0, and the transition matrix bold italic T, you can find the column state matrix, bold italic s subscript n by:

    • raising the transition matrix to the power n,

    • post-multiplying by the initial state matrix.

    bold italic T to the power of n bold italic s subscript 0 equals bold italic s subscript n

    This formula is given in the exam formula booklet.

  • What is meant by a steady state vector, bold italic s?

    A steady state vector, bold italic s, is a vector that does not change when multiplied by the transition matrix, i.e. bold italic T bold italic s equals bold italic s.

  • True or False?

    Regular Markov chains have steady states.

    True.

    Regular Markov chains have steady states.

  • What is the definition of a regular Markov chain in terms of its transition matrix, bold italic T?

    A Markov chain is said to be regular if there exists a positive integer k such that none of the entries are equal to 0 in the matrix bold italic T to the power of k.

  • If a Markov chain is regular, what do you know about the eigenvalues of its transition matrix?

    If a Markov chain is regular, then its transition matrix has exactly one eigenvalue equal to 1 and the rest will all be less than 1.

  • True or False?

    A steady state vector is an eigenvector of the transition matrix corresponding to the eigenvalue of 1.

    True.

    A steady state vector is an eigenvector of the transition matrix corresponding to the eigenvalue of 1.