## Puzzled by the definition of stationarity

### March 15, 2012

I’m reading Santosh Vempala’s survey “Geometric Random Walks: A Survey,” and already I’m puzzled at one of the very first definitions he gives.

Define a Markov chain as a state space sigma algebra pair \((K, \mathcal{A})\) with transition probability measures given by \(P_u\) for each \(u \in K.\)

A distribution \(Q\) on \((K, \mathcal{A})\) is called stationary if one step from it gives the same distribution, i.e., for any \(A \in \mathcal{A},\)

\[

\int_A P_u(A) \, dQ(u) = Q(A).

\]

This definition makes sense in words, but mathematically it doesn’t seem sound: unless \(P_u(A) = 1\) a.e. (with respect to \(u \in A\)), this equality can’t hold. I must be missing something…

**Update: **

The definition should involve integration over the whole space:

A distribution \(Q\) on \((K, \mathcal{A})\) is called stationary if one step from it gives the same distribution, i.e., for any \(A \in \mathcal{A},\)

\[

\int_K P_u(A) \, dQ(u) = Q(A).

\]

That this is so can be seen from the discrete case. Just goes to show you that you have to be careful even when reading peer-reviewed articles.