M2R maths Grenoble
2016–2017
Tutorials
Continuous-time Markov chains
Exercise 1.1 From discrete to continuous time Markov chains.
Suppose that P= (p(x, y))x,ySis the transition matrix of a discrete time Markov chain with
countable state-space S. The most natural way to make it into a continuous time Markov chain
is to have it take steps at the event times of a Poisson process of intensity 1independent of the
discrete chain. In other words, the times between transitions are independent random variables
with a unit exponential distribution. The forgetfulness property of the exponential distribution is
needed in order that the process have the Markov property.
1. Show that the transition function of the continuous time chain described above is given by:
pt(x, y) = et
+
X
k=0
tk
k!pk(x, y)(1)
where pk(x, y)are the k-step transition probabilities for the discrete time chain.
2. Show that (1) satisfies the Chapman-Kolmogorov equations, i.e for all s, t 0and x, y S:
ps+t(x, y) = X
zS
ps(x, z)pt(z, y).
3. Show that the Q-matrix is given by
Q=PI .
Exercise 1.2 The two-states Markov chain
Suppose that S={0,1}. Consider a Q-matrix that is given by
β β
δδ
where β, δ > 0. Show that the corresponding transition matrix is
Pt=
δ
β+δ+β
β+δet(β+δ)β
β+δ(1 et(β+δ))
δ
β+δ(1 et(β+δ))β
β+δ+δ
β+δet(β+δ)
Exercise 1.3 Pure birth chain and explosion
Let S=Nand
q(i, j) =
βisi j=i
βisi j=i+ 1
0otherwise,
1
where βi>0for each i. Let (τi)i0be independent exponentially distributed random variables with
τihaving parameter βi. Define:
T:=
X
i=0
τi[0,+]
1. Show that if P
i=0 1
βi<,T < a.s and if P
i=0 1
βi=,T=a.s.
2. What does this imply for the existence of a Markov chain with Q-matrix Qas above ?
Exercise 1.4 MCMC I: Metropolis
Let Ψbe an irreducible Q-matrix on a finite state-space Sand πbe a probability distribution on S.
We want to modify the markov chain given by Ψto obtain a new markov chain whose stationary
distribution is π. We shall adopt the following method: when at state x, a candidate move is
generated with rates Ψ(x, ·). If the proposed state is y, the move is accepted with probability
a(x, y)for some weight a(x, y), and rejected (i.e the chain stays at x) with probability 1a(x, y).
1. Show that the Q-matrix of the new chain is given by:
Q(x, y) = Ψ(x, y)a(x, y)if y6=x
Pz:z6=xΨ(x, z)a(x, z)if y=x
2. We want to find weights a(x, y)such that Qis reversible with respect to πand we reject the
candidate moves as seldom as possible. Show that this leads to the choice:
a(x, y) = π(y)Ψ(y, x)
π(x)Ψ(x, y)1
This is called the (continuous-time) Metropolis chain associated to πand Ψ.
3. Suppose that you know neither the vertex set nor the edge set of a finite connected graph,
but you are able to perform a random walk on it and you want to approximate the uniform
distribution on the vertex set by running a Metropolis chain for a long time. Make explicit
the Q-matrix of the chain.
Exercise 1.5 MCMC II: Glauber
Let S0and Vbe finite sets and suppose that Sis a non-empty subset of SV
0. We shall call elements
of Vthe vertices and elements of Sthe configurations. Let πbe a probability distribution on S. The
Glauber dynamics for πis a reversible Markov chain on Sdefined as follows. Vertices are equipped
with independent Poisson processes with rate 1, let us name them “clocks”. When at configuration
ω, as soon as a clock rings at vertex vV, a new state for the configuration at vis chosen according
to the measure πconditioned on the configurations which agree with ωat all vertices different from
v.
2
1. For a configuration ηSand a vertex vV, let S(η, v)denote the configurations agreeing
with ηeverywhere except perhaps at v. Show that the Q-matrix of this chain is given by:
Q(η, η) =
π(η)
π(S(η,v)) if ηS(η, v)and η6=η
0if ηand ηdiffer at least on two coordinates
−|V|+π(η)PvV1
π(S(η,v)) if η=η
2. Check that Qis reversible with respect to π. Is it irreducible ?
3. The Ising model on the graph G= (V, E)is the family of probability distributions on S=
{−1,1}Vdefined by:
πβ(η) = 1
Z(β)eβH(η),
where β0is a parameter, His the hamiltonian defined by
H(η) = 1
2X
u,vV
{u,v}∈E
η(u)η(v)
and Z(β)is the normalization constant, called the partition function:
Z(β) = X
ηS
eβH(η).
Make explicit the Q-matrix of the Glauber dynamics for πβ. Is it irreducible ?
Exercise 1.6 Coupling and mixing time
Let Qbe an irreducible Q-matrix on a finite state-space Sand denote by πits invariant probability
measure. Recall the notion of total variation distance between two probability measures on S:
dV T (µ, ν) = sup
AS|µ(A)ν(A)|=1
2X
xS|µ(x)ν(x)|
1. Suppose that (X, Y )is a random vector with values in S2such that X(resp. Y) has distri-
bution µ(resp. ν). Show that:
dV T (µ, ν)P(X6=Y).
2. One way to measure the convergence to equilibrium is in the sense of total variation. Define:
d(t) := max
xSdV T (pt(x, ·), π)
and:
d(t) := max
x,ySdV T (pt(x, ·), pt(y, ·)) .
Show that dis non-increasing, that
d(t)d(t)2d(t),
3
and
d(t) = max
µdV T (Pµ(Xt=·), π).
For ε < 1
2, one usually defines the mixing time tmix(ε)as follows:
tmix(ε) := inf{t:d(t)ε}
3. Suppose that (Xt, Yt)t0is a stochastic process with values in S2such that (Xt)and (Yt)are
random walks with Q-matrix Q(this is called a coupling of two markov chains with Q-matrix
Q). Suppose also that Xand Ystay together as soon as they meet:
if Xs=Ysthen Xt=Ytfor any ts .
and define the coupling time τof Xand Yas:
τ:= inf{t0 : Xt=Yt}.
Let µ(resp. ν) be the distribution of X0(resp. Y0). Show that:
dV T (µetQ, νetQ)P(τ > t)
4. One may define a coupling as follows, Xstarting from xand Ystarting from y: we let the
two chains evolve independently (each one according to Q) until they meet, after which we
let them evolve according to Q, but staying equal. Write down the Q-matrix corresponding
to this Markov chain on S2.
5. Now, we suppose that Qdefines a simple random walk on the circle Z/nZ(with jump rates
1/2to the right and to the left). Using the coupling of question 4, and Markov’s inequality
on the coupling time, show that:
tmix(ε)n2
8ε.
Exercise 1.7 Spectral representation of a finite state-space reversible Markov chain
Let πbe a positive probability measure on S={1,...,n}and let Qbe an irreducible n×n Q-matrix
reversible with respect to π. Denote by Dthe diagonal matrix whose diagonal elements are pπ(i).
1. Show that the matrix S=DQD1is symmetric. Thus, there exists an orthonormal matrix
Uand a diagonal matrix Λsuch that:
S=UΛUT
and the diagonal elements of Λcan be chosen to be non-increasing: λ1λ2...λn.
2. Show that λ2< λ1= 0.
3. Show that the column vectors of the matrix DU constitute a basis of eigenvectors for QT,
with the k-th column associated to the eigenvalue λk.
4
4. Show that the column vectors of the matrix D1Uconstitute a basis of eigenvectors for Q,
with the k-th column associated to the eigenvalue λk.
5. Let Xtbe a continuous-time Markov chain with Q-matrix Q. Show the spectral representation
formula: for any i, j and any t0,
pt(i, j) := Pi(Xt=j) = π(i)1/2π(j)1/2
n
X
k=1
eλktuikujk .
6. When nis fixed and tgoes to infinity, describe the rate of convergence of pt(i, j)towards π(j).
7. Make explicit the rate of convergence of kpt(i, .)πk2towards zero when Qdefines a simple
random walk on the circle Z/nZ. Give also lower and upper bounds on the mixing time
(defined in Exercise 1.6).
Exercise 1.8 M/M/1 queue
Consider the following model for a queue. Customers arrive according to a Poisson process with
rate λ > 0and form a queue: there is a single server (i.e only one customer can be served at a given
time), and the service times are independent, exponential random variables with rate µ > 0. We
denote by Xtthe number of customers in the queue at time t, the number X0being independent
from the arrivals and service times at positive times.
1. Compute the Q-matrix of this chain and determine the invariant measures.
2. What happens, in the asymptotic sense, when λµ? And when λ < µ ?
3. Suppose that λ < µ. Show that the asymptotic proportion of the time that the server is busy
equals almost surely:
lim
T+
1
TZT
0
1Xt>0dt =λ
µ.
Exercise 1.9 The graphical representation without graphics
Let Sbe a topological space equipped with its Borel sigma-field and be the set of càdlàg functions
from R+to Sequipped with the σ-field generated by the maps ω7→ ω(t)for t0. Let (E, E, µ)be
a measured space with µfinite on Enand SnEn=E. Consider ˜
Na poisson random measure on
R×Ewith intensity λµ,λbeing the Lebesgue measure on R. Let ˜
denote the set of counting
measures on R×Ewhich are finite on each set of the form ]s, t]×En. Define, for any ˜ωin ˜
:
θs˜ω(A) = ˜ω({(t, x)R×E: (ts, x)A}).
For any t0, let φtbe a measurable function from ×Sto S. Suppose that:
(i) for every ˜ω˜
and xS,s7→ φs(˜ω, x)is càdlàg.
(ii) t0, x S φt(0, x) = x
5
1 / 16 100%
La catégorie de ce document est-elle correcte?
Merci pour votre participation!

Faire une suggestion

Avez-vous trouvé des erreurs dans linterface ou les textes ? Ou savez-vous comment améliorer linterface utilisateur de StudyLib ? Nhésitez pas à envoyer vos suggestions. Cest très important pour nous !