Martingales and martingale convergence
Let \(X\) be a martingale. Use the tower property for conditional expectation to deduce that \[\operatorname{\mathbb{E}}\left[X_{n+k}|\mathcal{F}_{n}\right]= X_n\,, \quad k=0,1,2,\dots\,.\]
Recall Thackeray’s martingale: let \(Y_1,Y_2,\dots\) be a sequence of independent and identically distributed random variables, with \(\operatorname{\mathbb{P}}\left(Y_1 = 1\right) = \operatorname{\mathbb{P}}\left(Y_1 = -1\right) = 1/2\). Define the Markov chain \(M\) by \[ M_0 = 0; \qquad M_n = \begin{cases} 1-2^{n} & \quad\text{if $Y_1=Y_2 = \dots = Y_n = -1$,} \\ 1 &\quad\text{otherwise.}\end{cases}\]
- Compute \(\operatorname{\mathbb{E}}\left[M_n\right]\) from first principles.
- What should be the value of \(\operatorname{\mathbb{E}}\left[\widetilde{M}_n\right]\) if \(\widetilde{M}\) is computed as for \(M\) but stopping play if \(M\) hits level \(1-2^N\)?
Consider a branching process \(Y\), where \(Y_0=1\) and \(Y_{n+1}\) is the sum \(Z_{n+1,1}+\ldots+Z_{n+1,Y_n}\) of \(Y_n\) independent copies of a non-negative integer-valued family-size r.v. \(Z\).
- Suppose \(\operatorname{\mathbb{E}}\left[Z\right]=\mu<\infty\). Show that \(X_n=Y_n/\mu^n\) is a martingale.
- Show that \(Y\) is itself a supermartingale if \(\mu < 1\) and a submartingale if \(\mu > 1\).
- Suppose \(\operatorname{\mathbb{E}}\left[s^{Z}\right]=G(s)\). Let \(\eta\) be the smallest non-negative root of the equation \(G(s)=s\). Show that \(\eta^{Y_n}\) defines a martingale.
- Let \(H_n=Y_0+\ldots+Y_n\) be the total of all populations up to time \(n\). Show that \(s^{H_n}/(G(s)^{H_{n-1}})\) is a martingale.
- How should these three expressions be altered if \(Y_0 = k \geq 1\)?
Consider asymmetric simple random walk, stopped when it first returns to \(0\). Show that this is a supermartingale if jumps have non-positive expectation, a submartingale if jumps have non-negative expectation (and therefore a martingale if jumps have zero expectation).
Consider Thackeray’s martingale based on asymmetric random walk. Show that this is a supermartingale or submartingale depending on whether jumps have negative or positive expectation.
Show, using the conditional form of Jensen’s inequality, that if \(X\) is a martingale then \(|X|\) is a submartingale.
A shuffled pack of cards contains \(b\) black and \(r\) red cards. The pack is placed face down, and cards are turned over one at a time. Let \(B_n\) denote the number of black cards left just before the \(n^{th}\) card is turned over. Let \[ Y_n = \frac{B_n}{r+b-(n-1)}\,. \] (So \(Y_n\) equals the proportion of black cards left just before the \(n^{th}\) card is revealed.) Show that \(Y\) is a martingale.
Suppose \(N_1, N_2, \dots\) are independent identically distributed normal random variables of mean \(0\) and variance \(\sigma^2\), and put \(S_n=N_1+\ldots+N_n\).
- Show that \(S\) is a martingale.
- Show that \(Y_n= \exp\left(S_n - \tfrac{n}{2}\sigma^2\right)\) is a martingale.
- How should these expressions be altered if \(\operatorname{\mathbb{E}}\left[N_i\right] = \mu\neq 0\)?
Let \(X\) be a discrete-time Markov chain on a countable state-space \(S\) with transition probabilities \(p_{x,y}\). Let \(f: S \to \mathbb{R}\) be a bounded function. Let \(\mathcal{F}_n\) contain all the information about \(X_0, X_1, \ldots, X_n\). Show that \[M_n = f(X_n) - f(X_0) - \sum_{i=0}^{n-1} \sum_{y \in S} (f(y) - f(X_i)) p_{X_i,y}\] defines a martingale. (Hint: first note that \(\operatorname{\mathbb{E}}\left[f(X_{i+1}) - f(X_i) | X_i\right] = \sum_{y \in S} (f(y) - f(X_i)) p_{X_i,y}\). Using this and the Markov property of \(X\), check that \(\operatorname{\mathbb{E}}\left[M_{n+1} - M_n | \mathcal{F}_n\right] = 0\).)
Let \(Y\) be a discrete-time birth-death process absorbed at zero: \[ p_{k,k+1} = \frac{\lambda}{\lambda+\mu},\quad p_{k,k-1} = \frac{\mu}{\lambda+\mu}\,, \qquad \text{for $k>0$, with $0<\lambda<\mu$.} \]
- Show that \(Y\) is a supermartingale.
- Let \(T=\inf\{n:Y_n=0\}\) (so \(T<\infty\) a.s.), and define \[X_n = Y_{\min\{n, T\}} + \left(\frac{\mu-\lambda}{\mu+\lambda}\right)\min\{ n, T\} \,.\] Show that \(X\) is a non-negative supermartingale, converging to \[ Z=\left(\frac{\mu-\lambda}{\mu+\lambda}\right)T \,. \]
- Deduce that \[ \operatorname{\mathbb{E}}\left[T|Y_0 = y\right] \leq \left(\frac{\mu+\lambda}{\mu-\lambda}\right)y\,. \]
Let \(L(\theta; X_1, X_2, \ldots, X_n)\) be the likelihood of parameter \(\theta\) given a sample of independent and identically distributed random variables, \(X_1, X_2, \ldots, X_n\).
- Check that if the “true” value of \(\theta\) is \(\theta_0\) then the likelihood ratio \[M_n = \frac{L(\theta_1; X_1, X_2, \ldots, X_n)}{L(\theta_0; X_1, X_2, \ldots, X_n)}\] defines a martingale with \(\operatorname{\mathbb{E}}\left[M_n\right] = 1\) for all \(n \ge 1\).
- Using the strong law of large numbers and Jensen’s inequality, show that \[\frac{1}{n} \log M_n \to -c \text{ as $n \to \infty$.}\]
Let \(X\) be a simple symmetric random walk absorbed at boundaries \(a<b\).
- Show that \[ f(x)=\frac{x-a}{b-a} \qquad x\in[a,b]\] is a bounded harmonic function.
- Use the martingale convergence theorem and optional stopping theorem to show that \[f(x) = \operatorname{\mathbb{P}}\left(X \text{ hits }b\text{ before }a|X_0=x\right)\,.\]