4 Solutions for Chapter 4

Exercise 4.1 See HW solution.

Exercise 4.2 Write \(M_{n+1} = M_n R\) where \(R = \frac{\e^{\theta X_{n+1}}}{m(\theta)}\). The random variable \(R\) is independent of \(\mathcal{F}_n\) and \(M_n\) is \(\mathcal{F}_n\) measurable. Also, \(\mathbb{E}[R] = 1\) as the law of \(X_{n+1}\) is the same as \(X_1\). So, \[ \mathbb{E}[M_{n+1} \mid \mathcal{F}_n] = M_n \mathbb{E}[ R \mid \mathcal{F}_n] = M_n \mathbb{E}[R] = M_n \] (using taking out what is known, then independence).

Exercise 4.3 We have

  1. It should be clear that \(X_t\) only depends on the first \(t\) coin tosses. Let \(\tau\) be the first time that a coin is a head, i.e., \(\tau := \min \{ k \in \N : \omega_k = \h \}\). Now, either \(\tau < t\), which means that at time \(t\) we have stopped betting and \(X_t = X_\tau = -\sum_{i=1}^{\tau-1} 2^{i-1} + 2^{\tau-1} = -(2^{\tau -1}-1) + 2^{\tau-1} = 1\) (because \(\omega_1=\dots= \omega_{\tau-1} = \T\) and \(\omega_\tau = \h\)); or \(\tau \geq t\), which means that \(\omega_1 = \dots = \omega_t = \T\) and \(X_t = -\sum_{i=1}^t 2^{i-1} = -(2^t -1) = 1-2^t\).

  2. We need to show that \(\mathbb{E}[X_t \mid \cF_{t-1}] = X_{t-1}\) for all \(t \geq 1\), assuming the coin tosses are all fair (heads and tails equally likely). For \(t=1\), it is easy to see that \(\mathbb{E}[X_1\mid \cF_0 ] = \mathbb{E}[X_1] = \frac12 X_1(\h) + \frac12 X_1(\T) = 0 = X_0\), and for \(t>1\) the conditional expectation \(\mathbb{E}[X_t \mid \cF_{t-1}]\) is a random variable that depends on the first \(t-1\) coins, given by \[\begin{split} \mathbb{E}[X_t \mid \cF_{t-1}](\omega_1\dots\omega_{t-1}) &= \frac12 X_t(\omega_1\dots \omega_{t-1}\h) + \frac12 X_t(\omega_1\dots\omega_{t-1}\T)\\ &= \begin{cases} \frac12 + \frac12 (1-2^t) & \text{if $\omega_1 = \dots = \omega_{t-1} = \T$},\\ \frac12 + \frac12 & \text{o/wise}, \end{cases} \end{split} \] which is identical to \(X_{t-1}(\omega_1\dots\omega_{t-1})\).

(You might have seen this martingale before if you’ve taken Stochastic Processes, as an example of a martingale for which the Optional Stopping Theorem doesn’t apply: since \(X_\tau = 1\) with probability 1, and \(X_0 = 0\), we have \(\mathbb{E}[X_\tau] \neq \mathbb{E}[X_0]\).

Although this strategy looks like it achieves a guaranteed winnings of \(X_\tau = 1\), its success relies both on there being no upper limit on the size of the bets that can be made and the gambler having an arbitrary amount of money to continue betting until the first head occurs; typically neither of these is true in practice!

Indeed, it’s possible to show that the expected exposure when employing this strategy (i.e., the expected maximum amount owed by the gambler before the first head occurs) is infinite: \(\mathbb{E}[ \displaystyle\min_{ t < \tau }X_t] = \mathbb{E}[X_{\tau-1}] = -\infty\).)

Exercise 4.4 We have

  1. From the representation of \(V_t-V_0\), we find that \[V_t - V_{t-1} = (V_t - V_0) - (V_{t-1} - V_0) = x_t(B_t - B_{t-1}) + y_t(S_t - S_{t-1}).\] Since \(V_{t-1} = x_t B_{t-1} + y_t S_{t-1}\) by definition, we see that the previous equation becomes \(V_t - V_{t-1} = x_t B_t + y_t S_t - V_{t-1}\). So \(V_t = x_t B_t + y_t S_t\), as required.

  2. The condition that \(V_t/(1+r)^t\) is a martingale under \(\mathbb{Q}\) is that \[\mathbb{E}_{\mathbb{Q}} [ V_t \mid \mathcal{F}_{t-1}] = (1+r)V_{t-1}.\] This is the same as \(\mathbb{E}_{\mathbb{Q}} [ V_t -V_{t-1} \mid \mathcal{F}_{t-1}] = rV_{t-1}\) because \(V_{t-1}\) is measurable w.r.t. \(\mathcal{F}_{t-1}\) (taking out what is known). Recall that \(x_t\) and \(y_t\) are measurable w.r.t.~\(\mathcal{F}_{t-1}\). Also, \(\mathbb{E}_{\mathbb{Q}} [ S_t \mid \mathcal{F}_{t-1}] = (1+r) S_{t-1}\) by Theorem 4.1 in the lecture notes and \(B_t = (1+r)B_{t-1}\). From equation (1) we find that \(V_t - V_{t-1} = x_t (B_t-B_{t-1}) + y_t (S_t-S_{t-1})\). So, \[\begin{align*} \mathbb{E}_{\mathbb{Q}} [ V_t -V_{t-1} \mid \mathcal{F}_{t-1}] &= \mathbb{E}_{\mathbb{Q}} [ x_t (B_t-B_{t-1}) + y_t (S_t-S_{t-1}) \mid \mathcal{F}_{t-1}] \\ (\text{since}\; B_t = (1+r) B_{t-1}) &= r \mathbb{E}_{\mathbb{Q}}[x_t \mid \mathcal{F}_{t-1}] B_{t-1} + \mathbb{E}_{\mathbb{Q}} [ y_t (S_t - S_{t-1}) \mid \mathcal{F}_{t-1}] \\ (\text{taking out what is known}) &= r x_t B_{t-1} + y_t \left ( \mathbb{E}_{\mathbb{Q}}[S_{t} \mid \mathcal{F}_{t-1}] - S_{t-1}\right )\\ (\text{by Theorem 4.1}) & = r (x_t B_{t-1} + y_t S_{t-1}) \\ & = r V_{t-1}. \end{align*}\]

Exercise 4.5 Recall that \(M_t = V_t/(1+r)^t\) is a martingale under \(\mathbb{Q}\), and therefore \[ \mathbb{E}_{\mathbb{Q}}[M_n \mid \mathcal{F}_m] = M_m\] for all \(m <n\), where \(\mathcal{F}_n\) is the sigma-algebra made by \(S_0, \ldots, S_n\). (This can be seen be applying the iterated conditioning property to the martingale condition.) Taking \(n = T\) and \(m=t\), and clearing denominators, the identity in (a) follows. For the identity (b) take \(n=t\) and \(m=0\).