8  Questions for Chapter 8

8.1 Warm-up

Exercise 8.1 Suppose \(X\colon\Omega\to\{1,2,3\}\) is a discrete random variable with \(p(x)=x/6\) for \(x\in\{1, 2, 3\}\). Find \(\expec{X}\).

Exercise 8.2 Find \(\expec{X}\), when \(X\) is continuously distributed with \[f(x) = \begin{cases} 3x^2 & \text{if }x\in[0,1], \\ 0 & \text{elsewhere}. \end{cases}\]

Exercise 8.3 Let \(X\sim\text{Po}(\lambda)\). Show that \(\expec{X}=\lambda\).

Hint: Use the fact that \(\sum_{x=0}^\infty \lambda^x/x! = e^\lambda\).

Exercise 8.4 Let \(X\sim\mathcal{E}(\beta)\). Show that \(\expec{X}=1/\beta\).

Hint: Integrate by parts.

Exercise 8.5 Continuing from Exercise 8.1, find \(\expec{X^2}\) and \(\expec{1/X}\).

Exercise 8.6 Continuing from Exercise 8.2, find \(\expec{X^2}\) and \(\expec{1/X}\).

Exercise 8.7 Let \(X \sim \text{U}(0, 2\pi)\). Find \(\expec{\sin X}\).

Exercise 8.8 Let \(X \sim \text{U}(0, \pi)\). Find \(\expec{\sin X}\).

Exercise 8.9 Let \(X\sim\mathcal{E}(\beta)\). Show that \(\expec{X^2}=2/\beta^2\).

Hint: Integrate by parts.

Exercise 8.10 Let \(Z\sim\mathcal{N}(0,1)\). Show that \(\expec{Z^2}=1\).

Hint: You guessed it: integrate by parts.

Exercise 8.11 Assume \(X\) and \(Y\) have joint probability mass function as in Exercise 7.1:

\(p(x,y)\) \(y=-1\) \(y=0\) \(y=1\) \(y=2\)
\(x=0\) \(1/12\) \(1/4\) \(0\) \(0\)
\(x=1\) \(1/12\) \(1/12\) \(1/12\) \(1/12\)
\(x=3\) \(0\) \(0\) \(1/4\) \(1/12\)

Find \(\expec{X}\), \(\expec{Y}\), \(\expec{S}\), \(\expec{D}\), and \(\expec{M}\), where \(S = X + Y\), \(D = X - Y\), and \(M = \max(X, Y)\).

Exercise 8.12 Let \(X\sim\text{Bin}(1,q)\), i.e. \(p_X(1)=q\) and \(p_X(0)= 1 - q\), where \(0<q<1\). Show that \(\expec{X}=q\) and \(\var{X}=q(1-q)\).

Hint: use the “Variance and covariance of linear combinations” corollary.

Exercise 8.13 Prove the “Variance and covariance of linear combinations” corollary in Section 8.4 using the linearity of expectation, that is, you may use the results proven in the two “Linearity of expectation” theorems in Section 8.3.

Exercise 8.14 Continuing from Exercise 8.12, let \(Y\) be a discrete random variable taking two distinct values \(s\) and \(t\in\mathbb{R}\), where \(p_Y(t)=p\) and \(p_Y(s)= 1 - p\) for \(0<p<1\). Express \(Y\) as a linear function of \(X\sim\text{Bin}(1,p)\), and find \(\expec{Y}\) and \(\var{Y}\).

Exercise 8.15 Let \(X\sim\mathcal{E}(\beta)\). In Exercise 8.4 and Exercise 8.9, we showed that \(\expec{X}=1/\beta\) and \(\expec{X^2}=2/\beta^2\). Use this, along with linearity of expectation, to show that \(\var{X}=1/\beta^2\).

Exercise 8.16 Let \(X\) and \(Y\) be distributed as in Exercise 8.11. You already calculated \(\expec{X}\) and \(\expec{Y}\) in that exercise. Now find \(\expec{XY}\) and \(\cov{X,Y}\).

Exercise 8.17 Toss a fair coin twice, and let \(X\) be the number of heads on the first toss, and \(Y\) be the total number of heads on both tosses. Find \(\cov{X,Y}\).

Exercise 8.18 Suppose \(X \sim U(0,2)\) and \(Y= X^2\). Find \(\cov{X,Y}\).

Exercise 8.19 A fair coin is tossed three times. Suppose \(X\) is the number of heads in the first two tosses and \(Y\) is the number of tails in the last two tosses. Guess whether \(\cov{X,Y}\) is positive or negative and evaluate \(\cov{X,Y}\) to check.

Exercise 8.20 Consider real-valued random variables \(X\) and \(Y\) such that \(\sd{X} = \sd{Y} = 1\). Use the facts that \(\var{X + Y} \geq 0\) and \(\var{X - Y} \geq 0\) to show that, in this case: \[-1 \leq \cov{X, Y} \leq 1.\]

Exercise 8.21 Using the result established in , show that for any real-valued random variables \(X\) and \(Y\) (i.e. with arbitrary standard deviation): \[-\sd{X} \sd{Y} \leq \cov{X, Y} \leq \sd{X} \sd{Y}.\]

Hint: Use the “Variance and covariance of linear combinations” corollary, with \(\alpha=\gamma=0\), \(\beta=1/\sd{X}\), and \(\delta=1/\sd{Y}\).

Exercise 8.22 Suppose \(U\sim\text{U}(0,1)\) and let \(X = \cos 2\pi U\) and \(Y=\sin 2\pi U\). Calculate \(\expec{X}\), \(\expec{Y}\), \(\var{X}\), \(\var{Y}\), and \(\cov{X,Y}\). Are \(X\) and \(Y\) independent?

Exercise 8.23 Let \(X\) and \(Y\) be independent discrete random variables. Show that \(\cexpec{X}{Y = y} = \expec{X}\) for all \(y\in Y(\Omega)\) such that \(\pr{Y=y}>0\).

Exercise 8.24 The random variables \(X\) and \(Y\) have joint probability mass function given by:

\(p(x,y)\) \(y=-2\) \(y=0\) \(y=1\) \(y=3\)
\(x=0\) \(1/16\) \(1/8\) \(0\) \(0\)
\(x=2\) \(1/16\) \(1/8\) \(1/8\) \(1/8\)
\(x=5\) \(0\) \(0\) \(1/8\) \(1/4\)
  1. Find \(\cexpec{Y}{X = x}\) for \(x\in\{0,2,5\}\).

  2. Confirm that \(\expec{\expec{Y\mid X}} = \expec{Y}\).

Exercise 8.25 From previous experience, it is judged that student scores on a particular test have expected value 75 and variance \(25\). Let \(X\) denote the score of a randomly selected individual.

  1. Use Markov’s inequality to get an upper bound for \(\pr{X \ge 85}\).

  2. Use Chebyshev’s inequality to get a lower bound for \(\pr{65<X<85}\).

Let \(\bar{X}\) be the average of a sample of \(n\) independent student test scores.

  1. Find \(\expec{\bar{X}}\) and \(\var{\bar{X}}\).

  2. Find the smallest sample size \(n\) to guarantee that \(\pr{65<\bar{X}<85}\ge 0.99\).

8.2 Work-out

Exercise 8.26 Let \(X\sim\text{Geom}(q)\) for \(0<q \leq 1\). Show, by direct calculation, that \(\expec{X}=1/q\).

Hint: If \(0<r<1\) then \[\sum_{x=0}^{\infty} r^x=\frac{1}{1-r}\], and \[\sum_{x=1}^{\infty} x r^{x-1}=\frac{\mathrm{d}}{\mathrm{d} r} \sum_{x=0}^{\infty} r^x \].

Exercise 8.27 A card is drawn at random from a well-shuffled pack, and \(X\) denotes the denomination of the card i.e. the number on the card (assume it takes the value 1 for an ace, and value 10 for a face card). Find \(\expec{X}\).

Exercise 8.28 You repeatedly play a game with three possible outcomes: win (W), lose (L), or draw (D). Suppose that on each play, \(\pr{W} = p_w\), \(\pr{L} = p_\ell\), \(\pr{D} = p_d\), where \(p_w + p_\ell + p_d = 1\). If a result of W occurs before a result of L, the return of the scenario is \(Z = 1\) pound. Otherwise, the return is \(Z= 0\).

  • Show that \(\pr{Z = 1} = \frac{p_w}{p_w + p_{\ell}}\).

    Hint: This can be done by summing a series, but the neatest method uses the partition theorem, where the partition is the outcome of the first play.

  • Let \(N\) be the number of plays up to and including the first decisive result (W or L). Find \(\expec{N}\).

  • Are the random variables \(N\) and \(Z\) independent? Explain.

Exercise 8.29 Let \(t\in\mathbb{R}\) and \(Z\sim\mathcal{N}(0,1)\). Show that \(\expec {e^{tZ} } = e^{t^2/2}\).

Hint: Use the expansion \(tz-z^2/2=-\bigl((z - t)^2 -t^2\bigr)/2\).

Exercise 8.30 Continuing from Exercise 8.27, find \(\expec{|X-6|}\).

Exercise 8.31 Suppose \(X\sim\text{Bin}(4,1/2)\). Find \(\expec{X}\), \(\expec{X^2}\), and \(\expec{(X-2}^2)\).

Exercise 8.32 Let \(n\in\mathbb{N}\) and \(X\sim\text{Po}(\lambda)\). Show that \(\expec{X(X-1)(X-2) \cdots (X-n+1)}=\lambda^n\) (this is the \(n\)-th cumulant of \(X\)).

Hint: remember that \(\sum_{x=0}^{\infty} \lambda^x/x! = e^\lambda\).

Exercise 8.33 Let \(X\sim\text{Geom}(q)\) for \(0 < q \leq 1\). Show that \(\expec{X(X-1})=2(1-q)/q^2\).

Hint: If \(0<r<1\) then \(\sum_{x=0}^{\infty} r^x=\frac{1}{1-r}\), and \(\sum_{x=2}^{\infty} x(x-1)r^{x-2}=\frac{d^2}{dr^2} \sum_{x=0}^{\infty} r^x\).

Exercise 8.34 A house suffers damage costing \(X \sim \mathcal{E}(\beta)\) done to it by a storm. The insurance policy pays the cost in excess of \(a\) i.e. it pays \(Y:=\max (X - a, 0)\). Find \(\expec{Y}\). You may use the fact that \(\expec{X}=1/\beta\) (as shown in Exercise 8.4).

Exercise 8.35 Prove the “Expectation of a function of a multivariate random variable” theorem (discrete case), along the same lines as the proof we saw for the univariate version.

Exercise 8.36 Let \(X\) be a real-valued random variable and \(a \in \mathbb{R}\) a constant. Consider the quantity \(d(a) := \expec{ (X -a )^2 }\). Prove that \(d(a)\) has a unique minimum \(d(a_0) = d_0\), where \(a_0\) and \(d_0\) you should determine.

Exercise 8.37 Let \(X\sim\text{Po}(\lambda)\). In Exercise 8.3 and Exercise 8.32, we showed that \(\expec{X}=\lambda\) and \(\expec{X(X-1})=\lambda^2\). Use this, along with linearity of expectation, to show that \(\var{X}=\lambda\).

Exercise 8.38 Let \(X\sim\text{Geom}(q)\) for \(0<q\leq 1\). In Exercise 8.26, we showed that \(\expec{X}=1/q\) and \(\expec{X(X-1})=2(1-q)/q^2\). Use this, along with linearity of expectation, to show that \(\var{X}=(1 - q)/q^2\).

Exercise 8.39 A bag contains 4 counters. 2 are worth £ 1, 1 is worth £2 and 1 is worth £4.

  1. You randomly select a counter, giving gain £\(X\). Find \(\expec{X}\) and \(\var{X}\).

  2. You place the first counter in your pocket, without looking at it. You then draw a second counter from the bag at random. This counter gives gain £\(Y\). Without further calculations, write down (with explanation!) \(\expec{Y}\) and \(\var{Y}\).

  3. Let \(T\) be the total value of the two counters drawn. Find \(\expec{T}\).

  4. Tabulate \(p_{X,Y}\) and verify that \(p_X = p_Y\).

  5. Using that table, calculate \(\expec{(X-1)(Y-1)}\).

  6. Using linearity, write \(\expec{XY}\) as a function of \(\expec{X}\), \(\expec{Y}\), and \(\expec{(X-1)(Y-1)}\).

  7. Calculate \(\cov{X,Y}\). Hint: Use the previous result.

Exercise 8.40 Suppose you have two fair 6-sided dice, one red and the other blue. Let \(X\) be the score on the red die and \(Y\) be the score on the blue die. Compute:

  • \(\cexpec{X}{X \text{ is even}}\);

  • \(\cexpec{X}{X \text{ is odd}}\);

  • \(\cexpec{X+Y}{X+Y \text{ is even}}\);

  • \(\cexpec{X+Y}{X+Y \text{ is odd}}\).

Exercise 8.41 Let \(X\sim\text{Geom}(q)\) for \(0<q\leq 1\). Recall that \(X\) can be interpreted as the number of trials until the first success in a sequence of independent trials, each with success probability \(q\). Use the partition theorem for expectations, with the partition generated by the outcome of the first trial, to write down an equation for \(\expec{X}\), and hence find \(\expec{X}\) as a function of \(q\). By a similar method, find \(\expec{X^2}\) and hence \(\var{X}\).

Exercise 8.42 A coin is biased with probability \(p\) of heads and \(1-p\) of tails. You toss the coin until the first head is obtained; each time the coin is tossed you also roll (independently) a standard six-sided die.

Let \(N\) be the total number of tosses and let \(X\) be the sum of the scores on all the dice rolls; so for example if the sequence of tosses was TTH and the sequence of dice rolls was 513 we would have \(N=3\) and \(X = 5+1+3=9\).

Find (a) \(\expec{N}\); and (b) \(\expec{X}\). What do you notice?

Hint: For both parts, the neatest solution goes via conditioning on the first toss; i.e., use the partition theorem for expectations, with the partition generated by the outcome of the first toss, to derive an equation for the quantity of interest.

Exercise 8.43 Let \(X\) and \(Y\) be any random variables, and let \(g: X (\Omega) \to \mathbb{R}\) be any function.

Show that \(\cexpec{g(X)Y}{X} = g(X)\cexpec{Y}{X}\).

Exercise 8.44 The following experiment is performed: A fair coin is tossed twice, and the number \(N\) of heads is recorded; the coin is tossed \(N\) more times, so there are 2, 3 or 4 tosses in all. Let \(X\) be the total number of heads obtained, \(Y\) the total number of tails obtained and let \(Z:=\cexpec{X}{Y}\). Recall that \(\cexpec{X}{Y}\) is a random variable, i.e. a map from \(\Omega\) to \(\mathbb{R}\) such that \[\cexpec{X}{Y}(\omega) := \cexpec{X}{Y=y} \quad \text{when}~ Y(\omega)=y.\] Complete the following table.

\(\omega\) \(\pr{\omega}\) \(X(\omega)\) \(Y(\omega)\) \(Z(\omega)\)
HHHH \(1/16\) 4 0 4
HHHT
HHTH
HHTT
HTH
HTT
THH
THT
TT

Exercise 8.45 Roll a fair die repeatedly. Let \(X\) be the number of 2s before the first 6, and let \(N\) be the total number of rolls up to and including the first \(6\). So if the sequence of scores is \(42526\) then \(X=2\) and \(N=5\). By considering the conditional distribution of \(X\) given \(N=n\), write down \(\cexpec{X}{N}\), and hence compute \(\expec{X}\).

Exercise 8.46 Suppose that \(\pr{X > a} > 0\) and \(\pr{X < a} > 0\) for some constant \(a\in\mathbb{R}\). Show that \(\expec{X \mid X < a} \leq a \leq\expec{X \mid X > a}\).

8.3 Stretch

Exercise 8.47 You have won the lottery. A prize of 1 million pounds is to be shared equally by you and the other \(X\) winners that day, where \(X\sim\text{Po}(\lambda)\). Let \(Y\) be the amount you receive, in millions of pounds. Express \(Y\) as a function of \(X\) and find \(\expec{Y}\).

Hint: \(\sum_{x=0}^{\infty} \lambda^x/x! = e^\lambda\).

Exercise 8.48 A boxer has \(X\) fights during his career, where \(X\sim\text{Geom}(q)\) for \(0 < q \leq 1\). The boxer earns prize money \(K \alpha^x\) on his \(x\)th fight where \(K\) and \(\alpha>1\) are constants (thus earnings per fight increase as a geometric series as his career progresses). Let \(W\) be the amount he earns in his last fight.

Express \(W\) as a function of \(X\) and find \(\expec{W}\). What condition must \(\alpha\) satisfy for \(\expec{W}\) to be finite?

Hint: Remember that \[\sum_{x=0}^{\infty} r^x =\begin{cases} \frac{1}{1-r} & \text{if }0<r<1, \\ \infty & \text{if }r\ge 1. \end{cases}\]

Exercise 8.49 Let \(X\sim\text{Po}(\lambda)\).

  1. Show that for any function \(g : \mathbb{Z}_+ \to \mathbb{R}\), \(\expec{Xg(X)} = \lambda \expec{ g(X+1) }\) (this is the Chen–Stein equation for the Poisson distribution).

  2. Let \(n\) be a positive integer. Use part (a) to deduce a formula for \(\expec{X^n}\) in terms of \(\expec{X}, \expec{X^2}, \ldots, \expec{X^{n-1}}\) and \(\lambda\).

  3. Hence compute \(\expec{X^3}\).

Exercise 8.50 Let \(U_1, U_2 \sim\text{U}(0,1)\) be independent, and let \(V = | U_1 - U_2 |\). Find \(\expec{ V}\) and \(\var{ V }\). There are two ways you could do this: either by multiple integration, or compute \(F_V(r)\) for \(r \in (0,1)\) using the partition theorem \[\pr { | U_1 - U_2 | > r } = \int_0^1 \pr { | U_1 - u | > r } \mathrm{d} u ,\] and then find the density of \(V\).

Exercise 8.51 In Big Town, there are \(r\) people who support the ‘Really Great Party’ and \(d\) people who support the ‘Dead Good Party’. The ‘Totally Impartial Opinion Poll Company’ selects people at random and asks them who they support. (To keep things simple, we suppose that everybody supports one of the two parties and nobody tells lies.)

  1. Let \(X=1\) if the first person selected supports the Really Great Party, \(0\) if they support the Dead Good Party. Let \(Y=1\) if the second person selected supports the Really Great Party, 0 if they support the Dead Good Party. (The opinion pollsters never ask the same person twice in the same sample.) Show that \[\begin{gathered} \nonumber \expec{X} = \expec{Y} = \frac{r}{r+d}, \qquad\qquad \var{X} = \var{Y} = \frac{rd}{(r+d)^2}, \\ \nonumber \cov{X,Y} = -\frac{rd}{(r+d)^2(r+d-1)}. \end{gathered}\]

  2. Suppose that \(m\) different people are selected at random. Let \(Z\) be the number of supporters of the Really Great Party. Use the previous part to show that \[\begin{aligned} \expec{Z}&=\frac{mr}{r+d}, & \var{Z}&=\frac{mrd(r+d-m)}{(r+d)^2(r+d-1)}. \end{aligned}\]

  3. Let the proportion of supporters of the Really Great Party in the sample be \(S\) (i.e \(S:= Z/m\)). Find \(\expec{S}\) and \(\var{S}\). Show that a sample of 250 people ensures that \(\var{S}\leq 1/1000\).

Exercise 8.52 Consider the experiment described in Exercise 3.18. Let \(X\) be the total score on the red die and let \(Y\) be the total score on the blue die. Find \(\cov{X,Y}\). Comment on your answer.

Hint: You may wish to use the fact that if discrete random variables \(X\) and \(Y\) are conditionally independent given an event \(A\), then \(\cexpec{XY}{A} = \cexpec{X}{A} \cexpec{Y}{A}\).}

Exercise 8.53 Suppose that \(\pr{X > a} > 0\) and \(\pr{X < a} > 0\) for some constant \(a\in\mathbb{R}\). Show that \(\expec{X \mid X < a} \leq\expec{X} \leq\expec{X \mid X > a}\).

Hint: Use the partition theorem, along with Exercise 8.46.