6  Questions for Chapter 6

6.1 Warm-up

Exercise 6.1 Let \(X\colon\Omega\to X(\Omega)\) be a random variable. For any \(B_1, B_2 \subseteq X(\Omega)\), show that:

  1. \(\{X\in B_1\}^\mathrm{c}=\{X\in B_1^\mathrm{c}\}\);

  2. \(\{X\in B_1\}\cup\{X\in B_2\}=\{X\in B_1\cup B_2\}\); and

  3. \(\{X\in B_1\}\cap\{X\in B_2\}=\{X\in B_1\cap B_2\}\).

Exercise 6.2 A bag contains 10 counters, numbered \(1\) to \(10\). The first 6 of them are worth £1 each, the next 2 are worth £5 each, the next one is worth £10 and the last one is worth £50. You draw a counter from the bag at random, its value being \(X\) pounds. Let \(\Omega=\{1,2,\dots,10\}\).

  1. Write down the function corresponding to random variable \(X\), i.e. for each \(\omega\in\Omega\), write down \(X(\omega)\).

  2. What subset of \(\Omega\) does the event \(\{ X\le 5 \}\) correspond to? Find \(\pr{X \leq 5}\).

  3. Write down the probability mass function \(p()\) of \(X\), i.e. write down in a table all values \(X\) can take and their corresponding probabilities.

  4. Use the probability mass function to find \(\pr{X \geq 10}\).

Exercise 6.3 Suppose \(X\) is a discrete random variable with possible values 1, 2, and 3, and with probability mass function \(p(x)=c x^2\) for \(x\in\{1,2,3\}\). Calculate:

  1. the value of the constant \(c\),

  2. the value of \(\pr{X \geq 2}\), and

  3. the value of \(\pr{X \in \{1,3\}}\).

Exercise 6.4 Use the equality \[(a + b)^n=\sum_{x=0}^n \binom{n}{x} a^x b^{n-x}\] to show that \(\sum_{x=0}^n p(x) = 1\) when \(X \sim \mathrm{Bin}(n, p)\).

Exercise 6.5 Suppose \(X\sim\mathrm{Bin}(n,p)\). Calculate \(p(x)\) when

  1. \(n=7\), \(p=0.3\) for \(x=0, 1, 2\);

  2. \(n = 10\), \(p = 0.95\) for \(x = 9, 10\);

  3. \(n = 10\), \(p = 0.05\) for \(x = 1, 0\).

Use these to find \(\pr{X > 2}\) in part (i), \(\pr{X \leq 8}\) in part (ii), and \(\pr{X \geq 2}\) in part (iii).

Exercise 6.6 Let \(X\sim\mathrm{Bin}(n,p)\), and let \(Y=n-X\) (i.e. if \(X\) counts the number of successes in a binomial scenario, then \(Y\) counts the number of failures in that same scenario). Show that \(Y\sim\mathrm{Bin}(n,1-p)\). hint: Remember that \(\binom{n}{x} = \binom{n}{n-x}\).

Exercise 6.7 Use the equality \[\exp(\lambda)=\sum_{x=0}^{\infty} \frac{\lambda^x}{x!}\] to show that \(\sum_{x=0}^{\infty} p(x) = 1\) when \(X \sim \mathrm{Po}(\lambda)\).

Exercise 6.8 Let \(X\sim\mathrm{Po}(\lambda)\). Consider \(f(\lambda):= e^{-\lambda} \lambda^x/x!\) as a function of \(\lambda>0\) (so \(x\) is fixed). Find the \(\lambda\) which maximises \(f\). (This value is of interest when trying to estimate \(\lambda\) from an observation \(X=x\).)

Exercise 6.9 Let \(X\sim\mathrm{U}(3,7)\).

  1. Sketch the probability density function \(f(\cdot)\) of \(X\).

  2. Find \(\pr{X\in[4,6]}\), and mark the corresponding area on your graph.

  3. Find \(\pr{X\in[1,5]}\), and mark the corresponding area on your graph.

Exercise 6.10 Let \(X\sim\mathcal{E}(1)\).

  1. Sketch \(f(x)\).

  2. Find \(\pr{X\in [1,2]}\), and mark the corresponding area on your graph.

Exercise 6.11 In calculus, you showed that \(\int_{-\infty}^{\infty} e^{-z^2}\,\mathrm{d} z = \sqrt{\pi}\) via double integration and change to polar coordinates. Use this equality to evaluate \[\int_{-\infty}^{\infty} e^{-\frac{1}{2}\left(\frac{x-\mu}{\sigma}\right)^2}\,\mathrm{d} x.\] (Thereby, you have shown that \(f(x)\) integrates to one when \(X\sim\mathcal{N}(\mu,\sigma^2)\).)

Exercise 6.12 Let \(a<b\), and \(X\sim\mathrm{U}(a,b)\).

  1. Calculate, and sketch, the cumulative distribution function \(F\) of \(X\).

  2. With \([a,b]=[3,7]\), use \(F\) to calculate \(\pr{X\in[4,6]}\) and \(\pr{X\in[1,5]}\). (You may wish to check your answer against your solution for Exercise 6.9)

Exercise 6.13 Let \(\beta>0\), and \(X\sim\mathcal{E}(\beta)\).

  1. Calculate, and sketch, the cumulative distribution function \(F\) of \(X\).

  2. With \(\beta=1\), use \(F\) to calculate \(\pr{X\in[1,2]}\). (You may wish to check your answer against your solution for #exr-expsketch.)

Exercise 6.14 Let \(\beta>0\) and \(a>0\). Show that if \(X \sim \mathcal{E}(\beta)\) then \(aX \sim \mathcal{E}(\beta/a)\).

Exercise 6.15 Prove the “standardizing the Normal distribution” theorem, by showing that \(F_{(X-\mu)/\sigma}(z)=F_Z(z)\), and \(F_{\sigma Z+\mu}(x)=f_X(x)\), when \(X\sim\mathcal{N}(\mu,\sigma^2)\) and \(Z\sim\mathcal{N}(0,1)\).

Exercise 6.16 Let \(X\sim\mathcal{N}(\mu,\sigma^2)\). Use the tables below (you may pick for \(z\) the nearest number) to evaluate \(\pr{X\in[-1,2]}\) when:

\(\mu = 0\), \(\sigma^2 = 1\);

\(\mu = 2\), \(\sigma^2 = 4\);

\(\mu = -1.2\), \(\sigma^2 = 1.62\).

\(z\) 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
\(\Phi(z)\) 0.540 0.580 0.618 0.655 0.691 0.726 0.758 0.788 0.816 0.841
\(z\) 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0
\(\Phi(z)\) 0.864 0.885 0.903 0.919 0.933 0.945 0.955 0.964 0.971 0.977
\(z\) 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 3.0
\(\Phi(z)\) 0.982 0.986 0.989 0.992 0.994 0.995 0.996 0.997 0.998 0.999

6.2 Workout

Exercise 6.17 Let \(\Omega\) be a sample space, let \(X\colon\Omega\to X(\Omega)\) be a random variable on \(\Omega\), and let \(\pr{}\) be a probability on \(\Omega\), so \(\pr{}\) satisfies A1–A4. Show that the function \(\mathbb{P}_X\left(\right)\) defined by \[\mathbb{P}_X\left(B\right):= \pr{X\in B}\] for all \(B\subseteq X(\Omega)\), also satisfies A1–A4, with \(X(\Omega)\) instead of \(\Omega\). (This establishes that \(\mathbb{P}_X\left(\right)\) is indeed a probability, as mentioned in Section 6.1 of the notes.) hint: To prove A3/A4, use parts iii. and iv. of Exercise 6.1.

Exercise 6.18 A bag contains \(m\) red marbles and \(n\) blue marbles. You randomly take \(r\) marbles from the bag, without replacement, with \(r \leq m+n\). Let \(X\) denote the number of red marbles you end up with. Find the probability mass function \(p(x)\) for all \(x\in\mathbb{N}\).

Exercise 6.19 Let \(X\sim\mathrm{Bin}(n,p)\). By considering the ratios \(p(x+1)/p(x)\) for successive \(x\), or otherwise, find a formula in terms of \(n\) and \(p\) for the value(s) of \(x\) where \(p(x)\) is maximal over \(x\in\{0,1,\dots,n\}\). In other words, find the most probable observed value for \(X\).

Exercise 6.20 Let \(X\sim\mathrm{Bin}(n,p)\). Consider \(g(p):=\binom{n}{x} p^x (1-p)^{n-x}\) as a function of \(p \in (0, 1)\) (so \(n\) and \(x\) are fixed). Find the \(p\) which maximizes \(g\). (This value is of interest when trying to estimate \(p\) from an observation \(X=x\).)

Exercise 6.21 Let \(X\sim\mathrm{Bin}(n,p)\) and let \(A_n\) denote the event that \(X\) is even: \(A_n = \{ \omega : X(\omega ) \text{ is even}\}\). By considering the partition formed by \(\{\text{first trial a success}\}\), \(\{\text{first trial a failure}\}\), show that \(\pr{A_n} = p + (1-2p) \pr{A_{n-1}}\) for \(n \in \mathbb{N}\). Hence prove by induction that \[\pr{A_n} = \frac{1+ (1-2p)^n}{2} \quad \text{for all } n \in \mathbb{Z}_+.\]

Exercise 6.22 A boxer, at the start of his career, decides that he will retire after his first defeat. In each fight he has probability \(q\) of being defeated, with successive fights being assumed independent. Let \(X\) be the total number of fights in the boxer’s career.

  1. What is the distribution of \(X\)?

  2. Show that \(\pr{X >x} = (1-q)^x\) for \(x\in\{0, 1, 2, 3, \ldots\}\). Interpret this result.

  3. Find a formula for the probability that \(X\) is even (simplify your formula as much as possible). Evaluate this probability when \(q=0.4\).

Exercise 6.23 Let \(X\sim\mathrm{Po}(\lambda)\). Calculate \(p(x)\) when:

  1. \(\lambda = 2.1\) for \(x=0,1,2\);

  2. \(\lambda = 9.5\) for \(x=9,10\);

  3. \(\lambda = 0.5\) for \(x=1,0\).

Use these to find \(\pr{X > 2}\) in item (i), and \(\pr{X \geq 2}\) in item (iii).

(Compare these values with those from #exr-binomialcalc.)

Exercise 6.24 Let \(X\sim\mathrm{Po}(\lambda)\) where \(\lambda > 1\). Show that \(p(x)\), as a function of \(x\), increases monotonically, and then decreases. For what \(x\in\mathbb{Z}_+\) is \(p(x)\) maximal?

Exercise 6.25 Let \(X\sim\mathrm{Po}(\lambda)\) and \(Y \sim \mathrm{Po}(\mu)\) be independent. By computing the probability mass function of \(X+Y\), show that \(X+Y \sim \mathrm{Po}(\lambda+\mu)\). Here independence means that \(\pr{ X = x, Y = y } = \pr { X = x} \pr { Y = y }\) for all \(x, y\): you may want to look ahead to Chapter 7.

Exercise 6.26 Let \(\beta>0\), and \(X\sim\mathcal{E}(\beta)\). For any real numbers \(s>0\) and \(t > 0\), show that \[\cpr{X > s + t}{X > s} = \pr{X > t}.\] [This is called the memoryless property, and says that, given that \(X\) is bigger than \(s\), the chance that it is at least \(t\) bigger than \(s\) is the same no matter how big \(s\) is.]

Exercise 6.27 Let \(\beta>0\) be some constant parameter, and let \(X\) be a continuous random variable with probability density function \[f(x) = \frac{\beta}{2}e^{-\beta |x|}\text{ for } x \in \mathbb{R}.\] (This is called the two-sided exponential or Laplace distribution.) Calculate, and sketch, the cumulative distribution function \(F\) of \(X\).

6.3 Stretch

Exercise 6.28 A sequence of flips of a fair coin produces \(n\) heads and \(m\) tails. A run is a consecutive sequence of flips with the same outcome. Let \(R\) denote the number of runs (of both \(H\) and \(T\)). Find \(p_R(2k)\) and \(p_R(2k+1)\) for \(k\in\{1,\dots,\min(m,n)\}\). hint: This is a “sheep and fences” problem.

Exercise 6.29 A distributor of bean seeds determines from extensive tests that 1% of a large batch of seeds will not germinate. She sells the seeds in packets of 200 and guarantees that at least 98% of the seeds will germinate.

  1. Find the probability that any particular packet violates the guarantee

    1. exactly, using a binomial distribution;

    2. using the Poisson approximation.

  2. A gardener buys 13 packets. What is the probability that at least one packet violates the guarantee? In your calculation, you should use the exact probability from the previous part, rather than the approximation.

Exercise 6.30 Suppose that the probability distribution of the number of eggs, \(N\), laid by an insect is Poisson with parameter \(\lambda\). Suppose also that the probability that an egg develops into an insect is \(q\) (independently for each egg).

  1. Show that the probability distribution of the number of insects produced, \(H\), is Poisson with parameter \(\lambda q\). In other words, show that \[\pr{ H=h } = e^{-\lambda q} \, \frac{(\lambda q)^h}{h!} \qquad\text{for }h\in\{0,1,2,\dots\}.\] hint: Use the partition theorem (P4) with the events \(\{N=n\}\), for \(n=0,1,2,...\).

  2. What is the probability distribution of \(N-H\)?

  3. Show that \(H\) and \(N-H\) are independent, i.e. for all \(j\), \(h\), \[\pr{\{H=h\} \cap \{N-H =j\}} = \pr{H=h} \pr{N-H =j}.\]

Note: you will need the exponential series several times.

Exercise 6.31 Let \(X\sim\mathrm{Bin}(1,p)\) be a Bernoulli random variable and let \(Y\sim\mathrm{U}(0,1)\) be a uniform random variable, and suppose that \(X\) and \(Y\) are independent. Let \(S = X+Y\) and \(M = \max\{X,Y\}\).

  1. What is the cumulative distribution function of \(S\)? Is \(S\) discrete, continuous, or neither?

  2. What is the cumulative distribution function of \(M\)? Is \(M\) discrete, continuous, or neither?

hint: Compute the cumulative distribution function. Use the partition theorem \(\pr{A} = \pr{X=0}\pr{A\vert X=0} + \pr{X=1}\pr{A \vert X=1}\) for suitable events \(A\); independence (which we look at in Chapter 7) means that conditioning on the value of \(X\) does not change the distribution of \(Y\).