7  Questions for Chapter 7

7.1 Warm-up

Exercise 7.1 The discrete random variables \(X\) and \(Y\) have the following joint probability mass function:

\(f(x,y)\) \(y=-1\) \(y=0\) \(y=1\) \(y=2\)
\(x=0\) \(1/12\) \(1/4\) \(0\) \(0\)
\(x=1\) \(1/12\) \(1/12\) \(1/12\) \(1/12\)
\(x=3\) \(0\) \(0\) \(1/4\) \(1/12\)

Find the marginal probability mass functions \(f_X(x)\) and \(f_Y(y)\).

Exercise 7.2 A fair coin is tossed three times. Suppose \(X\) denotes the number of heads in the first two tosses and \(Y\) denotes the number of heads in the last two tosses.

  1. Make a table of the joint probability distribution of \(X\) and \(Y\).

  2. Use your table for the joint probability mass function to confirm that the marginal probability mass functions of \(X\) and \(Y\) are both \(\text{Bin}(2,1/2)\).

  3. Compute \(\pr{ X = Y }\).

Exercise 7.3 Continuing from Exercise 7.2, confirm that \(X\) given \(Y=2\) is not binomial.

Exercise 7.4 Continuing from Exercise 7.1, write in a table the conditional probability mass function \(f_{Y\vert X}(y \vert x)\).

Exercise 7.5 Let \(Y\) be a random variable with \(\pr{Y=+1}=\pr{Y=-1}=1/2\). Let \(X\) be another random variable, taking values in \(\mathbb{Z}\), independent of \(Y\).

Show that the random variables \(Y\) and \(XY\) are independent if and only if the distribution of \(X\) is symmetric, i.e., \(\pr{X = k}=\pr{X=-k}\) for all \(k \in \mathbb{Z}\).

Exercise 7.6 Suppose that \(X\) and \(Y\) have joint probability density function \[f(x,y) = \begin{cases} 6e^{-(2x+3y)} &\text{if } x \ge 0\text{ and }y \ge 0 \\ 0 & \text{otherwise}. \end{cases}\] By integrating the joint probability density function, calculate:

  1. \(\pr{X < 1/2, Y > 1/2}\), and

  2. \(\pr{X > Y}\).

Exercise 7.7 Continuing from Exercise 7.6:

  1. Show that \(X\) and \(Y\) are independent, with \(X \sim \mbox{Exp}(2)\), and \(Y \sim \mbox{Exp}(3)\).

  2. Confirm that \(\pr{X < 1/2, Y > 1/2}\), which you calculated in , is equal to \(\pr{X < 1/2}\pr{Y > 1/2}\).

Exercise 7.8 Continuing from Exercise 7.1 and Exercise 7.4, find the probability mass functions of \(S = X + Y\), \(D = X - Y\), and \(M = \max(X, Y)\).

7.2 Work-out

Exercise 7.9 Let \(U_1, U_2 \sim\text{U}(0,1)\) be independent. By calculating the relevant cumulative distribution functions, show that \(\max ( U_1, U_2)\) has the same distribution as \(\sqrt{U_1}\).

Exercise 7.10 A fair die is thrown. The score is divided by two and rounded up giving score \(X\). A fair coin is then thrown \(X\) times and the number of heads is \(Y\) (so if the die roll is 3, then \(X=2\) and the coin is tossed twice).

  1. Write down the marginal probability mass function \(p_X()\) for \(X\).

  2. What is the (conditional) distribution of \(Y\) given \(X=x\)? Using this conditional probability mass function, and the marginal from part (a), compute the joint probability mass function \(p(x,y)\), and present it in a table.

  3. Calculate the marginal probability mass function \(p_Y()\) of \(Y\).

Exercise 7.11 Continuing from Exercise 7.10, write in a table the conditional probability mass function \(p_{X \vert Y}(x \vert y)\).

Exercise 7.12 Consider the experiment described in Exercise 3.18. Let \(X\) be the total score on the red die and let \(Y\) be the total score on the blue die. The experimental set-up implies that \(X\) and \(Y\) are conditionally independent: give a definition of the concept of conditional independence given an event for discrete random variables. (We did not discuss this in lectures, but there is a natural definition that you should be able to find.) Are \(X\) and \(Y\) independent?

Exercise 7.13 Let \(\beta_1>0\) and \(\beta_2>0\). Suppose that \(X_1 \sim \mathcal{E}(\beta_1)\), \(X_2 \sim \mathcal{E}(\beta_2)\), and \(X_1\) and \(X_2\) are independent. Let \(M = \min (X_1, X_2)\). Show that \(M \sim \mathcal{E}(\beta_1 + \beta_2)\).