ECE department, Cornell University, Fall 2013 Homework 2
Due September 20 at 5:00 p.m.
1. (Chebyshev Inequality) Let X1 , . . ., X be independent Geometric random variables with parameters p1 , . . . , p respectively (i.e., P (Xi = k) = pi (1 − pi )k−1 , k = 1, 2, . . .). Let random variable X to be X= i=1 Xi .
(a) Find µX = E[X]. (b) Apply the Chebyshev inequality to upper bound P (|X − µX | > a). Evaluate your upper bound for the case that p1 = · · · = p = p. What happens as → ∞? 2. (Minimum of Independent Exponential Random Variables) Assume that T1 , . . ., Tn are independent random variables and that each Ti is exponentially distributed with mean 1/µi , i = 1, . . . , n. Let T = min(T1 , . . . , Tn ). (a) Show that T is exponentially distributed. What is its mean? (b) Let the random variable K indicate the index of the Ti that is the minimum. Show that µk P (K = k) = n . i=1 µi 3. (Joint PDF and CDF) Two random variables X and Y have joint pdf: fX,Y (x, y) = csin(x + y) 0 ≤ x ≤ π/2, 0 ≤ y ≤ π/2 (a) Find the value of the constant c. (b) Find the joint cdf of X and Y . (c) Find the marginal pdf’s of X and of Y . (d) Find the mean, variance, and covariance of X and Y . 4. (Uncorrelated vs. Independent)
(a) We have seen that independence of two random variables X and Y implies that they are uncorrelated, but that the converse is not true. To see this, let Θ be uniformly distributed on [0, π] and let random variables X and Y be defined by X = cos Θ and Y = sinΘ. Show that X and Y are uncorrelated but they are not independent. (b) Now show that if we make the stronger assumption that random variables X and Y have the property that for all functions g and h, the random variables g(X) and h(Y ) are uncorrelated, then X and Y must be independent. That is, uncorrelation of the random variables is not enough, but uncorrelation of all functions of the random variables is enough to ensure