P(A U B)= P(A) + P(B) – P(A∩B)
If P(A∩B) = 0 then A and B are mutually exclusive and P(AUB) = P(A) + P(B)
Joint Probability Marginal Probability
PXY(x,y) = P(X=x ∩ Y=y) PX(x) = ∑P(X=x ∩ Y=y) (For all values of y)
Quotient Rule: Multiplication Rule
P(A|B) = P(A∩B) / P(B) P(A∩B) = P(A|B) x P(B) = P(B|A) x P(A)
Two events are statistically independent if:
P(A|B) = P(A)
P(B|A) = P(B)
P(A∩B) = P(A) P(B) _ _
P(A) = P(A|B)P(B) + P(A|B)P(B)
Bayes Rule:
P(E|A) = P(A∩E) = P(A|E)P(E) . P(A) P(A|E)P(E) + P(A|Ē)P(Ē)
Discrete random variable:
E(x), μ = ∑x*p(x)
Var(x) = ∑(x- μ)2*p(x)
E(aX+b) = aE(X)+b Var(aX+b) = a2Var(X) *** b disappears
BINOMIAL DISTRIBUTIONS: UNIFORM DISTRIBUTIONS:
E(X) = np E(X) = (a+b)/2
Var(X) = np(1-p) Var(X) = (b-a)2 / 12
*See cumulative probability table
NORMAL DISTRIBUTIONS
N ~ (μ, σ2), Z = (x- μ)/ σ Look for z values in the table.
VARIANCE and STANDARD DEVIATION:
Var(X) = E((X- μ)2) Σ(x-μ)2P(x) = Σx2P(x)- μ2 E(X2) – E(X)2
SD(X) = √Var(X)
COVARIANCE:
Cov(X,Y), σ XY = E(XY)- μx* μy
Cov (x, y) = ∑(x- μx)*(y- μy) PXY(x,y) = ∑ x y PXY(x,y) – μx* μy PXY(x,y) = P(X=x ∩ Y=y)
IF DEPENDENT: IF INDEPENDENT:
Var(X+Y) = Var(X)+Var(Y)+2Cov(X,Y) Var(X+Y) = Var(X) + Var(Y)
Var(a1X+a2Y+b) = a12Var(X)+a22Var(Y)+2a1a2Cov(X,Y)
Cov(X,Y) .
Corr(X,Y) = SD(X) SD(Y) Var(a1X+a2Y+b)=a12Var(X)+a22Var(Y)+2a1a2Corr(X,Y)SD(X)SD(Y)
1. Did you √ for Standard Deviation?
2. If X and Y are Independent, E(X,Y) = E(X)E(Y)
3. Do you need to subtract by 1?