The Rules of Summation n å xi ¼ x1 þ x2 þ Á Á Á þ xn
covðX; YÞ ¼ E½ðXÀE½XÞðYÀE½YÞ
i¼1 n ¼ å å ½x À EðXÞ½ y À EðYÞ f ðx; yÞ
å a ¼ na
x y
i¼1 n covðX;YÞ r ¼ pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi varðXÞvarðYÞ n
å axi ¼ a å xi
i¼1 n i¼1
n
n
i¼1
i¼1
E(c1X þ c2Y ) ¼ c1E(X ) þ c2E(Y )
E(X þ Y ) ¼ E(X ) þ E(Y )
å ðxi þ yi Þ ¼ å xi þ å yi
i¼1 n n
n
i¼1
i¼1
å ðaxi þ byi Þ ¼ a å xi þ b å yi
i¼1 n var(aX þ bY þ cZ ) ¼ a2var(X) þ b2var(Y ) þ c2var(Z ) þ 2abcov(X,Y ) þ 2accov(X,Z ) þ 2bccov(Y,Z )
n
å ða þ bxi Þ ¼ na þ b å xi
i¼1
If X, Y, and Z are independent, or uncorrelated, random variables, then the covariance terms are zero and:
i¼1
n
å xi x1 þ x2 þ Á Á Á þ xn x ¼ i¼1n ¼ n varðaX þ bY þ cZÞ ¼ a2 varðXÞ
n
å ðxi À xÞ ¼ 0
þ b2 varðYÞ þ c2 varðZÞ
i¼1
2
3
2
å å f ðxi ; yj Þ ¼ å ½ f ðxi ; y1 Þ þ f ðxi ; y2 Þ þ f ðxi ; y3 Þ
i¼1 j¼1
i¼1
¼ f ðx1 ; y1 Þ þ f ðx1 ; y2 Þ þ f ðx1 ; y3 Þ þ f ðx2 ; y1 Þ þ f ðx2 ; y2 Þ þ f ðx2 ; y3 Þ
Expected Values & Variances
EðXÞ ¼ x1 f ðx1 Þ þ x2 f ðx2 Þ þ Á Á Á þ xn f ðxn Þ n ¼ å xi f ðxi Þ ¼ å x f ðxÞ x i¼1
E½gðXÞ ¼ å gðxÞ f ðxÞ x E½g1 ðXÞ þ g2 ðXÞ ¼ å ½g1ðxÞ þ g2 ðxÞ f ðxÞ x ¼ å g1ðxÞ f ðxÞ þ å g2 ðxÞ f ðxÞ x Normal Probabilities
XÀm
$ Nð0; 1Þ s 2
If X $ N(m, s ) and a is a constant, then
a À m
PðX ! aÞ ¼ P Z ! s If X $ Nðm; s2 Þ and a and b are constants; then
aÀm bÀm
Z
Pða X bÞ ¼ P s s
If X $ N(m, s2), then Z ¼
Assumptions of the Simple Linear Regression
Model
SR1
x
¼ E½g1 ðXÞ þ E½g2 ðXÞ
E(c) ¼ c
E(cX ) ¼ cE(X )
E(a þ cX ) ¼ a þ cE(X ) var(X ) ¼ s2 ¼ E[X À E(X )]2 ¼ E(X2) À [E(X )]2 var(a þ cX ) ¼ E[(a þ cX) À E(a þ cX)]2 ¼ c2var(X )
Marginal and Conditional Distributions f ðxÞ ¼ å f ðx; yÞ
for each value X can take
f ðyÞ ¼ å f ðx; yÞ
for each value Y can take
SR2
SR3
SR4
SR5
SR6
The value of y, for each value of x, is y ¼ b1 þ b2x þ e
The average value of the random error e is
E(e) ¼ 0 since we assume that E(y) ¼ b1 þ b2x
The variance of the