Proof of linearity for expectation given random variables are dependent

$\begingroup$

The proof of linearity for expectation given random variables are independent is intuitive. What is the proof given there they are dependent?

Formally,$$ E(X+Y)=E(X)+E(Y)$$where $X$ and $Y$ are dependent random variables.

The proof below assumes that $X$ and $Y$ belong to the sample space. That is, they map from the sample space to a real number line. Is that also a condition for linearity of expectation?

Proof:$$E\left(X+Y\right) =\sum\limits_{s}\left(X+Y\right)\left(s\right) P\left({s}\right) $$$$E\left(X+Y\right) =\sum\limits_{s}\left(X\left(s\right)+Y\left(s\right)\right) P\left({s}\right) $$$$E\left(X+Y\right) =\sum\limits_{s} X\left(s\right)P\left({s}\right) + \sum\limits_{s} Y\left(s\right)P\left({s}\right) $$$$E\left(X+Y\right) =E\left(X\right)+E\left(Y\right)$$Here $S$ is the sample space and $s$ is an event in the sample space.

Reference Lecture for proof.

Also, more reasoning for step 2 would be helpful. I don't understand it completely.

$\endgroup$ 6

1 Answer

$\begingroup$

The proof below assumes that $X$ and $Y$ belong to the sample space. That is, they map from the sample space to a real number line. Is that also a condition for linearity of expectation?

No.   It's the definition of a random variable.

Basically any random variable $X$ is a function that maps the sample space to the reals (or a subset there of, called the support).   $$X: \Omega \mapsto \Bbb R$$

If $X$ and $Y$ are both random variables of the same sample space, then so is their sum. $X+Y$.   (That is not defined if they are not of the same sample space.)  

$$ X:\Omega\mapsto\Bbb R~\wedge~ Y:\Omega\mapsto \Bbb R ~~\implies~~ X+Y:\Omega\mapsto\Bbb R\\\forall s\in\Omega,\quad(X+Y)(s) := X(s)+Y(s)$$

Linearity of Expectation then follows from its definition.

$\begin{align} \mathsf E(X+Y) =&~ \sum_{\omega\in\Omega} (X+Y)(\omega)~\mathsf P(\omega) \\[1ex] =&~ \sum_{\omega\in \Omega} X(\omega)~\mathsf P(\omega)+\sum_{\omega\in \Omega} Y(\omega)~\mathsf P(\omega) \\[1ex] =&~ \mathsf E(X)+\mathsf E(Y) \end{align}$

Of course, this is for discrete random variables.   For continuous random variables we use integration , but everything is analogous by no coincidence.

$\begin{align} \mathsf E(X+Y) =&~ \int_{\Omega} (X+Y)(\omega)~\mathsf P(\mathrm d \omega) \\[1ex] =&~ \int_{\Omega} X(\omega)~\mathsf P(\mathrm d \omega)+\int_{\Omega} Y(\omega)~\mathsf P(\mathrm d \omega) \\[1ex] =&~ \mathsf E(X)+\mathsf E(Y) \end{align}$

$\endgroup$ 10

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like