$X \sim \mathcal{P}( \lambda) $ and $Y \sim \mathcal{P}( \mu)$ meaning that $X$ and $Y$ are Poisson distributions. What is the probability distribution law of $X + Y$. I know it is $X+Y \sim \mathcal{P}( \lambda + \mu)$ but I don't understand how to derive it.
$\endgroup$ 47 Answers
$\begingroup$This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k \ge 0$: \begin{align*} P(X+ Y =k) &= \sum_{i = 0}^k P(X+ Y = k, X = i)\\ &= \sum_{i=0}^k P(Y = k-i , X =i)\\ &= \sum_{i=0}^k P(Y = k-i)P(X=i)\\ &= \sum_{i=0}^k e^{-\mu}\frac{\mu^{k-i}}{(k-i)!}e^{-\lambda}\frac{\lambda^i}{i!}\\ &= e^{-(\mu + \lambda)}\frac 1{k!}\sum_{i=0}^k \frac{k!}{i!(k-i)!}\mu^{k-i}\lambda^i\\ &= e^{-(\mu + \lambda)}\frac 1{k!}\sum_{i=0}^k \binom ki\mu^{k-i}\lambda^i\\ &= \frac{(\mu + \lambda)^k}{k!} \cdot e^{-(\mu + \lambda)} \end{align*} Hence, $X+ Y \sim \mathcal P(\mu + \lambda)$.
$\endgroup$ 7 $\begingroup$Another approach is to use characteristic functions. If $X\sim \mathrm{po}(\lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it) $$ \varphi_X(t)=E[e^{itX}]=e^{\lambda(e^{it}-1)},\quad t\in\mathbb{R}. $$ Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $\lambda$ and $\mu$ respectively. Then due to the independence we have that $$ \varphi_{X+Y}(t)=\varphi_X(t)\varphi_Y(t)=e^{\lambda(e^{it}-1)}e^{\mu(e^{it}-1)}=e^{(\mu+\lambda)(e^{it}-1)},\quad t\in\mathbb{R}. $$ As the characteristic function completely determines the distribution, we conclude that $X+Y\sim\mathrm{po}(\lambda+\mu)$.
$\endgroup$ $\begingroup$You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($\lambda$) and Po($\mu$). P.G.F of X is \begin{equation*} \begin{split} P_X[t] = E[t^X]&= \sum_{x=0}^{\infty}t^xe^{-\lambda}\frac{\lambda^x}{x!}\\ &=\sum_{x=0}^{\infty}e^{-\lambda}\frac{(\lambda t)^x}{x!}\\ &=e^{-\lambda}e^{\lambda t}\\ &=e^{-\lambda (1-t)}\\ \end{split} \end{equation*} P.G.F of Y is \begin{equation*} \begin{split} P_Y[t] = E[t^Y]&= \sum_{y=0}^{\infty}t^ye^{-\mu}\frac{\mu^y}{y!}\\ &=\sum_{y=0}^{\infty}e^{-\mu}\frac{(\mu t)^y}{y!}\\ &=e^{-\mu}e^{\mu t}\\ &=e^{-\mu (1-t)}\\ \end{split} \end{equation*}
Now think about P.G.F of U = X+Y. As X and Y are independent, \begin{equation*} \begin{split} P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\\ &= e^{-\lambda (1-t)}e^{-\mu (1-t)}\\ &= e^{-(\lambda+\mu) (1-t)}\\ \end{split} \end{equation*}
Now this is the P.G.F of $Po(\lambda + \mu)$ distribution. Therefore,we can say U=X+Y follows Po($\lambda+\mu$)
$\endgroup$ 0 $\begingroup$In short, you can show this by using the fact that $$Pr(X+Y=k)=\sum_{i=0}^kPr(X+Y=k, X=i).$$
If $X$ and $Y$ are independent, this is equal to $$ Pr(X+Y=k)=\sum_{i=0}^kPr(Y=k-i)Pr(X=i) $$ which is $$ \begin{align} Pr(X+Y=k)&=\sum_{i=0}^k\frac{e^{-\lambda_y}\lambda_y^{k-i}}{(k-i)!}\frac{e^{-\lambda_x}\lambda_x^i}{i!}\\ &=e^{-\lambda_y}e^{-\lambda_x}\sum_{i=0}^k\frac{\lambda_y^{k-i}}{(k-i)!}\frac{\lambda_x^i}{i!}\\ &=\frac{e^{-(\lambda_y+\lambda_x)}}{k!}\sum_{i=0}^k\frac{k!}{i!(k-i)!}\lambda_y^{k-i}\lambda_x^i\\ &=\frac{e^{-(\lambda_y+\lambda_x)}}{k!}\sum_{i=0}^k{k\choose i}\lambda_y^{k-i}\lambda_x^i \end{align} $$ The sum part is just $$ \sum_{i=0}^k{k\choose i}\lambda_y^{k-i}\lambda_x^i=(\lambda_y+\lambda_x)^k $$ by the binomial theorem. So the end result is $$ \begin{align} Pr(X+Y=k)&=\frac{e^{-(\lambda_y+\lambda_x)}}{k!}(\lambda_y+\lambda_x)^k \end{align} $$ which is the pmf of $Po(\lambda_y+\lambda_x)$.
$\endgroup$ 1 $\begingroup$Using Moment Generating Function.
If $X \sim \mathcal{P}(\lambda)$, $Y \sim \mathcal{P}(\mu)$ and S=X+Y.
We know that MGF(Moment Generating Function) of $\mathcal{P}(\lambda)=e^{\lambda(e^t-1)}$(See the end if you need proof)
MGF of S would be
$$\begin{align}
M_S(t)&=E[e^{tS}]\\&=E[e^{t(X+Y)}]\\&=E[e^{tX}e^{tY}]\\&=E[e^{tX}]E[e^{tY}]\quad \text{given }X,Y\text{ are independent}\\&=e^{\lambda(e^t-1)}e^{\mu(e^t-1)}\\&=e^{(\lambda+\mu)(e^t-1)}
\end{align}$$
Thus S is a Poisson Distribution with parameter $\lambda+\mu$.
MGF of Poisson Distribution
If $X \sim \mathcal{P}(\lambda)$, then by definition Probability Mass Function is
$$\begin{align}
f_X(k)=\frac{\lambda^k}{k!}e^{-\lambda},\quad k \in 0,1,2....
\end{align}$$
It's MGF is
$$\begin{align}
M_X(t)&=E[e^{tX}]\\&=\sum_{k=0}^{\infty}\frac{\lambda^k}{k!}e^{-\lambda}e^{tk}\\&=e^{-\lambda}\sum_{k=0}^{\infty}\frac{\lambda^ke^{tk}}{k!}\\&=e^{-\lambda}\sum_{k=0}^{\infty}\frac{(\lambda e^t)^k}{k!}\\&=e^{-\lambda}e^{\lambda e^t}\\&=e^{\lambda e^t-\lambda}\\&=e^{\lambda(e^t-1)}
\end{align}$$
hint: $\sum_{k=0}^{n} P(X = k)P(Y = n-k)$
$\endgroup$ 6 $\begingroup$Here's a much cleaner solution:
Consider a two Poisson processes occuring with rates $\lambda$ and $\mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $\frac{r}{n}$, as $n\to\infty$.
Then $X$ counts the number of successes in the trials of rate $\lambda$ and $Y$ counts the number of successes in the trials of rate $\mu$, so the total number of successes is the same as if we had each trial succeed with probability $\frac{\lambda + \mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successful has a negligible probability. Then we are done.
$\endgroup$ 3