next up previous contents
Next: Bibliography Up: How are different positions Previous: Numerical comparison of averages   Contents


Scaling Poisson variates

If we have a count value $ C_0$ that follows a Poisson distribution, we can assume immediately that the average is $ C_0$ and the s.d. is $ \sqrt{C_0}$ . I.e., repeated experiments would give values $ n$ distributed according to the normalized distribution

$\displaystyle P(n)={\ensuremath{\displaystyle{\frac{{\ensuremath{\displaystyle{C_0^n{\ensuremath{\mathrm{e}}}^{-C_0}
}}}}{{\ensuremath{\displaystyle{
n!}}}}}}}
$

This obeys

$\displaystyle \mathop{\sum}_{n=0}^{+\infty}
P(n)=1\ ;
$

$\displaystyle \langle n\rangle=\mathop{\sum}_{n=0}^{+\infty}
nP(n)=C_0\ ;
$

$\displaystyle \langle n^2\rangle=\mathop{\sum}_{n=0}^{+\infty}
n^2 P(n)=C_0^2+C_0\ ;
$

$\displaystyle \sigma_{C_0}=\sqrt{\langle n^2\rangle-\langle n\rangle^2}=\sqrt{C_0}
$

Suppose now that our observable is

$\displaystyle X_0=\eta_0 C_0
$

where $ \eta_0$ is a known error-free scaling factor. The distribution of $ X$ is

$\displaystyle P'(X)=P(X/\eta_0)=P(n)\qquad\Biggl\vert\Biggr.\qquad \frac{X}{\eta_0}\equiv n\in\mathbb{Z}
$

and now,

$\displaystyle \mathop{\sum}_{n=0}^{+\infty}
P(n)=1\ ;
$

$\displaystyle \langle X\rangle=\mathop{\sum}_{n=0}^{+\infty}
\eta_0 nP(n)=\eta_0 C_0=X_0\ ;
$

$\displaystyle \langle X^2\rangle=\mathop{\sum}_{n=0}^{+\infty}
\eta_0^2 n^2 P(n)=\eta_0^2(C_0^2+C_0)=X_0^2+\eta_0 X_0\ ;
$

$\displaystyle \sigma_X=\sqrt{\langle X^2\rangle-\langle X\rangle^2}=\sqrt{\eta_0 X_0}=\eta_0\sqrt{C_0}=\sqrt{\eta_0}\sqrt{X_0}
$

Now it is no more valid that $ \sigma_X=\sqrt{\langle X\rangle}=\sqrt{X_0}$ , instead

$\displaystyle \sigma_X=\sqrt{\eta_0}\sqrt{X_0}=\eta_0\sqrt{C_0}=\eta_0\sigma_{C_0}
$

that is the characteristic relationship for a normal-variate distribution.

Moreover, assume now that the scaling factor is not exctly known but instead it is a normal-variate itself with average $ \eta_0$ , s.d. $ \sigma_{\eta_0}$ , and distribution

$\displaystyle \widehat{P}(\eta)={\ensuremath{\displaystyle{\frac{{\ensuremath{\...
...ght)}}^2
}
}}}}{{\ensuremath{\displaystyle{
\sigma_{\eta_0}\sqrt{2\pi}
}}}}}}}
$

Then,

$\displaystyle \int_{-\infty}^{+\infty}{\ensuremath{\mathrm{d}{\eta}\, }}\mathop{\sum}_{n=0}^{+\infty}
P(n)\widehat{P}(\eta)=1\ ;
$

$\displaystyle \langle X\rangle=\int_{-\infty}^{+\infty}{\ensuremath{\mathrm{d}{...
...}{\ensuremath{\mathrm{d}{\eta}\, }} \widehat{P}(\eta)\eta
=
\eta_0 C_0=X_0\ ;
$

$\displaystyle \langle X^2\rangle=\int_{-\infty}^{+\infty}{\ensuremath{\mathrm{d...
...p{\sum}_{n=0}^{+\infty}
n^2 P(n)
=
(\eta_0^2+\sigma_{\eta_0}^2)(C_0^2+C_0)\ ;
$

$\displaystyle {\ensuremath{\displaystyle{\frac{{\ensuremath{\displaystyle{\sigm...
...yle{\sigma_{\eta_0}}}}}{{\ensuremath{\displaystyle{\eta_0}}}}}}}}\right)}}^2
}
$

where in the last we discard, as usual, the 4th order in the relative errors. Both the exact and approximated forms are exactly the same as if both distributions were to be normal.


next up previous contents
Next: Bibliography Up: How are different positions Previous: Numerical comparison of averages   Contents
Thattil Dhanya 2019-04-08