I have seen random errors being defined as those which average to 0 as the number of measurements goes to infinity, and that the error is equally likely to be positive or negative. This only requires a symmetric probability distribution about zero. However typing this question into Google, I did not find a single source that suggested random errors could be anything other than gaussian. Why must random errors be gaussian?
Answer
Are random errors necessarily gaussian?
Errors are very often Gaussian, but not always. Here are some physical systems where random fluctuations (or "errors" if you're in a context with the thing that's varying constitutes an error) are not Gaussian:
The distribution of times between clicks in a photodetector exposed to light is an exponential distribution.$^{[a]}$
The number of times a photodetector clicks in a fixed period of time is a Poisson distribution.
The position offset, due to uniformly distributed angle errors, of a light beam hitting a target some distance away is a Cauchy distribution.
I have seen random errors being defined as those which average to 0 as the number of measurements goes to infinity, and that the error is equally likely to be positive or negative. This only requires a symmetric probability distribution about zero.
There are distributions that have equal weight on the positive and negative side, but are not symmetric. Example: $$ P(x) = \left\{ \begin{array}{ll} 1/2 & x=1 \\ 1/4 & x=-1 \\ 1/4 & x=-2 \, . \end{array}\right.$$
However typing this question into Google, I did not find a single source that suggested random errors could be anything other than gaussian. Why must random errors be gaussian?
The fact that it's not easy to find references to non-Gaussian random errors does not mean that all random errors are Gaussian :-)
As mentioned in the other answers, many distributions in Nature are Gaussian because of the central limit theorem. The central limit theorem says that given a random variable $x$ distributed according to a function $X(x)$, if $X(x)$ has finite second moment, then given another random variable $y$ defined as the average of many instances of $x$, i.e. $$y \equiv \frac{1}{N} \sum_{i=1}^N x_i \, ,$$ the distribution $Y(y)$ is Gaussian.
The thing is, many physical processes are the sums of smaller processes. For example, the fluctuating voltage across a resistor is the sum of the voltage contributions from many individual electrons. Therefore, when you measure a voltage, you get the underlying "static" value, plus some random error produced by the noisy electrons, which because of the central limit theorem is Gaussian distributed. In other words, Gaussian distributions are very common because so many of the random things in Nature come from a sum of many small contributions.
However,
There are plenty of cases where the constituents of an underlying error mechanism have a distribution that does not have a finite second moment; the Cauchy distribution is the most common example.
There are also plenty of cases where an error is simply not the sum of many small underlying contributions.
Either of these cases lead to non-Gaussian errors.
$[a]$: See this other Stack Exchange post.
No comments:
Post a Comment