Error Propagation

The purpose of this document is to explain how Mantid deals with error propagation and how it is used in its algorithms.

Theory

In order to deal with error propagation, Mantid treats errors as Gaussian probabilities (also known as a bell curve or normal probabilities) and each observation as independent. Meaning that if X = 100 \pm 1 then it is still possible for a value of 102 to occur, but less likely than 101 or 99, and a value of 105 is far less likely still than any of these values.

Plus and Minus Algorithm

The Plus v1 algorithm adds two datasets together, propagating the uncertainties. Mantid calculates the result of X_1 + X_2 as

X = X_1 + X_2

with uncertainty

\sigma_X = \sqrt{ \left( \sigma_{X_1} \right)^2 + \left( \sigma_{X_2} \right)^2 }.

Consider the example where X_1 = 101 \pm 2 and X_2 = 99 \pm 2. Then for this algorithm:

X = X_1 + X_2 = 101 + 99 = 200

\sigma_X = \sqrt{ 2^2 + 2^2} = \sqrt{8} = 2.8284

Hence the result of Plus v1 can be summarised as X = 200 \pm \sqrt{8}.

Mantid deals with the Minus v1 algorithm similarly: the result of X_1 - X_2 is

X = X_1 - X_2

with error

\sigma_X = \sqrt{ \left( \sigma_{X_1} \right)^2 + \left( \sigma_{X_2} \right)^2 }.

Multiply and Divide Algorithm

The Multiply v1 and Divide v1 algorithms propagate the uncertainties according to (see also here):

\sigma_X = \left|X\right| \sqrt{ \left( \frac{\sigma_{X_1}}{X_1} \right)^2 + \left( \frac{\sigma_{X_2}}{X_2} \right)^2 },

where X is the result of the multiplication, X = X_1 \cdot X_2, or the division, X = X_1 / X_2.

Considering the example above where X_1 = 101 \pm 2 and X_2 = 99 \pm 2. Mantid would calculate the result of X_1 / X_2 as X = 101 / 99 = 1.0202, with uncertainty \sigma_X = 1.0202 \sqrt{ \left(2/101\right)^2 + \left(2/99\right)^2} = 0.0288.

For Multiply v1, the result of X_1 \times X_2 is X = 101 \times 99 = 9999, with uncertainty \sigma_X = 9999 \sqrt{ \left(2/101\right)^2 + \left(2/99\right)^2} = 282.8568.

Category: Concepts