Practice Problem - Monte-Carlo Error Propagation

Part 1

You have likely encountered the concept of propagation of uncertainty before (see the usual rules here). The idea is that given measurements with uncertainties, we can find the uncertainty on the final result of an equation.

For example, let us consider the following equation:

$$F = \frac{G~M_1~M_2}{r^2}$$

which gives the gravitational force between two masses $M_1$ and $M_2$ separated by a distance $r$.

Let us now imagine that we have two masses:




separated by a distance:


where the uncertaintes are the standard deviations of Gaussian distributions which could be e.g. measurement errors.

We also know:

$$G = 6.67384\times10^{-11}~\rm{m}^3~\rm{kg}^{-1}~\rm{s}^{-2}$$

(exact value, no uncertainty)

Use the standard error propagation rules to determine the resulting force and uncertainty in your script (you can just derive the equation by hand and implement it in a single line in your code).

Now, we can try using a Monte-Carlo technique instead. The idea behind Monte-Carlo techniques is to generate many possible solutions using random numbers and using these to look at the overall results. In the above case, you can propagate uncertainties with a Monte-Carlo method by doing the following:

  • randomly sample values of $M_1$, $M_2$, and $r$, 1000000 times, using the means and standard deviations given above

  • compute the gravitational force for each set of values

You should do this with Numpy arrays, and without any loops. You should then get an array of 1000000 different values for the forces.

Make a plot of the normalized histogram of these values of the force, and then overplot a Gaussian function with the mean and standard deviation derived with the standard error propagation rules. Make sure that you pick the range of x values in the plot wisely, so that the two distributions can be seen. Make sure there are also a sensible number of bins in the histogram so that you can compare the shape of the histogram and the Gaussian function. The two distributions should agree pretty well.

In [ ]:
# Your solution here

Part 2

Now repeat the experiment above with the following values:


and as above, produce a plot.

In this case, which method do you think is more accurate? Why? What do you think are the advantages of using a Monte-Carlo technique?


In [ ]:
# your solution here