DISTANCE BETWEEN TWO POINTS

There are two points in R ^ 2 whose distance we want to measure. The problem is that by measuring the coordinates of the points, they are perturbed by two measurement errors with independent normal distributions of mean 0 and sd \ sigma. Then we calculate the distance with the observed coordinates ... With theory (reduce the problem to for example the points (0,0) and (0,1) by scaling \ sigma?) And simulations, to study mean and distribution of the observed distance to vary by \ sigma and the "true" distance ..

How can I simulate with R?