While modern computers are fast, there are still many practical problems that require even faster computers. It turns out that on the fundamental level, one of the main factors limiting computation speed is the fact that, according to modern physics, the speed of all processes is limited by the speed of light. Good news is that while the corresponding limitation is very severe in Euclidean geometry, it can be more relaxed in (at least some) non-Euclidean spaces, and, according to modern physics, the physical space is not Euclidean. The differences from Euclidean character are especially large on micro-level, where quantum effects need to be taken into account. To analyze how we can speed up computations, it is desirable to reconstruct the actual distance values -- corresponding to all possible paths -- from the values that we actually measure -- which correspond only to macro-paths and thus, provide only the upper bound for the distance. In our previous papers -- including our joint paper with Victor Selivanov -- we provided an explicit formula for such a reconstruction. But for this formula to be useful, we need to analyze how algorithmic is this reconstructions. In this paper, we show that while in general, no reconstruction algorithm is possible, an algorithm is possible if we impose a lower limit on the distances between steps in a path. So, hopefully, this can help to eventually come up with faster computations.