Historically, people have used many ways to represent natural numbers: from the original "unary" arithmetic, where each number is represented as a sequence of, e.g., cuts (4 is IIII) to modern decimal and binary systems. However, with all this variety, some seemingly reasonable ways of representing natural numbers were never used. For example, it may seem reasonable to represent numbers as products -- e.g., as products of prime numbers -- such a representation was never used in history. So why some theoretically possible representations of natural numbers were historically used and some were not? In this paper, we propose an algorithm-based explanation for this different: namely, historically used representations have decidable theories -- i.e., for each such representation, there is an algorithm that, given a formula, decides whether this formula is true or false, while for un-used representations, no such algorithm is possible.