In many practical applications, it is useful to consider Kolmogorov complexity K(s) of a given string s, i.e., the shortest length of a program that generates this string. Since Kolmogorov complexity is, in general, not computable, it is necessary to use computable approximations K~(s) to K(s). Usually, to describe such an approximations, we take a compression algorithm and use the length of the compressed string as K~(s). This approximation, however, is not perfect: e.g., for most compression algorithms, adding a single bit to the string $s$ can drastically change the value K~(s) -- while the actual Kolmgorov complexity only changes slightly. To avoid this problem, V. Becher and P. A. Heiber proposed a new approximation called I-complexity. The formulas for this approximation depend on selecting an appropriate function F(x). Empirically, the function F(x) = log(x) works the best. In this paper, we show that this empirical fact can be explained if we take in account the corresponding symmetries.