Decimal fractions

Finite decimal numbers are called decimal fractions, because they are a different representation for fractions with powers of ten in the denominator. So is:

$$\frac{z}{n} = \frac{q_1}{1} + \frac{q_2}{10} + \dots + \frac{q_k}{10^k}$$

with \(k \in \mathbb{N}\) and \(q_k\) the \(k-1\) -th place to the right after the comma.


Now is:

$$\frac{z}{n} = \frac{10^k \cdot q_1 + 10^{k-1} q_2 + \dots + q_k}{10^k} = \frac{10^k \cdot q_1 + 10^{k-1} q_2 + \dots + q_k}{2^k \cdot 5^k}$$

This means: If the denominator for a general fraction can be extended to \(2^k \cdot 5^k\) in a completely abbreviated form \(\frac{z}{n}\) , it is a finite decimal fraction. If we consider the prime factorization of the denominator \(n = p_1^{l_1} \cdot \, \dots \, \cdot p_j^{l_j}\) , then according to the fundamental theorem of arithmetic, this can be expressed as \(f = 2^{km} \cdot 5^{kn}\) to \(2^k \cdot 5^k\) if \(n = 2^m \cdot 5^n\) . This applies:

Only fractions whose denominators have no prime factors other than 2's or 5's when fully abbreviated result in a finite decimal fraction.

Back