Viewing source code
The following is the source code for post
>>>/math/176In his work which introduced decimals to Europe, Stevin wrote (converting it to modern notation) that when you divide 0.4 ÷ 0.03, the algorithm gives you infinitely many 3's. But he didn't call this infinite result an exact answer. Instead he noted 13.33⅓ and 13.333⅓ as exact answers while recommending instead truncating to 13.33 or however near the answer you require. So clearly the main idea of infinite decimals giving arbitrarily good approximations was there. But at what point did people start saying things like 0.4 ÷ 0.03 = 13.333... exactly?