[ home ] [ math / cs / ai / phy / as / chem / bio / geo ] [ civ / aero / mech / ee / hdl / os / dev / web / app / sys / net / sec ] [ med / fin / psy / soc / his / lit / lin / phi / arch ] [ off / vg / jp / 2hu / tc / ts / adv / hr / meta / tex ] [ chat ] [ wiki ]

Viewing source code

The following is the source code for post >>>/math/176

In his work which introduced decimals to Europe, Stevin wrote (converting it to modern notation) that when you divide 0.4 ÷ 0.03, the algorithm gives you infinitely many 3's. But he didn't call this infinite result an exact answer. Instead he noted 13.33⅓ and 13.333⅓ as exact answers while recommending instead truncating to 13.33 or however near the answer you require. So clearly the main idea of infinite decimals giving arbitrarily good approximations was there. But at what point did people start saying things like 0.4 ÷ 0.03 = 13.333... exactly?