Set Decimals, Set Fixed, Set("Decimals"), Set("Fixed")
These two commands determine display and calculation of numbers involving decimals. SET DECIMALS determines the minimum number of decimals used and displayed. SET FIXED determines whether the SET DECIMALS value is also the maximum number displayed.
Usage |
SET DECIMALS TO [ nDecimals ] SET FIXED ON | OFF nDecimalSetting = SET( "DECIMALS" ) cIsItFixed = SET( "FIXED" ) |
X1 = 10/3 && with default DECIMALS setting of 2 ? X1 && 3.33, as expected SET DECIMALS TO 5 X2 = 10/3 ? X2 && 3.33333 - so far, so good DISPLAY MEMORY LIKE X*Interesting—X1 shows up as 3.33 while X2 is 3.33333. But it gets stranger.
? X1*2 && 6.67 ? X2*2 && 6.66667 SET DECIMALS TO 18 ? X1*2 && 6.67 ? X2*2 && 6.66667 ? X1*3 && 10.00 ? X2*3 && 10.00000 - so no precision was lost in either caseThe variables remember how many decimal places they were created with, even though you can see in the memory listing that the internal representations are the same.What does all this mean for you? That you should choose a decimals setting for your application and use it throughout.
SET DECIMALS TO without a number resets you to the default of 2, even if you've set a different default through the Tools | Options dialog. |
Example |
* Using X1 and X2 created above SET DECIMALS TO 3 SET FIXED ON ? X1 && 3.333 ? X2 && 3.333 ? X1*2 && 6.667 yMoney = $37.5837 ? yMoney && 37.584 SET DECIMALS TO 7 ? yMoney && 37.5837 |
See Also |