What is the difference between #.## and ##.## pattern in Decimal Format?
Solution 1
IMHO, DecimalFormat
is used to format the entire represntation of numbers and controling the count of digits after a number.
So #
and ##
before a decimal (.
) wont behave differently....but would certainly do, if placed after a decimal (.
)
assuming 12.234 as an input number:
12.234 and 12.23 would make sense
but
12.234 and 2.234 would not.
So emphasis is put on format and values after a decimal point like :
setMinimumFractionDigits(0);
setMaximumFractionDigits(2);
setRoundingMode
Solution 2
According to DecimalFormat JavaDoc. "#, Decimal, zero shows as absent".
So the extra # will show another digit unless it is 0. Thus the two formats give the same results.
Edit: In fact they both do nothing because they are equivalent to the default formatting.
Gokul Nath KP
A curious self-learner, with innovative and out-of-box mindset. Looking forward to learn everyday!
Updated on June 08, 2022Comments
-
Gokul Nath KP almost 2 years
In my program I'm using the pattern #.## in
DecimalFormat
as shown below:DecimalFormat df = new DecimalFormat("#.##");
By mistake I added an extra
#
as shown below:DecimalFormat df = new DecimalFormat("##.##");
But this does not affects my output. I've tried using different combinations of inputs. There is no difference in the output. Tried Googling, but no proper explanation.
So what is the exact difference of using "#.##" and "##.##"?
If both are same, why is it allowed in Java?
If both are different, why output is same in both cases?
EDIT:
Sample program:
import java.text.DecimalFormat; public class Decimals { public static void main(String[] args) { double d1[] = new double[] {100d, -1d, -0.111111d, 2.555666d, 55555555555d, 0d}; DecimalFormat df1 = new DecimalFormat("#.##"); DecimalFormat df2 = new DecimalFormat("##.##"); for (double d : d1) { System.out.println(df1.format(d)); System.out.println(df2.format(d)); } } }
Output:
100 100 -1 -1 -0.11 -0.11 2.56 2.56 55555555555 55555555555 0 0