This might seem like a stupid question. But if I have
a variable declared as follows:
If I enter “a” as 1.1234567 i.e. 7 digits after the decimal point , it loses
precision and prints
a = 1.123457 i.e 6 digits after decimal point.
How do I get it to read 7 digits after the decimal point?