I just discovered that:
round(3.5) = 4
while:
round(2.5) = 2
That is, when a number ends in .5 it sometimes rounds up (as it should) and sometimes rounds down.
I want to make a program that does some calculations, but it rounds them well, it is not worth it that 55.5 rounds it to 55 .
How can I do so that, given any number, the numbers that end in .5 are well rounded ?
It doesn't work for me ceil
because if the number is 2.4 it would round to 3 , which is not correct either.
I don't want to know why this happens, I want to know how to make code like this work well:
X=a*b , where a and b are 2 decimal numbers
print(round(X))
so that the user can see the rounded number, so it seems simple, but if it gives 14.5 it rounds me to 14 . And that is not rounding correctly in science.