In all the programming languages that I have used, whenever you work with decimal numbers, they are represented with a point, which is contrary to several definitions such as:
Decimal numbers are those that are represented by a comma and that have an integer part (to the left of the comma) and another decimal part (to the right of the comma).
Concept taken from https://www3.gobiernodecanarias.org/medusa/ecoblog/crodalf/numeros-decimales-concepto/
either
We can say that a decimal number is one that is made up of an integer part and a decimal part, separated by a comma, and represents quantities that are not integers.
Definition taken from https://www.mundoprimaria.com/recursos-matematicas/numeros-decimales
For example in the programming console in javascript we make a simple division:
179/4
--> 44.75
It returns a decimal separated by a point and not by a comma. Why is this happening?
Note: If this question is not for this site please let me know.