The question is to calculate the arithmetic mean of an Array (which contains a set of numerical values) that has a length
FIXED to which new values arrive , in the answer I explain everything.
The question is to calculate the arithmetic mean of an Array (which contains a set of numerical values) that has a length
FIXED to which new values arrive , in the answer I explain everything.
It is not simply making an arithmetic mean of a group of values in an Array to which values are added, what I mean is making the arithmetic mean of an Array to which new values arrive without changing its length (in this way I'll have to remove the oldest values ), to explain it better:
We have an Array with length 10 full of zeros to which we are going to add new values and eliminate the oldest ones:
var array = [0,0,0,0,0,0,0,0,0,0];
If we enter a value (1 for example) it will now look like this:
var array = [1,0,0,0,0,0,0,0,0,0];
If we add a 2 now:
var array = [1,2,0,0,0,0,0,0,0,0];
If we have for example a full Array:
var array = [3,5,6,7,8,9,1,2,3,4];
and we now introduce the value 45 , it would be:
var array = [45,5,6,7,8,9,1,2,3,4];
and the arithmetic mean of this Array would be 9 ;If we add 56 to this last Array , it would be:
var array = [45,56,6,7,8,9,1,2,3,4];
and its arithmetic mean would be 14.1 ;I know that some of you will know how to do it and it won't seem like anything special, but for people who start or intend to do this type of calculation I find it quite useful.
Hope that helps!