#B4321. Compute the Variance of an Integer Sequence
Compute the Variance of an Integer Sequence
Compute the Variance of an Integer Sequence
Little Keke has recently learned the definition of variance. Given a sequence of integers of length \(n\), compute its variance defined as follows:
\[ \sigma = \frac{(a_1 - \overline{a})^2 + (a_2 - \overline{a})^2 + \cdots + (a_n - \overline{a})^2}{n} \]
where \(\overline{a}\) is the average of the sequence, i.e., \[ \overline{a} = \frac{a_1 + a_2 + \cdots + a_n}{n}. \]
It is guaranteed that all intermediate and final results are integers.
inputFormat
The first line contains a single integer \(n\) representing the number of elements in the sequence.
The second line contains \(n\) space-separated integers \(a_1, a_2, \ldots, a_n\), representing the elements of the sequence.
outputFormat
Output one integer representing the variance \(\sigma\) of the sequence.
sample
5
1 2 3 4 5
2