#K13471. Maximizing Variance in Item Distribution
Maximizing Variance in Item Distribution
Maximizing Variance in Item Distribution
You are given two integers n and k. Your task is to generate a sequence of n integers where each integer represents the number of items in a chest. The objective is to maximize the variance of the sequence.
The variance of a sequence \(a_1, a_2, \dots, a_n\) is defined as
[ \sigma^2 = \frac{1}{n} \sum_{i=1}^{n} (a_i - \mu)^2, \quad \text{where} \quad \mu = \frac{1}{n}\sum_{i=1}^{n}a_i ]
To maximize the variance, you need to assign the minimum value of 1 to some chests and the maximum value of k to the remaining chests. In particular:
- If n is 1, the only chest gets k.
- If n is 2, the optimal distribution is [1, k].
- If n is greater than 2, assign \(\left\lfloor \frac{n}{2} \right\rfloor\) chests the value 1 and the remaining n - \(\left\lfloor \frac{n}{2} \right\rfloor chests the value k.
Note that for some cases there might be multiple valid answers, but your program should output the specific sequence as defined above.
inputFormat
The input consists of a single line containing two space-separated integers: n and k.
\(1 \leq n \leq 10^5\) and \(1 \leq k \leq 10^9\). You may assume that the inputs are valid.
outputFormat
Output a single line containing n space-separated integers representing the number of items in each chest according to the optimal distribution.
## sample1 5
5