#C5986. Load Balancer Simulation
Load Balancer Simulation
Load Balancer Simulation
This problem simulates a load balancing algorithm where you are given n servers and m incoming requests. Each request has a positive integer value. The algorithm works by assigning each request to the server which currently has the smallest total load. In case of a tie, the server with the smallest index is chosen. After processing all requests, output the final load (counter value) of each server.
The load for each server is the sum of the request values it has been assigned. Formally, let \(c_i\) be the counter for server \(i\), then for each request \(r_j\), find the server index \(k = \min\{ i : c_i \text{ is minimum} \}\) and update \(c_k = c_k + r_j\). Return the final array of counters \([c_0, c_1, \dots, c_{n-1}]\).
inputFormat
The first line contains two integers \(n\) and \(m\) separated by a space, where \(n\) is the number of servers and \(m\) is the number of requests.
The second line contains \(m\) integers separated by spaces, representing the request values.
Input is read from standard input (stdin).
outputFormat
Output a single line containing \(n\) integers separated by spaces, which represent the final load for each server after processing all requests.
Output is printed to standard output (stdout).
## sample3 5
2 3 1 5 2
4 3 6
</p>