#K58212. Server Allocation Optimization

    ID: 30593 Type: Default 1000ms 256MiB

Server Allocation Optimization

Server Allocation Optimization

You are given a number of test cases. In each test case, a list of customer arrival times (in seconds) is provided. A customer arriving at time t is counted towards the hour \( \lfloor t/3600 \rfloor \). Each server can serve at most \( K \) customers per hour.

Your task is to determine the minimum number of servers required for each test case so that all customers are served. This is computed by finding the maximum number of customers arriving within any single hour for that test case and then calculating \( \lceil \frac{\text{max_customers}}{K} \rceil \). If there are no customers in a test case, the required number of servers is 0.

Input Format: The first line contains two integers \( T \) and \( K \) where \( T \) is the number of test cases and \( K \) is the maximum number of customers that a server can handle in one hour. For each test case, the first line contains an integer \( N \) denoting the number of customers, followed by a line with \( N \) space-separated integers representing the arrival times (in seconds).

Output Format: For each test case, output a single integer: the minimum number of servers required. Each result should be printed on its own line.

inputFormat

The input is read from stdin and has the following format:

  • The first line contains two space-separated integers: \( T \) (the number of test cases) and \( K \) (maximum customers per server per hour).
  • For each test case:
    • The first line contains an integer \( N \) (the number of customers).
    • The second line contains \( N \) space-separated integers representing the arrival times in seconds.

outputFormat

For each test case, output one integer on a new line indicating the minimum number of servers required. The output should be written to stdout.

## sample
2 3
5
0 1 2 3 4
3
0 1800 3599
2

1

</p>