#C3381. Sum of Row Minimums
Sum of Row Minimums
Sum of Row Minimums
You are given a grid of integers with m rows and n columns. For each row, first sort the row in non-decreasing order and then take the minimum element (which becomes the first element after sorting). Your task is to compute the sum of these minimum elements across all rows.
In mathematical terms, if the grid is represented by \(a_{ij}\) for \(1 \leq i \leq m\) and \(1 \leq j \leq n\), then after sorting each row \(i\), let \(min_i\) be the minimum element. You need to calculate:
[ S = \sum_{i=1}^{m} min_i ]
Note that although sorting the entire row may seem unnecessary if you just need the minimum, the problem explicitly requires sorting each row before selecting the minimum element.
inputFormat
The first line of the input contains two space-separated integers m and n representing the number of rows and columns respectively.
This is followed by m lines, each containing n space-separated integers representing the elements of the grid.
outputFormat
Output a single integer, which is the sum of the minimum elements from each row after sorting each row in non-decreasing order.
## sample3 3
3 1 2
4 5 6
7 8 9
12