#P5665. Minimizing Program Running Time in Data Segmentation
Minimizing Program Running Time in Data Segmentation
Minimizing Program Running Time in Data Segmentation
In the year 2048, during the 30th CSP Certification Exam, the contestant Xiaoming faces the first problem. There are \( n \) datasets numbered from 1 to \( n \) with sizes \( a_1, a_2, \dots, a_n \). Xiaoming's brute-force program runs on a dataset of size \( u \) in time \( u^2 \). However, after the program runs on a dataset of size \( u \), it will produce an error when running on any dataset of size < \( u \).
To avoid errors, Xiaoming decides to partition the datasets into several consecutive segments and merge each segment into one new dataset whose size is the sum of the sizes in that segment. He requires that the new dataset sizes are non-decreasing. More formally, if some partition indices \( 1 \le k_1 < k_2 < \cdots < k_p
[ S_1 = \sum_{i=1}^{k_1} a_i, \quad S_2 = \sum_{i=k_1+1}^{k_2} a_i, \quad \dots, \quad S_{p+1} = \sum_{i=k_p+1}^{n} a_i ]
must satisfy
[ S_1 \le S_2 \le \cdots \le S_{p+1}. ]
Note that it is allowed to have no partition (i.e. ( p=0 )), meaning all data are merged into one segment.
The total running time of the program under a partitioning is the sum of squares of the segment sums, i.e.,
[ T = S_1^2 + S_2^2 + \cdots + S_{p+1}^2. ]
Your task is to choose a valid segmentation such that the running time \( T \) is minimized, and output the minimum possible running time.
inputFormat
The first line contains a single integer \( n \) (the number of datasets). The second line contains \( n \) integers \( a_1, a_2, \dots, a_n \), representing the sizes of the datasets.
outputFormat
Output a single integer, which is the minimum possible total running time.
sample
3
1 2 3
14