#P8155. GCD Transformation Operations

    ID: 21337 Type: Default 1000ms 256MiB

GCD Transformation Operations

GCD Transformation Operations

You are given an integer \(m\) and a sequence of \(n\) integers \(a = [a_1, a_2, \ldots, a_n]\).

Define one GCD Transform operation as follows:

For every index \(i\) from \(1\) to \(n\), first compute \[ b_i = a_i + \gcd(m, a_1, a_2, \ldots, a_i), \] where the \(\gcd\) is taken over \(m, a_1, \ldots, a_i\). After computing all \(b_i\) (for \(i=1,2,\ldots,n\)), update the sequence by setting \(a_i \leftarrow b_i\) for every \(i\) (note that all \(b_i\) values are computed before any updates are made).

Your task is to determine the minimum number of GCD transform operations needed so that \(m\) divides every element of the sequence; that is, for all \(i\) in \([1,n]\) we have \(m \mid a_i\). It is guaranteed that a solution exists.

inputFormat

The input consists of two lines:

  • The first line contains two integers \(n\) and \(m\).
  • The second line contains \(n\) integers \(a_1, a_2, \ldots, a_n\) separated by spaces.

outputFormat

Output a single integer denoting the minimum number of operations required so that \(m\) divides every element \(a_i\).

sample

1 10
3
4