You are given two positive integer arrays nums and target, of the same length.
In a single operation, you can select any
subarray
of nums and increment or decrement each element within that subarray by 1.
Return the minimum number of operations required to make nums equal to the array target.
why greedy algorithm works for it? any one prove it
I can’t understand how to prove
New contributor
Samarasimha Reddy is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.