I’m trying to solve a problem on Geeksforgeeks that requires me to make a C++ program that returns the sum of the first ‘n’ integers, my initial approach was to use the Sum of the first n integers = (n*n+1)/2 identity so I wrote the following code:
class Solution {
public:
long long seriesSum(int n) {
return ((n)*(n+1)/2);
}
};
It successfully passes about 15 test cases by then it’s showing an unexpected output when 4407895 is passed as input
For Input: 4407895
Your Output: -444654092
Expected Output: 9714771369460
Why does it return a negative value when two positive values are multiplied?