What is the justification for the running time of this loop?
I don’t understand why the running time of the following loop is n+2 ? Also why the running time of the statements inside the loop is n+1 ?
Why aren’t there code overviews for open-source projects? [closed]
Closed 9 years ago.
If I replace N objects with N pointers, is my space complexity still O(N)?
Lets say I get N
objects as input, and I need to rearrange them into a different data structure. This means the space complexity of my algorithm will be O(N)
. But what if I replace the objects with pointers to objects? I’ll still have N
pointers, but will it still be O(N)
space complexity despite the fact that there is clearly going to be less memory taken?
If I replace N objects with N pointers, is my space complexity still O(N)?
Lets say I get N
objects as input, and I need to rearrange them into a different data structure. This means the space complexity of my algorithm will be O(N)
. But what if I replace the objects with pointers to objects? I’ll still have N
pointers, but will it still be O(N)
space complexity despite the fact that there is clearly going to be less memory taken?
If I replace N objects with N pointers, is my space complexity still O(N)?
Lets say I get N
objects as input, and I need to rearrange them into a different data structure. This means the space complexity of my algorithm will be O(N)
. But what if I replace the objects with pointers to objects? I’ll still have N
pointers, but will it still be O(N)
space complexity despite the fact that there is clearly going to be less memory taken?
If I replace N objects with N pointers, is my space complexity still O(N)?
Lets say I get N
objects as input, and I need to rearrange them into a different data structure. This means the space complexity of my algorithm will be O(N)
. But what if I replace the objects with pointers to objects? I’ll still have N
pointers, but will it still be O(N)
space complexity despite the fact that there is clearly going to be less memory taken?
Why does this algorithm work in O(n m)?
This is from a blog post on Codeforces. I couldn’t really understand why the editorialist goes on to claim that this code works in O(n m)
A fast algorithm for a simple multi-objective minimization?
I have a set of n (arbitrary) integer numbers S which I want to partition into k subsets S_i each of size n/k (you can assume that k divides n). Let A be the arithmetic mean of elements of the set S. I am looking for the fastest algorithm that fills each S_i with elements of S such that sum of the elements of each S_i is as close as possible to A. Essentially, this is a multi-objective minimization problem and I am looking for Pareto minimal solutions. The complexity of the brute-force algorithm is O(n!). I am wondering if there exists a faster algorithm.
Complexity analysis: Finding common members of unsorted arrays
I’ve been going over previous tech interviews I’ve had (got another one coming up).
Complexity analysis: Finding common members of unsorted arrays
I’ve been going over previous tech interviews I’ve had (got another one coming up).