I am confused about the time complexity of algorithms, since the running time of an algorithm is represented by a function of n, f(n).
For example: f(n) = 3n^2 + 5n + 1 = Θ(n^2)
But if an algorithm has multiple f(n), for example:
if statement 1 then
for i = 1 to n do
statement 2
statement 3
...
If statement 1 is True, then f(n) = n * constant = Θ(n)
If statement 1 is False, then f(n) = constant = Θ(1)
How do I determine the running time of an algorithm with multiple f(n)?
I don’t know how to analyze it, since big-oh big-omega and big-theta are all based on a single function f(n).
Naijun Wang is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.