4

What is the complexity of a loop which goes the following?

for (int i = 0; i < n; i++)
{
    for (int j = 0; j < log(i); j++)
    {
        // Do something
    }
}

According to me, the inner loop will be running log(1)+log(2)+log(3)+...+log(n) times, so how do I calculate its complexity?

2
  • 3
    "According to me" Always found this phrase amusing. We typically use "according to" to announce a third-party authority we used to arrive at a conclusion. You can't use yourself as a third-party authority. Commented Aug 9, 2016 at 11:46
  • @LightnessRacesinOrbit According to me you can. Commented Aug 9, 2016 at 13:00

3 Answers 3

8

So, you have a sum log(1) + log(2) + log(3) + ... + log(n) = log(n!). By using Stirling's approximation and the fact that ln(x) = log(x) / log(e) one can get

log(n!) = log(e) * ln(n!) = log(e) (n ln(n) - n + O(ln(n)))

which gives the same complexity O(n ln(n)) as in the other answer (with slightly better understanding of the constants involved).

Sign up to request clarification or add additional context in comments.

1 Comment

Nice qualitative analysis -- \sum_{i=1}^{n} log(i) => log(n!) which allows us to use Stirling's approximation good catch!
8

Without doing this in a formal manner, such complexity calculations can be "guessed" using integrals. The integrand is the complexity of do_something, which is assumed to be O(1), and combined with the interval of log N, this then becomes log N for the inner loop. Combined with the outer loop, the overall complexity is O(N log N). So between linear and quadratic.

Note: this assumes that "do something" is O(1) (in terms of N, it could be of a very high constant of course).

3 Comments

Multiply that by the complexity of "do something".
Image
How can the complexity be worse than that of a basic two-dimensional loop over n^2? How can it be worse than n*log n for that matter? Honestly asking - it's been a while since I've studied this formally. But "between quadratic and cubic" for this intuitively seems wrong to me?
@LightnessRacesinOrbit duh, you are right, I messed up, fixed now.
0

Let’s start with log(1)+log(2)+log(3)+...+log(n).

Roughly half of the elements of this sum are greater than or equal to log(n/2) = log(n) - log(2). Hence the lower bound of this sum is n / 2 * (log(n) - log(2)) = Omega(nlog(n)). To get the upper bound, simply multiply n by the largest element which is log(n), hence O(nlog(n)).

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.