How do we get order of an Algorithm?

Hi All,

The efficiency of algorithm is defined in terms "O" notation. I am clear to some extent but i have some doubts too.

If for a particular algorithm, i employ a "for" loop that loops once over all "n" numbers order is O(n), if twice we loop it becomes O(2n), if a "for" loops inside another "for" loop it becomes O(n^2).

My doubt is how do they get O(n log n) or O(n^2 log n) ?? That is, am not clear how do they chip in this "log" ??

For example "Merge sort" algorithm is O(n log n). How do we get these numbers (log)?

Thanks in advance for helping ๐Ÿ˜€

Regards

Replies

  • skipper
    skipper
    Order notation is an approximation of the amount of time it will take, for an algorithm to process some input. Approximate because it doesn't guarantee the algorithm will halt, for all inputs.

    You need the recurrence for the algorithm, Tn: Tm, Tm+1, ..., Tn-1, Tn; m = 0,1,2,...

    The reason a sort is more efficient than simply iterating over a set each time, is because even a binary sort is log2, better than 2n or n^2 = worst case.
  • mech_guy
    mech_guy
    Hi skipper,

    Thanks for replying but i guess you didn't get my question.

    I want to know how an order of algorithm is expressed in terms of "log". Where from this "log" comes?

    Hope its clear now ๐Ÿ˜€

    Thanks
  • tashirosgt
    tashirosgt
    Example: I've picked a number x between 1 and 64. Guess which number it is.
    Use the algorithm that asks a series of questions like:
    Is x >32?
    No
    Is x > 16?
    Yes
    Is x > 24?
    No
    Is x > 20?
    ...
    etc.


    After about 6 questions you know the answer for x.

    2[sup]6[/sup] = 64.

    6 = log_of_64_to_base_2

    For N things, you need about log_of_N_to_base_2 questions.
  • sarveshgupta
    sarveshgupta
    In Merge sort the computational time is divided into divide time, combine time and comparison time

    Its recursion is T(n) = 2T(n/2) + theta(n) + theta(1)

    2T(n/2) = division time

    theta(n) = combine time

    theta(1) = comparison time

    note when we use Master Method to solve this recursion we get O(n log n)

    In any sorting algorithm or any other algorithm where the increase in number of terms(in case of tree) or reduction in number of terms take place( as in case of sorting)
    when we generalise the case such as

    1 time : n comparisons

    2 time : n/2 comparisons
    .
    .
    .
    .
    .so on till only 1 comparison is left

    this means n/(2^i) = 1

    n = 2^i

    taking log on both sides
    i = log n base 2

    like this log factor comes in
  • mech_guy
    mech_guy
    Aha!! Cool mates ๐Ÿ˜€

    Got it.

    Thanks a lot sarvesh and tashirosgt ๐Ÿ˜€

You are reading an archived discussion.

Related Posts

It would be handy for a logged-in user to have quick links to: "View my posts" and "View unanswered posts". I often use this feature on the SOS Mathematics Cyberboard...
This one is for the website owners. Don't panic if you get a mail like this one - We are Shanghai Chooke Network Information Technology Co., Ltd, which is the...
This one is for the Quality Assurance Engineers on CE: I've seen that lot of people are not aware of the difference between load testing and stress testing. Can someone...
I'm sure we've lot of software QA engineers. I've realized (while teaching to a batch of students) that lot of people are not aware of the software QA concepts. Can...
Wondering whether we have any instrumentation engineers on CE. I've seen many registrations; but none active.