Your archetypical Θ(nlogn) is a divide-and-conquer algorithm, which divides (and recombines) the work in linear time and recurses over the pieces. Merge sort works that way: spend O(n) time splitting the input into two roughly equal pieces, recursively sort each piece, and spend Θ(n) time combining the two sorted halves.
Intuitively, continuing the divide-and-conquer idea, each division stage takes linear time in total, because the increase in the number of pieces to divide exactly matches the decrease in the size of the pieces, since the time taken by division is linear. The total running time is the product of the total cost of a division stage multiplied by the number of division stages. Since the size of the pieces is halved at each time, there are log2(n) division stages, so the total running time is n⋅log(n). (Up to a multiplicative constant, the base of the logarithm is irrelevant.)
Putting it in equations (), one way to estimate the running time T(n) of such an algorithm is to express it recursively: T(n)=2T(n/2)+Θ(n). It's clear that this algorithm takes more than linear time, and we can see how much more by dividing by n:
T(n)n=T(n/2)n/2+Θ(1)
When
n doubles,
T(n)/n increases by a constant amount:
T(n)/n increases logarithmically, or in other words,
T(n)=Θ(nlogn).
This is an instance of a more general pattern: the master theorem. For any recursive algorithm that divides its input of size n into a pieces of size n/b and takes a time f(n) to perform the division and recombination, the running time satisfies T(n)=a⋅T(n/b)+f(n). This leads to a closed form that depends on the values of a and b and the shape of f. If a=b and f(n)=Θ(n), the master theorem states that T(n)=Θ(nlogn).