When computing the running time for some set of operations, how do you know whether something is log base 3 or 2 or 10. I will refer to log base x as log_x from here on out.
A set of operations will be said to take log_x steps if you're decreasing the work by a multiplicative factor. Take merge sort for example, we divide the list by 2, so that operation takes log_2 n (n being the size of the list). So if you're dividing the work by a constant amount each time, you end up with a logarithmic equation.
Also, all logs can be reduced to log_2. Why and how you might be wondering since you've probably noticed most running time of algorithms is always written in log_2.
That is because you can convert between any of the logs. They are all the same within a constant factor.
Equation:
log_b (n) = log_x (n) / log_x (b)
Example with converting from base 2 to 10:
log_2 (n) = log_10 (n) / log_10 (2)
log_2 (n) = log_10 (n) / constant
Real example of why a function's running time is logarithmic:
Take a list with 128 elements, apply merge sort to it. With merge sort, you keep dividing by 2:
128
64 64
32 32 32 32
16 16 16 16 16 16 16 16 16 16
..
2...2
At each step you process all n numbers, and there are 7 steps you have to do to reach the final answer, so that's where you get log_2 128 = 7
What the algorithm looks like if mapped on an x, y axis:
The logarithmic function measures the rate of growth. It is the inverse of the exponential function. As a result, it is a slow growth function (opposite of an exponential function).
An exponential function's graph is a U split down the middle with the origin at the bottom of the U starting at (0,0). We care only about the +x, + y axis. Since the logarithm function is the inverse, its graph is a U turned on its right side. Half of the U is on the +x, +y axis, the other on the +x, -y axis with the origin (0, 0). You can tell for very large input, growth will be slow.
Inverse relationship example:
log_2 128 = 7
2 ^7 = 128
No comments:
Post a Comment