Time Complexity Overview
Suppose we take two algorithms, "X" and "Y." How do we compare the two algorithms? Well, first would always be the structure and steps. But in general, we compare their complexities to know their efficiency. There are two types of complexities: one is Time complexity, and the other one is Space complexity. Today, we will talk about Time complexity only.
What is Time complexity?
Definition: Time complexity describes the efficiency of your code in terms of time, or the number of operations required to execute the algorithm as the data size increases. And more operations takes more more time.
Example: Suppose we have an unsorted array of size n. We have to detect whether element x is present in the array or not. For that, if we go through all the elements of the array, we are bound to find the answer (present or not). So, if the element is at the corner, we can find it in at most n operations. Therefore, the complexity is O(n) [Big O of n].
There are several common terms used to describe an algorithm's time complexity:
- O(1)
- O(log(n))
- O(n)
- O(n * log₂(n))
- O(n²)
- O(n³)
...and so on.
O(1) is pronounced as "Big O of 1," while O(n) is pronounced as "Big O of n."
Let’s briefly explore the meaning of these terms:
-
O(1) – Constant Time Complexity: This indicates that the number of operations your algorithm performs is constant, regardless of the input size. It’s not necessarily "1"—it could be 2, 3, or even 1000—but the key point is that it doesn’t grow with the number of elements.
-
O(n) – Linear Time Complexity: This indicates that, in the worst case, the algorithm will take at most n operations as the input size grows.
What About Ω (Omega) and Θ (Theta)?
Do we always use Big O for time complexity? Mostly, yes—but not always.
-
Ω (Omega): This represents the best case or the lower bound of an algorithm. It indicates the minimum number of operations an algorithm can take. For example, a linear search algorithm might find an element in just one step in the best case, giving it an Ω(1) complexity. Similarly, binary search has an Ω(1) complexity because it may find the target on the first check.
-
Θ (Theta): This represents the average case or the exact bound of an algorithm. For instance, if the best-case and worst-case time complexities are the same (like counting the exact number of students in a class, which always requires n operations), we say the algorithm has Θ(n) complexity.
In Conclusion
- Big O: Upper Bound
- Omega: Lower Bound
Great
ReplyDelete