Data Structures And Algorithms

Big O, theta, Omega

Big O represents the worst-case time complexity (upper bound) of an algorithm as the input size increases. Say you have a function that finds the minimum or maximum value in an array, and that array could be of any size, it can have 2 elements in it, or it could have 1 million elements in it. Obviously to traverse an array of 1 million elements, it takes a lot more time and uses more of the computer’s resources.

As the input size increases the time and resources increases, and we can say this function has an O(n) time complexity. At worst this algorithm must traverse the entire array, and so the worst case (O) time complexity is O(n) (n = the size of the array).

The g(n) part is the mathematical representation of the function.

Upper bounds and worst-case are used interchangeably, and it means that it is the maximum amount of time the algorithm will take to run. The algorithm’s time it takes to run will never exceed this amount of time.

ω(g(n)) is the best case (lower bound) time complexity of an algorithm. The algorithm being described is going to grow AT LEAST as fast as omega (g(n)). It can not grow any slower than ω(g(n)).

ω(1) is considered the best case time, that means that no matter how large the input size is, the function will always in best case take constant time.

ω(n) would mean that at the best case, an algorithm will grow linearly as the input size increases.

Examples of ω(1) algorithms:
ω(n) Algorithms