Jul 30, 2024
This lecture focuses on different asymptotic notations used to describe the time complexity of algorithms. The discussion includes Big O (O), Omega (Ω), and Theta (Θ) notations, with examples for various functions and their complexities.
Example:
2n^2 + 3n + 4
O(n^2)
since 2n^2 + 3n + 4 <= 9n^2
for large values of n
.F(n) ≤ c * G(n)
for some constant c
and sufficiently large n
.Example:
-n^2 + 3n + 4
Ω(n^2)
since -n^2 + 3n + 4 >= 1 * n^2
.F(n) ≥ c * G(n)
for some constant c
and sufficiently large n
.Example:
-n^2 + 3n + 4
Θ(n^2)
since it lies between 1 * n^2
and 9 * n^2
.a * G(n) ≤ F(n) ≤ b * G(n)
for constants a
and b
and sufficiently large n
.n^2 log n + n
O(n^2 log n)
since it is less than 10n^2 log n
for large n.Ω(n^2 log n)
since it is greater than 1n^2 log n
for large n.Θ(n^2 log n)
since it tightly bounds the function.n^2 log n
lies between n^2
and n^3
.n!
(Factorial)O(n^n)
upper bound.Ω(1)
lower bound.n!
does not have a tight bound as it varies significantly for different ranges of n
.n!
fluctuates between O(n^n)
and Ω(1)
.log(n!)
O(n log n)
.Ω(1)
.Advice: Prefer using Theta (Θ) for exact complexity when possible.
Think of these notations like estimating the price of a mobile phone:
Understanding and applying these notations correctly can greatly assist in analyzing and optimizing algorithm performance in computer science.
Next: Properties of asymptotic notations (covered in the next video).