Skip to content

BigO Notation

BigO notation is a way of describing the compute, or time needed for a function to complete given an increasing number of data or data points to process.

How code slows as data grows.

https://en.wikipedia.org/wiki/Big_O_notation

From ChatGPT

Big O notation is a mathematical notation that describes the limiting behavior of a function when its argument approaches infinity. In the context of computer science and algorithm analysis, Big O notation is used to characterize the performance or complexity of an algorithm with respect to its input size. It provides an upper bound on the growth rate of the algorithm's time or space complexity.

input vs time plots of different O notations

Types of complexity

O(1)

Constant time complexity.

The algorithm takes the same amount of time regardless of the size of the input.

  • accessing an element of the array
  • hash table operations
  • queue / stack operations

O(log n)

logarithmic time complecity

For increasing input the rate of increase of time slows down.

Examples include

  • Binary search
  • Binary heap perations
  • merge sort
  • tree traversal (Balanced Binary trees)

O(n)

Linear time complexity

As inpuit increases the time taken increases at the same rate

O(n log n)

Linearithmetic time complecity

O(n^2), O(n^3), ...

Polynomial time complecity

  • nested loops

O(2^n)

Exponential time complexity

  • brute force algorithm (try every comnbination)

O(n!)

Factorial time complexity. Algorithms with combinatorial complexity