# Big O notation

Big O notation is a symbolism used in complexity theory to describe the
asymptotic behavior of functions. Basically, it tells you how fast a function grows or declines.

Suppose M is an algorithm and n is the size of the input data. As n increases the complexity, f(n), of M also increases. The rate of increase of f(n) is usually examined by comparing f(n) with some standard functions such as logn, n.logn,n2,2n…etc

Big O function is defined as, suppose f(n) and g(n) are functions defined on positive integers.We write

$$f(n)=O(g(n))$$

if and only if there exists a positive number n and a positive constant c, such that

$$f(n) \leq c.g(n) \; \; \; for \: every \: \: n \geq n_0.$$

Big O defines the upper bound of an algorithm. To find the Big O notation of an algorithm, First, express all operations that the algorithm performs in terms of n. Then eliminate all excluding the fastest growing term. Finally, remove all constants.

For example, when analyzing some algorithm, if the time or the number of steps it takes to complete a problem of size n is given by C(n) = 6n2+5n+9.
If we ignore slower growing terms and constants and , we could say “C(n) grows at the order of n2 ” and write: C(n) = O(n2).