loader image

Algorithm complexity and time space trade off

An algorithm is defined as a well-defined list of steps for solving a problem. Each step of an algorithm will have a clear meaning and can be performed with a finite amount of effort and finite length of time. Every algorithm must satisfy the following criteria, 

  • Inputs: Zero or more quantities which are externally supplied to the algorithm. 
  • Output: At least one quantity is produced as output.
  • Definiteness: Each step must be clear and unambigous. 
  • Finiteness: All steps for all cases of an algorithm will terminate after a finite amount of time. 
  • Effectiveness: The algorithm will be efficient.

The efficiency of an algorithm is measured in terms of the time and space it uses. The complexity of an algorithm is expressed as a function of input size which gives running time and or space.


Suppose M is an algorithm and suppose n is the size of the input data. The efficiency of M is measured in terms of time and space used by the algorithm. Time is measured by counting the number of operations and space is measured by counting the maximum amount of memory consumed by M.

The complexity of M is the function f(n) which gives running time and or space in terms of n. In complexity theory, we find complexity f(n) for three major cases as follows,

  • Worst case: f(n) have the maximum value for any possible inputs.
  • Average case: f(n) have the expected value.
  • Best case: f(n) have the minimum possible value.

These ideas are illustrated by taking the example of the Linear search algorithm.

Suppose a linear array LIST with n elements and suppose we have to find the location of specific ITEM of information is given. The linear search algorithm compares the ITEM with each element in the array until we find a location INDEX where LIST[INDEX]=ITEM. Also, it has to send a message something like INDEX=-1, if ITEM doesn’t appear in the LIST.

The complexity of the search algorithm is given by the number of comparisons between ITEM and LIST[K].

it is clear that the worst-case occurs when the ITEM is not in the LIST or it is the last element of LIST. In both cases, we have n number of comparisons. That is the worst-case complexity of the linear search algorithm is C(n)=n.

The probability to occur ITEM in any position is same as that of ITEM doesn’t occur. So the number of comparisons can be 1,2,3…n and each number has a probability p=1/n to occur. Then

C(n) = 1.\frac{1}{n}+2.\frac{1}{n}+…+n.\frac{1}{n}
C(n) = (1+2+…+n).\frac{1}{n}
C(n) = \frac{n(n+1)}{2}.\frac{1}{n}=\frac{n+1}{2}

That is the average number of comparisons needed to find the location of ITEM is approximately equal to half the number of elements.

space-time tradeoff

Sometimes the choice of a data structure involves a space-time tradeoff. That is by increasing the amount of space for storing the data, we may be able to reduce the time needed for processing data or vice-versa.

Notify of
Inline Feedbacks
View all comments