Complexity Analysis Of Program|Overview

Why should we even care about the complexity of the program?

Think you and your friend wrote some algorithm(finite sequence of instruction)which solves the same problem, but how do you get to know which algorithm is more efficient and the challenge here is to choose the most efficient one. This is the case where complexity analysis plays a major role.

So how do we compare the two algorithms?

We can compare the algorithms on the basis of their space (amount of memory consume) and time complexity (number of operations performed).

What is Time Complexity?

The time complexity is the computational complexity that represents the amount of time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of operations performed by an algorithm, supposing that each operation takes a fixed amount of time to perform. Some operations are single additions, multiplications, assignments etc. We may leave some operations uncounted and concentrate on those that are performed the largest number of times.

The complexity is described in Big-O notation

Note: when calculating the complexity we can ignore constants: i.e. regardless of whether the loop is executed 20 *n times or n/5 times, we still have a complexity of O(n), even though the running time of the program may vary.

Some example of time complexity:

Example 1: Constant time — O(1).

1 def func(n):
2 result = n * n
3 return result

In this example, there is always a fixed or constant number of operations. So, therefore, the time complexity of this program is Big-O(c) or Big-O(1) (where is constant).

Example 2: Logarithmic time — O(log n).

1 def func(n): 
2 result = 0
3 while n > 1:
4 n //= 2
5 result += 1
6 return result

In this example, The value of n is halved on each iteration of the loop. If n = 2^x then log n = x. How long would the program below take to execute, depending on the input data?

Example 3:Linear time — O(n).

1 def func(n, A): 
2 for i in xrange(n):
3 if A[i] == 0:
4 return i
5 return -1

Let’s note, that if the first value of array A is 0 then the program will end immediately. But remember, when we analyzing the time complexity we should check for the worst cases, so suppose if element 0 is present at the last location and then, in this case, the time complexity will be Big-O(n) because we perform (comparison is happening up to the last element in the array) n operation.

Time Limit

Nowadays, an average computer can perform 10⁷ to 10⁸ operations in one second. Sometimes we have the required time limit allowed for a specific problem, but sometimes we do not.

Many programmers always argue that the problems in Competitive Programming always end up with TLE(Time Limit Exceed). The time limit set for online tests is usually from 1 to 5 seconds(at most). During contests, we are often given a limit on the size of data, and therefore we can guess the time complexity within which the task should be solved.

How to overCome Time Limit Errors:

  1. Bounds of loops may be reduced
  2. Optimize your algorithm for larger input
  3. Choose faster input and output method

Space Complexity

Memory limits provide information about the expected space complexity. You can estimate the number of variables that you can declare in your programs.

More specifically, space complexity is the amount of memory needed to perform the computation. It includes all the variables, both global and local, dynamic pointer data-structures and, in the case of recursion, the contents of the stack. Depending on the convention, input data may also be included. Space complexity is more tricky to calculate than time complexity because not all of these variables and data-structures may be needed at the same time. Global variables exist and occupy memory all the time; local variables (and additional information kept on the stack) will exist only during invocation of the function.

Space complexity is the amount of memory used by the algorithm (including the input values to the algorithm) to execute and produce the result.

Space Complexity = Auxiliary Space + Input space

Memory Usage while Execution

  • Instruction Space: The amount of memory used to store the compiled version of instructions.
  • Stack Memory: Sometimes our main function calls a function and inside this function we call another function. In this situation, the current function memory or current variable get pushed into the stack and then the call to the inside function is made.

for example: If a function A() calls function B() inside it, then all the variables of the function A() will get stored on the system stack temporarily, while the function B() is called and executed inside the function A().

Some important algorithm with there time complexity:

+ — — — — — — — — — — — + — — — — — — — — — — — — +
| Algorithm | Worst time Complexity |
+ — — — — — — — — — — — + — — — — — — — — — — — — +
| Selection Sort | O(n^2) |
| Bubble Sort | O(n^2) |
| Insertion Sort | O(n^2) |
| Merge Sort | O(n log(n)) |
| Quick Sort | O(n^2) |
+ — — — — — — — — — — — + — — — — — — — — — — — ——+

Written by

Let’s Have Coffee & Code it

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store