Welcome back to yet another blog post. Hope you are doing well. In this post I have written down the time and space complexity of different data structures and algorithms for reference.

log_{2}n < n < nlog_{2}n < n^{2} < 2^{n} < n!

**log _{2}n** is one of the best while

**n!**is one of the worst.

## Table of Content

- Time and Space complexity
- Growth rate
- Common growth rates
- Compare growth rates
- Determine time complexity

## Time and Space complexity

Color code from best to worst complexity.

BEST | BETTER | OK | BAD | WORST |

### Common Data Structures

**WORST** complexity

Time | Space | ||||
---|---|---|---|---|---|

Data Structure | Insert | Delete | Search | Access | |

Array | O(n) | O(n) | O(n) | O(1) | O(n) |

Stack | O(1) | O(1) | O(n) | O(n) | O(n) |

Queue | O(1) | O(1) | O(n) | O(n) | O(n) |

Linked List | O(1) | O(1) | O(n) | O(n) | O(n) |

Doubly Linked List | O(1) | O(1) | O(n) | O(n) | O(n) |

Binary Search Tree | O(n) | O(n) | O(n) | O(n) | O(n) |

AVL Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(n) |

B Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(n) |

Red Black Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(n) |

Hash table | O(n) | O(n) | O(n) | O(n) |

### Array Sorting Algorithms

Time | Space | |||
---|---|---|---|---|

Algorithm | BEST | AVERAGE | WORST | WORST |

Bubble sort | Ω(n) | Θ(n^{2}) | O(n^{2}) | O(1) |

Selection sort | Ω(n^{2}) | Θ(n^{2}) | O(n^{2}) | O(1) |

Insertion sort | Ω(n) | Θ(n^{2}) | O(n^{2}) | O(1) |

Merge sort | Ω(n log n) | Θ(n log n) | O(n log n) | O(n) |

Heap sort | Ω(n log n) | Θ(n log n) | O(n log n) | O(1) |

Quick sort | Ω(n log n) | Θ(n log n) | O(n^{2}) | O(log n) |

## Growth rate

It is the rate at which the cost of the algorithm grows as the size of its input increases.

For example, consider the following code which prints the characters of a string.

```
str = "Hello World!";
for(i = 0; i < str.length; i++) {
print(str[i]);
}
```

There are total 12 characters in the above string `str`

.

Now, let us look at the operations being performed in the above code snippet.

- The variable
`str`

is assigned the string value. - The loop variable
`i`

is initialize with value`0`

. - The
`for`

loop condition is checked. - If the condition is valid then the character is printed.
- After each loop the variable
`i`

is incremented by 1.

Operation | Count | Note |
---|---|---|

str = "Hello World!" | 1 | |

i = 0 | 1 | |

i < str.length | 13 | On the last i.e., 13th check the loop will exit. |

print(str[i]) | 12 | |

i++ | 12 |

Total operations is 1+1+13+12+12 i.e., 39 times.

If there are **n** characters in the string `str`

then the growth rate of the above code will be the following.

Operation | Count | Note |
---|---|---|

str = string with n characters | 1 | |

i = 0 | 1 | |

i < str.length | n+1 | On the last i.e., (n+1)th check the loop will exit. |

print(str[i]) | n | |

i++ | n |

So growth rate of the above code is 1+1+(n+1)+n+n i.e., **3n+3**.

## Common growth rates

log_{2}n | n | nlog_{2}n | n^{2} | 2^{n} | n! |
---|---|---|---|---|---|

0.00 | 1 | 0.00 | 1 | 2 | 1 |

1.00 | 2 | 2.00 | 4 | 4 | 2 |

1.58 | 3 | 4.75 | 9 | 8 | 6 |

2.00 | 4 | 8.00 | 16 | 16 | 24 |

2.32 | 5 | 11.61 | 25 | 32 | 120 |

3.32 | 10 | 33.22 | 100 | 1024 | 3628800 |

4.32 | 20 | 86.44 | 400 | 1048576 | 2.4329x10^{18} |

4.91 | 30 | 147.21 | 900 | 1073741824 | 2.65253x10^{32} |

From the above growth rate table we can safely say that **log _{2}n** is the best while

**n!**is the worst.

## Compare growth rates

In this section we will compare the growth rates and check with of the given functions is growing quicker.

To compare growth rates we can go through the following steps.

- Remove common values from both the functions.
- Take
**log**of both the functions. - Replace the input
**n**with a very large value. - Compare the value.

### Exercise 1

f(n) = n 3^{n} | g(n) = n 2^{n} | Note |
---|---|---|

n 3^{n} | n 2^{n} | |

3^{n} | 2^{n} | removed common n |

log_{2}3^{n} | log_{2}2^{n} | log of both functions |

n log_{2}3 | n log_{2}2 | |

log_{2}3 | log_{2}2 | removed common n |

1.58 | 1 |

So, f(n) > g(n)

### Exercise 2

f(n) = n^{2} | g(n) = 2^{n} | Note |
---|---|---|

n^{2} | 2^{n} | |

log_{2}n^{2} | log_{2}2^{n} | log of both functions |

2 log_{2}n | n log_{2}2 | |

2 log_{2}n | n | |

2 log_{2}2^{10} | 2^{10} | replacing n with 2^{10} |

2x10 log_{2}2 | 2^{10} | |

20 | 1024 |

So, f(n) < g(n)

### Exercise 3

f(n) = n^{2} | g(n) = (log_{2}n)^{24} | Note |
---|---|---|

n^{2} | (log_{2}n)^{24} | |

log_{2}n^{2} | log_{2}(log_{2}n)^{24} | log of both functions |

2 log_{2}n | 24 log_{2}(log_{2}n) | |

2 log_{2}2^{16} | 24 log_{2}(log_{2}2^{16}) | replacing n with 2^{16} |

2x16 log_{2}2 | 24 log_{2}(16 log_{2}2) | |

2x16 | 24 log_{2}2^{4} | |

2x16 | 24x4 log_{2}2 | |

32 | 96 |

So, for a smaller value of **n** we get f(n) < g(n).

Now, let us try for higher value of n.

f(n) = n^{2} | g(n) = (log_{2}n)^{24} | Note |
---|---|---|

n^{2} | (log_{2}n)^{24} | |

log_{2}n^{2} | log_{2}(log_{2}n)^{24} | log of both functions |

2 log_{2}n | 24 log_{2}(log_{2}n) | |

2 log_{2}2^{1024} | 24 log_{2}(log_{2}2^{1024}) | replacing n with 2^{1024} |

2x1024 log_{2}2 | 24 log_{2}(1024 log_{2}2) | |

2x1024 | 24 log_{2}2^{10} | |

2x1024 | 24x10 log_{2}2 | |

2048 | 240 |

So, for larger values of **n** we get f(n) > g(n).

## Determine time complexity

In this section we will find the time complexity of some of the code snippets.

### Example 1: loop

```
for(let i = 0; i < n; i++) {
console.log("Hello");
}
```

This code will execute **n** times so its time complexity is **O(n)**.

### Example 2: nested loop

```
for(let i = 0; i < n; i++) {
for(let j = 0; j < n; j++) {
console.log("Hello");
}
}
```

The outer loop will execute **n** times. And for each outer loop execution the inner loop will execute **n** times. Therefore the total execution time is **n ^{2}** and time complexity is

**O(n**.

^{2})### Example 3: nested loop

```
for(let i = 0; i < n; i++) {
for(let j = 0; j < i; j++) {
console.log("Hello");
}
}
```

The outer loop will execute **n** times.

During the first loop, when i=0, the inner loop will not execute. Then onwards, for every i, the inner loop will execute i-1 times.

So, total execution time is **n(n-1)** i.e., **n ^{2}-n**.

Therefore, the time complexity is **O(n ^{2})**.

### Example 3: logarithmic

```
for(let i = 1; i < n; i*=2) {
console.log("Hello");
}
```

From the above code, the value of **i** which increase like the following.

i = 1, 2, 4, 8, 16, ..., n

Or,

i = 2^{0}, 2^{1}, 2^{2}, ..., 2^{k}

The loop will end when **2 ^{k} = n**.

Taking base 2 log of both sides we get **log _{2}2^{k} = log_{2}n**

Or,

**k = log _{2}n**

The console.log statement will be executed log_{2}n times.

So, time complexity of the above code snippet is **O(log _{2}n)**