Tutorials Home   >   Problem-Solving & Logic Building   >   Basic Time and Space Complexity

Basic Time and Space Complexity

Basic Time & Space Complexity

A Complete Guide for Learners


1. Why Time and Space Complexity Matter

When you write a program, it may:

  • Run fast or slow

  • Use little or a lot of memory

Two programs can solve the same problem, but one might:

  • Finish in 1 second, while another takes 1 hour

  • Use a few kilobytes of memory, while another uses gigabytes

To measure and compare efficiency, computer scientists use:

  • Time Complexity β†’ How fast an algorithm runs

  • Space Complexity β†’ How much memory an algorithm uses

These concepts help us:

  • Choose better algorithms

  • Write scalable programs

  • Predict performance before running code


2. What Is Time Complexity?

2.1 Definition

Time Complexity describes how the running time of an algorithm grows as the input size grows.

It does not measure:

  • Actual seconds or milliseconds

  • Speed of a specific computer

Instead, it measures:

How the number of operations increases with input size


2.2 Input Size (n)

We usually represent input size with n:

  • Number of elements in an array

  • Length of a string

  • Size of a list

  • Number of nodes in a tree

Example:

  • Searching in an array of 10 elements β†’ n = 10

  • Searching in an array of 1,000,000 elements β†’ n = 1,000,000


3. Big-O Notation (Core Idea)

3.1 What Is Big-O?

Big-O notation describes the worst-case growth rate of an algorithm.

It answers:

β€œHow does performance scale as input gets very large?”

Examples:

  • O(1)

  • O(n)

  • O(nΒ²)

  • O(log n)


3.2 Why Worst Case?

We analyze the worst case because:

  • It guarantees performance

  • It avoids surprises

  • It helps in real-world systems


4. Common Time Complexities (From Best to Worst)

4.1 O(1) – Constant Time

The algorithm takes the same amount of time, no matter the input size.

Example:

Accessing an array element:

value = arr[5]

βœ… Very fast
βœ… Best possible time complexity


4.2 O(n) – Linear Time

Time grows proportionally with input size.

Example:

Finding the maximum value in an array:

max_val = arr[0]
for x in arr:
if x > max_val:
max_val = x
  • If n doubles, time doubles


4.3 O(nΒ²) – Quadratic Time

Time grows with the square of input size.

Example:

Nested loops:

for i in range(n):
for j in range(n):
print(i, j)
  • n = 10 β†’ 100 operations

  • n = 100 β†’ 10,000 operations

❌ Becomes slow quickly


4.4 O(log n) – Logarithmic Time

Input size is cut in half each step.

Example:

Binary Search

search space β†’ half β†’ half β†’ half
  • n = 1,000,000 β†’ ~20 steps

βœ… Extremely efficient
βœ… Used in searching and trees


4.5 O(n log n) – Linearithmic Time

Common in efficient sorting algorithms.

Examples:

  • Merge Sort

  • Quick Sort (average case)

This is often the best achievable for comparison-based sorting.


4.6 O(2ⁿ) – Exponential Time

Time doubles with each additional input.

Example:

  • Recursive Fibonacci (naive)

❌ Extremely slow
❌ Not scalable


5. Rules for Calculating Time Complexity

5.1 Ignore Constants

for i in range(1000):
print(i)

Still O(1) because 1000 is constant.


5.2 Drop Lower-Order Terms

O(nΒ² + n) β†’ O(nΒ²)
O(n + log n) β†’ O(n)

Only the fastest-growing term matters.


5.3 Loops

  • One loop β†’ O(n)

  • Nested loops β†’ multiply

for i in range(n):
for j in range(n):
# O(nΒ²)

5.4 Sequential Code

loop 1 β†’ O(n)
loop 2 β†’ O(n)

Total β†’ O(2n) β†’ O(n)


6. What Is Space Complexity?

6.1 Definition

Space Complexity measures how much extra memory an algorithm uses as input size grows.

It includes:

  • Variables

  • Data structures

  • Recursive call stack


6.2 Input Space vs Auxiliary Space

  • Input space β†’ memory for input data

  • Auxiliary space β†’ extra memory used by algorithm

We usually analyze auxiliary space.


7. Common Space Complexities

7.1 O(1) – Constant Space

Uses a fixed amount of memory.

sum = 0
max_val = 0

7.2 O(n) – Linear Space

Memory grows with input size.

new_array = []
for x in arr:
new_array.append(x)

7.3 O(nΒ²) – Quadratic Space

Example:

  • 2D matrix

  • Table of size n Γ— n


7.4 Recursive Space

Each recursive call uses stack memory.

def recurse(n):
recurse(n-1)

Space complexity β†’ O(n) (call stack)


8. Time vs Space Trade-Off

Often, you can:

  • Use more memory to run faster

  • Use less memory but run slower

Example:

  • Storing results (memoization)

  • Avoiding recomputation

This is called a time–space trade-off.


9. Comparing Algorithms Using Complexity

Example: Searching

Algorithm Time Complexity
Linear Search O(n)
Binary Search O(log n)

Binary search is faster, but:

  • Requires sorted data

  • May use extra space


10. Common Beginner Mistakes

❌ Thinking Big-O Measures Exact Time

Big-O measures growth, not seconds.

❌ Ignoring Worst Case

Worst case is what matters most.

❌ Over-optimizing Too Early

Clarity matters before performance.


11. How to Build Intuition

Think Like This:

  • β€œWhat happens if input becomes 10Γ— larger?”

  • β€œHow many times does this loop run?”

  • β€œDoes memory grow with input?”


12. Summary

Time Complexity:

  • Measures speed

  • Focuses on growth

  • Uses Big-O notation

Space Complexity:

  • Measures memory usage

  • Includes variables and recursion

  • Important for large inputs