Basic Time and Space Complexity
Basic Time & Space Complexity
A Complete Guide for Learners
1. Why Time and Space Complexity Matter
When you write a program, it may:
-
Run fast or slow
-
Use little or a lot of memory
Two programs can solve the same problem, but one might:
-
Finish in 1 second, while another takes 1 hour
-
Use a few kilobytes of memory, while another uses gigabytes
To measure and compare efficiency, computer scientists use:
-
Time Complexity β How fast an algorithm runs
-
Space Complexity β How much memory an algorithm uses
These concepts help us:
-
Choose better algorithms
-
Write scalable programs
-
Predict performance before running code
2. What Is Time Complexity?
2.1 Definition
Time Complexity describes how the running time of an algorithm grows as the input size grows.
It does not measure:
-
Actual seconds or milliseconds
-
Speed of a specific computer
Instead, it measures:
How the number of operations increases with input size
2.2 Input Size (n)
We usually represent input size with n:
-
Number of elements in an array
-
Length of a string
-
Size of a list
-
Number of nodes in a tree
Example:
-
Searching in an array of 10 elements β n = 10
-
Searching in an array of 1,000,000 elements β n = 1,000,000
3. Big-O Notation (Core Idea)
3.1 What Is Big-O?
Big-O notation describes the worst-case growth rate of an algorithm.
It answers:
βHow does performance scale as input gets very large?β
Examples:
-
O(1)
-
O(n)
-
O(nΒ²)
-
O(log n)
3.2 Why Worst Case?
We analyze the worst case because:
-
It guarantees performance
-
It avoids surprises
-
It helps in real-world systems
4. Common Time Complexities (From Best to Worst)
4.1 O(1) β Constant Time
The algorithm takes the same amount of time, no matter the input size.
Example:
Accessing an array element:
β
Very fast
β
Best possible time complexity
4.2 O(n) β Linear Time
Time grows proportionally with input size.
Example:
Finding the maximum value in an array:
-
If n doubles, time doubles
4.3 O(nΒ²) β Quadratic Time
Time grows with the square of input size.
Example:
Nested loops:
-
n = 10 β 100 operations
-
n = 100 β 10,000 operations
β Becomes slow quickly
4.4 O(log n) β Logarithmic Time
Input size is cut in half each step.
Example:
Binary Search
-
n = 1,000,000 β ~20 steps
β
Extremely efficient
β
Used in searching and trees
4.5 O(n log n) β Linearithmic Time
Common in efficient sorting algorithms.
Examples:
-
Merge Sort
-
Quick Sort (average case)
This is often the best achievable for comparison-based sorting.
4.6 O(2βΏ) β Exponential Time
Time doubles with each additional input.
Example:
-
Recursive Fibonacci (naive)
β Extremely slow
β Not scalable
5. Rules for Calculating Time Complexity
5.1 Ignore Constants
Still O(1) because 1000 is constant.
5.2 Drop Lower-Order Terms
O(nΒ² + n) β O(nΒ²)
O(n + log n) β O(n)
Only the fastest-growing term matters.
5.3 Loops
-
One loop β O(n)
-
Nested loops β multiply
5.4 Sequential Code
Total β O(2n) β O(n)
6. What Is Space Complexity?
6.1 Definition
Space Complexity measures how much extra memory an algorithm uses as input size grows.
It includes:
-
Variables
-
Data structures
-
Recursive call stack
6.2 Input Space vs Auxiliary Space
-
Input space β memory for input data
-
Auxiliary space β extra memory used by algorithm
We usually analyze auxiliary space.
7. Common Space Complexities
7.1 O(1) β Constant Space
Uses a fixed amount of memory.
7.2 O(n) β Linear Space
Memory grows with input size.
7.3 O(nΒ²) β Quadratic Space
Example:
-
2D matrix
-
Table of size n Γ n
7.4 Recursive Space
Each recursive call uses stack memory.
Space complexity β O(n) (call stack)
8. Time vs Space Trade-Off
Often, you can:
-
Use more memory to run faster
-
Use less memory but run slower
Example:
-
Storing results (memoization)
-
Avoiding recomputation
This is called a timeβspace trade-off.
9. Comparing Algorithms Using Complexity
Example: Searching
| Algorithm | Time Complexity |
|---|---|
| Linear Search | O(n) |
| Binary Search | O(log n) |
Binary search is faster, but:
-
Requires sorted data
-
May use extra space
10. Common Beginner Mistakes
β Thinking Big-O Measures Exact Time
Big-O measures growth, not seconds.
β Ignoring Worst Case
Worst case is what matters most.
β Over-optimizing Too Early
Clarity matters before performance.
11. How to Build Intuition
Think Like This:
-
βWhat happens if input becomes 10Γ larger?β
-
βHow many times does this loop run?β
-
βDoes memory grow with input?β
12. Summary
Time Complexity:
-
Measures speed
-
Focuses on growth
-
Uses Big-O notation
Space Complexity:
-
Measures memory usage
-
Includes variables and recursion
-
Important for large inputs