› Forums › Python › Understanding Time Complexity Through a Simple Experiment (Big O & Θ Made Intuitive)
- This topic is empty.
-
AuthorPosts
-
March 27, 2026 at 9:00 am #6284
If you’ve ever wondered why we create lists and then sum them when studying algorithms, you’re not alone. This learning post walks you step-by-step through the intuition behind time complexity, using one of the simplest examples possible.
🔹 The Goal
Understand how an algorithm’s runtime grows as input size increases.
Not just:
- “Is it fast?”
But:
- “What happens when input becomes 10x, 100x, 1000x larger?”
🔹 Step 1: Define an Algorithm
We start with a simple function:
def sum_of(L): total = 0 for x in L: total = total + x return total👉 What this does:
- Takes a list
L - Iterates through every element
- Adds them together
🔹 Step 2: Create Inputs of Different Sizes
L_N = [1] for i in range(6): L_N.append(L_N[-1] * 10)This generates:
[1, 10, 100, 1000, 10000, ...]Then:
for N in L_N: L = list(range(N)) sum_of(L)
🔹 The Big Question 🤔
Why do we:
- Create lists of size 10, 100, 1000…
- Then compute their sum?
Why not just use the numbers directly?
🔹 Key Insight: Numbers vs Work
Numbers (10, 100, 1000) define size—but they don’t create work.
Example:
for N in [10, 100, 1000]: print(N)👉 This runs almost instantly
👉 Work does NOT grow with N
🔹 Where does the “work” come from?
From this:
for x in L: total = total + x👉 This loop runs:
- 10 times (if N = 10)
- 100 times (if N = 100)
- 1000 times (if N = 1000)
🔹 Connecting Everything 🔗
Component Role NDefines input size L = list(range(N))Creates data of size N sum_of(L)Processes each element
🔹 Flow of the Experiment
N → creates list L → algorithm processes L → work measured
🔹 What We Observe
N (size) Number of steps 10 10 100 100 1000 1000 👉 Work grows proportionally with N
🔹 Final Result
✅ Time Complexity:
- Big O: O(n)
- Theta: Θ(n)
💡 Meaning:
If input doubles → work doubles
If input becomes 10× → work becomes ~10×
🔹 Why
sum_ofwas chosenIt’s just a teaching tool:
- Simple to understand
- Clearly shows “one operation per element”
- Easy to measure
👉 Any similar loop would work
🔹 What if we didn’t use
sum_of?Then:
- No repeated operations
- No scaling behavior
- No way to observe complexity
❌ No learning
🔹 Real-World Analogy
Imagine measuring how long it takes to:
👉 Pack boxes
Boxes Work 10 Pack 10 boxes 100 Pack 100 boxes 1000 Pack 1000 boxes 👉 You must do the work, not just count boxes
🔹 Key Takeaways
- Input size alone doesn’t create complexity
- Algorithms create work based on input
- Complexity is about how work grows with size
🔹 One-Line Summary
We vary input size to see how much work the algorithm does—and that reveals its time complexity.
🔹 What to Learn Next 🚀
To deepen your understanding, explore:
- Difference between O(n), O(log n), O(n²)
- Nested loops vs single loops
- Real-world examples like search and sorting
-
AuthorPosts
- You must be logged in to reply to this topic.
