RecursionBase CaseCall StackDivide & ConquerBacktracking

Recursion — The Complete Guide

Master recursion from first principles. Learn to think recursively, trust the recursive leap, identify base cases instantly, and confidently solve interview questions that require breaking problems into smaller subproblems.

45 min read10 sections
01

Intuition from Scratch

Why does this pattern exist?

Some problems have a beautiful property: the solution to the big problem is built from solutions to smaller versions of the exact same problem. Computing factorial(5)? That's just 5 × factorial(4). Searching a binary tree? Check the root, then search the left subtree and the right subtree — each of which is itself a smaller tree. This self-similar structure is everywhere in computer science, and recursion is the natural way to express it.

A recursive function calls itself with a smaller input, trusting that the smaller call will return the correct answer. It keeps breaking the problem down until it hits a base case — a version so simple it can be answered directly without any further calls. Then the answers bubble back up, combining at each level until the original problem is solved.

Recursion isn't just a technique — it's a way of thinking. Once you learn to think recursively, you can tackle trees, graphs, backtracking, divide and conquer, and dynamic programming. All of these are built on the recursive foundation. Master recursion, and you unlock the entire upper tier of DSA.

Analogies to Build Intuition

🪆

Russian Nesting Dolls (Matryoshka)

Open the biggest doll — there's a smaller one inside. Open that — there's an even smaller one. Keep going until you reach the tiniest doll that doesn't open (the base case). Then you put them back together, each doll wrapping the previous one. That's recursion: break down until trivial, then build back up.

🪞

Two Mirrors Facing Each Other

Stand between two mirrors and you see infinite reflections — each one a smaller copy of the same scene. Recursion is like that: the function sees a smaller version of itself. But unlike mirrors, recursion has a base case that stops the infinite reflection.

📋

The Delegation Manager

A manager gets a task: 'Count all employees in the company.' Instead of counting everyone, they ask each direct report: 'Count all employees under you.' Each report does the same — delegates to their reports. The person with no reports (base case) says '1 — just me.' Answers flow back up and get summed.

What kind of problems does it solve?

Recursion is your go-to when:

  • The problem has self-similar substructure — the big problem is made of smaller copies of itself
  • You're working with trees or graphs — naturally recursive data structures
  • You need to explore all possibilities — permutations, combinations, subsets (backtracking)
  • You can divide the problem in half — merge sort, binary search, quicksort
  • The problem has a mathematical recurrence — Fibonacci, power, factorial
  • You need to reverse or undo something — reverse a linked list, reverse a string

The core insight

Recursion is about trust. You solve the smallest case directly (base case), then assume the recursive call works correctly for a smaller input (the "recursive leap of faith"). Your only job is to combine the result of the smaller call with the current element to produce the answer for the current level. Don't trace every call in your head — trust the recursion.

02

Pattern Recognition

Recognizing when recursion is the right approach is the most important skill. Here are the signals to look for and the traps to avoid.

Keywords & Signals in the Problem Statement

Use Recursion when you see...

  • "Generate all permutations / combinations / subsets"
  • "Traverse a tree" — inorder, preorder, postorder
  • "Depth-first search" on a tree or graph
  • "Divide and conquer" — merge sort, quicksort
  • "Check if a tree is balanced / symmetric / valid BST"
  • "Flatten a nested structure" — nested lists, trees to lists
  • "Compute power(x, n)" or "compute factorial"
  • "Reverse a linked list" or "reverse a string"
  • The data structure is inherently recursive (tree, linked list, graph)
  • The problem can be defined as f(n) in terms of f(n-1) or f(n/2)

Don't use Recursion when...

  • A simple loop does the job — don't over-engineer with recursion
  • The recursion depth could be very large (n > 10⁴) — risk stack overflow
  • The problem has overlapping subproblems — use DP with memoization instead of plain recursion
  • You need O(1) space — recursion uses O(depth) stack space
  • The problem is purely iterative (sliding window, two pointers)
  • Tail recursion isn't optimized in your language — convert to iteration

Recursion vs. Similar Patterns

PatternWhen to UseKey Difference
RecursionProblem has self-similar substructure; trees, graphs, generate all possibilitiesFunction calls itself; uses call stack; natural for tree/graph traversal
IterationSimple loops, sliding window, two pointers, linear scansUses explicit loops; O(1) stack space; often faster in practice
Dynamic ProgrammingRecursion with overlapping subproblems (same subproblem solved multiple times)Recursion + memoization or bottom-up table; avoids redundant computation
BacktrackingGenerate all valid configurations; constraint satisfactionRecursion + choose/explore/unchoose; prunes invalid branches early
Divide & ConquerSplit problem in half, solve halves, combine resultsA specific form of recursion; merge sort, quicksort, binary search

The recursion question to always ask

When you see a problem, ask: "Can I define the answer for input of size n in terms of the answer for a smaller input?" If yes, you have a recurrence relation, and recursion is the natural approach. Then ask: "What's the smallest input I can solve directly?" That's your base case.

03

Brute Force → Optimized Thinking

Let's walk through the thinking evolution using a classic problem: Power(x, n) — compute x raised to the power n.

1

Brute Force — Multiply n Times

O(n) time · O(1) space

Multiply x by itself n times in a loop. Simple and correct, but painfully slow when n is huge (e.g., 2³¹). We're doing n multiplications when we could do far fewer.

Brute Forcetypescript
function powBrute(x: number, n: number): number {
  if (n < 0) { x = 1 / x; n = -n; }
  let result = 1;
  for (let i = 0; i < n; i++) {
    result *= x;
  }
  return result;
}
// Problem: O(n) multiplications.
// For n = 2^31, that's ~2 billion operations. Way too slow.
2

Optimal — Recursive Fast Power (Exponentiation by Squaring)

O(log n) time · O(log n) space

The key insight: x^n = (x^(n/2))² when n is even, and x × (x^(n/2))² when n is odd. Each recursive call halves n, so we only need log(n) multiplications instead of n.

Fast Power — Recursivetypescript
function myPow(x: number, n: number): number {
  if (n < 0) return 1 / myPow(x, -n);
  if (n === 0) return 1;           // base case
  if (n === 1) return x;           // base case

  const half = myPow(x, Math.floor(n / 2));

  if (n % 2 === 0) {
    return half * half;            // x^n = (x^(n/2))²
  } else {
    return half * half * x;        // x^n = (x^(n/2))² × x
  }
}
// Each call halves n → O(log n) calls → O(log n) multiplications.
// Stack depth: O(log n). For n = 2^31, that's only ~31 calls!

Why Does the Optimization Work?

1

Halving the problem at each step

Instead of reducing n by 1 each time (n → n-1 → n-2 → ...), we halve it (n → n/2 → n/4 → ...). Halving reaches 0 in log(n) steps instead of n steps. This is the divide-and-conquer principle.

2

Reusing the half result

We compute x^(n/2) once and square it. Without this, we'd compute x^(n/2) twice (left half and right half), leading to O(n) total work. Storing the half result is the key — it's the same idea behind memoization.

3

The general principle

Whenever a recursive problem can be split in half and the halves are identical, compute the half once and combine. This turns O(n) into O(log n). The same principle powers merge sort, binary search, and many tree algorithms.

The thinking process matters more than the code

In interviews, walk through this progression: "The brute force does n multiplications — O(n). But I notice x^n = (x^(n/2))², so I can halve the problem each time. That gives O(log n) multiplications. I just need to handle the odd case by multiplying an extra x." This shows the interviewer you understand the recursive structure.

04

Core Templates

Recursive problems follow a few recurring templates. Memorize these structures — most problems are variations of one of them.

Template 1: Linear Recursion (Reduce by One)

The simplest form. Reduce the problem size by 1 at each step. Process the current element, then recurse on the rest. Used for linked lists, arrays processed element-by-element, and mathematical recurrences.

Linear Recursion Templatetypescript
function solve(input, index): Result {
  // 1. Base case — smallest input you can answer directly
  if (index >= input.length) return baseValue;

  // 2. Recursive case — solve for the rest
  const subResult = solve(input, index + 1);

  // 3. Combine — merge current element with sub-result
  return combine(input[index], subResult);
}

// Examples:
// Factorial: base = 1, combine = n * factorial(n-1)
// Sum array: base = 0, combine = arr[i] + sum(arr, i+1)
// Reverse list: base = null, combine = reverse rest, then append head

// When to use: factorial, fibonacci, reverse linked list,
// sum of array, string reversal, palindrome check

Template 2: Divide and Conquer (Split in Half)

Split the problem into two (or more) halves, solve each half recursively, then combine the results. The halves are independent — this is what makes it different from DP.

Divide and Conquer Templatetypescript
function solve(input, left, right): Result {
  // 1. Base case — single element or empty
  if (left >= right) return baseValue;

  // 2. Divide — find the midpoint
  const mid = Math.floor((left + right) / 2);

  // 3. Conquer — solve both halves recursively
  const leftResult = solve(input, left, mid);
  const rightResult = solve(input, mid + 1, right);

  // 4. Combine — merge the two results
  return merge(leftResult, rightResult);
}

// Examples:
// Merge sort: split, sort halves, merge sorted halves
// Maximum subarray (Kadane variant): max of left, right, or crossing
// Count inversions: merge sort + count during merge step

// When to use: merge sort, quicksort, binary search,
// closest pair of points, power(x, n)

Template 3: Tree Recursion (DFS on Trees)

The most common recursion template in interviews. Process a tree node, then recurse on left and right children. The base case is a null node. Almost every tree problem uses this template.

Tree Recursion Templatetypescript
function solve(node: TreeNode | null): Result {
  // 1. Base case — null node
  if (node === null) return baseValue;

  // 2. Recurse on children
  const leftResult = solve(node.left);
  const rightResult = solve(node.right);

  // 3. Combine — use current node + children's results
  return combine(node.val, leftResult, rightResult);
}

// Examples:
// Max depth: base = 0, combine = 1 + max(left, right)
// Is balanced: base = true/0, combine = check |leftH - rightH| ≤ 1
// Diameter: base = 0, combine = update global max with leftH + rightH
// Invert tree: base = null, combine = swap left and right

// When to use: max depth, diameter, balanced check, LCA,
// path sum, invert tree, serialize/deserialize

Template 4: Backtracking (Choose → Explore → Unchoose)

Generate all valid configurations by making a choice, recursing to explore that path, then undoing the choice to try the next option. Used for permutations, combinations, subsets, and constraint satisfaction.

Backtracking Templatetypescript
function backtrack(state, choices, result): void {
  // 1. Base case — valid complete solution
  if (isComplete(state)) {
    result.push([...state]); // save a copy
    return;
  }

  for (const choice of choices) {
    // 2. Prune — skip invalid choices early
    if (!isValid(choice, state)) continue;

    // 3. Choose — add to current state
    state.push(choice);

    // 4. Explore — recurse with updated state
    backtrack(state, remainingChoices, result);

    // 5. Unchoose — remove to try next option
    state.pop();
  }
}

// Examples:
// Subsets: choose to include or exclude each element
// Permutations: choose each unused element for the next position
// N-Queens: choose a column for each row, prune conflicts
// Combination Sum: choose a number, recurse with remaining target

// When to use: generate all permutations/combinations/subsets,
// N-Queens, Sudoku solver, word search, palindrome partitioning

Which template to pick?

Ask yourself: (1) Am I reducing the problem by one element at a time? → Template 1. (2) Can I split the problem in half? → Template 2. (3) Am I traversing a tree? → Template 3. (4) Do I need to generate all valid configurations? → Template 4 (backtracking).

05

Variations & Sub-Patterns

Recursion isn't a single trick — it's the foundation for an entire family of techniques. Here are the most common variations and how the approach changes for each.

VariationRecursive StrategyClassic Example
Linear RecursionReduce by 1: f(n) depends on f(n-1)Factorial, Fibonacci, Reverse Linked List
Divide & ConquerSplit in half: f(n) depends on f(n/2)Merge Sort, Power(x,n), Count Inversions
Tree DFSRecurse on left and right childrenMax Depth, Diameter, Validate BST, LCA
BacktrackingChoose → explore → unchoose; prune invalid pathsSubsets, Permutations, N-Queens, Sudoku
Recursion + MemoizationCache results of subproblems to avoid recomputationFibonacci, Climbing Stairs, Coin Change
Mutual RecursionTwo functions call each other alternatelyIs Even / Is Odd, Parser for nested expressions
Tail RecursionRecursive call is the last operation; can be optimized to a loopFactorial (tail form), GCD (Euclidean algorithm)
Generate & FilterGenerate all possibilities recursively, filter valid onesGenerate Parentheses, Letter Combinations of Phone Number

Problems mentioned above

Deep Dive: Subsets — The Include/Exclude Pattern

Generating all subsets is the purest form of recursion. For each element, you make a binary choice: include it or exclude it. This creates a binary tree of decisions with 2^n leaves — one for each subset.

Subsets — Include/Excludetypescript
function subsets(nums: number[]): number[][] {
  const result: number[][] = [];

  function backtrack(index: number, current: number[]) {
    // Base case: processed all elements
    if (index === nums.length) {
      result.push([...current]); // save a copy
      return;
    }

    // Choice 1: EXCLUDE nums[index]
    backtrack(index + 1, current);

    // Choice 2: INCLUDE nums[index]
    current.push(nums[index]);
    backtrack(index + 1, current);
    current.pop(); // unchoose (backtrack)
  }

  backtrack(0, []);
  return result;
}
// Time: O(2^n × n) — 2^n subsets, each takes O(n) to copy
// Space: O(n) recursion depth + O(2^n × n) for output
// Key: each element has exactly 2 choices → 2^n total subsets

Deep Dive: Merge Sort — Divide and Conquer

Merge sort is the textbook divide-and-conquer algorithm. Split the array in half, recursively sort each half, then merge the two sorted halves. The merge step is where the real work happens.

Merge Sort — Divide and Conquertypescript
function mergeSort(arr: number[]): number[] {
  // Base case: single element is already sorted
  if (arr.length <= 1) return arr;

  // Divide
  const mid = Math.floor(arr.length / 2);
  const left = mergeSort(arr.slice(0, mid));
  const right = mergeSort(arr.slice(mid));

  // Conquer (merge two sorted arrays)
  return merge(left, right);
}

function merge(a: number[], b: number[]): number[] {
  const result: number[] = [];
  let i = 0, j = 0;

  while (i < a.length && j < b.length) {
    if (a[i] <= b[j]) result.push(a[i++]);
    else result.push(b[j++]);
  }

  return [...result, ...a.slice(i), ...b.slice(j)];
}
// Time: O(n log n) — log n levels, O(n) work per level
// Space: O(n) for the merge step
// Key: splitting gives log(n) depth; merging does O(n) per level

Deep Dive: Generate Parentheses — Constrained Recursion

Generate all valid combinations of n pairs of parentheses. This is recursion with constraints: you can add '(' if you haven't used all n, and you can add ')' only if open count > close count. The constraints prune invalid branches.

Generate Parentheses — Constrained Backtrackingtypescript
function generateParenthesis(n: number): string[] {
  const result: string[] = [];

  function backtrack(current: string, open: number, close: number) {
    // Base case: used all parentheses
    if (current.length === 2 * n) {
      result.push(current);
      return;
    }

    // Choice 1: add '(' if we haven't used all n
    if (open < n) {
      backtrack(current + "(", open + 1, close);
    }

    // Choice 2: add ')' if it won't create invalid sequence
    if (close < open) {
      backtrack(current + ")", open, close + 1);
    }
  }

  backtrack("", 0, 0);
  return result;
}
// Time: O(4^n / √n) — the n-th Catalan number
// Space: O(n) recursion depth
// Key: the constraint (close < open) ensures we never have
// more closing parens than opening ones at any point.
06

Visual Walkthrough

Let's trace through three classic recursion problems step by step to see how the call stack builds up and unwinds.

Factorial — The Call Stack in Action

Factorial — Call Stack Tracetext
factorial(5)
├── 5 * factorial(4)
│   ├── 4 * factorial(3)
│   │   ├── 3 * factorial(2)
│   │   │   ├── 2 * factorial(1)
│   │   │   │   └── return 1base case
│   │   │   └── return 2 * 1 = 2unwinding
│   │   └── return 3 * 2 = 6unwinding
│   └── return 4 * 6 = 24unwinding
└── return 5 * 24 = 120unwinding

Answer: 120

Key observations:
- The call stack grows 5 levels deep (one per recursive call)
- Base case (n=1) returns directlyno more calls
- Results "bubble up" as each call returns
- Each level does O(1) worktotal O(n) time, O(n) stack space

Power(2, 10) — Halving the Problem

Power(2, 10) — Tracetext
myPow(2, 10)
├── half = myPow(2, 5)
│   ├── half = myPow(2, 2)
│   │   ├── half = myPow(2, 1)
│   │   │   └── return 2base case (n=1)
│   │   └── return 2 * 2 = 4even: half * half
│   └── return 4 * 4 * 2 = 32odd: half * half * x
└── return 32 * 32 = 1024even: half * half

Answer: 1024

Key observations:
- Only 4 recursive calls for n=10 (vs 10 multiplications in brute force)
- Each call halves n: 10521
- Odd n: multiply an extra x. Even n: just square the half.
- Stack depth: O(log n) = O(log 10) ≈ 4 levels

Subsets of [1, 2, 3] — The Decision Tree

Subsets — Decision Tree Tracetext
subsets([1, 2, 3])

                          []
                   /              \
              exclude 1          include 1
                []                  [1]
              /      \           /       \
          excl 2    incl 2    excl 2    incl 2
           []        [2]       [1]      [1,2]
          / \       / \      / \      / \
        e3  i3    e3  i3   e3  i3   e3  i3
        []  [3]  [2] [2,3] [1] [1,3] [1,2] [1,2,3]

Leaves (all subsets): [], [3], [2], [2,3], [1], [1,3], [1,2], [1,2,3]

Answer: [[], [3], [2], [2,3], [1], [1,3], [1,2], [1,2,3]]

Key observations:
- Each level makes a binary choice: include or exclude one element
- 3 elements3 levels → 2³ = 8 leaves = 8 subsets
- The tree has depth n (number of elements)
- Backtracking: after exploring "include", we pop the element and explore "exclude"
07

Practice Problems

These 7 problems are carefully ordered from easy to hard. Each one teaches a different aspect of recursive thinking. Solve them in order.

1

LC 206. Reverse Linked List

Easy

Why this pattern:

Linear recursion on a linked list. Recurse to the end of the list, then rewire pointers on the way back. The base case is a null or single node. This builds intuition for how recursion 'unwinds' and does work on the return path.

Key idea: Base case: if head is null or head.next is null, return head. Recursive case: reverse the rest of the list, then set head.next.next = head and head.next = null. The reversed rest becomes the new head.

2

LC 104. Maximum Depth of Binary Tree

Easy

Why this pattern:

Tree recursion (Template 3). The depth of a tree is 1 + max(depth of left subtree, depth of right subtree). Base case: null node has depth 0. This is the simplest tree recursion problem — solve it first to build confidence.

Key idea: If node is null, return 0. Otherwise, return 1 + Math.max(maxDepth(node.left), maxDepth(node.right)). Trust the recursion: assume maxDepth works correctly on subtrees.

3

LC 50. Pow(x, n)

Medium

Why this pattern:

Divide and conquer (Template 2). Halve the exponent at each step: x^n = (x^(n/2))². Handle odd n by multiplying an extra x. Handle negative n by computing 1/x^(-n). This teaches the 'halving' optimization.

Key idea: Compute half = myPow(x, n/2) once. If n is even, return half * half. If odd, return half * half * x. Handle n < 0 by inverting. Store half in a variable — computing it twice gives O(n) instead of O(log n).

4

LC 78. Subsets

Medium

Why this pattern:

Backtracking with include/exclude choices (Template 4). For each element, make a binary decision: include it or skip it. This generates all 2^n subsets. The purest form of recursive enumeration.

Key idea: At each index, branch into two paths: one that includes nums[index] and one that doesn't. When index reaches the end, save the current subset. Remember to pop after including (backtrack) to explore the exclude path.

5

LC 46. Permutations

Medium

Why this pattern:

Backtracking with a 'used' set (Template 4). For each position, try every unused element. Mark it as used, recurse for the next position, then unmark. This generates all n! permutations.

Key idea: Maintain a 'used' boolean array. For each position in the permutation, iterate all elements. If unused, mark it, add to current permutation, recurse, then unmark and remove. Base case: permutation length equals n.

6

LC 22. Generate Parentheses

Medium

Why this pattern:

Constrained backtracking. At each step, choose '(' or ')' — but only if the choice keeps the sequence valid. Constraint: open < n to add '(', close < open to add ')'. This teaches how constraints prune the recursion tree.

Key idea: Track open and close counts. Add '(' if open < n. Add ')' if close < open. Base case: string length equals 2n. The constraints ensure every generated string is valid — no filtering needed.

7

LC 51. N-Queens

Hard

Why this pattern:

Backtracking with constraint checking (Template 4). Place queens row by row. For each row, try every column. Check if the placement is safe (no conflicts on column, diagonal, anti-diagonal). If safe, recurse to the next row. If not, try the next column.

Key idea: Use three sets to track occupied columns, diagonals (row - col), and anti-diagonals (row + col). For each row, try each column. If all three sets allow it, place the queen, add to sets, recurse, then remove (backtrack).

Practice strategy

For each problem: (1) Identify the base case before writing any code. (2) Write the recurrence in plain English: "The answer for n is [something] combined with the answer for n-1." (3) After solving, draw the recursion tree for a small input. This builds the visual intuition that makes recursion click.

08

Common Mistakes

These are the traps that catch people on recursion problems. Learn them here so you don't learn them in an interview.

🔁

Missing or wrong base case

You forget the base case entirely, or your base case doesn't cover all termination conditions. The recursion never stops — infinite loop and stack overflow.

Always write the base case FIRST. Ask: 'What's the smallest input? What should I return for it?' Common base cases: null node, empty array, index out of bounds, n === 0 or n === 1. Test your base case with the smallest possible input.

📉

Not making the problem smaller

Your recursive call passes the same input (or a larger one) instead of a strictly smaller one. The problem never shrinks toward the base case.

Every recursive call MUST move closer to the base case. If your base case is n === 0, each call should reduce n. If your base case is an empty array, each call should process fewer elements. Verify: 'Is the input to my recursive call strictly smaller?'

🌳

Computing the same subproblem twice (no memoization)

In problems like Fibonacci, naive recursion computes fib(3) multiple times because both fib(5) and fib(4) call fib(3). This turns O(n) into O(2^n).

If you see overlapping subproblems (the same call with the same arguments happens multiple times), add memoization: cache results in a map. Check the cache before computing. This is the bridge from recursion to dynamic programming.

📋

Forgetting to copy the state in backtracking

You push the current state (array/path) into the result, but later mutations change it because you pushed a reference, not a copy. All results end up as empty arrays.

When saving a result in backtracking, always push a COPY: result.push([...current]) or result.push(current.slice()). The spread operator [...arr] creates a shallow copy. Without this, all entries in result point to the same mutated array.

🔙

Forgetting to unchoose (backtrack)

You add an element to the current state but forget to remove it after the recursive call returns. The state accumulates elements from all branches, producing wrong results.

Every 'choose' must have a matching 'unchoose'. If you push, you must pop. If you set used[i] = true, you must set used[i] = false. The pattern is always: choose → recurse → unchoose.

💥

Stack overflow on deep recursion

Your recursion depth is O(n) where n can be 10⁵ or more. The call stack overflows because most languages limit stack depth to ~10⁴ frames.

If recursion depth could exceed ~10⁴, convert to iteration using an explicit stack. Or use tail recursion if your language optimizes it (JavaScript does NOT optimize tail calls in most engines). For tree problems, iterative DFS with a stack is the standard alternative.

09

Interview Strategy

Knowing how to recurse is half the battle. Here's how to present recursive solutions in an interview to maximize your score.

The 5-Step Interview Flow

1

Identify the Recurrence

'I can define the answer for n in terms of smaller subproblems. For example, the depth of a tree is 1 + max(depth of left, depth of right).' — State the recurrence in plain English before writing code.

2

Define the Base Case

'The base case is when the node is null — the depth is 0.' — Always state the base case explicitly. This shows the interviewer you've thought about termination.

3

Trust the Recursive Leap

'I'll assume my recursive call correctly returns the depth of the subtree. My job is just to add 1 and take the max.' — Don't trace through every call. Show you trust the recursion.

4

Code It Clean

Write the base case first, then the recursive case. Use clear variable names (leftDepth, rightDepth, not a, b). Keep the function short — recursive solutions should be concise.

5

Analyze Complexity

'Time: O(n) because I visit each node once. Space: O(h) for the call stack where h is the tree height — O(log n) for balanced, O(n) for skewed.' — Always mention stack space for recursive solutions.

Common Follow-Up Questions

Follow-UpHow to Handle
"Can you do it iteratively?"Yes — use an explicit stack to simulate the call stack. For tree DFS, push nodes onto a stack instead of recursing. Mention that iterative avoids stack overflow for deep recursion.
"What's the space complexity?"Recursive solutions use O(depth) stack space. For trees: O(log n) balanced, O(n) worst case. For backtracking: O(n) where n is the decision depth. Always mention this — interviewers expect it.
"What if there are overlapping subproblems?"Add memoization — cache results in a hash map keyed by the function arguments. This converts exponential recursion into polynomial DP. Mention the transition: 'This is where recursion becomes DP.'
"Can you optimize the backtracking?"Add pruning — skip branches that can't lead to valid solutions. For N-Queens: check column/diagonal conflicts before recursing. For combination sum: skip if remaining target < 0. Pruning can dramatically reduce the search space.
"What about tail recursion?"Tail recursion is when the recursive call is the last operation. Some languages optimize it to a loop (O(1) stack space). JavaScript does NOT reliably optimize tail calls. If asked, convert to an iterative loop manually.
"How would you handle very deep recursion?"Convert to iteration with an explicit stack. Or increase the stack size limit (language-dependent). For Python: sys.setrecursionlimit(). For JS: iterative is the only reliable option.

The meta-skill

Interviewers test recursion because it reveals how you decompose problems. The ability to say "the answer for this problem is [something] combined with the answer for a smaller version" is the core skill. Practice stating the recurrence in one sentence before coding — it's the most impressive thing you can do in a recursion interview.

10

Summary & Cheat Sheet

Pin this section. Review it before mock interviews and contests.

Quick Revision Cheat Sheet

Core idea: A function that calls itself with a smaller input, building the answer from base cases up

Three ingredients: 1. Base case (when to stop) 2. Recursive case (how to shrink) 3. Combine (how to use the sub-result)

Linear recursion: f(n) = combine(n, f(n-1)). Reduce by one each step. Factorial, reverse list, sum array

Divide & conquer: f(n) = merge(f(n/2), f(n/2)). Halve the problem. Merge sort, power(x,n), binary search

Tree recursion: f(node) = combine(node.val, f(left), f(right)). Base: null → default. Max depth, diameter, validate BST

Backtracking: Choose → explore → unchoose. Generate all valid configurations. Subsets, permutations, N-Queens

Base case first: Always write the base case before the recursive case. It's the foundation everything builds on

Recursive leap of faith: Assume the recursive call works correctly. Your job: combine its result with the current element

Copy state: In backtracking, push [...current] not current. References mutate — copies don't

Stack space: Recursion uses O(depth) stack space. Trees: O(log n) balanced, O(n) skewed. Watch for stack overflow

Memoization bridge: If subproblems overlap (same call repeated), add a cache → recursion becomes DP

Mental trigger: "Self-similar structure" or "tree/graph" or "generate all" or "f(n) in terms of f(smaller)" → Recursion

Decision Flowchart

When to Use Recursion — Decision Treetext
Is the data structure a tree or graph?
├── YESTree/Graph DFS recursion (Template 3)
Base case: null node. Recurse on children. Combine results.
└── NOCan you define f(n) in terms of f(smaller)?
          ├── YESCan you split the problem in HALF?
          │         ├── YESDivide & Conquer (Template 2)
          │         │         Merge sort, power(x,n), closest pair
          │         └── NOLinear Recursion (Template 1)
Factorial, reverse list, fibonacci
          └── NODo you need to generate ALL valid configurations?
                    ├── YESBacktracking (Template 4)
Subsets, permutations, N-Queens, sudoku
                    └── NOAre there overlapping subproblems?
                              ├── YESRecursion + Memoization (→ DP)
Fibonacci, coin change, climbing stairs
                              └── NORecursion probably isn't needed
Consider iteration, greedy, or hashing

Recursion Complexity Quick Reference

PatternTimeSpace (stack)Example
Linear (reduce by 1)O(n)O(n)Factorial, reverse list
Divide & conquer (halve)O(n log n)O(log n)Merge sort, power(x,n)
Binary tree DFSO(n)O(h) where h = heightMax depth, diameter
Subsets (include/exclude)O(2ⁿ)O(n)Subsets, combination sum
PermutationsO(n!)O(n)Permutations, N-Queens
Fibonacci (no memo)O(2ⁿ)O(n)Naive fibonacci — DON'T do this
Fibonacci (with memo)O(n)O(n)Memoized fibonacci — DO this

One last thing

Recursion is the gateway to the hardest and most rewarding parts of DSA: trees, graphs, backtracking, divide and conquer, and dynamic programming. Every one of these builds on recursive thinking. The key skill isn't tracing call stacks — it's learning to trust the recursive leap of faith: define the base case, assume the recursive call works, and focus only on combining the result. Once that clicks, you'll see recursion everywhere.