⚙️AlgorithmIntermediate

Complete Search and Backtracking

Key Points

  • Complete search enumerates every candidate solution, while backtracking prunes branches that cannot possibly lead to a valid or better solution.
  • The search space is often a tree where each level represents a decision and each path represents a partial solution.
  • Typical complexities are exponential, such as O(2^n) for subsets and O(n!) for permutations, but pruning can dramatically reduce actual runtime.
  • Bitmasks compactly represent subsets and constraints, enabling fast operations like adding, removing, and checking membership with bitwise operators.
  • Permutation generation can be done via recursion with a used array or via std::nex, and duplicate elements require careful skipping.
  • N-Queens is a classic backtracking example where columns and diagonals encode constraints; bitmasks enable O(1) checks.
  • Good heuristics (ordering choices, bounding functions) and early feasibility checks are key to effective pruning.
  • Avoid common pitfalls: forgetting to undo choices, exploring duplicate states, and using overflow-prone expressions like (1 << n) on 32-bit types.

Prerequisites

  • Recursion and call stackBacktracking is naturally implemented with recursive functions that build and undo choices.
  • Big-O complexityUnderstanding exponential growth (2^n, n!) motivates pruning and input-size limits.
  • Bitwise operationsBitmasks rely on shifts and AND/OR/XOR for O(1) set operations and constraint checks.
  • Combinatorics basicsCounts like 2^n subsets and n! permutations describe the size of the search space.
  • Vectors, arrays, and iteration in C++Data structures to hold candidates and partial solutions are essential.
  • Sorting and STL utilitiesDuplicate-skipping and heuristic ordering commonly require sorting; next_permutation is often used.

Detailed Explanation

Tap terms for definitions

01Overview

Complete search, also called brute force enumeration, systematically tries all possible candidates to solve a problem. You can think of it as walking every path in a decision tree: at each step, you choose an option, and eventually you reach a leaf that represents a full solution. This guarantees correctness because nothing is skipped. However, the number of possibilities can grow extremely fast—often exponentially with the input size—so naive complete search can be too slow. Backtracking improves complete search by cutting off branches early when they can no longer produce a valid or better solution. It builds a solution incrementally and uses quick tests to reject partial solutions that already violate constraints (feasibility checks) or cannot beat the current best (bounding). This approach preserves correctness while potentially exploring a tiny fraction of the full search space in practice. In programming contests and many real problems, small input sizes or strong constraints make complete search with backtracking surprisingly effective. Bitmasks are a common representation to compactly encode subsets and constraints, allowing O(1) add/remove/check operations via bitwise operators. Core applications include subset enumeration, permutation generation, assignment/partition problems, graph searches for small n, and classics like N-Queens. Although worst-case complexity remains exponential, careful pruning and good order of choices often make solutions fast enough for moderate inputs.

02Intuition & Analogies

Imagine trying to open a lock by guessing the combination. Pure complete search means trying every possible sequence until one works—guaranteed success, but potentially a lot of tries. Backtracking is like noticing early that certain partial sequences can’t possibly lead to the right combination (maybe the lock gives you a warning tone after a wrong prefix). You stop exploring those and move to a different branch, saving time. Another analogy is cooking with limited ingredients. If you’re assembling a multi-course meal from options, complete search would list every menu combination. Backtracking adds common sense: if you already picked a very spicy appetizer and the rules say you must balance flavors, then you skip all continuations that include a spicy main. You prune branches that violate constraints before you waste effort on them. Think of the search process as a tree. Each level corresponds to a decision (which item to pick next, which position to place a queen, which digit to choose). A root-to-leaf path is a complete solution. Feasibility checks are like guardrails that block impossible branches right away. Bounding functions are like budgets: if even in the best case you cannot surpass your current best score, don’t go down that road. Ordering your decisions cleverly—e.g., try the most constrained choice first—can shrink the tree. Bitmasks provide a compact checklist (what is already used, which columns/diagonals are blocked) so you can test and update constraints with just a few bit operations. In practice, these simple tricks often turn an astronomical search into a manageable one.

03Formal Definition

Let S be a set of decisions, and define a state space where each state encodes a partial assignment of decisions to values. The state space can be viewed as a rooted tree: the root is the empty assignment, internal nodes add one more decision, and leaves represent complete assignments. A complete search performs a traversal (typically depth-first search) that visits every leaf, checking for validity or optimality. Backtracking augments this traversal with two predicates on partial states x: (1) Feasibility predicate F(x), which returns false if x cannot be extended to any valid full solution (thus prune this subtree); (2) Bounding predicate B(x, best), which returns false if the best achievable value from any extension of x is no better than the current incumbent best (for optimization problems). The algorithm explores a node only if both predicates hold. The search maintains reversibility: each recursive call adds a decision, then later undoes it to restore the prior state. Bitmask representation maps sets into integers: a subset A \{0,,n-1\} is encoded by an integer M where bit i is 1 iff i A. Basic operations—membership, insertion, deletion—become O(1) bitwise operations (AND, OR, XOR, and bit shifts). For problems like N-Queens, additional masks record blocked columns and diagonals; for permutations/subsets, masks record which elements are picked or still available.

04When to Use

Use complete search with backtracking when the input size is small to moderate, or when constraints are strong enough to prune most branches. Classic use cases include:

  • Feasibility under constraints: place items without conflicts (N-Queens, graph coloring for small n, seating with incompatibilities).
  • Optimization over combinatorial choices: choose a subset or ordering that maximizes/minimizes a score (knapsack variants for small n, assignment with extra rules, maximum compatible sets).
  • Enumeration: generate all subsets, combinations, or permutations (possibly with uniqueness constraints) for counting or checking.
  • Constraint satisfaction problems (CSP) with tight local checks: Sudoku variants, word placement, or scheduling with many hard constraints. Prefer recursion/DFS when state changes are incremental and easily undone, and when quick feasibility checks exist. Bitmask enumeration is especially attractive when n \leq 25–30 (depending on operations), or when you need to iterate over all submasks or quickly test membership/availability. If the problem size is large and pruning is weak, consider different paradigms (dynamic programming, greedy, or approximation) instead.

⚠️Common Mistakes

  1. Forgetting to undo choices (no backtrack). If you mark an element as used or update masks, you must restore them after the recursive call. Otherwise, states leak into other branches and correctness fails.
  2. Missing or wrong base cases. Always define precisely when a solution is complete (e.g., placed n queens, picked n elements). Off-by-one errors in stopping conditions are common.
  3. Duplicate states. Generating the same partial or full solution multiple times wastes time. For permutations with repeated values, you must skip duplicates (e.g., by sorting and checking used[i-1]). For subsets, avoid symmetric choices when possible.
  4. Weak or late pruning. If you check constraints only at leaves, you lose the benefit of backtracking. Validate feasibility as soon as a constraint can be violated. In optimization, use a bounding function (e.g., currentSum + remainingUpperBound \leq best) to cut hopeless branches early.
  5. Overflow and bit mistakes. Using (1 << n) with n \geq 31 can overflow 32-bit ints; prefer 1ULL << n. Keep track of mask widths and signedness. For diagonals in N-Queens, ensure correct indexing or shifting when using bitmasks.
  6. Poor ordering. Trying easy choices first may explode the search tree. Heuristics like “most constrained first” often help drastically.
  7. Ignoring recursion limits or stack usage. Very deep recursion may cause stack overflow; in C++, depth \leq a few tens of thousands is usually okay, but still prefer tail-light recursion or iterative DFS for very deep trees.

Key Formulas

Number of Subsets

Explanation: A set with n elements has 2^n subsets because each element is either included or excluded. This is the basic size of the subset search space.

Binomial Sum

Explanation: Summing the number of k-element subsets over all k equals 2^n. It confirms the total count of subsets and often underlies subset-enumeration complexity.

Number of Permutations

Explanation: There are n! ways to order n distinct items. This is the size of the permutation search space in the worst case.

Search Tree Complexity

Explanation: If each node branches into at most b children and depth is d, the number of nodes is exponential in d. Backtracking aims to reduce the effective branching factor.

Subset Recurrence

Explanation: A simple recursion that either takes or skips each element yields 2^n leaves. This models brute force subset enumeration.

N-Queens Diagonal Indexing

Explanation: Left-to-right and right-to-left diagonal indices can be computed from row r and column c. These formulas help map diagonals to arrays or bits for O(1) conflict checks.

All-Submasks-of-All-Masks

Explanation: Enumerating all submasks of all masks costs 3^n operations overall. This guides performance expectations for nested mask–submask loops.

Branch-and-Bound Pruning Rule

Explanation: If the current value V plus an admissible upper bound on what remains cannot beat the incumbent best, the branch is safely discarded.

Complexity Analysis

Complete search explores a state space that typically grows exponentially with input size. For subset problems, you often face O(2^n) leaves because each element is either chosen or skipped; time becomes O(n 2^n) if each leaf requires O(n) work (e.g., summing). Permutations incur O(n!) leaves, and generating/outputting each permutation costs O(n), leading to O(n · n!) time. Space for pure enumeration is usually O(n) for the recursion stack plus storage of the current partial solution. Backtracking reduces the number of visited nodes through feasibility and bounding. In the best cases, pruning shrinks the effective branching factor dramatically, transforming a theoretical exponential into a much smaller practical runtime. However, worst-case time remains exponential: constraints might be too weak to prune. Memory remains modest—O(n) to track choices and O(1) per check—unless you store many solutions or use auxiliary tables. Bitmask operations are O(1) and enable fast subset iteration and constraint checks. Iterating all subsets of an n-element set costs O(2^n); iterating all submasks of a given mask m costs O(2^{popcount(m)}); iterating all submasks for all masks totals O(3^n). N-Queens with bitmasks uses constant-time conflict checks per placement, but still explores exponentially many partial boards. In optimization backtracking with branch-and-bound, the cost hinges on the tightness of the upper bound; tighter bounds yield more pruning and fewer explored nodes.

Code Examples

Bitmask enumeration: maximum subset sum ≤ W (complete search)
1#include <bits/stdc++.h>
2using namespace std;
3
4// Enumerate all subsets via bitmask to find the maximum sum <= W.
5// Demonstrates 2^n enumeration and bit operations.
6
7int main() {
8 ios::sync_with_stdio(false);
9 cin.tie(nullptr);
10
11 vector<int> a = {3, 34, 4, 12, 5, 2};
12 int n = (int)a.size();
13 int W = 10; // capacity/limit
14
15 long long best = 0; // best feasible sum so far
16
17 // There are 2^n masks; each mask picks elements whose bits are 1.
18 // NOTE: ensure 1ULL << n if n can be >= 31 to avoid overflow.
19 for (unsigned long long mask = 0; mask < (1ULL << n); ++mask) {
20 long long sum = 0;
21 // Accumulate sum of chosen elements.
22 for (int i = 0; i < n; ++i) {
23 if (mask & (1ULL << i)) sum += a[i];
24 }
25 if (sum <= W) best = max(best, sum);
26 }
27
28 cout << "Best sum <= W is: " << best << "\n";
29 return 0;
30}
31

This program iterates over all 2^n subsets using a bitmask from 0 to (1<<n)-1. For each subset, it adds selected elements and keeps the best sum that does not exceed W. It illustrates pure complete search and the convenience of bit operations for subset membership.

Time: O(n · 2^n)Space: O(1) extra (ignoring input), O(1) recursion since iterative
Recursive backtracking with pruning for subset sum optimization
1#include <bits/stdc++.h>
2using namespace std;
3
4// Choose a subset with maximum sum <= W using backtracking + pruning.
5// We sort descending and use suffix sums as an admissible upper bound.
6
7long long best;
8long long Wglob;
9vector<int> vals;
10vector<long long> suff; // suff[i] = sum of vals[i..n-1]
11
12void dfs(int i, long long cur) {
13 int n = (int)vals.size();
14
15 // Bound: even if we add everything remaining, can we beat best?
16 long long maxPossible = cur + (i < n ? suff[i] : 0LL);
17 if (maxPossible <= best) return; // prune by upper bound
18
19 // Feasibility: current partial sum must not exceed W
20 if (cur > Wglob) return; // prune infeasible branch
21
22 // If all elements considered, update best
23 if (i == n) {
24 best = max(best, cur);
25 return;
26 }
27
28 // Heuristic: try including vals[i] first (often reaches good solutions earlier)
29 dfs(i + 1, cur + vals[i]); // take
30 dfs(i + 1, cur); // skip
31}
32
33int main() {
34 ios::sync_with_stdio(false);
35 cin.tie(nullptr);
36
37 vector<int> a = {3, 34, 4, 12, 5, 2};
38 Wglob = 10;
39
40 // Sort descending so suff[i] is a tighter bound (bigger early values)
41 vals = a;
42 sort(vals.begin(), vals.end(), greater<int>());
43
44 int n = (int)vals.size();
45 suff.assign(n + 1, 0);
46 for (int i = n - 1; i >= 0; --i) suff[i] = suff[i + 1] + vals[i];
47
48 best = 0;
49 dfs(0, 0);
50
51 cout << "Best sum <= W with pruning: " << best << "\n";
52 return 0;
53}
54

This recursive solution explores include/skip choices but prunes aggressively. A suffix-sum upper bound guarantees we cut branches that cannot surpass the current best. Sorting descending tends to find good solutions early, tightening pruning further. It exemplifies backtracking with both feasibility and bounding checks.

Time: O(2^n) worst-case, but much smaller on typical inputs due to pruningSpace: O(n) for recursion stack and suffix array
Unique permutation generation with duplicate-skipping (backtracking)
1#include <bits/stdc++.h>
2using namespace std;
3
4// Generate all unique permutations of a multiset (string with duplicates).
5// Key idea: sort and skip duplicates when the previous identical element
6// has not been used yet at the current depth.
7
8string s;
9vector<int> used;
10string cur;
11vector<string> out;
12
13void backtrack() {
14 if ((int)cur.size() == (int)s.size()) {
15 out.push_back(cur);
16 return;
17 }
18 for (int i = 0; i < (int)s.size(); ++i) {
19 if (used[i]) continue;
20 // Skip duplicates: if s[i] == s[i-1] and s[i-1] not used,
21 // then placing s[i] now would duplicate the permutation where s[i-1] was chosen here.
22 if (i > 0 && s[i] == s[i - 1] && !used[i - 1]) continue;
23
24 used[i] = 1;
25 cur.push_back(s[i]);
26 backtrack();
27 cur.pop_back();
28 used[i] = 0; // undo choice
29 }
30}
31
32int main() {
33 ios::sync_with_stdio(false);
34 cin.tie(nullptr);
35
36 s = "aabc"; // multiset of characters
37 sort(s.begin(), s.end()); // required for duplicate-skipping rule
38 used.assign(s.size(), 0);
39 backtrack();
40
41 cout << "Count: " << out.size() << "\n";
42 // Print first few permutations as a demo
43 for (int i = 0; i < (int)min<size_t>(10, out.size()); ++i) cout << out[i] << '\n';
44 return 0;
45}
46

This backtracking enumerates permutations but avoids duplicates by a standard rule: after sorting, you cannot choose s[i] if it equals s[i-1] and s[i-1] has not been used yet. Each recursive call places one character and backtracks by undoing the choice.

Time: O(n · U) where U is the number of unique permutations (≤ n!)Space: O(n) recursion stack + O(U · n) if storing all results
N-Queens with bitmasks (fast conflict checks and pruning)
1#include <bits/stdc++.h>
2using namespace std;
3
4// Solve N-Queens using three bitmasks: columns, diag1, diag2.
5// We also record one board to display a solution, while counting all solutions.
6
7struct Solver {
8 int n;
9 uint64_t full; // mask with n lowest bits set to 1
10 uint64_t cols = 0; // occupied columns
11 uint64_t d1 = 0; // diag1 (r-c) shifted: use left-shift on recursion
12 uint64_t d2 = 0; // diag2 (r+c): use right-shift on recursion
13 long long ways = 0; // number of solutions
14 vector<int> pos; // pos[r] = column of queen in row r (for one solution)
15 bool saved = false;
16
17 void dfs(int r) {
18 if (r == n) {
19 ++ways;
20 if (!saved) saved = true;
21 return;
22 }
23 // available positions for row r: free columns and diagonals
24 uint64_t available = full & ~(cols | d1 | d2);
25 while (available) {
26 // pick least significant 1-bit
27 uint64_t p = available & -available;
28 available -= p;
29 int c = __builtin_ctzll(p); // column index
30
31 // place queen: update masks; for next row, diagonals shift
32 cols ^= p;
33 d1 = (d1 | p) << 1;
34 d2 = (d2 | p) >> 1;
35
36 if (!saved) pos[r] = c; // record one solution
37 dfs(r + 1);
38
39 // undo changes
40 d1 >>= 1;
41 d2 <<= 1;
42 cols ^= p;
43 }
44 }
45
46 void run(int N) {
47 n = N;
48 full = (n == 64) ? ~0ULL : ((1ULL << n) - 1ULL);
49 pos.assign(n, -1);
50 dfs(0);
51
52 cout << "Number of solutions for N=" << n << ": " << ways << "\n";
53 if (saved) {
54 cout << "One solution:\n";
55 for (int r = 0; r < n; ++r) {
56 for (int c = 0; c < n; ++c) cout << (pos[r] == c ? 'Q' : '.') ;
57 cout << '\n';
58 }
59 }
60 }
61};
62
63int main() {
64 ios::sync_with_stdio(false);
65 cin.tie(nullptr);
66
67 Solver s;
68 s.run(8); // classic 8-Queens
69 return 0;
70}
71

The solver maintains three bitmasks to mark attacked columns and diagonals. For each row, available positions are computed in O(1) using bitwise operations. Picking the least significant bit iterates through candidates efficiently. The algorithm counts all solutions and prints one board. It demonstrates powerful pruning and constant-time feasibility checks per move.

Time: O(k) per placed queen for bit ops; overall exponential in n (≈ O(n!) worst-case)Space: O(n) for recursion and position storage