Two Pointers
Key Points
- •Two pointers is a pattern where two indices move through a sequence in a coordinated, usually monotonic way to avoid unnecessary work.
- •Classic patterns include opposite-end pointers that meet in the middle, sliding windows that expand and contract, and linear merging of sorted arrays.
- •The power of two pointers comes from monotonic movement: each pointer only moves forward (or inward), ensuring linear-time passes.
- •Many O() brute-force checks can be reduced to O(n) or O(n + m) by maintaining invariants and moving the correct pointer at each step.
- •Sliding window works reliably when the window’s property changes monotonically, typically with non-negative elements or monotone constraints.
- •Opposite-end pointers exploit sorted order or structure to discard large portions of the search space at each step.
- •Correctness depends on well-defined invariants, such as the current window property or what has been safely excluded.
- •Memory usage is often O(1) extra space, making this technique very cache-friendly and practical for large inputs.
Prerequisites
- →Arrays and Indexing — Two pointers operate over arrays or strings; understanding indexing and bounds is fundamental.
- →Sorting — Many two-pointer solutions rely on sorted input for correctness and efficiency.
- →Asymptotic Analysis (Big-O) — To reason about linear-time scans, amortized steps, and space trade-offs.
- →Prefix Sums — Helps with range sums and understanding when sliding windows are applicable.
- →Loop Invariants — Correctness of pointer movements depends on maintaining clear invariants.
Detailed Explanation
Tap terms for definitions01Overview
The two pointers technique is a family of linear-time strategies for array- and string-based problems where we can compare or maintain properties between two moving positions. Instead of nested loops that recheck overlapping work, two pointers coordinate their movement so that each element is visited only a small number of times. Common forms include: (1) opposite-end pointers that start at the extremes and move toward each other to find a target sum, maximize or minimize a function, or validate a condition; (2) sliding windows in which a left and a right boundary maintain a contiguous subarray satisfying a property (e.g., sum within bounds, no duplicates), expanding and contracting as needed; and (3) merging two sorted sequences by advancing the smaller front element, producing a single sorted result in linear time. The main ingredient is a monotonic relationship: when you move a pointer, you permanently rule out certain possibilities without losing the optimal solution. This relies on well-chosen invariants and problem structure, such as sortedness or non-negativity. The technique yields time complexities like O(n) or O(n + m) with minimal additional space. It is widely used in competitive programming and systems code because it is simple, fast, and memory efficient once you understand the invariants.
02Intuition & Analogies
Imagine two people searching for each other on a long, straight street. If they start at opposite ends and walk toward each other, they will meet without retracing steps. This is like opposite-end pointers: each move eliminates parts of the street they no longer need to consider. Now think of a bus with doors at the front and back. Passengers get on at the front (right pointer) and off at the back (left pointer). The bus represents a sliding window: as new passengers board, the bus gets crowded (sum grows); if it gets too crowded (constraint violated), people exit from the back until it’s comfortable again. The window size adapts dynamically. Finally, merging two sorted lines is like two checkout queues that are each in order; you always pick the shorter next person from the front of either queue to form a single orderly line—you never look back. In all these stories, the key is that movement is one-directional and justified: when you step forward, you are certain you won’t need to go back to reconsider old ground. That certainty comes from structure: sorted order ensures that moving the smaller pointer can reveal new, potentially better combinations; non-negative numbers ensure that shrinking a window always decreases the sum; and careful invariants ensure that whatever you skip cannot contain a better answer than what you can still reach. Once you feel this one-way movement, two pointers becomes a natural reflex.
03Formal Definition
04When to Use
Use two pointers when you see: (1) a sorted array (or two sorted arrays) and you need to find pairs or compute a function depending on two positions—opposite-end pointers often reduce O(n^2) to O(n); (2) a need to maintain a contiguous subarray property—sliding windows are ideal when the property changes monotonically with window expansion or contraction, such as sums with non-negative numbers, or character frequencies bounded by a limit; (3) merging or linear-time joining of pre-sorted streams—classic in merge sort, k-way merges (extended with a heap), or external sorting; (4) problems that require maximizing or minimizing a function of distance or endpoints, such as the container-with-most-water problem; (5) two sequences to be scanned exactly once, like deduplication in sorted arrays, two-sum on sorted input, or intersecting sets. Be cautious when key assumptions do not hold: sliding window with negative numbers may break monotonicity, and opposite-end pointers without sortedness or structural order may miss solutions. In such cases, consider alternative tools like hashing (for unsorted two-sum), prefix sums with data structures (for subarray sums with negatives), or binary search on answer combined with a monotone check function.
⚠️Common Mistakes
Common pitfalls include: (1) Misapplying sliding window to arrays with negative numbers and assuming the sum behaves monotonically. With negatives, expanding the window may decrease the sum, breaking the logic; use prefix sums with data structures or different techniques. (2) Using opposite-end pointers on unsorted data. Without sortedness or a comparable structural guarantee, moving a pointer can skip valid pairs. Either sort first (changing indices) or use a hash map to preserve indices. (3) Off-by-one errors in window boundaries, such as forgetting that r is inclusive in A[l..r] or incorrectly updating l before recording an answer while maintaining the invariant. (4) Not updating the best answer at the right time—e.g., taking the answer before shrinking the window to minimal size when seeking a minimum-length subarray. (5) Overflow when computing products of distances and values (e.g., container area) if inputs can be large; use wider integer types like 64-bit. (6) Forgetting to progress pointers in all branches, causing infinite loops. (7) Failing to prove or articulate the invariant, leading to ad hoc pointer moves that seem plausible but break correctness on edge cases. To avoid these, always state the invariant, verify that each move preserves it and reduces the search space, and test edge cases like empty arrays, single elements, all equal values, or very large numbers.
Key Formulas
Prefix Sum Definition
Explanation: The prefix sum at i equals the sum of the first i elements. It lets you compute any subarray sum quickly as prefix[r] - prefix[l].
Range Sum via Prefix
Explanation: The sum of a contiguous window from l to r can be obtained from two prefix sums. This underpins many sliding-window and range queries.
Linear-Time Bounds
Explanation: Two pointers typically visit each element a constant number of times, yielding linear time in the sizes of the inputs. Merging two arrays takes time proportional to their combined length.
Amortized Pointer Movement
Explanation: Each pointer advances monotonically and cannot pass the array end, so the total number of pointer updates is at most linear. This proves O(n) runtime for many patterns.
Container Area
Explanation: The area between two lines at positions l and r is the shorter height times the horizontal distance. Moving the shorter side can potentially increase the minimum height.
Two-Sum Decision Rule (Sorted)
Explanation: In a sorted array, increasing l raises the sum and decreasing r lowers it. This rule preserves all potential solutions while discarding impossible ranges.
Sliding Window Monotonicity
Explanation: With non-negative elements, expanding the right end never decreases the sum and shrinking the left end never increases it. This ensures a simple expand/shrink strategy works.
Counting Visits
Explanation: Each index is processed a constant number of times in two-pointer scans, so the total work is linear. This basic counting argument formalizes the intuition.
Complexity Analysis
Code Examples
1 #include <bits/stdc++.h> 2 using namespace std; 3 4 // Find indices (0-based) of two numbers that sum to target in a sorted array. 5 // Returns {-1, -1} if no such pair exists. 6 pair<int,int> twoSumSorted(const vector<int>& a, int target) { 7 int l = 0, r = (int)a.size() - 1; 8 while (l < r) { 9 long long sum = (long long)a[l] + a[r]; // use 64-bit to avoid overflow on large ints 10 if (sum == target) return {l, r}; 11 if (sum < target) { 12 // Sum too small: only moving l right can increase the sum in a sorted array 13 ++l; 14 } else { 15 // Sum too large: only moving r left can decrease the sum 16 --r; 17 } 18 } 19 return {-1, -1}; 20 } 21 22 int main() { 23 ios::sync_with_stdio(false); 24 cin.tie(nullptr); 25 26 vector<int> a = {-5, -2, 0, 1, 2, 4, 10}; 27 int target = 8; 28 auto ans = twoSumSorted(a, target); 29 if (ans.first != -1) { 30 cout << "Found at indices: " << ans.first << ", " << ans.second << " (values: " 31 << a[ans.first] << ", " << a[ans.second] << ")\n"; 32 } else { 33 cout << "No pair sums to target\n"; 34 } 35 return 0; 36 } 37
We maintain two pointers at the extremes. If the sum is too small, we move the left pointer right to increase it; if too large, we move the right pointer left to decrease it. Sortedness guarantees we never skip a valid pair.
1 #include <bits/stdc++.h> 2 using namespace std; 3 4 // Returns the length of the shortest contiguous subarray with sum >= S in a non-negative array. 5 // If no such subarray exists, returns 0. 6 int minLenAtLeastS(const vector<int>& a, long long S) { 7 int n = (int)a.size(); 8 int best = INT_MAX; 9 long long sum = 0; // running window sum 10 int l = 0; // left boundary of the window (inclusive) 11 12 for (int r = 0; r < n; ++r) { 13 sum += a[r]; // expand window to the right 14 // While feasible, shrink from the left to make it as tight as possible 15 while (l <= r && sum - a[l] >= S) { 16 sum -= a[l]; 17 ++l; 18 } 19 if (sum >= S) { 20 best = min(best, r - l + 1); 21 } 22 } 23 return best == INT_MAX ? 0 : best; 24 } 25 26 int main() { 27 ios::sync_with_stdio(false); 28 cin.tie(nullptr); 29 30 vector<int> a = {2, 3, 1, 2, 4, 3}; 31 long long S = 7; 32 int ans = minLenAtLeastS(a, S); 33 cout << "Minimum length with sum >= " << S << " is: " << ans << "\n"; 34 return 0; 35 } 36
We keep a window [l..r] and a running sum. Each time the sum is at least S, we try to contract from the left to get the shortest feasible window. Non-negativity ensures that expanding increases the sum and contracting decreases it, which makes the logic correct.
1 #include <bits/stdc++.h> 2 using namespace std; 3 4 vector<int> mergeSorted(const vector<int>& a, const vector<int>& b) { 5 vector<int> c; 6 c.reserve(a.size() + b.size()); 7 size_t i = 0, j = 0; 8 while (i < a.size() && j < b.size()) { 9 // For stability, take from 'a' first when equal 10 if (a[i] <= b[j]) c.push_back(a[i++]); 11 else c.push_back(b[j++]); 12 } 13 while (i < a.size()) c.push_back(a[i++]); 14 while (j < b.size()) c.push_back(b[j++]); 15 return c; 16 } 17 18 int main() { 19 ios::sync_with_stdio(false); 20 cin.tie(nullptr); 21 22 vector<int> a = {1, 3, 5, 7}; 23 vector<int> b = {2, 2, 6, 8, 9}; 24 vector<int> c = mergeSorted(a, b); 25 for (int x : c) cout << x << ' '; 26 cout << "\n"; 27 return 0; 28 } 29
Two pointers i and j scan the sorted arrays. We repeatedly append the smaller of a[i] and b[j], advancing only that pointer. Each element is consumed exactly once, producing a linear-time merge.
1 #include <bits/stdc++.h> 2 using namespace std; 3 4 long long maxArea(const vector<int>& h) { 5 int l = 0, r = (int)h.size() - 1; 6 long long best = 0; 7 while (l < r) { 8 long long height = min(h[l], h[r]); 9 long long width = r - l; 10 best = max(best, height * width); 11 // Move the pointer at the shorter line to try to find a taller one 12 if (h[l] < h[r]) ++l; else --r; 13 } 14 return best; 15 } 16 17 int main() { 18 ios::sync_with_stdio(false); 19 cin.tie(nullptr); 20 21 vector<int> h = {1,8,6,2,5,4,8,3,7}; 22 cout << maxArea(h) << "\n"; // Expected 49 23 return 0; 24 } 25
We start with the widest container and compute area using the shorter height. Moving the taller side cannot increase the minimum height, so we always move the shorter side inward. This prunes the search while preserving optimality.