⚙️AlgorithmIntermediate

Tree DP - Rerooting Technique

Key Points

  • Rerooting (a.k.a. 换根 DP) computes a per-node answer as if each node were the root, in total O(n) time on trees.
  • It works by separating each child’s contribution so we can remove one child and add a parent-side contribution efficiently.
  • The pattern uses two traversals: a down-DFS to compute subtree DP and an up-DFS to feed parent-side information to children.
  • A clean way to reason is with a monoid: an associative merge, an identity, and a lift that transforms a child’s DP across an edge.
  • Prefix and suffix merges let us compute the answer for a child without its own contribution in O(1) per child after O(deg) preprocessing.
  • Classic applications include sum of distances for every node, node eccentricities, and per-root heights or products.
  • Care is needed to pick the right identity and to apply the edge transformation (e.g., +1 for depth) exactly once.
  • The technique extends to weighted edges by changing the lift function to add the edge weight.

Prerequisites

  • Tree basics (nodes, edges, degree, leaves)Understanding the structure being processed and properties like number of edges n−1.
  • Depth-first search (DFS)Both passes are DFS traversals over the tree.
  • Dynamic programming on treesRerooting builds on combining child results to get parent results.
  • Associativity and identity (monoid)Correctness and efficiency rely on associative merge and a proper identity element.
  • Prefix/suffix aggregatesUsed to exclude a single child’s contribution efficiently.
  • Big-O complexityTo reason about O(n) total time across degrees and edges.
  • Integer overflow and 64-bit arithmeticPrevent errors when sums can be Θ(n^2).

Detailed Explanation

Tap terms for definitions

01Overview

Rerooting (Tree DP rerooting) is an algorithmic technique to compute, for every vertex v of a tree, the value of some function that is typically defined on a rooted tree when v is considered as the root. A naive approach would recompute the DP from scratch for each root in O(n^2), which is too slow. Rerooting reduces the total cost to O(n) by cleverly reusing partial results. The process has two phases. First, a bottom-up DFS (down DP) computes a DP value for each node from its children, assuming a fixed root (often 0 or 1). Second, a top-down DFS (up DP) propagates the contribution from the parent side to each child. The key trick is to decompose the answer at a node into independent contributions of its neighbors. Using prefix and suffix merges, we can recompute a child’s view that excludes the child’s own contribution and includes everything else, then pass that information to the child as if it were coming from its parent. This method is widely applicable whenever the problem’s combination of child contributions is associative and has an identity element, allowing efficient removal and addition of contributions. Examples include computing the height for each possible root, the sum of distances from each node to all others, or the maximum distance (eccentricity) per node.

02Intuition & Analogies

Imagine you and a group of friends standing at the nodes of a tree-shaped network of paths. For some tasks, like asking “how far is everyone from the leader?” the answer depends on who the leader (root) is. If you change the leader, answers change. Recomputing everything from zero for each possible leader would be like having everyone rewalk the entire network each time—slow and repetitive. Rerooting is like preparing a shared memo. First, each person gathers information from people beneath them if we pick an arbitrary leader (say, person 0). That’s the down-DFS: each node collects and merges the reports from its children. Next, to let a child see the world with itself as leader, the parent takes all the information it has (including from other children and from above) and hands the child a “parent-side summary” that excludes the child’s own impact. With this summary, the child now has everything it needs to become the leader in its own perspective, without anyone repeating work. The secret sauce is that each neighbor’s contribution can be viewed independently and then merged with others. If the way we combine contributions is associative (like sum, max, product) and there’s an identity (like 0 for sum, 1 for product), then we can peel off one neighbor’s contribution, replace it by the parent-side contribution, and pass it down efficiently. Prefix/suffix arrays are just a neat way to “remove a single ingredient from a mixed batter” without remaking the batter from scratch every time.

03Formal Definition

Let G = (V, E) be a tree and fix an arbitrary root r. Suppose each node v has children ch(v) in the rooted tree. Many tree-DP tasks admit a form \[ dp[v] = \operatorname{add\_root}\Big( (dp[u], ) \Big), \] where \(\) is an associative binary operator with identity element \(\), lift applies an edge-dependent transformation (e.g., +1 for distance, +w for weighted edges), and ad optionally adjusts the merged value at v. In the second traversal, for each edge (v, u), we compute a value fro[u] that represents all contributions to u coming from outside u’s subtree when rooted at r. Using prefix/suffix merges over the multiset \(\{(dp[w]) : w (v) w u\}\) and the parent’s contribution to v, we get \[ [u] = \operatorname{lift}\Big( \Big), \] where each \(\) is the contribution of neighbor w to v (child or parent), lifted through edge (v, u). Finally, the answer at each v is \[ ans[v] = \operatorname{add\_root}\Big( \Big). \] These equations imply an O(n) solution since each edge contributes O(1) merges amortized when prefix/suffix arrays are used per node.

04When to Use

Use rerooting when you need a per-node answer that is naturally defined on a rooted tree, but the designated root varies over all nodes. Typical signals that rerooting fits: the DP combines children with an associative operator (sum, max, min, product modulo M), has a clear identity, and transforms values across edges in a uniform way (e.g., +1 depth). Classic use cases include: (1) computing for every node v the sum of distances to all other nodes; (2) computing the height (maximum depth) of the tree if v is the root; (3) computing each node’s eccentricity (largest distance to any node); (4) counting structures where each child’s contribution multiplies (e.g., ways to assign labels subject to local constraints) and you want the count for every choice of root; (5) weighted variants where distances or costs accumulate along edges. Avoid rerooting if the combination is non-associative, requires global constraints that cannot be decomposed per neighbor, or if the state per edge is too large to pass efficiently. In those cases, centroid decomposition, heavy-light decomposition, or offline tricks may be more suitable.

⚠️Common Mistakes

  1. Wrong identity element: Using 0 when the operator is max should be fine for nonnegative depths, but for minima you need +∞, and for products you need 1. Pick (\mathbf{e}) consistent with (\oplus).
  2. Double or missing edge lifting: For depth-like problems, each edge should contribute +1 exactly once. Applying +1 both during down and up passes can inflate answers; skipping it can undercount.
  3. Forgetting to exclude the child: When computing the parent-side contribution for a child, you must remove the child’s own contribution (hence prefix/suffix). Otherwise, you count that child twice.
  4. Overflow: Sums like total distances are (\Theta(n^2)). Use 64-bit integers (long long) and modulo arithmetic when needed.
  5. Recursion depth: Trees can be chains of length n. DFS recursion can stack overflow. Use iterative DFS or increase stack size for very deep trees.
  6. Building expensive temporaries: If the per-node DP object is large or non-trivially copyable, excessive copying in prefix/suffix loops can TLE. Prefer simple types or references, or reserve capacity and reuse buffers.
  7. Mixing 0-based/1-based indices: Be consistent in input parsing and printing to avoid off-by-one mistakes.

Key Formulas

Generic subtree DP

Explanation: Parent v’s DP is the merge of lifted child DPs, possibly with an extra adjustment at the root. This covers sums, maxima, products, etc.

Answer from neighbor contributions

Explanation: Each neighbor u contributes independently to v; merging them all yields the final answer at v. Here is child or parent-side contribution.

Parent-to-child transfer

Explanation: To send information from v to its child u, remove u’s own part and merge everything else, then transform across edge (v,u).

Subtree size

Explanation: The size of v’s subtree equals one for v plus the sizes of all child subtrees. This is a standard downward DP.

Sum of distances (down pass)

Explanation: Each child subtree contributes its internal sum plus one extra per node in that subtree (distance increases by 1).

Reroot transition for distance sums

Explanation: Moving root from v to child u makes sz[u] nodes one step closer and the other n - sz[u] nodes one step farther, updating the total distance sum.

Eccentricity transitions

Explanation: down[v] is the deepest depth in v’s subtree; up[u] accumulates the best path going through v via parent or via a sibling of u. The eccentricity is max(down[v], up[v]).

Linear time over trees

Explanation: Processing each node’s neighbors once (plus prefix/suffix of degree(v)) yields total linear time since the sum of degrees in a tree is 2(n−1).

Complexity Analysis

Rerooting on an unweighted tree runs in linear time O(n). The first DFS (down DP) visits each node once and processes each edge in both directions, leading to O(n). The second DFS (up DP) again touches each edge once and, at each node v, builds prefix and suffix merges over its neighbors, costing O(deg(v)). Summed over all nodes, this is O(∑ deg(v)) = O(n). For problems with constant-time merge and lift operations, the overall time remains O(n). If merge or lift are more expensive (e.g., small fixed-size structs), the constant factor grows correspondingly. Space complexity is O(n) for adjacency lists plus O(n) for DP arrays (down values, up values or answers). Temporary prefix/suffix arrays total O(deg(v)) per node and O(n) in aggregate, so peak auxiliary memory is O(ma deg(v)). The recursion stack in recursive DFS has depth equal to the tree height; in the worst-case chain this is O(n), which may require iterative DFS or stack size adjustments in practice. For weighted variants, the complexity remains O(n) provided lift and merge are O(1). Careful attention to integer width is required: some answers (e.g., sum of distances) can be Θ(), so 64-bit integers are necessary to avoid overflow. Overall, rerooting offers an asymptotically optimal one-pass-per-edge approach for per-root answers on trees.

Code Examples

Generic rerooting template (monoid) to compute height for each root
1#include <bits/stdc++.h>
2using namespace std;
3
4// Monoid for height (max depth) when rooted at each node
5// Operation: merge = max, identity = 0, lift(x) = x + 1 across an edge
6struct MonoidHeight {
7 using T = int;
8 static T id() { return 0; } // identity for max over nonnegative depths
9 static T merge(const T &a, const T &b) { return max(a, b); }
10 static T lift(const T &child) { return child + 1; } // cross one edge
11 static T finalize(const T &acc) { return acc; } // no extra adjustment at root
12};
13
14template<class M>
15struct Reroot {
16 using T = typename M::T;
17 int n;
18 vector<vector<int>> g;
19 vector<T> dp, ans; // dp[v]: down value; ans[v]: final answer considering all neighbors
20
21 Reroot(int n) : n(n), g(n), dp(n, M::id()), ans(n, M::id()) {}
22
23 void add_edge(int u, int v) {
24 g[u].push_back(v);
25 g[v].push_back(u);
26 }
27
28 void dfs1(int v, int p) {
29 T acc = M::id();
30 for (int u : g[v]) if (u != p) {
31 dfs1(u, v);
32 acc = M::merge(acc, M::lift(dp[u]));
33 }
34 dp[v] = acc;
35 }
36
37 void dfs2(int v, int p, T from_parent) {
38 // Collect child contributions (lifted) to v
39 vector<int> kids; kids.reserve(g[v].size());
40 vector<T> contribs; contribs.reserve(g[v].size());
41 for (int u : g[v]) if (u != p) {
42 kids.push_back(u);
43 contribs.push_back(M::lift(dp[u]));
44 }
45
46 // Answer at v merges all child contributions and the parent-side contribution
47 T total = from_parent;
48 for (const T &c : contribs) total = M::merge(total, c);
49 ans[v] = M::finalize(total);
50
51 int k = (int)contribs.size();
52 vector<T> pref(k + 1, M::id()), suff(k + 1, M::id());
53 for (int i = 0; i < k; ++i) pref[i + 1] = M::merge(pref[i], contribs[i]);
54 for (int i = k - 1; i >= 0; --i) suff[i] = M::merge(contribs[i], suff[i + 1]);
55
56 for (int i = 0; i < k; ++i) {
57 // Exclude i-th child and include from_parent, then lift across edge (v -> child)
58 T without_i = M::merge(pref[i], suff[i + 1]);
59 T pass = M::lift(M::merge(without_i, from_parent));
60 dfs2(kids[i], v, pass);
61 }
62 }
63
64 vector<T> solve(int root = 0) {
65 dfs1(root, -1);
66 dfs2(root, -1, M::id());
67 return ans;
68 }
69};
70
71int main() {
72 ios::sync_with_stdio(false);
73 cin.tie(nullptr);
74
75 int n;
76 if (!(cin >> n)) return 0;
77 Reroot<MonoidHeight> rr(n);
78 for (int i = 0; i < n - 1; ++i) {
79 int u, v; cin >> u >> v; // 0-indexed input assumed; adjust if 1-indexed
80 rr.add_edge(u, v);
81 }
82 vector<int> height = rr.solve(0);
83
84 // Output: height[v] = maximum depth when v is root
85 for (int v = 0; v < n; ++v) {
86 cout << height[v] << (v + 1 == n ? '\n' : ' ');
87 }
88 return 0;
89}
90

This is a reusable rerooting template specialized with a max monoid to compute, for every node v, the height (max depth to a leaf) when v is considered the root. The first DFS computes dp[v] by taking the max of lifted child results (depth + 1). The second DFS uses prefix/suffix to form the parent-side contribution for each child in O(1) per child and sets ans[v] as the merge of all neighbor contributions. Changing the monoid can solve many other rerooting problems.

Time: O(n)Space: O(n)
Sum of distances from every node to all others (classic rerooting)
1#include <bits/stdc++.h>
2using namespace std;
3
4// Compute, for each node v, S[v] = sum of distances from v to all other nodes.
5// Two-pass rerooting: down pass computes subtree sizes and partial sums; up pass reroots.
6
7int main() {
8 ios::sync_with_stdio(false);
9 cin.tie(nullptr);
10
11 int n;
12 if (!(cin >> n)) return 0;
13 vector<vector<int>> g(n);
14 for (int i = 0; i < n - 1; ++i) {
15 int u, v; cin >> u >> v; // 0-indexed assumed
16 g[u].push_back(v);
17 g[v].push_back(u);
18 }
19
20 vector<int> sz(n, 0);
21 vector<long long> dp(n, 0); // dp[v]: sum of distances from v to nodes in its subtree (rooted at 0)
22 vector<long long> ans(n, 0); // final answer for each v
23
24 // Downward DFS: compute sz[] and dp[] w.r.t. root 0
25 function<void(int,int)> dfs1 = [&](int v, int p) {
26 sz[v] = 1;
27 dp[v] = 0;
28 for (int u : g[v]) if (u != p) {
29 dfs1(u, v);
30 sz[v] += sz[u];
31 dp[v] += dp[u] + sz[u]; // each node in u's subtree is 1 farther from v than from u
32 }
33 };
34
35 // Upward DFS: propagate answers using the reroot formula
36 function<void(int,int)> dfs2 = [&](int v, int p) {
37 for (int u : g[v]) if (u != p) {
38 // Move root from v to u:
39 // Nodes in u's subtree get 1 closer: -sz[u]
40 // All other nodes (n - sz[u]) get 1 farther: +(n - sz[u])
41 ans[u] = ans[v] - sz[u] + (n - sz[u]);
42 dfs2(u, v);
43 }
44 };
45
46 // Initialize from root 0
47 dfs1(0, -1);
48 ans[0] = dp[0];
49 dfs2(0, -1);
50
51 for (int v = 0; v < n; ++v) {
52 cout << ans[v] << (v + 1 == n ? '\n' : ' ');
53 }
54 return 0;
55}
56

First DFS computes subtree sizes sz[v] and dp[v] = sum of distances from v to nodes in its subtree when rooted at 0. For the root, ans[0] = dp[0] is the sum to all nodes. In the second DFS, the reroot transition ans[u] = ans[v] − sz[u] + (n − sz[u]) adjusts the total when moving the root from v to child u: sz[u] nodes become one edge closer, and the remaining nodes become one edge farther. This yields all answers in linear time.

Time: O(n)Space: O(n)
Eccentricity of every node via down/up depths (two-pass rerooting)
1#include <bits/stdc++.h>
2using namespace std;
3
4// Eccentricity[v] = max distance from v to any node.
5// Compute down[v] = max depth within v's subtree (rooted at 0),
6// then up[v] = best path going through parent/siblings; eccentricity[v] = max(down[v], up[v]).
7
8int main() {
9 ios::sync_with_stdio(false);
10 cin.tie(nullptr);
11
12 int n;
13 if (!(cin >> n)) return 0;
14 vector<vector<int>> g(n);
15 for (int i = 0; i < n - 1; ++i) {
16 int u, v; cin >> u >> v; // 0-indexed assumed
17 g[u].push_back(v);
18 g[v].push_back(u);
19 }
20
21 vector<int> down(n, 0), up(n, 0), ecc(n, 0);
22
23 function<void(int,int)> dfs1 = [&](int v, int p) {
24 down[v] = 0; // leaf => depth 0
25 for (int u : g[v]) if (u != p) {
26 dfs1(u, v);
27 down[v] = max(down[v], down[u] + 1);
28 }
29 };
30
31 function<void(int,int)> dfs2 = [&](int v, int p) {
32 // Collect top two best downward depths from children
33 int best1 = -1, best2 = -1;
34 for (int u : g[v]) if (u != p) {
35 int cand = down[u] + 1;
36 if (cand > best1) { best2 = best1; best1 = cand; }
37 else if (cand > best2) { best2 = cand; }
38 }
39 for (int u : g[v]) if (u != p) {
40 int use = (down[u] + 1 == best1 ? best2 : best1); // best sibling path length via v
41 up[u] = max(up[v] + 1, use == -1 ? 1 : use + 1);
42 dfs2(u, v);
43 }
44 };
45
46 dfs1(0, -1);
47 up[0] = 0;
48 dfs2(0, -1);
49
50 for (int v = 0; v < n; ++v) {
51 ecc[v] = max(down[v], up[v]);
52 cout << ecc[v] << (v + 1 == n ? '\n' : ' ');
53 }
54 return 0;
55}
56

down[v] is the longest path going down from v. To compute up[u], we consider two options: come from the parent side (up[v] + 1) or go through v to the best sibling subtree (bestSibling + 2). Tracking the top two downward depths at v lets us exclude the child’s own contribution in O(1). Finally, eccentricity[v] is the max of down[v] and up[v].

Time: O(n)Space: O(n)