My question is perhaps a bit underspecified, but what I would like to do is find an algorithm for efficiently decomposing a tree (via cutting edges) into a set of subtrees such that the maximum depth among subtrees is small. Here is one version of the problem:
Problem 1: Given a tree and a number n, cut n edges of the tree so that maximum depth of subtrees is minimal (among choices of cuts).
For example, for n=1 in Graph 1, cut at A. For n=2 in Graph 2, cut at B and C
For n=1, with a couple passes through the tree we can store for each node its depth, height, and the height of each of its children. Then for each edge (from a parent to child) the size of the two subtrees created will be (i) the depth of the parent + the maximum height of its other children + 1 and (ii) the height of the child. Then take the max of these two values for each edge, and choose the minimum among edges.
For n=2, we can stop at each edge and obtain (i) and (ii), and then for whichever one is larger, run the n=1 case. We would be able to save some work because if both subtrees are larger than our current min, we can just skip them. However this type of recurrence seems like it would do poorly for larger n.
Another version of the problem might be:
Problem 2: Given a tree and a depth d, what is the shortest sequence of cuts that will leave each subtree with depth less than or equal to d?
If anyone is aware of algorithms about similar problems I'd also be interested in hearing about them. This is really for a practical use case, so I'd also consider just an optimization algorithm that uses a good heuristic for choosing which cuts to make. Any input would be appreciated, thanks.
For problem 2:
Total run time: O(n). The middle step is amortized linear time across the algorithm; each time we hit it, it's linear in the number of nodes being removed from consideration, but each node is removed only once.
For problem 1, we can run binary search on d using the approach outlined above for problem 2. This makes problem one O(n log n).