Search code examples
mathnumerical-methodspolynomial-math

efficiently determining if a polynomial has a root in the interval [0,T]


I have polynomials of nontrivial degree (4+) and need to robustly and efficiently determine whether or not they have a root in the interval [0,T]. The precise location or number of roots don't concern me, I just need to know if there is at least one.

Right now I'm using interval arithmetic as a quick check to see if I can prove that no roots can exist. If I can't, I'm using Jenkins-Traub to solve for all of the polynomial roots. This is obviously inefficient since it's checking for all real roots and finding their exact positions, information I don't end up needing.

Is there a standard algorithm I should be using? If not, are there any other efficient checks I could do before doing a full Jenkins-Traub solve for all roots?

For example, one optimization I could do is to check if my polynomial f(t) has the same sign at 0 and T. If not, there is obviously a root in the interval. If so, I can solve for the roots of f'(t) and evaluate f at all roots of f' in the interval [0,T]. f(t) has no root in that interval if and only if all of these evaluations have the same sign as f(0) and f(T). This reduces the degree of the polynomial I have to root-find by one. Not a huge optimization, but perhaps better than nothing.


Solution

  • Sturm's theorem lets you calculate the number of real roots in the range (a, b). Given the number of roots, you know if there is at least one. From the bottom half of page 4 of this paper:

    Let f(x) be a real polynomial. Denote it by f0(x) and its derivative f′(x) by f1(x). Proceed as in Euclid's algorithm to find

    f0(x) = q1(x) · f1(x) − f2(x),
    f1(x) = q2(x) · f2(x) − f3(x),
    .
    .
    .
    fk−2(x) = qk−1(x) · fk−1(x) − fk,
    

    where fk is a constant, and for 1 ≤ i ≤ k, fi(x) is of degree lower than that of fi−1(x). The signs of the remainders are negated from those in the Euclid algorithm.

    Note that the last non-vanishing remainder fk (or fk−1 when fk = 0) is a greatest common divisor of f(x) and f′(x). The sequence f0, f1,. . ., fk (or fk−1 when fk = 0) is called a Sturm sequence for the polynomial f.

    Theorem 1 (Sturm's Theorem) The number of distinct real zeros of a polynomial f(x) with real coefficients in (a, b) is equal to the excess of the number of changes of sign in the sequence f0(a), ..., fk−1(a), fk over the number of changes of sign in the sequence f0(b), ..., fk−1(b), fk.