I am having a continuously incoming data represented by an array of integer x = [x1,...,xn], n<1 000 000
. Each two elements satisfy the following condition x[i] < x[i + 1]
.
I need to detected as fast as possible such a breakpoint, where the linear trend of these data ends and transforms into a quadratic trend. The data always starts with linear trend...
I tried to compute
k = (x[i+1] - x[i])/ (x[i] - x[i-1])
But this test not too reliable... Maybe there is a more simple and efficent statistic test... The computation of the regression line is slow in this case...
Actually you calculate a derivative of the function. Possibly you should use more points for calculating it e.g. 5, see Five-point stencil