Search code examples
language-agnosticfunctionmachine-learningneural-networkapproximation

Algorithm for online approximation of a slowly-changing, real valued function


I'm tackling an interesting machine learning problem and would love to hear if anyone knows a good algorithm to deal with the following:

  • The algorithm must learn to approximate a function of N inputs and M outputs
  • N is quite large, e.g. 1,000-10,000
  • M is quite small, e.g. 5-10
  • All inputs and outputs are floating point values, could be positive or negative, likely to be relatively small in absolute value but no absolute guarantees on bounds
  • Each time period I get N inputs and need to predict the M outputs, at the end of the time period the actual values for the M outputs are provided (i.e. this is a supervised learning situation where learning needs to take place online)
  • The underlying function is non-linear, but not too nasty (e.g. I expect it will be smooth and continuous over most of the input space)
  • There will be a small amount of noise in the function, but signal/noise is likely to be good - I expect the N inputs will expain 95%+ of the output values
  • The underlying function is slowly changing over time - unlikely to change drastically in a single time period but is likely to shift slightly over the 1000s of time periods range
  • There is no hidden state to worry about (other than the changing function), i.e. all the information required is in the N inputs

I'm currently thinking some kind of back-propagation neural network with lots of hidden nodes might work - but is that really the best approach for this situation and will it handle the changing function?


Solution

  • With your number of inputs and outputs, I'd also go for a neural network, it should do a good approximation. The slight change is good for a back-propagation technique, it should not have to 'de-learn' stuff.