Search code examples
machine-learninglinear-regressiongradient-descent

What does learning algorithm output in linear regression?


Reading course notes of Andrew NG's machine learning course it states for linear regression :

Take a training set and pass it into a learning algorithm. The algorithm outputs a function h (the hypothesis). h takes an input and tries to output estimated value y.

It then goes on to say :

present h as : h theta(x) = theta0 + theta1x

Does this not mean the hyptohesis was not outputted by the learning algorithm, instead we just defined it as h theta(x) = theta0 + theta1x

Instead of "Take a training set and pass it into a learning algorithm. The algorithm outputs a function h (the hypothesis)." should the statement be "Take a training set and pass it into a learning algorithm. The algorithm outputs value(s) which make the hypothesis as accurate as possible" ?


Solution

  • In principle you are right here. A true learning algorithm as defined in learning theory is an algorithm that gets labelled instances and a whole class of possible hypotheses as input and then chooses one hypothesis as an output.

    So strictly speaking, an algorithm that outputs the predictions is not a learning algorithm. But of course such an algorithm can be split into a learning algorithm - the algorithm that actually learns the parameters, here the thetas. and a prediction algorithm that transforms some input instances to our predictions which are then returned to the caller.