Search code examples
testingupdatestraining-dataepochweighted

Different results in different epochs


Why we get different results(weights) while using multiple epochs also we use the same data in each epoch? What is happening in each epoch? Is each epoch use the previous weights?__________________________________________________________________________________________________________


Solution

  • In the case where the same data is being used, the results you get in each epoch may vary. This, however, is not a bug, but a feature of stochastic machine learning models i.e. models in which the loss function depends on one or more stochastic variables(random variables). Adding this randomness to a model often results in better solutions. Because of the random nature of the initialization of weights and the training procedure, your loss function will converge to several different local minima.

    However, if the loss function is convex in nature(which is the case with algorithms like linear regression), the introduction of any randomness will not affect the result as the loss function will always converge at the global minima.