Search code examples
perceptron

Perceptron Learning - Update Weights


I am studying Perceptron Learning, and have a question which sort of leaves a bit confused. As I am self-teaching, I have looked through a variety of papers, tutorials, powerpoints etc., and at times it seems they use different algorithms to adjust the weights of the network.

For example, some include a learning rate, others include individual weight/input product while others just the sum of all weight/input products.

So, am I right in assuming that there are multiple algorithms which all lead to the same final weight matrix/vector?


Solution

  • Nope, not the same.

    You are right that there are many algorithms, but they may lead to different weights. Its like sorting algorithms - there are many, each of them does the same thing, but some are stable and some are not, some use additional memory, and some sorts in place.