Search code examples
machine-learningneural-networkperceptron

Is weights are diffrent for each training Example in perceptrons


I am new to the Neural network. I have training dataset of 1K examples. each example contains the 5 features.

Initially, I provided some to value to weights.

So, Is there is 1K value is stored for weights associated with each example or the weight values remain same for all the 1K examples?

For example:

example1 => [f1,f2,f3,f4,f5] -> [w1e1,w2e1,w3e1,w4e1,w5e1]
example2 => [f1,f2,f3,f4,f5] -> [w1e2,w2e2,w3e2,w4e2,w5e2]

Here w1 means first weight and e1, e2 mean different examples.

or example1,example2,... -> [gw1,gw2,gw3,gw4,gw5]

Here g means global and w1 means weight for feature one as so on.


Solution

  • Start with a single node in the Neural network. It's output is sigmoid function applied to the linear combination of input as shown below.

    Single node (perceptron) in a Neural Network

    So for 5 features you will have 5 weights + 1 bias for each node of the neural network. While training, a batch of inputs are fed, the output at then end of the neural network is calculated, the error is calculated with respect to the actual outputs and gradients are backpropogated based on the error. In simple words, the weights are adjusted based on the error.

    So for each node you have 6 weights, and depending on the number of nodes (which depends on the number of layers and size of the layers) you can calculate number of weights. All the weights are updated once per batch (since you are doing batch training)