I am new to learn Neural Network at Machine Learning. when i calculate activation functions for boolean functions like AND/OR/NOT, there need not Hidden Layer
but XOR/XNOR and so on boolean function need Hidden Layer
for calculating activation function. why? (I search in google for that but can't take clear knowledge for that). Are there number of neuron
in Hidden layer dependent at number of Input neuron??
If to put simply hidden layer adds additional transformation of inputs, which is not easy achievable with single layer networks ( one of the ways to achieve it is to add some kind of non linearity to your input). Second layer adds additional transformations and can feet to more complicated tasks. If to speak about AND/OR/NOT they are considered as linearly separable tasks. Look at picture which displays values of AND ( three dots are for false). It's possible to separate true values from false values with one line, and this can be done with help of neural network without hidden layers. But if to speak about XOR/XNOR you need two lines, and two lines can be constructed by neural network with two layers and with non linear activation function. Yellow line shows separation which can be done with neural network.