Author 
Topic: MLP question 

fitziano 
Posted: 05Nov10 09:25 



Hello I have a question about multilayer percepton networks. For example I have an input layer, a hidden layer and an output layer and i use backpropagation for training. Should i multiply the input layer with weights, or the values enter the network without the influence of any weight? I'm a little confused. Thank you in advance 


Nikola 
Posted: 31Jan11 02:39 



There are no "input layer weights". Weights are assigned to the inputs of neurons, so your inputs should only be multiplied by weights when they enter an actual neuron.
Normally, the input layer has no neurons, it is just said to have processing units for convenience during back propagation. Thus we say that an input layer consists of N neurons, where N is the number of inputs, each with one input, weight on that input 1, and a linear activation function ( f(x) = x ). This is not actually so, but it makes indexing the neurons and their weights easier and it makes no difference in the model. 

