MAKHFI.COM
Fascinating World of Neural Nets  
Neural Network Forums
Home PageHome Page : General Neural Networks Topics : MLP question
  You are currently not logged in. You can view the forums, but cannot post messages. | HOME | Log In | Register | Search | Help
Post a Reply on This Topic Post a Reply on This Topic

Author Topic: MLP question
fitziano Posted: 05-Nov-10 09:25
  Edit Edit
 
Email the Author Mail   View Author's Profile Profile  
Hello I have a question about multilayer percepton networks.
For example I have an input layer, a hidden layer and an output layer and i use backpropagation for training. Should i multiply the input layer with weights, or the values enter the network without the influence of any weight? I'm a little confused.
Thank you in advance
 
Nikola Posted: 31-Jan-11 02:39
Delete Delete    Edit Edit
 
Email the Author Mail   View Author's Profile Profile  
There are no "input layer weights". Weights are assigned to the inputs of neurons, so your inputs should only be multiplied by weights when they enter an actual neuron.

Normally, the input layer has no neurons, it is just said to have processing units for convenience during back propagation. Thus we say that an input layer consists of N neurons, where N is the number of inputs, each with one input, weight on that input 1, and a linear activation function ( f(x) = x ). This is not actually so, but it makes indexing the neurons and their weights easier and it makes no difference in the model.
 

Post a Reply on This Topic Post a Reply on This Topic
 

Copyright © 2001-2003 Pejman Makhfi. All rights Reserved.