View Single Post
  #4  
Old 31st July 2007, 06:52 AM
wesmip1 wesmip1 is offline
Member
 
Join Date: Sep 2005
Posts: 1,601
Default

Backpropagation is a supervised learning technique used for training artificial neural networks.

Essentially (i stole this from wikipeida)

Summary of the technique:
  1. Present a training sample to the neural network.
  2. Compare the network's output to the desired output from that sample. Calculate the error in each output neuron.
  3. For each neuron, calculate what the output should have been, and a scaling factor, how much lower or higher the output must be adjusted to match the desired output. This is the local error.
  4. Adjust the weights of each neuron to lower the local error.
  5. Assign "blame" for the local error to neurons at the previous level, giving greater responsibility to neurons connected by stronger weights.
  6. Repeat the steps above on the neurons at the previous level, using each one's "blame" as its error.
The reason I have chosen it is that it is good for complex problems where there is a need for generalisations of the problem. All this means is it can handle similar data which may/may not come to the same result.

I guess you could say it is similar to a back rating but much more complex as it learns the rating itself.

Good Luck.
Reply With Quote