Word:

back-propagation

back-propagation - (Or "backpropagation") A learning algorithm for modifying a feed-forward neural network which minimises a continuous "error function" or "objective function." Back-propagation is a "gradient descent" method of training in that it uses gradient information to modify the network weights to decrease the value of the error function on subsequent tests of the inputs. Other gradient-based methods from numerical analysis can be used to train networks more efficiently.

Back-propagation makes use of a mathematical trick when the network is simulated on a digital computer, yielding in just two traversals of the network (once forward, and once back) both the difference between the desired and actual output, and the derivatives of this difference with respect to the connection weights.
Browse
Back step
Back stream
back street
back talk
back tooth
back up
Back-acting steam engine
back-and-forth
Back-biting
back-blast
Back-bond
back-channel
back-end
back-formation
back-geared
back-number
-- back-propagation --
back-to-back
Back-water
backache
Backarack
Backare
Backband
backbeat
backbench
backbencher
backbend
Backbite
Backbiter
Backbiting
backblast
Backboard
Backbond
Definitions Index: # A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

About this site and copyright information - Online Dictionary Home - Privacy Policy