?url_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rft.title=Minimisation+methods+for+training+feed-forward+networks&rft.creator=Smagt%2C+P.+van+der&rft.subject=Neural+Nets&rft.description=Minimisation+methods+for+training+feed-forward+networks+with+back-propagation+are+compared.+Feed-forward+neural+network+training+is+a+special+case+of+function+minimisation%2C+where+no+explicit+model+of+the+data+is+assumed.+Therefore%2C+and+due+to+the+high+dimensionality+of+the+data%2C+linearisation+of+the+training+problem+through+use+of+orthogonal+basis+functions+is+not+desirable.+The+focus+is+on+function+minimisation+on+any+basis.+Quasi-Newton+and+conjugate+gradient+methods+are+reviewed%2C+and+the+latter+are+shown+to+be+a+special+case+of+error+back-propagation+with+momentum+term.+Three+feed-forward+learning+problems+are+tested+with+five+methods.+It+is+shown+that%2C+due+to+the+fixed+stepsize%2C+standard+error+back-propagation+performs+well+in+avoiding+local+minima.+However%2C+by+using+not+only+the+local+gradient+but+also+the+second+derivative+of+the+error+function+a+much+shorter+training+time+is+required.+Conjugate+gradient+with+Powell+restarts+shows+to+be+the+superior+method.&rft.date=1994&rft.type=Journal+(Paginated)&rft.type=PeerReviewed&rft.format=application%2Fpostscript&rft.identifier=http%3A%2F%2Fcogprints.org%2F497%2F2%2Fchapter2.ps&rft.identifier=++Smagt%2C+P.+van+der++(1994)+Minimisation+methods+for+training+feed-forward+networks.++%5BJournal+(Paginated)%5D+++++&rft.relation=http%3A%2F%2Fcogprints.org%2F497%2F