title: Why feed-forward networks are in a bad shape creator: Smagt, P. van der creator: Hirzinger, G. subject: Neural Nets subject: Statistical Models description: It has often been noted that the learning problem in feed-forward neural networks is very badly conditioned. Although, generally, the special form of the transfer function is usually taken to be the cause of this condition, we show that it is caused by the manner in which neurons are connected. By analyzing the expected values of the Hessian in a feed-forward network it is shown that, even in a network where all the learning samples are well chosen and the transfer function is not in its saturated state, the system has a non-optimal condition. We subsequently propose a change in the feed-forward network structure which alleviates this problem. We finally demonstrate the positive influence of this approach. publisher: Springer Verlag, Berlin date: 1998 type: Conference Paper type: NonPeerReviewed format: application/postscript identifier: http://cogprints.org/495/2/SmaHir98b.ps identifier: Smagt, P. van der and Hirzinger, G. (1998) Why feed-forward networks are in a bad shape. [Conference Paper] (In Press) relation: http://cogprints.org/495/