Search results for key=SVL1992 : 1 match found.

Refereed full papers (journals, book chapters, international conferences)

1992

Patrice Simard, Bernard Victorri, Yann Le Cun and John Denker, Tangent Prop - A formalism for specifying selected invariances in an adaptive network, Advances in Neural Information Processing Systems, 4, pp. 895-903, 1992.

In many machine learning applications, one has access not only to training data, but also to some high-level a priori knowledge about the desired behaviour of the system. For example, it is known in advance that the output of a character recognizer should be invariant with respect to small spatial distortions of the input images (translations, rotations, scale changes, etcetera). We have implemented a scheme that allows a network to learn the derivative of its outputs with respect to distortion operators of our choosing. This not only reduces the learning time and the amount of training data, but also provides a powerful language for specifying what generalizations we wish the network to perform.