Image for Analogue Imprecision in Mlp Training.

Analogue Imprecision in Mlp Training.

Part of the Progress in Neural Processing series
See all formats and editions

Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms.

This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implication for learning and network performance.

The aim of the book is to present a study of how including an imprecision model into a learning scheme as a "fault tolerance hint" can aid understanding of accuracy and precision requirements for a particular implementation.

In addition the study shows how such a scheme can give rise to significant performance enhancement.

Read More
Special order line: only available to educational & business accounts. Sign In
£136.00
Product Details
World Scientific Publishing
9812830014 / 9789812830012
eBook (Adobe Pdf)
08/01/1996
Singapore
English
173 pages
Copy: 20%; print: 20%