Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

Peter Edwards

Format: Print Book

ISBN: 9789810227395

  • Rp 1.460.984,71
    Unit price per 
Tax included.


Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a“fault tolerance hint” can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.

Format: Hardcover
No of Pages: 192
Imprint: World Scientific
Publication date: 19960801
Series: Progress In Neural Processing





We Also Recommend