Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

ebook Progress In Neural Processing

By Peter Edwards

cover image of Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

Sign up to save your library

With an OverDrive account, you can save your favorite libraries for at-a-glance information about availability. Find out more about OverDrive accounts.

   Not today
Libby_app_icon.svg

Find this title in Libby, the library reading app by OverDrive.

app-store-button-en.svg play-store-badge-en.svg
LibbyDevices.png

Search for a digital library with this title

Title found at these libraries:

Loading...
Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a“fault tolerance hint” can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4