Path: utzoo!utgpu!water!watmath!clyde!att!osu-cis!tut.cis.ohio-state.edu!accelerator.eng.ohio-state.edu!lepton.eng.ohio-state.edu!borgstrm
From: borgstrm@lepton.eng.ohio-state.edu (Tom Borgstrom)
Newsgroups: comp.ai.neural-nets
Subject: Analog Vs. Digital Weights
Keywords: Neural Nets, synaptic weights
Message-ID: <646@accelerator.eng.ohio-state.edu>
Date: 25 Sep 88 21:13:25 GMT
Sender: news@accelerator.eng.ohio-state.edu
Reply-To: borgstrm@icarus.eng.ohio-state.edu (Tom Borgstrom)
Organization: The Ohio State University Dept of Electrical Engineering
Lines: 20

I am interested in finding performance/capacity comparisons between neural
networks that use discrete synaptic weights and those that use continuous
valued weights.

I have one reference: "The Capacity of the Hopfield Associative Memory", by
R.J. McEliece, E.C. Posner, et al.; IEEE Transactions on Information
Theory, Vol. IT-33, No. 4, July 1987.  The authors claim to "only lose 19
percent of capacity by ... three level quantization."  Is this true? Has
anyone else done hardware/software simulations to verify this?

Please reply by e-mail; I will post a summary if there is a large enough
response. 



-=-
Tom Borgstrom            |borgstrm@icarus.eng.ohio-state.edu
The Ohio State University|...!osu-cis!tut!icarus.eng.ohio-state.edu!borgstrm
2015 Neil Avenue         |
Columbus, Ohio  43210    |