Statements in which the resource exists as a subject.
PredicateObject
rdf:type
lifeskim:mentions
pubmed:issue
4
pubmed:dateCreated
1996-12-3
pubmed:abstractText
This paper investigates fault tolerance in feedforward neural networks, for a realistic fault model based on analog hardware. In our previous work with synaptic weight noise we showed significant fault tolerance enhancement over standard training algorithms. We proposed that when introduced into training, weight noise distributes the network computation more evenly across the weights and thus enhances fault tolerance. Here we compare those results with an approximation to the mechanisms induced by stochastic weight noise, incorporated into training deterministically via penalty terms. The penalty terms are an approximation to weight saliency and therefore, in addition, we assess a number of other weight saliency measures and perform comparison experiments. The results show that the first term approximation is an incomplete model of weight noise in terms of fault tolerance. Also the error Hessian is shown to be the most accurate measure of weight saliency.
pubmed:language
eng
pubmed:journal
pubmed:citationSubset
IM
pubmed:status
MEDLINE
pubmed:month
Dec
pubmed:issn
0129-0657
pubmed:author
pubmed:issnType
Print
pubmed:volume
6
pubmed:owner
NLM
pubmed:authorsComplete
Y
pubmed:pagination
401-16
pubmed:dateRevised
2006-11-15
pubmed:meshHeading
pubmed:year
1995
pubmed:articleTitle
Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance?
pubmed:affiliation
Department of Electrical Engineering, University of Edinburgh, Scotland, UK.
pubmed:publicationType
Journal Article, Review, Research Support, Non-U.S. Gov't