Synthesis of multiple-valued logic functions by neural networks.

Title: Synthesis of multiple-valued logic functions by neural networks.
Authors: Ngom, Alioune.
Date: 1998
Abstract: The issue we address in this thesis is that of implementing multiple-valued logic systems by neural networks. In particular we discuss models of multiple-valued logic neurons (and neural networks) and how such models might be used to learn or compute multiple-valued logic functions. Analog computers are inherently inaccurate due to imperfections in fabrication and fluctuations in operating temperatures. The classical solution to this problem uses extra hardware to enforce discrete behavior. However, the brain appears to compute reliably well with inaccurate components without necessarily resorting to discrete techniques. The continuous neural network is a computational model based upon certain observed features of the brain. Experimental evidence has shown continuous neural network to be extremely fault-tolerant; in particular, their performance does not appear to be significantly impaired when precision is limited. It has been shown in literature that analog neurons of limited precision are essentially discrete multiple-valued neurons. Our research focuses on the synthesis of multiple-valued functions by some models of multiple-valued neural networks. We introduce an interesting model of multiple-valued neuron, the multiple-valued multiple-threshold perceptron, which extends and generalizes the well-known binary perceptron and other previously studied models. Beside investigating the computational and learning abilities of this main model, we discuss some methods of optimizing identified resources of multiple-valued neural networks. Our main contributions in the field of neural networks and multiple-valued logic are summarized as follows. We obtain bounds on the computing capacity of multiple-valued multiple-threshold perceptrons. We obtain a high capacity learning algorithm for the multiple-valued multiple-threshold perceptron with guaranteed convergence properties. We show two techniques of constructing minimal multiple-valued neural networks for given but arbitrary functions. We propose a method for minimizing the number of thresholds of multiple-valued perceptrons. We introduce a neuro-genetic approach to minimization of multiple-valued logic expressions.
CollectionTh├Ęses, 1910 - 2010 // Theses, 1910 - 2010
NQ36787.PDF6 MBAdobe PDFOpen