- neural network
-
1. any group of neurons that conduct impulses in a coordinated manner, as the assemblages of brain cells that record a visual stimulus.2. Also called neural net. a computer model designed to simulate the behavior of biological neural networks, as in pattern recognition, language processing, and problem solving, with the goal of self-directed information processing.[1985-90]
* * *
Type of parallel computation in which computing elements are modeled on the network of neurons that constitute animal nervous systems.This model, intended to simulate the way the brain processes information, enables the computer to "learn" to a certain degree. A neural network typically consists of a number of interconnected processors, or nodes. Each handles a designated sphere of knowledge, and has several inputs and one output to the network. Based on the inputs it gets, a node can "learn" about the relationships between sets of data, sometimes using the principles of fuzzy logic. For example, a backgammon program can store and grade results from moves in a game; in the next game, it can play a move based on its stored result and can regrade the stored result if the move is unsuccessful. Neural networks have been used in pattern recognition, speech analysis, oil exploration, weather prediction, and the modeling of thinking and consciousness.* * *
a computer program that operates in a manner analogous to the natural neural network in the brain. The theoretical basis of neural networks was developed in 1943 by the neurophysiologist Warren McCulloch of the University of Illinois and the mathematician Walter Pitts of the University of Chicago. In 1954 Belmont Farley and Wesley Clark of the Massachusetts Institute of Technology succeeded in running the first simple neural network. The primary appeal of neural networks is their ability to emulate the brain's pattern-recognition skills. Among commercial applications of this ability, neural networks have been used to make investment decisions, recognize handwriting, and even detect bombs.A distinguishing feature of neural networks is that knowledge is distributed throughout the network itself rather than being explicitly written into the program. The network then learns through exposure to various situations. Neural networks are able to accomplish this because they are built of processing elements (artificial neurons) grouped into layers, as shown in the figure—> of a simple feedforward network. The input layer of artificial neurons receives information from the environment, and the output layer communicates the response; between these layers may be one or more “hidden” layers (with no direct contact with the environment), where most of the information processing takes place. The output of a neural network depends on the “weights” of the connections between neurons in different layers. Each weight indicates the relative importance of a particular connection. If the total of all the weighted inputs received by a particular neuron surpasses a certain threshold value, the neuron will send a signal to each neuron to which it is connected in the next layer. Neural networks may be used, for example, to process loan applications, in which the inputs may represent loan application data and the output whether or not to grant a loan.Two modifications of this simple feedforward neural network account for the growth of commercial applications. First, a network can be equipped with a feedback mechanism, known as a back-propagation algorithm, that enables it to adjust the connection weights back through the network, training it in response to representative examples. Second, recurrent neural networks can be developed, involving signals that proceed in both directions as well as within and between layers, and these networks are capable of vastly more complicated patterns of association. (In fact, for large networks it can be extremely difficult to follow exactly how an output was determined.)Training neural networks typically involves supervised learning, where each training example contains the values of both the input data and the desired output. As soon as the network is able to perform sufficiently well on additional test cases, it can be used to classify new cases. For example, researchers at the University of British Columbia have trained a feedforward neural network with temperature and pressure data from the tropical Pacific Ocean and from North America to predict future global weather patterns.In contrast, certain neural networks are trained through unsupervised learning, in which a network is presented with a collection of input data and given the goal of discovering patterns—without being told what specifically to look for. Such a neural network might be used, for example, to discover clusters of customers in a marketing database during a process known as data mining.Vladimir Zwass* * *
Universalium. 2010.