- Brief History of Neural Networks
- Definition of a Neural Network
- Learning in a Neural Network
- The Pattern Associator
- The Hebb Rule
- The Delta Rule
- The Generalized Delta Rule

This module on Neural Networks was written by Ingrid Russell of the University of Hartford . It is being printed with permission from Collegiate Microcomputer Journal.

If you have any comments or suggestions, please send email to irussell@mail.hartford.edu

The earliest work in neural computing goes back to the 1940's when McCulloch and Pitts introduced the first neural network computing model. In the 1950's, Rosenblatt's work resulted in a two-layer network, the perceptron, which was capable of learning certain classifications by adjusting connection weights. Although the perceptron was successful in classifying certain patterns, it had a number of limitations. The perceptron was not able to solve the classic XOR (exclusive or) problem. Such limitations led to the decline of the field of neural networks. However, the perceptron had laid foundations for later work in neural computing.

In the early 1980's, researchers showed renewed interest in neural networks. Recent work includes Boltzmann machines, Hopfield nets, competitive learning models, multilayer networks, and adaptive resonance theory models.

- Brief History of Neural Networks
- Definition of a Neural Network
- Learning in a Neural Network
- The Pattern Associator
- The Hebb Rule
- The Delta Rule
- The Generalized Delta Rule

Copyright 1996 by Ingrid Russell.