Neural Networks Module

The module below can be covered in a three week period in the Introduction to AI Course

This module on Neural Networks was written by Ingrid Russell of the University of Hartford . It is being printed with permission from Collegiate Microcomputer Journal.

If you have any comments or suggestions, please send email to

The Pattern Associator

A pattern associator learns associations between input patterns and output patterns. One of the most appealing characteristics of such a network is the fact that it can generate what it learns about one pattern to other similar input patterns. Pattern associators have been widely used in distributed memory modeling.

The pattern associator is one of the more basic two-layer networks. Its architecture consists of two sets of units, the input units and the output units. Each input unit connects to each output unit via weighted connections. Connections are only allowed from input units to output units. The effect of a unit ui in the input layer on a unit uj in the output layer is determined by the product of the activation ai of ui and the weight of the connection from ui to uj. The activation of a unit uj in the output layer is given by: SUM(wij * ai).

A pattern associator can be trained to respond with a certain output pattern when presented with an input pattern. The connection weights can be adjusted in order to change the input/output behavior. However, one of the most interesting properties of these models is their ability to self-modify and learn. The learning rule is what specifies how a network changes it weights for a given input/output association. The most commonly used learning rules with pattern associators are the Hebb rule and the Delta rule.

Copyright 1996 by Ingrid Russell.