Misplaced Pages

Bidirectional associative memory

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

Bidirectional associative memory (BAM) is a type of recurrent neural network. BAM was introduced by Bart Kosko in 1988. There are two types of associative memory, auto-associative and hetero-associative. BAM is hetero-associative, meaning given a pattern it can return another pattern which is potentially of a different size. It is similar to the Hopfield network in that they are both forms of associative memory. However, Hopfield nets return patterns of the same size.

It is said to be bi-directional as it can respond to inputs from either the input or the output layer.


Topology

A BAM contains two layers of neurons, which we shall denote X and Y. Layers X and Y are fully connected to each other. Once the weights have been established, input into layer X presents the pattern in layer Y, and vice versa.

The layers can be connected in both directions (bidirectional) with the result the weight matrix sent from the X layer to the Y layer is W {\displaystyle W} and the weight matrix for signals sent from the Y layer to the X layer is W T {\displaystyle W^{T}} . Thus, the weight matrix is calculated in both directions.

Procedure

Learning

Imagine we wish to store two associations, A1:B1 and A2:B2.

  • A1 = (1, 0, 1, 0, 1, 0), B1 = (1, 1, 0, 0)
  • A2 = (1, 1, 1, 0, 0, 0), B2 = (1, 0, 1, 0)

These are then transformed into the bipolar forms:

  • X1 = (1, -1, 1, -1, 1, -1), Y1 = (1, 1, -1, -1)
  • X2 = (1, 1, 1, -1, -1, -1), Y2 = (1, -1, 1, -1)

From there, we calculate M = X i T Y i {\displaystyle M=\sum {\!X_{i}^{T}Y_{i}}} where X i T {\displaystyle X_{i}^{T}} denotes the transpose. So,

M = [ 2 0 0 2 0 2 2 0 2 0 0 2 2 0 0 2 0 2 2 0 2 0 0 2 ] {\displaystyle M=\left}

Recall

To retrieve the association A1, we multiply it by M to get (4, 2, -2, -4), which, when run through a threshold, yields (1, 1, 0, 0), which is B1. To find the reverse association, multiply this by the transpose of M.

Capacity

The memory or storage capacity of BAM may be given as min ( m , n ) {\displaystyle \min(m,n)} , where " n {\displaystyle n} " is the number of units in the X layer and " m {\displaystyle m} " is the number of units in the Y layer.

The internal matrix has n x p independent degrees of freedom, where n is the dimension of the first vector (6 in this example) and p is the dimension of the second vector (4). This allows the BAM to be able to reliably store and recall a total of up to min(n,p) independent vector pairs, or min(6,4) = 4 in this example. The capacity can be increased above by sacrificing reliability (incorrect bits on the output).

Stability

A pair ( A , B ) {\displaystyle (A,B)} defines the state of a BAM. To store a pattern, the energy function value for that pattern has to occupy a minimum point in the energy landscape.

The stability analysis of a BAM is based on the definition of Lyapunov function (energy function) E {\displaystyle E} , with each state ( A , B ) {\displaystyle (A,B)} . When a paired pattern ( A , B ) {\displaystyle (A,B)} is presented to BAM, the neurons change states until a bi-directionally stable state ( A f , B f ) {\displaystyle (A_{f},B_{f})} is reached, which Kosko proved to correspond to a local minimum of the energy function. The discrete BAM is proved to converge to a stable state.

The Energy Function proposed by Kosko is E ( A , B ) = A M B T {\displaystyle E(A,B)=-AMB^{T}} for the bidirectional case, which for a particular case A = B {\displaystyle A=B} corresponds to Hopfield's auto-associative energy function. (i.e. E ( A , B ) = A M A T {\displaystyle E(A,B)=-AMA^{T}} ).

See also

References

  1. ^ Kosko, B. (1988). "Bidirectional Associative Memories" (PDF). IEEE Transactions on Systems, Man, and Cybernetics. 18 (1): 49–60. doi:10.1109/21.87054.
  2. ^ "Principles of Soft Computing, 3ed". www.wileyindia.com. Retrieved 2020-08-15.
  3. ^ RAJASEKARAN, S.; PAI, G. A. VIJAYALAKSHMI (2003-01-01). NEURAL NETWORKS, FUZZY LOGIC AND GENETIC ALGORITHM: SYNTHESIS AND APPLICATIONS (WITH CD). PHI Learning Pvt. Ltd. ISBN 978-81-203-2186-1.

External links

Category: