Make your own free website on Tripod.com

NEURAL NETWORK

PROJECT # 5

Binary Hopfield Network

1.     Introduction:

Binary networks deal with activation values +1 and -1.In this particular instance, the network is trained to recognize a certain pattern within a 12x10 grid.In order for patterns to be formed on the grid, net input neurons are set to certain activation levels.If the net input is less than 0, the neuron sets its activation to -1.If the input is greater than 0, then the neuron sets its activation to +1.There are occasions when the net input is 0, in which case the neuron activation does not change or stays the same.

2.     Source Code

 

Binary_network.java

Pattern Recognition Algorithm

Bi_Hop_Disp.java

GUI user interface

 

Pattern Recognition Simulation

 

3.                 Simulation Results

        Test each pattern to see if it is stable. That is, if you initialize the network with a perfect version of the pattern (no noise), and then update the neurons, does the network remain in the pattern? Report the results for each pattern. Is it stable? If not, then show the final state of the network after it converges. Describe how close the final state is to the initial state.

 

Answer

In a network with a perfect version of the pattern (no noise), I found that the results were stable.The final state was very close to the initial state.

 

        If you have any instability in you choice of patterns for the 0, 1, 2, 3 digits, go ahead and change the patterns until you get them to be stable. To do this you have to study the patterns and look for the activations that make one pattern distinct from the others. For example, the distinction between 0 and 1 should be clear, since they share very few activation values. I am not sure how you can do this, but you should change the patterns so that they are "more different". That is, you want to minimize the amount of times the training process increments and then decrements a connection strength. One way around this is to increase the resolution of the grid (might go to 50x50, but keep in mind that the number of connection strengths had just increased dramatically).

 

Answer

 

Without any noise on the network, I did not see any instability in the results for any patterns.

 

        Test the network with noisy versions of the patterns. If the patterns

themselves are stable (result of step 2), then we know that the network will remain in that pattern if initialized that way. But what if we add a small amount of noise? For example, if the initial state of the network is exactly the first pattern, but one bit is flipped, will the network updates restore the pattern? If it does, then we say that the network "recalled" the correct pattern, or memory, based on a noisy input. We might also say that the network "completed the pattern" correctly. So, let's test things out by starting with noise levels of 5% 10% and 20% and see which patterns are recalled correctly. Keep statistics on how the network performed.


Answer

 

Table of Network Statistics

 

Patterns

Noise Levels

Iteration

Energy

Coverged

Comments

0

0

1

-9496.0

Yes

No noise

 

0.05

10000

-11496

No

Unstable, no convergence

 

0.1

2

-9496.0

Yes

Stable, completed pattern correctly

 

0.2

2

-9496.0

Yes

Stable, completed pattern correctly

1

0

1

-7384

Yes

No noise

 

0.05

2

-7384

Yes

Stable, completed pattern correctly

 

0.1

2

-7384

Yes

Stable, completed pattern correctly

 

0.2

2

-7384

Yes

Stable, completed pattern correctly

2

0

1

-10424

Yes

No noise

 

0.05

10000

-11496

No

Unstable, no convergence

 

0.1

2

-10424

Yes

Stable, completed pattern correctly

 

0.2

10000

-11496

No

Unstable, no convergence

3

0

1

-11016

Yes

No noise

 

0.05

2

-11016

Yes

Stable, completed pattern correctly

 

0.1

2

-11016

Yes

Stable, completed pattern correctly

 

0.2

2

-11016

Yes

Stable, completed pattern correctly

 

Pattern: 0, Noise Level: 0

 

†††††

 

Pattern: 0, Noise Level: 0.05

†††††††

 

Pattern: 0, Noise Level: 0.1

 

†††††††††

 

Pattern: 0, Noise Level: 0.2

 

††††††††

 

Pattern: 1, Noise Level: 0

 

†††††††††††

 

Pattern: 1, Noise Level: 0.05

 

†††††††

 

Pattern: 1, Noise Level: 0.1

 

††††††††††

 

Pattern: 1, Noise Level: 0.2

 

†††††††††††

 

Pattern: 2, Noise Level: 0

 

†††††††††

 

Pattern: 2, Noise Level: 0.05

†††††††

 

Pattern: 2, Noise Level: 0.1

 

††††††††††

 

Pattern: 2, Noise Level: 0.2

 

†††††††††††

 

Pattern: 3, Noise Level: 0

 

†††††††††

 

 

Pattern: 3, Noise Level: 0.05

 

†††††† ††††

 

Pattern: 3, Noise Level: 0.1

 

†††††† †††††

 

 

Pattern: 3, Noise Level: 0.2

 

†††††

 

References: Previous studentís work and particularly Susan Zabaronick.

 

Wolfe, W. Neural Networks Course Materials