Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI...

Preview:

Citation preview

Associative Memory by Recurrent Neural Networks with Delay Elements

Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADAKobe City College of Tech. Ibaraki Univ.   RIKEN BSI , ERATO KDB

JAPAN JAPAN JAPAN    

miyoshi@kobe-kosen.ac.jp

www.kobe-kosen.ac.jp/~miyoshi/

Background• Synapses of real neural systems seem to have delays.

• It is very important to analyze associative memory model with delayed synapses.

• Computer simulation is powerful method.

• Theoretical and analytical approach is indispensable to research on delayed networks.

There is a Limit on the number of neurons.However,

• Yanai-Kim theory by using Statistical Neurodynamics

Good Agreement with computer simulationComputational Complexity is O(L4t)

Simulating network with large delay steps is realistically impossible.

Objective

• To derive macroscopic steady state equations by using discrete Fourier transformation

• To discuss storage capacity quantitatively even for a large L limit (L: length of delay)

Recurrent Neural Network with Delay Elements

Delay E lement1 l L-1

J

J

ij ij ij

ji ji ji ji

ijL-1

L-1

l

l

1

1

0

0

J

J

J

J

J

J

Neuron

i

N

1

j

Model

• Overlap

Model

• Discrete Synchronous Updating Rule

• Correlation Learning for Sequence Processing

Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995)   Miyoshi, Yanai & Okada(2002)

Initial Condition of the NetworkOne Step Set Initial Condition• Only the states of neurons are

set explicitly.

• The states of delay elements are set to be zero.

All Steps Set Initial Condition• The states of all neurons and all

delay elements are set to be close to the stored pattern sequences.

• If they are set to be the stored pattern sequences themselves

≡ Optimum Initial Condition

NeuronDelay E lement

i

N

1

1 l L-1

j

J

J

ij ij ij

ji ji ji ji

ijL-1

L-1

l

l

1

1

0

0

J

J

J

J

J

J

set zero

NeuronDelay E lement

i

N

1

1 l L-1

j

J

J

ij ij ij

ji ji ji ji

ijL-1

L-1

l

l

1

1

0

0

J

J

J

J

J

J

set

Dynamical Behaviors of Recall Processo

ver

lap

m

1

0.8

0.6

0.4

0.2

0

time step t

0 5 10 15 20 25 30

time step t

0 5 10 15 20 25 30

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

All Steps Set Intial Condition Loading rateα=0.5

Length of delay L=3

Simulation( N=2000)

Theory

time step t

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0 5 10 15 20 25 30

time step t

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0 5 10 15 20 25 30

Dynamical Behaviors of Recall Process

  All Steps Set Intial Condition Loading rateα=0.5

Length of delay L=2

Simulation( N=2000)

Theory

Loading rates α - Steady State Overlaps m

Simulation( N=500)

Ov

erla

p

m

1

0

0.2

0

L=1 L=3

One Step Set Optimum

L=10

L=10

0.5 1 1.5 2

0.4

0.6

0.8

αLoading RateO

ver

lap

m

1

0

0.2

0.4

0.6

0.8

0 0.5 1 1.5 2

αLoading Rate

L=3

One Step Set Optimum

L=10L=10L=1

Theory

Length of delay L - Critical Loading Rate αC

Cri

tica

l Lo

ad

ing

Ra

te

1

2

0.21 2 3 4 5 6 7 8 9 10

2.2

1.2

0.4

1.4

0.6

1.6

0.8

1.8

Length of Delay L

Optimum

One Step Set

Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995)   Miyoshi, Yanai & Okada(2002)

• Good Agreement with Computer Simulation

• Computational Complexity is O(L4t)

Macroscopic Steady State Equations• Accounting for Steady State• Parallel Symmetry in terms of Time Steps• Discrete Fourier Transformation

Loading rates α - Steady State Overlaps m

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L=1

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

L= 10

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

L= 10L=100

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

L= 10L=100

L= 1000

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

L= 10L=100

L= 1000L= 10000

1000 10000

αLoading Rate

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

L= 10L=100

L= 1000L= 10000

L=100000

1000 10000

αLoading Rate

Loading rates α - Steady State Overlaps m

Simulation( N=500)

Ov

erla

p

m

1

0

0.2

0

L=1 L=3

One Step Set Optimum

L=10

L=10

0.5 1 1.5 2

0.4

0.6

0.8

αLoading RateO

ver

lap

m

1

0

0.2

0.4

0.6

0.8

0 0.5 1 1.5 2

αLoading Rate

L=3

One Step Set Optimum

L=10L=10L=1

Theory

Loading rate α - Steady State Overlap

ov

erla

p

m

1

0.8

0.6

0.4

0.2

0

0.1 1 10 100

L= 1L= 2L= 3

L= 10L=100

L= 1000L= 10000

L=100000

1000 10000

αLoading Rate

Storage Capacity of Delayed Network

Sto

rag

e C

ap

ac

ity

αC

Number of Delays L

1

10.1

10

10

100

100

1000

1000

10000

10000 100000

100000

Storage Capacity = 0.195 L

Conclusions• Yanai-Kim theory (macrodynamical equations for del

ayed network) is re-derived.

• Steady state equations are derived by using discrete Fourier transformation.

• Storage capacity is 0.195 L in a large L limit.

→ Computational Complexity is O(L4t)

→ Intractable to discuss macroscopic properties in a large L limit

→ Computational complexity does not formally depend on L→ Phase transition points agree with those under the optimum initial conditions, that is, the Storage Capacities !

Recommended