Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised...

Preview:

Citation preview

Unsupervised Learning Networks

主講人 : 虞台文

Content Introduction Important Unsupervised Learning NNs

– Hamming Networks– Kohonen’s Self-Organizing Feature Maps– Grossberg’s ART Networks– Counterpropagation Networks– Adaptive BAN– Neocognitron

Conclusion

Unsupervised Learning Networks

Introduction

What is Unsupervised Learning?

Learning without a teacher.

No feedback to indicate the desired outputs.

The network must by itself discover the

relationship of interest from the input data.

– E.g., patterns, features, regularities, correlations, or

categories.

Translate the discovered relationship into

output.

A Strange World

Supervised Learning

IQ

Hei

ght

A B

C

Supervised Learning

IQ

Hei

ght

A B

C

Try ClassificationTry Classification

The Probabilities of Populations

IQ

Hei

ght

A B

C

The Centroids of Clusters

IQ

Hei

ght

A B

C

The Centroids of Clusters

IQ

Hei

ght

A B

C

Try ClassificationTry Classification

Unsupervised Learning

IQ

Hei

ght

Unsupervised Learning

IQ

Hei

ght

Clustering Analysis

IQ

Hei

ght

Categorize the input patterns into several classes based on the similarity among patterns.

Clustering Analysis

IQ

Hei

ght

Categorize the input patterns into several classes based on the similarity among patterns.

How many classes we may have?

How many classes we may have?

Clustering Analysis

IQ

Hei

ght

Categorize the input patterns into several classes based on the similarity among patterns.

2 clusters2 clusters

Clustering Analysis

IQ

Hei

ght

Categorize the input patterns into several classes based on the similarity among patterns.

3 clusters3 clusters

Clustering Analysis

IQ

Hei

ght

Categorize the input patterns into several classes based on the similarity among patterns.

4 clusters4 clusters

Unsupervised Learning Networks

The Hamming Networks

The Nearest Neighbor Classifier

Suppose that we have p prototypes centered at x(1),

x(2), …, x(p). Given pattern x, it is assigned to the class label of t

he ith prototype if

Examples of distance measures include the Hamming distance and Euclidean distance.

( )arg min ( , )k

ki dist x x

The Nearest Neighbor Classifier

11 22

33 44

x(1) x(2)

x(3)x(4)

The Stored PrototypesThe Stored Prototypes

The Nearest Neighbor Classifier

11 22

33 44

x(1) x(2)

x(3)x(4)

?Class

The Hamming Networks

Stored a set of classes represented by a set of binary prototypes.

Given an incomplete binary input, find the class to which it belongs.

Use Hamming distance as the distance measurement.

Distance vs. Similarity.

The Hamming Net

Similarity Measurement

MAXNET Winner-Take-All

x1 x2 xn

The Hamming Distance

y = 1 1 1 1 1 1 1

x = 1 1 1 1 1 1 1

Hamming Distance = ?Hamming Distance = ?

y = 1 1 1 1 1 1 1

x = 1 1 1 1 1 1 1

The Hamming Distance

Hamming Distance = 3Hamming Distance = 3

y = 1 1 1 1 1 1 1

The Hamming Distance

1 1 1 1 1 1 1

Sum=1

12( , ) (7 1) 3HD x y

x = 1 1 1 1 1 1 1

The Hamming Distance

1 2( , , , ) {1, 1}Tm iy y y y y

1 2( , , , ) {1, 1}Tm ix x x x x

( , ) ?HD x y

( , ) ?Similarity x y

The Hamming Distance

1 2( , , , ) {1, 1}Tm iy y y y y

12( , ) ( )THD m x y x y

12

1 12 2

( , ) ( )

T

T

Similarity m m

m

x y x y

x y

1 2( , , , ) {1, 1}Tm ix x x x x

The Hamming Net

Similarity Measurement

MAXNET Winner-Take-All

11 22 n1n1 nn

x1 x2 xm1 xm

11 22 n1n1 nn

y1 y2 yn1 yn

The Hamming Net

Similarity Measurement

MAXNET Winner-Take-All

11 22 n1n1 nn

x1 x2 xm1 xm

11 22 n1n1 nn

y1 y2 yn1 yn

WS=?WS=?

WM=?WM=?

The Stored Patterns

Similarity Measurement

MAXNET Winner-Take-All

11 22 n1n1 nn

x1 x2 xm1 xm

11 22 n1n1 nn

y1 y2 yn1 yn

WS=?WS=?

WM=?WM=?

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

kTk mSimilarity sxsx 21

21),( kTk mSimilarity sxsx 2

121),(

1 12 2

1

( , )m

k ki i

i

Similarity m x s

x s 1 12 2

1

( , )m

k ki i

i

Similarity m x s

x s

The Stored Patterns

Similarity Measurement

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

),( kSimilarity sx ),( kSimilarity sxk

x1 x2 xm. . .

112ks 1

22ks

12

kms

m/2

kTk mSimilarity sxsx 21

21),(

kTk mSimilarity sxsx 21

21),(

1 12 2

1

( , )m

k ki i

i

Similarity m x s

x s 1 12 2

1

( , )m

k ki i

i

Similarity m x s

x s

Weights for Stored Patterns

Similarity Measurement

11 22 n1n1 nn

x1 x2 xm1 xm

WS=?WS=?

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

1 1 11 22 2 21 2

1 2

1

2

m

mS

n n nm

s s s

s s s

s s s

W

1 1 11 22 2 21 2

1 2

1

2

m

mS

n n nm

s s s

s s s

s s s

W

112

212

12

T

T

S

T n

s

s

s

x

xW x

x

112

212

12

T

T

S

T n

s

s

s

x

xW x

x

Weights for Stored Patterns

Similarity Measurement

11 22 n1n1 nn

x1 x2 xm1 xm

WS=?WS=?

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

1 2

1 2( , , ) { 1,

Stored patterns

with 1 }.

n

k k k k T km i

n , , ,

s s s s

s s s

s

m/2 m/2 m/2

m/2

2/mθi 2/mθi

The MAXNET

Similarity Measurement

MAXNET Winner-Take-All

11 22 n1n1 nn

x1 x2 xm1 xm

11 22 n1n1 nn

y1 y2 yn1 yn

Weights of MAXNET

MAXNET Winner-Take-All11 22 n1n1 nn

y1 y2 yn1 yn

11

Weights of MAXNET

MAXNET Winner-Take-All11 22 n1n1 nn

y1 y2 yn1 yn

0< < 1/n0< < 1/n

1

1

1

1

M

ε ε ε

ε ε ε

ε ε ε

ε ε ε

W

11

Updating Rule

MAXNET Winner-Take-All11 22 n1n1 nn

0< < 1/n0< < 1/n

11

s1 s2 s3 sn1

1

1

1

M

ε ε ε

ε ε ε

ε ε ε

ε ε ε

W

11ty 1

2ty 1

1tny

1tny

1ty 2

ty 1tny

tny

0i iy s

1 ( )t tMa y W y

0 1, , ,Tt t t t

ny y yy

Updating Rule

MAXNET Winner-Take-All11 22 n1n1 nn

0< < 1/n0< < 1/n

11

s1 s2 s3 sn1

1

1

1

M

ε ε ε

ε ε ε

ε ε ε

ε ε ε

W

11ty 1

2ty 1

1tny

1tny

1ty 2

ty 1tny

tny

0i iy s

1 ( )t tMa y W y

0 1, , ,Tt t t t

ny y yy

n

j

tj

ti

n

ijj

tj

ti

n

j

tjij

ti εyyεaεyyayway

111

1 )1(

n

j

tj

ti

n

ijj

tj

ti

n

j

tjij

ti εyyεaεyyayway

111

1 )1(

Analysis Updating Rule

n

j

tj

ti

n

ijj

tj

ti

n

j

tjij

ti εyyεaεyyayway

111

1 )1(

n

j

tj

ti

n

ijj

tj

ti

n

j

tjij

ti εyyεaεyyayway

111

1 )1(

Let

00

0)(

net

netnetneta

If now , 00

ti

v i ky v

i k

ki

kivy t

i 01

Analysis Updating Rule

n

j

tj

ti

n

ijj

tj

ti

n

j

tjij

ti εyyεaεyyayway

111

1 )1(

n

j

tj

ti

n

ijj

tj

ti

n

j

tjij

ti εyyεaεyyayway

111

1 )1(

Let

00

0)(

net

netnetneta

If nowti

i

tk yy max

11 max ti

i

tk yy

0 if 1 tj

ti

ti yijyy

Example

0

0.2

0.4

0.6

0.8

1

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

1 2 3 4 5 6 7 8 9 10

Unsupervised Learning Networks

The Self-organizing Feature Map

Feature Mapping

Map high-dimensional input signals onto a lower-dimensional (usually 1 or 2D) structure.

Similarity relations present in the original data are still present after the mapping.

Dimensionality Reduction Dimensionality Reduction

Topology-Preserving Map Topology-Preserving Map

Somatotopic Map Illustration:The “Homunculus”

The relationship between body surfaces and the regions of the brain that control them.

Another Depiction of the Homunculus

Phonotopic maps

Phonotopic maps

humppila

Self-Organizing Feature Map

Developed by professor Kohonen.One of the most popular neural n

etwork models. Unsupervised learning.Competitive learning networks.

The Structure of SOM

Example

Local Excitation, Distal Inhibition

Topological Neighborhood

Square Hex

Size Shrinkage

)(*

1tiN

)(*

1tiN

)(*

2tiN

)(*

2tiN

)(*

3tiN

)(*

3tiN

Size Shrinkage

)(*

1tiN

)(*

1tiN

)(*

2tiN

)(*

2tiN

)(*

3tiN

)(*

3tiN

Learning Rule

jj

i ww ˆxminˆx *

Similarity Matching*i*i

Updating

otherwisew

Niwxαww t

ij

ti

tij

tj

ttijt

ij )(

)(*

)()()()()1( ][

Example

Example

Example

Example

Recommended