MULTILAYER PERCEPTRON

Preview:

DESCRIPTION

MULTILAYER PERCEPTRON. Nurochman , Teknik Informatika UIN Sunan Kalijaga Yogyakarta. Review SLP. X1. w 1. Σ. f(y ). w 2. X2. output. activation f unc. Σ x i .w i. w i. X3. weight. Fungsi Aktivasi. Fungsi undak biner (hard limit) Fungsi undak biner (threshold). . - PowerPoint PPT Presentation

Citation preview

MULTILAYER PERCEPTRONNurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta

Review SLP

ΣX2

.

.

.

w1

w2

wi

weight

f(y)

Σ xi.wi

activation func

X3

X1

output

Fungsi Aktivasi Fungsi undak biner (hard limit)

Fungsi undak biner (threshold)

Fungsi Aktivasi Fungsi bipolar

Fungsi bipolar dengan threshold

Fungsi Aktivasi Fungsi Linier (identitas)

Fungsi Sigmoid biner

Learning Algorithm Inisialisasi laju pembelajaran (α), nilai

ambang (𝛉), bobot serta bias Menghitung

Menghitung

Learning Algorithm Jika y ≠ target, lakukan update bobot

dan biasWi baru = Wlama + α.t.Xib baru = b lama + α.t

Ulang dari langkah 2 sampai tidak ada update bobot lagi

Problem “OR”X1 X2 net Y, 1 jika net >=1, 0 jika net < 1

1 1 1.1+1.1=2 11 0 1.1+0.1=1 10 1 0.1+1.1=1 10 0 0.1+0.1=0 0

Ternyata BERHASIL mengenali pola

X1

X2

Y

1

1

1

Problem “AND”X1 X2 net Y, 1 jika net >=2, 0 jika net < 2

1 1 1.1+1.1=2 11 0 1.1+0.1=1 00 1 0.1+1.1=1 00 0 0.1+0.1=0 0

Ternyata BERHASIL mengenali pola

X1

X2

Y

2

1

1

Problem “X1 and not(X2)”X1 X2 net Y, 1 jika net >=2, 0 jika net < 2

1 1 1.2+1.-1=1 01 0 1.2+0.-1=2 10 1 0.2+1.-1=-1 00 0 0.2+0.-1=0 0

Ternyata BERHASIL mengenali pola

X1

X2

Y

2

2

-1

HOW ABOUT XOR?

Problem “XOR”X1 X2 Y

1 1 01 0 10 1 10 0 0

GAGAL!

F(1,1) = 0

F(1,0) = 1F(0,0) = 0

F(0,1) = 1

Solusi XOR = (x1 ^ ~x2) V (~x1 ^ x2) Ternyata dibutuhkan sebuah layer

tersembunyiX1

X2

Z1

Z2

Y

2

2

-1-1

1

1

2

2

1

Tabel

Multi-Layer Perceptron MLP is a feedforward neural network with

at least one hidden layer (Li Min Fu) Limitations of Single-Layer Perceptron Neural Network for Nonlinier Pattern

Recognition XOR Problem

Solution for XOR Problem

X1 X2 X1 XOR X2

-1 -1 -1

-1 1 1

1 -1 1

1 1 -1

1

1

-1

-1

x1

x2

Solution from XOR Problem

+1

+1+1

+1-1

-1

x1

x2

-1

0,1

-1

1 if v > 0(v) =

-1 if v 0 is the sign function.

Input to Hidden layer

x1

x2

Net1 f1 Net2 f2

-1 -1 (-1.1+-1.-1) +-1=-1

-1 (-1.-1+-1.1)+-1 = -1

-1

-1 1 (-1.1+1.-1)+-1= -3

-1 (-1.-1+1.1)+-1 = 1

1

1 -1 (1.1+-1.-1) +-1= 1

1 (1.-1+-1.1)+-1 = -3

-1

1 1 (1.1+1.-1)+-1 = -1

-1 (1.-1+1.1)+-1 = -1

-1

Hidden to Output layer

Z1 Z2 Net Y

-1 -1 (-1.1+-1.1) = -1,9 -1

-1 1 (-1.1+1.1) = 0,1 1

1 -1 (1.1+-1.1) = 0,1 1

-1 -1 (-1.1+-1.1) = -1,9 -1

Learning Algorithm Backpropagation Algorithm

It adjusts the weights of the NN in order to minimize the average squared error

Function signalsForward Step

Error signalsBackward Step

BP has two phases Forward pass phase: computes ‘functional

signal’, feedforward propagation of input pattern signals through network

Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)

Activation Function Sigmoidal Function

-10 -8 -6 -4 -2 2 4 6 8 10

jv

)( jv 1

Increasing ajave

1

1j)(v

i,...,0

jijv ywmi

Recommended