102
Applying back propagation to shape recognition Presented by Amir Mahdi Azadi Farzane Salami Osame Ghavidel Ahmad Lahuti

Applying back propagation to shape recognition

  • Upload
    tamyra

  • View
    74

  • Download
    3

Embed Size (px)

DESCRIPTION

Applying back propagation to shape recognition. Presented by Amir Mahdi Azadi Farzane Salami Osame Ghavidel Ahmad Lahuti. outline. BP algorithm IDP Shape recognition with matlab code Face recognision with matlab code. Introduction. - PowerPoint PPT Presentation

Citation preview

Page 1: Applying back propagation to shape recognition

Applying back propagation to shape recognition

Presented byAmir Mahdi Azadi Farzane SalamiOsame Ghavidel Ahmad Lahuti

Page 2: Applying back propagation to shape recognition

outline

• BP algorithm• IDP• Shape recognition with matlab code• Face recognision with matlab code

Page 3: Applying back propagation to shape recognition

Introduction

( ورباسVerbos در )شيوه آموزش پ�س انتشار خط�ا يا 1974 Back Propagation يک ک�ه کرد بیان خود دکتری رس�اله در را

شبک�ه پرس�پترون چن�د الي�ه البت�ه ب�ا قواني�ن نيرومندت�ر آموزشی بود. چون خط�ا را در طول شبکه رو ب�ه عقب منتش�ر می کن�د به

این عنوان نامگذاری شد.

الگوریتمBP( 1985 توسط )Williams وHinton وRumelhart گسترش یافت.

Page 4: Applying back propagation to shape recognition

Introduction…

نزول شی�ب روش از استفاده Gradient descend)الگوریت�م )دسته ب�ا ت�ا کن�د تنظی�م را طوری شبک�ه پارامترهای ت�ا کرده نمون�ه های آموزش�ی تطاب�ق داشت�ه باشند. یادگیری شبکه های عص�بی در مقاب�ل خط�ا در داده های آموزش�ی مقاوم است و در

(و Lang 1990(، شناسایی گفتار )Cottrell 1990تشخی�ص چهره )( کاربرد دارد.LeCun 1989تشخیص کاراکترهای دست نویس )

Page 5: Applying back propagation to shape recognition

Back Propagation Algorithm

وزنهای الزم برای یک شبکه چند الیه با ساختار BPالگوریتم •را پیدا می کند. در این الگوریتم از شیب نزولی شبکه ثابت

برای مینیمم کردن میزان خطا استفاده می شود.

است.روشهای یادگیری با ناظر یکی از BPالگوریتم •

Page 6: Applying back propagation to shape recognition

Back Propagation Algorithm…

• Training a multilayer perceptron is the same as training a perceptron; the only difference is that now the output is a nonlinear function of the input thanks to the nonlinear basis function in the hidden units. Considering the hidden units as inputs, the second layer is a perceptron and we know how to update the parameters, . For the first-layer weights,, we use the chain rule to calculate the gradients:

Page 7: Applying back propagation to shape recognition

Back Propagation Algorithm…

• The learning problem faced by Back propagation is to search a large hypothesis space defined by all possible weight values for all the units in the network. Gradient descent can be used to find a hypothesis to minimize E.

•Backpropagation (training_examples, Ƞ,)

•Each training example is a pair of the form (), where is the vector of network input values and is the vector of target network output values.

Page 8: Applying back propagation to shape recognition

Back Propagation Algorithm…

•Ƞ is the learning rate (e.g., .05),is the number of network inputs, the number of units in the hidden layer, the number of output units.

•The input from unit i into unit j is denoted, and the weight from unit i to unit j is denoted.

Page 9: Applying back propagation to shape recognition

Error Back Propagation Learning

•The back propagation algorithm learns the weights for a multilayer network, given a network with a fixed set of units and interconnections. It employs gradient descent to minimize the squared error between the network output values and the target values for these outputs.

• • Outputs is the set of output units in the network

• and are the target and output values associated with kth output unit and training example d.

Page 10: Applying back propagation to shape recognition

Hypothesis Space

دو مقدار ممکن w1و w0دو محور •خطی واح00د وزن بردار برای

سوم محور میزان Eهس00تند. از ای دسته ب00ه مربوط خطای را خاص آموزشی های نمون00ه خطای سطح دهد. م00ی نشان

نشان داده شده در شک0ل ارجحیت ه0ر بردار وزن را در فضای فرضیه نشان م0ی دهد. ب0ا توج0ه ب0ه تعریف

سهمی وار س0طح خط0ا همیش0ه و مینیمم اس00ت نقط�ه ی�ک

خواهد داشت.مطلق

Page 11: Applying back propagation to shape recognition

Back Propagation Rules

Create a feed-forward network with inputs, hidden units, and output units.

Initialize all network weights to small random numbers.

Until the termination condition is met, Do

o For each () in training-examples, Do

• Propagate the input forward through the network:

1. Input the instance to the network and compute output of every unit u in the network.

Page 12: Applying back propagation to shape recognition

Update Weights

• Propagate the error backward through the network:

2. For each network output unit k, calculate its error term

3. For each hidden unit h, calculate its error term

4. Update each network weight

Page 13: Applying back propagation to shape recognition

BP Neural Network

mjS mw

,1

mjw ,1

mjiw ,

Layer m

1

j

mS

1

i

Layer m-1

1mS

mw 1,1miw 1,

mS mw

1,1

mSS mmw

,1

mS mw

,1m

Si mw,

1

k

MS

Layer MMa1

Mka

MS Ma

Layer 1

1p

2p

Rp

11,1w

11,2w

11,1S

w

1,1 RS

w

1

2

1S

Page 14: Applying back propagation to shape recognition

Structuring the network

• In some applications, we may believe that the input has a local structure. For example, in vision we know that nearby pixels are correlated and there are local features like edges and corners; any object, for example, a handwritten digit, may be defined as a combination of such a primitives.

• When designing the MLP, hidden units are not connected to all input units because not all inputs are correlated. Instead we define hidden units that are connected to only a small local subset of the inputs. (Le Cun 1989)

Page 15: Applying back propagation to shape recognition

Structuring the network…

• A structured MLP. Each unit is connected to a local group of units below it and checks for a particular feature.

Page 16: Applying back propagation to shape recognition

Applications•NETtalk: Neural networks that learn to pronounce English text (Sejnowski and Rosenberg, 1987)•Speech recognition (Cohen et al, 1993; Renals et al. 1992)•Optical character recognition (Sackinger et al. 1992; LeCun et al., 1990)•On-line handwritten character recognition (Guyon, 1990)•Combining visual and acoustic speech signals for improved intelligibility (Sejnowski et al., 1990)•System identification (Narendra and Parthasarathy, 1990)•Steering of an autonomous vehicle (Pomerleau, 1992)

Page 17: Applying back propagation to shape recognition

Example- Speech Recognition

برای • گفتار تشخیص دار صدا حروف تشخی0ص صدا ب0ی حروف دو بی0ن

h,d.در ده حالت مختلف ورودی س0یگنال ه0ا بصورت •

از که عددی پارامت0ر دو آمده بدست ص0دا آنالی0ز

است.صدایی • شبک0ه بین0ی پی0ش

مقدار بیشترین ک0ه اس0ت داشته را شبک0ه خروج0ی

باشد.

Page 18: Applying back propagation to shape recognition

Steering of an autonomous vehicle

توسط • ه0ا بزرگراه در متوس0ط ب0ا س0رعت فرمان کنترل س0یستم Pomerleau (1993) شبکه ای0ن ورودی اس0ت. شده طراح0ی

تص0ویر ی0ک به 30× 32عص0بی رو دوربی0ن از ک0ه اس0ت ای نقط0ه جلوی0ی ک0ه در داخ0ل اتومبی0ل کار گذاشت0ه شده اس0ت گرفت0ه می شود. باید آ0ن س0مت ب0ه فرمان ک0ه اس0ت جهت0ی عص0بی خروج0ی شبک0ه

دقیقه 5بچرخد. شبک0ه برای تقلی0د فرمان ده0ی انس0ان در طول حدود 70آموزش داده م0ی شود. شبک0ه موف0ق شده خوردو را ت0ا سرعت

مایل در بزرگراه کنترل کند.90مایل در ساعت و برای مسافت

Page 19: Applying back propagation to shape recognition

Steering of an autonomous vehicle

Page 20: Applying back propagation to shape recognition

Handwritten Digit recognition

تشخیص ارقام است.BPیکی از کاربردهای شبکه های •حداقل پردازش داده ها مورد نیاز است.•ورودی شبکه تصاویر نرماالیز شده از ارقام منفک شده است.•% است.9% و خطای عدم پذیرش 1خطای روش •

Page 21: Applying back propagation to shape recognition

ZIP code recognition

ورودی شامل پیکسلهای سیاه و سفید است. •ارقام براحتی از پس زمینه جدا می شوند.

فهرست از خروجی وجود دارد.10•ارقام سایز ها و سبک های مختلفی دارند.•نرونهای الیه اول شدت روشنائی پیکسلها را •

تقریب میزنندونرونهای الیه آخر شکل ارقام را تعیین •

میکنند.

Page 22: Applying back propagation to shape recognition

ZIP code recognition

مختلف 9298ابتدا • حالتهای در رق��م segment ارقام این شود. م��ی بندی

توسط افراد مختلفی نوشته شده است. 35 رق�م چاپی از 3349ای�ن مجموع�ه ب�ا •

فونت مختلف تکمیل شده است.این • مجموع�ه ای�ن های ویژگ�ی از یک�ی

و هم آموزش�ی های داده ه�م ک�ه اس�ت داده های تس�ت مبه�م و غی�ر ق�اب�ل طبقه

بندی هستند.

Page 23: Applying back propagation to shape recognition

Preprocessing

س�ایز ی�ک رق�م متفاوت اس�ت ام�ا تقریبا در • پیکس�ل اس�ت. از آنجاییکه 60 ˟ 40حدود

سایز ثابتی دارد. بای�د سایز BPورودی شبک�ه کارکتره�ا نرمالس�ازی شود. ای�ن عم�ل ب�ا یک انتقال خط�ی ص�ورت م�ی گیرد ت�ا کاراکترها

پیکسل قرار گیرند.16 ˟ 16در تصویر باینری • انتقال خط�ی تص�ویر خروجی بدلی�ل

نیس�ت و چندی�ن س�طوح خاکس�تری دارد. این یک و شده بندی درج�ه خاکس�تری س�طوح

+ منتقل می شوند.1- تا 1دامنه تغییرات

Page 24: Applying back propagation to shape recognition

The Network

شده • نرماالی�ز تص�ویر ورودی ˟ 16بنابرای�ن و 16 پیکسل واحد تشکیل شده است.10خروجی از

ب�ه کالس • متعل�ق الگ�و ی�ک گیرد. خروجی iوقت�ی م�ی قرار - است.1+ است و برای دیگر واحد ها 1مطلوب برای واحد

Page 25: Applying back propagation to shape recognition

The Network

• Input image(left), weight vector (center), and resulting feature map (right). The feature map is obtained by scanning the input image with a single neuron that has a local receptive field. White represent -1, black represents +1

Page 26: Applying back propagation to shape recognition

Preprocessing

س�ایز ی�ک رق�م متفاوت اس�ت ام�ا تقریبا در • پیکس�ل اس�ت. از آنجاییکه 60 ˟ 40حدود

س�ایز ثابت�ی دارد. باید سایز BPورودی شبک�ه کارکتره�ا نرمالس�ازی شود. ای�ن عم�ل ب�ا یک انتقال خط�ی ص�ورت م�ی گیرد ت�ا کاراکترها

پیکسل قرار گیرند.16 ˟ 16در تصویر باینری • انتقال خط�ی تص�ویر خروجی بدلی�ل

نیس�ت و چندی�ن س�طوح خاکس�تری دارد. این یک و شده بندی درج�ه خاکس�تری س�طوح

+ منتقل می شوند.1- تا 1دامنه تغییرات

Page 27: Applying back propagation to shape recognition

Preprocessing

•Notice that most of the errors are cases that people find quite easy. The human error rate is probably 20 to 30 errors

Page 28: Applying back propagation to shape recognition

Learning Weights

تصویر به شبکه ارائه شده و وزنهای پیکسلهای فعال بتدریج اضافه میشوند. وزن پیکسلهای غیر موثر نیز بتدریج کاهش میابد.

Page 29: Applying back propagation to shape recognition

Learning Weights (cont.)

Page 30: Applying back propagation to shape recognition

Learning Weights (cont.)

Page 31: Applying back propagation to shape recognition

Learning Weights (cont.)

Page 32: Applying back propagation to shape recognition

Learning Weights (cont.)

Page 33: Applying back propagation to shape recognition

Learning Weights (cont.)

Page 34: Applying back propagation to shape recognition

Learning Weights (cont.)

Page 35: Applying back propagation to shape recognition

Function Approximation

nnfe

nf n

)( ,1

1)( 21

.10,10,10,10 12

11

12

11 bbww

.0,1,1 221

21 bww

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2 -1

0

1

2

3

Page 36: Applying back propagation to shape recognition

Effect of Parameter Changes

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2 -1

0

1

2

3

21w

1.0

0.5

0.0

-0.5

-1.0

Page 37: Applying back propagation to shape recognition

Effect of Parameter Changes…

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2 -1

0

1

2

3

12b

20 15 10 5 0

Page 38: Applying back propagation to shape recognition

Effect of Parameter Changes…

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2 -1

0

1

2

3

21w

1.0

0.5

0.0

-0.5

-1.0

Page 39: Applying back propagation to shape recognition

Function Approximation

• Two-layer networks, with sigmoid transfer functions in the hidden layer and linear transfer functions in the output layer, can approximate virtually any function of interest to any degree accuracy, provided sufficiently many hidden units are available.

Page 40: Applying back propagation to shape recognition

Ex: Function Approximation…

g p 1 4---p

sin+=

1-2-1Network

+

t

ep

Page 41: Applying back propagation to shape recognition

• The steepest descent algorithm for the approximate mean square error:

• Matrix form:

Steepest Descent Algorithm

1,

,,, )(

ˆ)()1(

mj

mi

mjim

ji

mi

mi

mji

mji askw

wn

nFkwkw

mi

mim

i

mi

mi

mi

mi skb

bn

nFkbkb

)(ˆ

)()1(

Wmk 1+ Wm

k sm am 1–

T–=

bm k 1+ bm k sm–=

s m F

n m----------

F

n 1m

---------

F

n 2m

---------

F

nS mm

-----------

=

Page 42: Applying back propagation to shape recognition

BP the Sensitivity

• Backpropagation: a recurrence relationship in which the sensitivity at layer m is computed from the sensitivity at layer m +1.

• Jacobian matrix:

.

1

2

1

1

1

12

2

12

1

12

11

2

11

1

11

1

111

ms

ms

m

ms

m

ms

ms

m

m

m

m

m

ms

m

m

m

m

m

m

m

m

mmm

m

m

nn

nn

nn

nn

nn

nn

nn

nn

nn

nn

Page 43: Applying back propagation to shape recognition

43

Matrix Repression• The i,j element of Jacobian matrix

).(

)(

1,

1,

1,

1

11,1

mj

mmji

mj

mj

mm

jimj

mjm

jimj

s

l

mi

ml

mli

mj

mi

nfwn

nfw

na

wn

baw

nn

m

,)(11

mmmm

m

nFWn

n

.

)(00

0)(000)(

)( 2

1

ms

m

mm

mm

mm

mnf

nfnf

nF

Page 44: Applying back propagation to shape recognition

Recurrence Relation• The recurrence relation for the sensitivity

• The sensitivities are propagated backward through the network from the last layer to the first layer.

.))((

ˆ))((

ˆˆ

1T1

11

1

T1

mmmm

mTmm

mm

m

mm

sWnFn

FWnFn

Fn

nnFs

.121 ssss MM

Page 45: Applying back propagation to shape recognition

Intrusion Detection System (IDS)

وظیف0ه شناس0ایی و تشخی0ص ه0ر گون0ه اس0تفاده غیرمجاز ب0ه س0یستم، سوء •اس0تفاده و ی0ا آس0یب رس0انی توس0ط ه0ر دو دس0ته کاربران داخل0ی و خارجی را بر عهده دارند. تشخی0ص و جلوگیری از نفوذ امروزه ب0ه عنوان یکی از رایانه ای سیستم های و شبکه ه0ا امنی0ت برآوردن در اص0لی مکانیزم های مطرح اس0ت و عموم0أ در کنار دیواره های آت0ش و ب0ه ص0ورت مکم0ل امنیتی

برای آن ها مورد استفاده قرار می گیرند.

Page 46: Applying back propagation to shape recognition

Intrusion Detection System (IDS)….

س0امانه های تشخی0ص نفوذ ب0ه ص0ورت س0امانه های نرم افزاری و سخت افزاری ایجاد •مزایای از دقت و س0رعت دارند. را خود خاص معای0ب و مزای0ا کدام ه0ر و شده نفوذگران، توسط آن ه0ا امنیت0ی عدم شکس0ت و اس0ت افزاری س0یستم های س0خت

ای0ن گون0ه س0یستم ها می باشد. ام0ا اس0تفاده آس0ان از نرم افزار، قابلیت قابلی0ت دیگ0ر عمومیت مختل0ف، عام0ل س0یستم های تفاوت و نرم افزاری شرای0ط در س0ازگاری انتخاب ای0ن گون0ه سیستم ها بیشتری را ب0ه س0امانه های نرم افزاری می ده0د و عموم0أ

مناسب تری هستند.

Page 47: Applying back propagation to shape recognition

Intrusion Detection Methods

• Misuse detection• matches the activities occurring on an

information system to the signatures of known intrusions

• Anomaly detection• compares activities on the information system

to the norm behaviour

Page 48: Applying back propagation to shape recognition

Motivation for using AI for Intrusion Detection

معایب روش های سنتی:•هشدارهای غلط•به روز رسانی مداوم پایگاه داده براساس نشانه های جدید•

:AIمزایای تکنیک های بر پایه ی انعطاف پذیری

سازگاری تشخیص الگو و توانایی تشخیص الگوهای جدید

توانایی یادگیری

Page 49: Applying back propagation to shape recognition

AI techniques used for Intrusion Detection

• Support Vector Machines (SVMs)

• Artificial Neural Networks (ANNs)

• Expert Systems

• Multivariate Adaptive Regression Splines (MARS)

Page 50: Applying back propagation to shape recognition

Traditional Neural Network Based IDS

Typically consist of a single neural network based on either misuse detection or anomaly detection

Neural network with good pattern classification abilities typically used for misuse detetction, such as Multilayer Perceptron Radial Basis function networks, etc

Neural network with good classification abilities typically used for anomaly detetction, such as Self organizing maps (SOM) Competitive learning neural network, etc

Page 51: Applying back propagation to shape recognition

Hybrid Neural Network Approach

Combination of Misuse detection and anomaly detection based systems Clustering results in dimensionality reduction Classification attains attack identification

Advantages Improved accuracy Enhanced flexibility

Examples

SOM and MLP using back propagation SOM and RBF SOM and CNN, etc

Page 52: Applying back propagation to shape recognition

Hybrid Neural Network Approach(Using SOM and MLP)

SOM employing unsupervised learning used for clustering

MLP employing Back Propagation Algorithm used for classification

Output from SOM is given as input to MLP

Page 53: Applying back propagation to shape recognition

Self Organizing Maps

استفاده • آموزش برای رقابت0ی یادگیری روش از ده، خودس0ازمان ی درشبک0هشود گره .م0ی در گر پردازش های واح0د ده س0ازمان خود ی شبک0ه ی0ک در

. ها واح0د شوند م0ی داده قرار بیشت0ر ی0ا بعدی دو بعدی، ی0ک ی شبک0ه ی0ک های . شوند می منظم ورودی الگوهای به نسبت رقابتی یادگیری فرآیند یک در

صورت • بدی0ن شود م0ی گرفت0ه بکار ه0ا شبک0ه قیب0ل ای0ن در ک0ه رقابت0ی یادگیریرقابت ب0ه یکدیگ0ر ب0ا شدن فعال برای واحده0ا یادگیری، قدم ه0ر در ک0ه اس0تکه شود، م0ی برنده واح0د ی0ک تنه0ا رقاب0ت مرحل0ه ی0ک پایان در پردازن0د، م0یمی داده تغیی0ر متفاوت0ی شک0ل ب0ه واحده0ا س0ایر وزنهای ب0ه نس0بت آ0ن وزنهای

نظارت. بی یادگیری را یادگیری از نوع این .( Unsupervised)شود نامند می

Page 54: Applying back propagation to shape recognition

Proposed hybrid SOM_BPN Neural Network

Page 55: Applying back propagation to shape recognition

Shape recognition by backpropagation in matlab

• In this section we will design a neural network capable of identifying some conventional forms, for reasons of simplicity we have chosen the following shapes: triangle, rectangle and circle.

• The network is a network MLP (MultiLayer Perceptron).• First we will prepare samples (images) and then proceed to

learning and then shape recognition.• The work will be done in Matlab by exploiting functions of

Neural Network Toolbox.

Page 56: Applying back propagation to shape recognition

Preparation of training data and simulation

• The data used are 4 images for each type of form (12 in total) including 3 of each shape (9 in total) will be used for training and each form (3 in total) will be used for the test.

Page 57: Applying back propagation to shape recognition

Preparation of training data and simulation…

Page 58: Applying back propagation to shape recognition

Preparation of training data and simulation…

• The images are binary bitmap images with a resolution of 192 x 160.One of the key steps for the preparation of the data is to process the images have unified dimensional images and for the reason of simplicities all images are already of the same size and shapes are almost the same size in respect of their resolutions.

• to fill the input matrix each matrix of image is divided into 16 lines batches, 16 lines we travel and if we have a black pixel we accumulate them in a matrix P.

• As our image is 160 lines so we have 10 (160/16) entries.

Page 59: Applying back propagation to shape recognition

Preparation of training data and simulation…

• Thus the matrix P contains 10 rows and 9 columns (number of samples of the input data )

• Similarly it is necessary to prepare a matrix T (target) will indicate to the neural network during the learning in the case of a triangular rectangular shape , or circular .

• The matrix T is a 3 x 9 matrix , such that in each column target value for that shape is given.

Page 60: Applying back propagation to shape recognition

The associated code

•Num_Inputs=10 ;% number of 160/16•P=zeros(Num_Inputs,9);% input matrix•T=zeros(3,9); % target matrix

Page 61: Applying back propagation to shape recognition

The associated code…•for h=1:9 % for each 9 images• switch h• case 1• Img = imread('t1.bmp');• Target=[1;-1;-1] ;• case 4• Img = imread('r1.bmp');• Target=[-1;1;-1] ;• case 7• Img = imread('c1.bmp');• Target=[-1;-1;1];•End•T(:,h)=Target

Page 62: Applying back propagation to shape recognition

The associated code…

• Imread: gives 0 for black pixels and 1 for white ones

Page 63: Applying back propagation to shape recognition

The associated code…•[Num_Row,Num_column] = size(Img) ;

•for i=1:Num_Inputs %10• for j=(((Num_Row/Num_Inputs)*(i-1))+1) : ((Num_Row/Num_Inputs)*(i)) %160• for k=1 : Num_column %192• if Img(j,k)==0• P(i,h)=P(i,h)+k ;• end• end• end• end•end

Page 64: Applying back propagation to shape recognition

Input matrice

Page 65: Applying back propagation to shape recognition

Target matrice

Page 66: Applying back propagation to shape recognition

Test values and associated code

And so on run the same code for test objects, with s matrice•S=zeros(Num_Inputs,3);•for h=1:3 % 3 test image•switch h• case 1• Img = imread('t4.bmp');

Page 67: Applying back propagation to shape recognition

Test values and associated code…

•[Num_Row,Num_column] = size(Img) ;• for i=1:Num_Inputs• for j=(((Num_Row/Num_Inputs)*(i-1))+1) : ((Num_Row/Num_Inputs)*(i))• for k=1 : Num_column• if Img(j,k)==0• S(i,h)=S(i,h)+k ;• end• end• end• end•end

Page 68: Applying back propagation to shape recognition

Test matrice

Page 69: Applying back propagation to shape recognition

Normalization values of the matrices P and S between 1 and -1

•A=[P,S] ;•maxi=max(max(A));•mini=min(min(A));•[a,b]=size(A);•for i=1:a• for j=1:b• AN(i,j)=2*(A(i,j)/(maxi-mini))-1;• end•end

Page 70: Applying back propagation to shape recognition

Result of normalization p

Page 71: Applying back propagation to shape recognition

Result of normalization S

Page 72: Applying back propagation to shape recognition

Learning

• The neural network is a MLP network whose hidden layer neurons 20, which contains the activation function is the function tansig and outputs layer contains 3 neurons whose activation function is purelinCode of creating and learning is as follows:

Page 73: Applying back propagation to shape recognition

Learning …

• Num_Neuron_Hidden=20 • net = newff(P,T,Num_Neuron_Hidden,{},'traingd');• Newff: create a feed-forward backpropagation network • traingd: back prop network training function,• {}:Transfer function of ith layer. Default is 'tansig' for

hidden layers, and 'purelin' for output layer.

Page 74: Applying back propagation to shape recognition

Learning …

• net=init(net); % reintialisation weights and bias• net.trainparam.epochs=500;% maximum number of iteration

• net.trainparam.goal=0.0001; %error • net=train(net,P,T); % start learning

Page 75: Applying back propagation to shape recognition
Page 76: Applying back propagation to shape recognition

Error plot

Page 77: Applying back propagation to shape recognition

Simulation

• y = sim(net,S);

• and the output:

• As you see for test shapes:• For first shape: (0.3700>-0.8391>-1.4397) so the shape is triangle• For second shape:(1.5965 > 0.4745 >-0.4391). so the shape is rectangle• For third shape:(0.2083 > -1.3437 >--1.4397). so the shape is circle

Test one

trianglerectangle

circle

Page 78: Applying back propagation to shape recognition

Face recognition

• The learning task here involves classifying camera images of faces of various people in various poses. • Images of 20 different people were collected,

including approximately 32 images per person, varying the person's expression (happy, sad, angry, neutral), the direction in which they were looking (left, right, straight ahead,up), and whether or not they were wearing sunglasses.

Page 79: Applying back propagation to shape recognition

Face recognition…

• As can be seen from the example images, there is also variation in the background behind the person, the clothing worn by the person, and the position of the person's face within the image.• In total, 624 greyscale images were collected,

each with a resolution of 120 x 128,

Page 80: Applying back propagation to shape recognition

Face recognition…

• A variety of target functions can be learned from this image data.

• For example, given an image as input we could train an ANN to output the

o identity of the person, o the direction in which the person is facing, o the gender of the person, owhether or not they are wearing sunglasses, etc.

Page 81: Applying back propagation to shape recognition

Face recognition…

• In the remainder of this section we consider one particular task:

• learning the direction in which the person is facing (to their left, right, straight ahead, or upward).

Page 82: Applying back propagation to shape recognition
Page 83: Applying back propagation to shape recognition

Typical input images

Page 84: Applying back propagation to shape recognition

Face recognition…

• Learning an artificial neural network to recognize face pose. Here a 960 x 3 x 4 network is trained on grey-level images of faces to predict whether a person is looking to their left, right, ahead, or up

• After training on 260 such images, the network achieves an accuracy of 90% over a separate test set.

• The learned network weights are shown after one weight-tuning iteration through the training examples and after 100 iterations

Page 85: Applying back propagation to shape recognition

Learned weights

• Network weights after 1 iteration through each training example

Page 86: Applying back propagation to shape recognition

Learned weights…

• Network weights after 100 iteration through each training example

Page 87: Applying back propagation to shape recognition

Face recognition…

• The leftmost block corresponds to the weight w0, which determines the unit threshold, and the three blocks to the right correspond to weights on inputs from the three hidden units.

• After training on a set of 260 images, classification accuracy over a separate test set is 90%. In contrast, the default accuracy achieved by randomly guessing one of the four possible face directions is 25%.

Page 88: Applying back propagation to shape recognition

Design Choices

• In applying BACKPROPAGATION to any given task, a number of design choices must be made.

• For example, we could preprocess the image to extract edges, regions of uniform intensity, or other local image features, then input these features to the network.

Page 89: Applying back propagation to shape recognition

Difficulty with design choice

• One difficulty with this design option is that it would lead to a variable number of features (e.g., edges) per image, whereas the ANN has a fixed number of input units.

• The design option chosen in this case was instead to encode the image as a fixed set of 30 x 32 pixel intensity values, with one network input per pixel.

• The pixel intensity values ranging from 0 to 255 were linearly scaled to range from 0 to 1.

Page 90: Applying back propagation to shape recognition

Design choice difficulty's solution

• The 30 x 32 pixel image is, in fact, a coarse resolution summary of the original 120 x 128 captured image, with each coarse pixel intensity calculated as the mean of the corresponding high-resolution pixel intensities.

• Using this coarse-resolution image reduces the number of inputs and network weights to a much more manageable size, thereby reducing computational demands, while maintaining sufficient resolution to correctly classify the images

Page 91: Applying back propagation to shape recognition

Differences with ALVINN

• One interesting difference is that in ALVINN, each coarse resolution pixel intensity is obtained by selecting the intensity of a single pixel at random from the appropriate region within the high-resolution image, rather than taking the mean of all pixel intensities within this region

• The motivation for this in ALVINN is that it significantly reduces the computation required to produce the coarse-resolution image from the available high-resolution image. This efficiency is especially important when the network must be used to process many images per second while autonomously driving the vehicle.

Page 92: Applying back propagation to shape recognition

Output encoding

• The ANN must output one of four values indicating the direction in which the person is looking (left, right, up, or straight)

• we could encode this four-way classification using a single output unit, assigning outputs of, say, 0.2,0.4,0.6, and 0.8 to encode these four possible values

• Instead, we use four distinct output units, each representing one of the four possible face directions, with the highest-valued output taken as the network prediction

Page 93: Applying back propagation to shape recognition

Output encoding…

• This is often called a 1-of-n output encoding. There are two motivations for choosing the 1-of-n output encoding over the single unit option.

• First, it provides more degrees of freedom to the network for representing the target function (i.e., there are n times as many weights available in the output layer of units)

• Second, in the 1-of-n encoding the difference between the highest-valued output and the second-highest can be used as a measure of the confidence in the network prediction

Page 94: Applying back propagation to shape recognition

Target values

• One obvious choice would be to use the four target values (1,0,0,0) to encode a face looking to the left, (0,1,0,0) to encode a face looking straight, etc.

• Instead of 0 and 1 values, we use values of 0.1 and 0.9, so that (0.9,0. 1,0.1,0.1) is the target output vector for a face looking to the left.

• The reason for avoiding target values of 0 and 1 is that sigmoid units cannot produce these output values given finite weights.

• If we attempt to train the network to fit target values of exactly 0 and 1, gradient descent will force the weights to grow without bound.

Page 95: Applying back propagation to shape recognition

Network structure

• Another design choice we face is how many units to include in the network and how to interconnect them.

• The most common network structure is a layered network with feed forward connections from every unit in one layer to every unit in the next.

• In the current design we chose this standard structure, using two layers of sigmoid units (one hidden layer and one output layer).

• It is not common to use more layers than this because training times become long.

Page 96: Applying back propagation to shape recognition

Network structure…

• how many hidden units should we include? • In the results reported before, only three hidden units

were used, yielding a test set accuracy of 90%. • In other experiments 30 hidden units were used, yielding

a test set accuracy one to two percent higher.• Although the generalization accuracy varied only a small

amount between these two experiments, the second experiment required significantly more training time

Page 97: Applying back propagation to shape recognition

Other learning algorithm parameters.

• In these learning experiments the learning rate was set to 0.3, and the momentum was set to 0.3.

• Lower values for both parameters longer training times.• If these values are set too high, training fails to converge

to a network with acceptable error over the training set.

Page 98: Applying back propagation to shape recognition

Learned Hidden Representations

• consider first the four rectangular blocks just below the face images in the figure. Each of these rectangles depicts the weights for one of the four output units in the network (encoding left, straight, right, and up).

• The four squares within each rectangle indicate the four weights associated with this output unit-the weight w0, which determines the unit threshold (on the left), followed by the three weights connecting the three hidden units to this output.

Page 99: Applying back propagation to shape recognition

Learned weights…

• Network weights after 100 iteration through each training example

Page 100: Applying back propagation to shape recognition

Learned Hidden Representations…

• The brightness of the square indicates the weight value,• with bright white indicating a large positive weight, dark

black indicating a large negative weight, and intermediate shades of grey indicating intermediate weight values.

• For example, the output unit labeled "up" has a near zero w0 threshold weight, a large positive weight from the first hidden unit, and a large negative weight from the second hidden unit.

Page 101: Applying back propagation to shape recognition

PCA

پرس0پترون چن0د الیه • بعنوان مثال مدل ، ، MLPشبک0ه عص0بی مص0نوعی بعنوان ی0ک روش جام0ع و توان0ا برای تخمی0ن ه0ر نوع دلخواه ،سیستم های خط0ی و غی0ر خط0ی بدون هی0چ گون0ه پی0ش فرض0ی مورد توج0ه قرار گرفته

هنوز ب0ا موان0ع خاص0ی هنگام س0روکار داشت0ن با MLPاس0ت . اگرچ0ه مدل س0یستم های ب0ا ابعاد باال در فضای داده های ورودی مواج0ه اس0ت . انالیز قرار توجه مورد ویژگ0ی اس0تخراج روش مهمتری0ن بعنوان اص0لی اجزای از داده های ورودی اصلی را توان0د متغی0ر های مازاد گرفت0ه اس0ت و م0ی

حذف بکند .

Page 102: Applying back propagation to shape recognition

PCA

-بعدی( همراه ب0ا نمایش متغیرهای ورودی n ابتدا برای یافت0ن مس0یر در فضای ورودی )PCAمت0د • (Xi, i=1,……n. که بیشترین اهمیت تغیری پذیری را دارند اطالعات را استخراج میکند )

س0پس مقادی0ر ویژ0ه ) ( و بردار ویژ0ه نرمال متناظ0ر) ( براساس •ماتری0س کوواریان0س و تص0ویر ورودی ه0ا ب0ه ی0ک زیرفضای کوچکتر برای اعمال عملیات

Xامی0ن جزء اصلی از iکاهش ابعاد محاسبه م0ی شوند. براساس تئوری امار کالسیک0 ای0ن معن0ی اس0ت ک0ه ح0ل ب0ه ب0ا جس0تجو کردن Xامی0ن جزء اص0لی iک0ه برابر i امی0ن بردار ویژه

ماتریس کوواریانس است .