PR-lecture 6-7

Preview:

DESCRIPTION

PR-lecture 6-7

Citation preview

قالوا سبحانك علم لنا إ ما علمتنا

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

قالوا سبحانك علم لنا إ ما علمتناإنــــك أنــــت الـعــلــيــم الــحـــكــيــم

صدق هللا العظيم

والص ة والس م علي اشرف خلق هللاوالص ة والس م علي اشرف خلق هللا نبينا سيدنا محمد صلي هللا عليه وسلمنبينا سيدنا محمد صلي هللا عليه وسلم

سبحانك اللھم وبحمدك

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

سبحانك اللھم وبحمدكاشھد أن هللا إ أنتاستغفرك وأتوب اليك

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Prof. Dr Sayed Fadel BahgatProf. Dr Sayed Fadel Bahgat

Sayed_fadelSayed_fadel20062006@yahoo.com@yahoo.com

Chapter two

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Homogeneity (HOM)• (also called • "Inverse Difference Moment")

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Homogeneity weights values by the inverse of the Contrast weight, with weights decreasing exponentially away from the diagonal:

Homogeneity equation =

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Exercise:• Calculate the homogeneity value for the

horizontal GLCM and compare it with the Dissimilarity value.

• Homogeneity calculation:• Homogeneity weights X horizontal GLCM

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Homogeneity weights X horizontal GLCM

• Homogeneity equation =

1 0.5 0.2 0.1

0.5 1 0.5 0.2

0.2 0.5 1 0.5

0.1 0.2 0.5 1

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition محاضرات ا�ستاذ الدكتور سيد فاضل بھجت

=

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

=

0.166 0.083 0.042 0

0.083 .166 0 0

Normalized symmetrical horizontal GLCM matrix

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

1 0.5 0.2 0.1

0.5 1 0.5 0.2

0.2 0.5 1 0.5

0.1 0.2 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .008 0

=

1 0.5 0.2 0.1

0.5 1 0.5 0.2

0.2 0.5 1 0.5

0.1 0.2 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

0.166 0.042 .008 0

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .008 0

0.042 0.166 0 0

0.008 0 0.25 0.021

0 0 0.021 0.083

=

sum of multiplication results matrix = .807

• sum of multiplication results matrix = .807 In non-matrix form:

• 0.166(1)+ 0.083(0.5)+ 0.042(0.2)+ 0(0.1) + 0.083(0.5)+ 0.166(1) + 0(0.5)+ 0(0.2) + 0.042(0.2)+ 0(0.5) + 0.250(1)+0 .042(0.5) +0(0.1)+ 0(0.2)+0.042(0.5)

• + .083(1)1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• = 0.166 + 0.042 + 0.008 + 0 + 0.042 + 0.166 + 0 + 0

• + 0.008 + 0 + 0.250 + 0.021 • + 0 + 0 + 0.021 +0 .083• = .807

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Homogeneity is the most commonly used measure that increases with less contrast in the window.

• However, it would be easy to use the above model to construct a first degree "similarity" measure. Write the equation

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

"similarity" measure. Write the equation and perform the calculation for "similarity" using the horizontal GLCM

• Homogeneity is the most commonly used measure that increases with less contrast in the window.

• However, it would be easy to use the above model to construct a first degree "similarity" measure

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

"similarity" measure

•Equation =

Exercise:Calculate the similarity value for the

horizontal GLCM.Similarity calculation:

= "Similarity weights X horizontal GLCM= multiplication results

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

= multiplication results

X

(1+|i-j|) -1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

=

1 0.5 0.333 0.25

0.5 1 0.5 0.333

0.333 0.5 1 0.5

0.25 0.333 0.5 1

X

Pij(1+|i-j|) -1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

=

0.166 0.083 0.042 0

0.083 .166 0 0

Normalized symmetrical horizontal GLCM matrix

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

1 0.5 0.333 0.25

0.5 1 0.5 0.333

0.333 0.5 1 0.5

0.25 0.333 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

X

Pij(1+|i-j|) -1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .014 0

=

1 0.5 0.333 0.25

0.5 1 0.5 0.333

0.333 0.5 1 0.5

0.25 0.333 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

0.166 0.042 .014 0

X

Pij(1+|i-j|) -1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .014 0

0.042 0.166 0 0

0.014 0 0.25 0.021

0 0 0.021 0.083

=

sum of multiplication results matrix = .807

sum of multiplication results matrix = .807 In non-matrix form:0.166(1)+ 0.083(0.5)+ 0.042(0.333)+ 0(0.25) + 0.083(0.5)+ 0.166(1) + 0(0.5)+ 0(0.333) + 0.042(0.333)+ 0(0.5) + 0.250(1)+0 .042(0.5) +0(0.25)+ 0(0.333)+0.042(0.5) + .083(1)

= 0.166 + 0.042 + 0.014 + 0 + 0.083 + 0.166 + 0 + 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

+ 0.083 + 0.166 + 0 + 0 + 0.014 + 0 + 0.250 + 0.021 + 0 + 0 + 0.021 + 0 .083

= .804

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Angular Second Moment (ASM) and Energy (also called Uniformity)

• ASM and Energy use each Pij as a weight for itself. High values of ASM or Energy occur when the window is very orderly.

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

occur when the window is very orderly.

ASM equation =

The square root of the ASM is sometimes used as a texture measure, and is called Energy.

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Energy.

0.166 0.083 0.042 0

0.083 .166 0 0

Normalized symmetrical horizontal GLCM matrix

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

• Exercise: Perform the ASM calculation for the horizontal GLCM.

• Answer. matrix of ( Pij )2:

0.027 0.007 0.002 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.027 0.007 0.002 0

0.007 0.028 0 0

0.002 0 0.0625 0.002

0 0 0.002 0.007

summed = 0 .145

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Entropy equation =

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Entropy equation =

• Entropy =• ln(P )* horizontal GLCM * (-1)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• ln(Pij)* horizontal GLCM * (-1)• = multiplication results

0.166 0.083 0.042 0

Normalized symmetrical horizontal GLCM matrix

(Pi,j)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

-1.7957 -2.4889 -3.1700 0

-2.4889 -1.7957 0 0

ln ( Pi,j)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

-2.4889 -1.7957 0 0

-3.1700 0 -1.38 -3.1700

0 0 -3.1700 -2.4889

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

1.7957 2.4889 3.1700 0

2.4889 1.7957 0 0

3.1700 0 1.38 3.1700

0 0 3.1700 2.4889

-ln ( Pi,j)Pi,j

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.2980 0.2065 0.1331 0

0.2065 0.2980 0 0

0.1331 0 0.3465 0.1331

0 0 0.1331 0.2065

=

0.2980 0.2065 0.1331 0

0.2065 0.2980 0 0

Entropy = sum of product

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.1331 0 0.3465 0.1331

0 0 0.1331 0.2065

sum of multiplication results matrix = 2.0951

GLCM MeanGLCM Mean• GLCM Mean Equations

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• The left hand equation calculates themean based on the reference pixels, µi. It is also possible to calculate the mean using the neighbor pixels, µj, as in the right hand equation.

• For the symmetrical GLCM , the two

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• For the symmetrical GLCM , the two values µi and µj are identical.

• Exercise:

• For test image,• Calculate the mean of symmetrical

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Calculate the mean of symmetrical horizontal and Vertical GLCM.

0.166 0.083 0.042 0

Normalized symmetrical horizontal GLCM matrix

(Pij)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

• for the test image, calculate the Mean of symmetrical horizontal GLCM :

• µi = 0* ( 0.166 + 0.083 + 0.042 + 0 ) + 1* ( 0.083 + 0.166 + 0 + 0 ) + 2* ( 0.042 + 0 + 0.250 + 0.042) +

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

2* ( 0.042 + 0 + 0.250 + 0.042) + 3* ( 0 + 0 + 0.042 + 0.083)

• = 0.249 + 2(0.334) + 3(0.125)• = 0.249 + 0.668 + 0.375

• = 1.292 = µj

Normalized symmetrical Vertical GLCM matrix

(Pi,j)

0.250 0 0.083 0

0 0.167 .083 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0 0.167 .083 0

0.083 0.083 0.083 0.083

0 0 0.083 0

• for the test image, calculate the Mean of symmetrical vertical GLCM

• µi = 0*(0.250 + 0 + .083+ 0 ) + 1*( 0 + 0.167 + 0.083 + 0 ) + 2*(0.083 + 0.083 + 0.083 + 0.083) +

• 3*( 0 + 0 + 0.083 + 0 )

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• 3*( 0 + 0 + 0.083 + 0 )• =1.162 = µj

• the mean for the original values in the window (not the GLCM mean) is 1.25.

0 0 1 1

0 0 1 1

0 2 2 2

2 2 3 3

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• This clearly demonstrates that the GLCM Mean and the "ordinary" mean are not the same measure.

2 2 3 3

• The "ordinary mean would be a first-order "texture" measure,

• it is difficult to see how it could be called texture in any practical sense.

• The first-order standard deviation, however, is commonly used as a texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

however, is commonly used as a texture measure.

• remember, all GLCM texture is second-order (concerning the relationship between two pixels).

• Exercise: Calculate the Variance texture for both the horizontal and vertical GLCM of the test image.

• GLCM Variance (GLCM Standard Deviation)

Variance equation =

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Variance equation =

• Standard deviation equation

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Properties of Variance • Variance is a measure of the dispersion of

the values around the mean.

• It is similar to entropy. It answers the question "What is the dispersion of the

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

question "What is the dispersion of the difference between the reference and the neighbor pixels in this window?"

• Exercise: Calculate the Variance texture for both the horizontal and vertical GLCM of the test image.

• µi= 1.292

0.166 0.083 0.042 00.083 .166 0 00.042 0 0.25 0.042

0 0 .042 0.083

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Variance (horizontal) =• 0.166(0-1.292)2 + 0.083(0-1.292)2 + 0.042(0-1.292)2 + 0• + 0.083(1-1.292)2 + 0.166((1-1.292)2 + 0 + 0 • + 0.042(2-1.292)2 + 0 + .250(2-1.292)2 + .042(2-

1.292)2• + 0 + 0 + .042(3-1.292)2 + .083(3-

1.292)2

• = 1.039067

• Variance (vertical) = • 0.250(0 -1.162)2 + 0 + .083(0 -1.162)2 + 0 +

0.250 0 0.083 00 0.167 .083 0

0.083 0.083 0.083 0.0830 0 0.083 0

µj= 1.162

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• 0.250(0 -1.162) + 0 + .083(0 -1.162) + 0 + • 0 + 0.166(1 -1.162)2 + .083(1 -1.162)2 + 0 + • 0.083(2 -1.162)2 + 0.083(2 -1.162)2 + 0.083(2 -1.162)2 + .083(2-

1.162)2

• + 0 + 0 + 0.083(3 -1.162)2 + 0

• =0.969705• Variance calculated on the original image values rather than on the

GLCM = 1.030776

• GLCM Correlation• The Correlation texture measures the

linear dependency of grey levels on those of neighboring pixels.

• GLCM Correlation equation:

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Exercise: Calculate the Correlation measure for the horizontal test image.

• this is easier if you do the GLCM Meanand Variance exercises first and use their results

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

results

• Correlation (horizontal GLCM):

• Let A = 1/ [(1.039067)(1.039067)] .5

0.166 (0-1.292) (0-1.292)A + 0.083(0-1.292)(1-1.292)A+ 0.042(0-1.292)(2-1.292)A + 0 + 0.083 (1-1.292)(0-1.292) A + 0 .166 (1-1.292)(1-1.292)A + 0+ 0 + 0.042(2-1.292)(0-1.292)A + 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

+ 0.042(2-1.292)(0-1.292)A + 0+ 0.250(2-1.292)(2-1.292)A + 0 .042 (2-1.292)(3-1.292)A + 0 + 0 + 0.042 (3-1.292)(2-1.292)A +.083 (3-1.292)(3-1.292)A =

= 0.7182362

• Creating a texture image• The result of a texture calculation is a

single number representing the entire window.

• This number is put in the place of the centre pixel of the window, then the window is moved one pixel and the

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

window is moved one pixel and the process is repeated of calculating a new GLCM and a new texture measure. In this way an entire image is built up of texture values

• Example: For a 5x5 window,

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

الس1م عليكم و رحمة هللا و بركاتهالس1م عليكم و رحمة هللا و بركاته

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

الس1م عليكم و رحمة هللا و بركاتهالس1م عليكم و رحمة هللا و بركاته