noimosini sminous

  • View
    727

  • Download
    4

Embed Size (px)

Text of noimosini sminous

. & / : . ( 2008)i - .ii LaTEX ( teTeX). Kile UbuntuLinux. C++, FORTRAN R. Gnuplot R Xg Gimp. - . . .iii , - , - .iv - (). - (). - : 1 - . , . 2 . () - (). , () , , . , . 3 - . , . 4 - . , - . (). 5 - . (clustering) , () . bagging . 6 . , -vi Gibbs. , (). -, Epanechnikov 7 . , (). , . 8 . 8 . , - . , .SynopsisThe present thesis is dealing with the study and the development of classica-tion models that are based on Probabilistic Neural Networks (PNN). The proposedmodels were developed by the incorportation of statistical methods as well asmethods from several elds of Computational Intelligence (CI) into PNNs. Thepresentation of the subjects and the results of the dissertation is organized asfollows:In Chapter 1 the required theoretical elements of the statistical decision theoryin classication tasks is presented. Moreover, a summary of the most commondecision rules and discriminant functions is provided.Chapter 2 is devoted in the presentation of the concepts that consist CI. Specialcredit is given to the optimization methods of CI and especially to Particle SwarmOptimization (PSO) and Dierential Evolution Algorithms (DEA). Furthermore,Articial Neural Networks are briey presented and a thorough presentation aboutPNNs is provided regarding the structure, the operation, the usefulness and theirvarious applications. Several known variants of PNNs are also exhibited.Chapter 3 provides a brief description of the typical resampling methods thatare necessary for machine learning classication problems. Moreover, the requi-red methodology for the statistical comparisons of classication algorithms on oneor several application tasks is presented.In Chapter 4 a novel class of classication models that comprise variants ofPNNs is proposed. In particular, evolutionary optimization algorithms are incor-porated into PNN for the pursuit of promising values for the spread parameters ofits kernel functions. For this purpose, PSO and DEA are employed and the newmodels are named Evolutionary PNNs (EPNN).In the next chapter, a list of improvements for EPNNs is proposed regardingtheir performance and required training time. Using unsupervised clustering me-thods, a new Improved EPNN (IEPNN) is constructed that requires much shortertraining time. For further improvement of EPNNs performance, the bagging tech-nique is also employed. Moreover, a dierent spread parameters matrix of PNNskernels is used for every class of the available data.In Chapter 6 a brief summary of the fundamental concepts of Bayesian A-nalysis is provided. Afterwards, a Bayesian model is proposed for the estimationof PNNs spread parameters where the estimation is achieved by Gibbs sampler.The aforementioned model is incorporated into PNNs and EPNNs, proposing a newclass of models that are named Bayesian PNNs (BPNN). Moreover, we study theviiiuse of Epanechnikovs kernel function besides the normal kernel.In the rst part of Chapter 7 a short review on the theory of Fuzzy Sets isprovided. A Fuzzy Membership Function is employed for the further improvementof EPNNs performance in binary classication tasks and the proposed modelis named Fuzzy EPNN (FEPNN). Furthermore, we propose a new decompositionalgorithm that converts multiclass classication problems into multiple binaryclassication ones. Utilizing this algorithm, FEPNNs can also be applied on multiclass classication problems.This dissertation is completed with Chapter 8 and Appendix A. In the lastchapter, a comparison between all the novel models takes place. Moreover, theproposed models are compared to the model that has achieved the greatest per-formance ever for each classication problem. In Appendix A, we provide a shortdescription of all the classication problems that were used in this thesis for theevaluation of the proposed models. - . , . .. , - , . - . .. , , . - , , - . ... , , . . .. . . .. , . , . . . , 2008.x vSynopsis vii ix xixI 11 31.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.2 . . . . . 31.2.1 . . . . 41.2.2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.2.3 (outliers) . . . . . . . . . . . . . . . . . . 61.3 . . . . . . . . . . . . . . . . . . . . . . . . . 61.3.1 . . . . . . . . . . . . . . . . . . 61.3.2 . . . . . 81.3.3 . . . . . . . . . . . . . . . . . . . 102 132.1 . . . . . . . . . . . . . . . 132.2 . . . . . . . . . . . . . . . . . . . . . 142.3 . . . . . . . . . . . . . . . . . . . . . . . . 152.3.1 . . . . . . . . . . . . . . . . . . . . . . . 182.4 . . . . . . . . . . . . . . . . . . . . . . . . . . 202.4.1 . . . . . . . . . . . . 212.5 . . . . . . . . . . . . . . . . . . . . . . . 262.6 . . . . . . . . . . . . . . . . . . . . . 282.6.1 . . . . . . . . . . . . . . . . . . . . . . 292.6.2 . . . . . . . . . . . . . . . . . . . . . . . . . . . 322.6.3 . . . . . . . . . . . . . . . . . . . . . . . . . . . 322.6.4 . . . . . . . . . . . . . . . . . . . . 32xii3 373.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373.2 . . . . . . . . . . . . . . . . . . . . 383.2.1 Mfold CrossValidation . . . . . . . . . . . . . . . . . . . . 393.2.2 Bootstrap . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403.3 . . . . . . . . . . . . . 403.3.1 McNemar . . . . . . . . . . . . . . . . . . . . . . . 403.3.2 t . . 413.3.3 t MCV . . . . . . . . . . . . . . . . . . . . . . 413.3.4 Wilcoxon . . . . . . . . . . . . . . . . . . . . . . . 443.4 . . . . . . . . . 453.4.1 . . . . . . . . . . . . 463.4.2 . . . . . . . . 46II 514 534.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 534.2 . . . . . . . . . . . . . . . . . . . . . . . . . 544.2.1 . . . . . . . . . . . . . . . . . 554.2.2 . . . . . . . . . . . . . . . . . . . . . 584.3 . . . . . . . . . . . . . . . . . . . . . . 604.3.1 . . . . . . . . . . . . . . . . . 604.3.2 . . . . . . . . . . . . . . . . . . . 624.4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 634.5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 685 735.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 735.2 Bagging . . . . . . .