A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo...

28
A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence and Information Systems, Kharkov State Technical University of Radioelectronics, UKRAINE e-mail: [email protected], [email protected].fi **Department of Computer Science and Information Systems, University of Jyvaskyla, FINLAND, e-mail: [email protected].fi STeP’98 - Finnish AI Conference, 7-9 September, 1998
  • date post

    20-Dec-2015
  • Category

    Documents

  • view

    219
  • download

    0

Transcript of A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo...

Page 1: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

A Technique for Advanced Dynamic Integration of Multiple Classifiers

Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan*

*Department of Artificial Intelligence and Information Systems, Kharkov State Technical University of Radioelectronics, UKRAINE

e-mail: [email protected], [email protected]

**Department of Computer Science and Information Systems, University of Jyvaskyla, FINLAND, e-mail: [email protected]

STeP’98 - Finnish AI Conference, 7-9 September, 1998

Page 2: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Finland and UkraineFinland and Ukraine

University of JyväskyläFinland

State Technical University of Radioelectronics

KharkovUkraine

Page 3: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Metaintelligence Laboratory: Research Topics

• Knowledge and metaknowledge engineering;

• Multiple experts;

• Context in Artificial Intelligence;

• Data Mining and Knowledge Discovery;

• Temporal Reasoning;

• Metamathematics;

• Semantic Balance and Medical Applications;

• Distance Education and Virtual Universities.

Page 4: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Contents

• What is Knowledge Discovery ?

• The Multiple Classifiers Problem

• A Sample (Training) Set

• A Sliding Exam of Classifiers as Learning Technique

• A locality Principle

• Nearest Neighbours and Distance Measure

• Weighting Neighbours, Predicting Errors and Selecting Classifiers

• Data Preprocessing

• Some Examples

Page 5: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

What is Knowledge Discovery ?

• Knowledge discovery in databases (KDD) is a combination of data warehousing, decision support, and data mining and it is an innovative new approach to information management.

• KDD is an emerging area that considers the process of finding previously unknown and potentially interesting patterns and relations in large databases*.

• __________________________________________________________________________________________________________________________________________

• * Fayyad, U., Piatetsky-Shapiro, G., Smyth, P., Uthurusamy, R., Advances in Knowledge Discovery and Data Mining, AAAI/MIT Press, 1996.

Page 6: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

The Research Problem

During the past several years, in a variety of application domains, researchers in machine learning, computational learning theory, pattern recognition and statistics have tried to combine

efforts to learn how to create and combine an ensemble of classifiers.

The primary goal of combining several classifiers is to obtain a more accurate prediction than can be obtained from any single classifier alone.

Page 7: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Approaches to Integrate Multiple Classifiers

Integrating Multiple Classifiers

Selection Combination

Global (Static)

Local (Dynamic)

Local (“Virtual” Classifier)

Global (Voting-Type)

Decontextualization

Page 8: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Classification Problem

Given: n training pairs (xi, yi)

with xiRp and yi{1,…,J}

denoting class membership

Goal: given: new x0

select classifier for x0

predict class y0

J classes, n training observations,p object features

ClassificationClassifiers

Training set Vectorclassified

Classmembership

Page 9: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

A Sample (Training) Set

X1

X2

Cixi2

xi1

P x x C

P x x C

P x x Cnn n

n

1 11

21

1

2 12

22

2

1 2

:( , ) ;

:( , ) ;

...

:( , ) .

Page 10: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Classifiers Used in Example

• Classifier 1: LDA - Linear Discriminant Analysis;

• Classifier 2: k-NN - Nearest Neighbour

Classification;

• Classifier 3: DANN - Discriminant Adaptive Nearest Neighbour

Classification

Page 11: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

A Sliding Exam of Classifiers (Jackknife Method):

We apply all the classifiers to the Training Set

points and check correctness of classification

X1

X2

(0;0;1)

(1;0;0)

(0;0;0)

(0;0;0)

(0;0;0)(1;1;0)

(0;1;0)

(0;0;0)

(0;1;0)(0;0;0)

(0;0;0)

(0;0;0)

LDA - incorrect classification

k-NN - incorrect classification

DANN - correct classification

Page 12: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

A Locality Principle

X1

X2

(0;0;1)

(1;0;0)

(0;0;0)

(0;0;0)

(0;0;0)

(0;1;0)

(0;0;0)

(0;1;0)(0;0;0)

(0;0;0)

(0;0;0)

We assume that also in neighbourhood of a pointwe may expect the sameclassification result:

LDA - incorrect classificationk-NN - incorrect classification

DANN - correct classification

Page 13: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Selecting Amount of Nearest Neighbours

• A suitable amount l of nearest neighbours for a training set point should be selected, which will be used to classify case related to this point.

• We have used l = max(3, n div 50) for all training set points in the example, where n is the amount of cases in a training set.

• ? ? Should we locally select an appropriate l value ?

Page 14: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Brief Review of Distance Functions According to D. Wilson and T. Martinez (1997)

Page 15: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Weighting Neighbours

X1

X2

(0;0;1)

(1;0;0)

(0;0;0)

(0;0;0)

(0;0;0)(1;1;0)

(0;1;0)

(0;0;0)

(0;1;0)(0;0;0)

(0;0;0)

(0;0;0)

d1

d2d3

NN3

NN1

NN2

Pidmax

T h e v a l u e s o f d i s t a n c e m e a s u r e a r e u s e d t o d e r i v e t h e w e i g h t w k f o re a c h o f s e l e c t e d n e i g h b o u r s k = 1 , … , l u s i n g f o r e x a m p l e a c u b i cf u n c t i o n :

w d dk k ( ( / ) )m a x1 3 3

Page 16: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Nearest Neighbours’ Weights in the Example

X1

X2

(0;0;1)

(1;0;0)

(0;0;0)

(0;0;0)

(0;0;0)(1;1;0)

(0;1;0)

(0;0;0)

(0;1;0)(0;0;0)

(0;0;0)

(0;0;0)

d1

d2d3

NN3

NN1

NN2

Pidmax

k=3; d1=2,1; d2=3,2; d3=4,3; dmax=6w1=0,88; w2=0,61; w3=0,25

Page 17: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Selection of a Classifier

X1

X2

(0;0;1)

(0,3;0,6;0)

(1;0;0)

(0;0;0)

(0;0;0)

(0;0;0)(1;1;0)

(0;1;0)

(0;0;0)

(0;1;0)(0;0;0)

(0;0;0)

(0;0;0)

d1

d2d3

NN3

NN1

NN2

Pi

dmax

Predicted classification errors:

q w q k j mj ii

k

ij* ( ) / , , .

11

q*=(0,3; 0,6; 0). DANN should be selected

Page 18: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Compenetnce Map of Classifiers

X1

X2

(0;0;0)

(1;0;0)

(0;0;0)

(0;0;0)

(0;0;0)

(0;1;0)

(1;1;0)(0;0;1)

(0;0;0)

(0;1;0)(0;0;0)

(0;0;0)

DANN

k-NN

LDA

k-NN

LDA

DANN

Page 19: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Data Preprocessing: Selecting Set of Features

p'i - subsystems of features

PCM AFS+LDA

AFS+s-by-sDA

AFS+LDAby optimalscoring

AFS+FDA

AFS+PDA

p '1

p '2 p '

3 p '4 p '

5p '

6

Classification errorsaccountF1 F2 F3

F4 F5 F6

F*

Fi

i

min1

6

- the best subsystem of featuresp *

Conclusion:methodi - the best

Fi - classification errors

Page 20: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Features Used in Dystonia Diagnostics

• AF (x1) - attack frequency;

• AM0 (x2) - the mode, the index of sympathetic tone;

• dX (x3) - the index of parasympathetic tone;

• IVR (x4) - the index of autonomous reactance;

• V (x5) - the velocity of brain blood circulation;

• GPVR (x6) - the general peripheral blood-vessels’ resistance;

• RP (x7) - the index of brain vessels’ resistance.

Page 21: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Training Set for a Dystonia Diagnostics

Page 22: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Visualizing Training Set for the Dystonia Example

Page 23: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Evaluation of Classifiers

Page 24: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Diagnostics of the Test Vector

Page 25: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Experiments with Heart Disease Database

• Database contains 270 instances. Each instance has 13 attributes which have been extracted from a larger set of 75 attributes.

The average cross-validation errors for the three classification methods were the following:

DANN 0.196,

K-NN 0.352,

LDA 0.156,

Dynamic Classifier Selection Method 0.08

Page 26: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Experiments with Liver Disorders Database

• Database contains 345 instances. Each instance has 7 numerical attributes.

The average cross-validation errors for the three classification methods were the following:

DANN 0.333,

K-NN 0.365,

LDA 0.351,

Dynamic Classifier Selection Method 0.134

Page 27: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Liver learning curves

0.5

0.55

0.6

0.65

0.7

50 75 100

125

150

175

200

225

250

Training Set Size

Acc

urac

y

Voting

CVM

DCS

Heart learning curves

0,77

0,79

0,81

0,83

0,85

100 120 140 160 180 200

Training Set Size

Acc

urac

y

Voting

CVM

DCS

Local (Dynamic) Classifier Selection (DCS) is compared with Voting and static Cross-Validation Majority

Experimental Comparison of Three Integration Techniques

Page 28: A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Conclusion and Future Work• Classifiers can be effectively selected or integrated due

to the locality principle

• The same principle can be used when preprocessing data

• The amount of nearest neighbours and the way of distance measure it is reasonable decided in every separate case

• The difference between classification results obtained in different contexts can be used to improve classification due to possible trends