Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision...

22
Lab 2. Decision (Classification) Tree Represented as a set of hierarchically-arranged decision rules (i.e., tree-branch-leaf) Could be generated by knowledge engineering, neural network, or statistic methods. S-Plus: Tree Models: successively splitting the data to form homogeneous subsets.

Transcript of Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision...

Page 1: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

Lab 2. Decision (Classification) Tree

• Represented as a set of hierarchically-arranged decision rules (i.e., tree-branch-leaf)

• Could be generated by knowledge engineering, neural network, or statistic methods.

• S-Plus:– Tree Models: successively splitting the data to form

homogeneous subsets.

Page 2: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

Classification Example

Page 3: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 4: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 5: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

•The expert classification software provides a rules-based approach to multispectralimage classification, post-classification refinement and GIS modeling. In essence,an expert classification system is a hierarchy of rules, or a decision tree thatdescribes the conditions for when a set of low level constituent information getsabstracted into a set of high level informational classes. The constituent informationconsists of user-defined variables and includes raster imagery, vector layers, spatialmodels, external programs and simple scalars.

•A rule is a conditional statement, or list of conditional statements, about thevariable’s data values and/or attributes that determine an informational componentor hypotheses. Multiple rules and hypotheses can be linked together into a hierarchythat ultimately describes a final set of target informational classes or terminalhypotheses. Confidence values associated with each condition are also combinedto provide a confidence image corresponding to the final output classified image.

Page 6: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 7: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 8: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

•Hypotheses are evaluated by the use of rules – if one or more rules are true, then that hypothesis may be true at that particular location.

•To determine if a rule is true, the rule is evaluated based on input variables. For instance, a rule could be that slopes must be gentle (less than 5 degrees).

•A variable determining the slope at every location is required to evaluate this. It could be in the form of an existing image specifying slope angles, it could come from a spatial model calculating slope on-the-fly from an input DEM, or it could even be an external program. •Variables can also be defined from vectors and scalars.

•If the variable’s value indicates that the rule is correct, this (combined with other correct rules) indicates that the hypothesis (class allocation) is true.

•The ability to handle uncertainty is of vital importance to the knowledge base. The expert places confidence in each rule and as multiple rules are triggered within a tree, the Knowledge Classifier combines the confidences.

•Several rules could be true at a particular location – the one with the highest confidence is most likely to be the class for that pixel.

Page 9: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

KEY FEATURES:• Graphical drag-and-drop tool for building the knowledge tree

• Confidence value definition and propagation (handling of uncertainty)

• Ability to use variables from various sources including: images, vectors, scalars, graphical models, and even user-defined programs

• Ability to include prompts for particular data files and variables enabling thecreation of portable knowledge bases

• Use of spatial operators (as opposed to traditional per-pixel classifiers) via Model Maker

• Multiple AND-ing or OR-ing of rules through the construction of the tree branches horizontally or vertically

• Pathway cursor for quick feedback on classification results to aid in developing and fine-tuning a knowledge base

• Access to existing ERDAS IMAGINE tools, such as Model Maker for definingspectral/spatial operators, shortens the learning curve

Page 10: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

LIMITATIONS:

•While the Expert Classification approach does enable ancillary data layers to be taken into consideration, it is still not truly an object based means of image classification (rules are still evaluated on a pixel by pixel basis).

•Additionally, it is extremely user-intensive to build the models – an expert is required in the morphology of the features to be extracted, which also then need to be turned into graphical models and programs that feed complex rules, all of which need building up from the components available.

•Even once a knowledge base has been constructed it may not be easily transportable to other images (different locations, dates, etc).

Page 11: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

KNOWLEDGE CLASSIFIER:

With a previously created expert knowledge base, a less experienced user may use the Knowledge Classifier to apply the knowledge base to data and perform a classification. This program is designed with a simple, user-friendly wizard interface.

FEATURES:• Wizard interface allows non-experts to apply the knowledge base to their own data• Evaluation of all possible classification classes, or customized by a subset of rules• Identification of missing files and automatic prompting to find them• Options to output fuzzy sets and confidence layers, as well as a classification

Page 12: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 13: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 14: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 15: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 16: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 17: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 18: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 19: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 20: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,
Page 21: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

•In the knowledge classifier, the parameters are entered in attribute boxes and the rule event is told to perform the intersection. Most notably, it is all done at once!

• Any number of parameters, attributes and rules can be used depending on the complexity of the model. For their general purposes, this became the base data they needed to continue with our work, to identify native/nonnative vegetation.

•Through this process, the scrub/shrub and evergreen forest are re-defined into the following 36 descriptive classes.

•The descriptive classes represent the data that is often referred to in Hawaii and allows us tointegrate other sources of information into our investigations.

Page 22: Lab 2. Decision (Classification) Treeweb.pdx.edu/~nauna/Lab2.pdf · Lab 2. Decision (Classification) Tree • Represented as a set of hierarchically -arranged decision rules (i.e.,

Example of some of the new classes that were formed: