An Object Oriented Algorithm for Extracting Geographic Information from Remotely Sensed Data Zachary...

Post on 28-Dec-2015

215 views 0 download

Tags:

Transcript of An Object Oriented Algorithm for Extracting Geographic Information from Remotely Sensed Data Zachary...

An Object Oriented Algorithm for Extracting Geographic Information from Remotely Sensed Data

Zachary J. Bortolot

Assistant Professor of Geography

Department of Integrated Science and Technology

BE the CHANGE

Texture-based vs. Object-based image processing

Texture based Object based

Basis of analysisStatistical calculations performed on pixels in a moving window

Objects (connected pixels representing features) identified in the image

Integration with ground data

Easy Difficult

Performance when objects difficult to recognize

Variable Poor

Suitability for extracting attribute data

Good Often poor (except for location)

Utilization of known properties of features of interest

Poor Good

Objective:

To create an object oriented algorithm that incorporates the strengths of the texture-based approach

To meet this objective the Blacklight algorithm was created

Three versions exist:

-Three band passive optical

-Panchromatic passive optical

-Three band passive optical plus LiDAR

The project setup window for the version that uses LiDAR and passive optical imagery

The data in the spreadsheet consists of data on the phenomena you would like to map made on the ground using a GPS unit or through image interpretation.

In this case the data are trees / hectare for a forest.

Next, the user uses sliders to identify objects he or she thinks may be related to the attribute of interest.

The sliders work by creating a linear equation based on a series of images created using simple image processing operations.

This equation should maximize the response to the object, and minimize the response to the background.

If the equation value is greater than 0, a pixel is considered to be part of the object.

In this case the equation is:

1.17DN2.0DN0.1DN2.0DN0.1 if 1otherwise 0

heightedcontrast_ralbedored{ Threshold

Once the objects have been initially identified, metrics can be calculated based on the object.

For example:

The percentage of the plot taken up by objects.

The percentage of the object pixels that are core pixels.

These metrics are used in a regression equation to predict the measured attribute.

= Core pixel

Percent core = 12.5%

To improve the prediction accuracy, an optimization procedure is run which adjusts the sliders.

The values image.

In this case the number of trees per hectare in the area under the crosshairs is 1810.47.

A map showing the phenomena over the whole area of interest. Clicking on a pixel will bring up the estimated value at that location.

Tests

Test 1: Mapping forests

Test 2: Mapping urban features

Test 3: Mapping population density

Test 4: Mapping vehicle density

Test 1: Mapping forests

Remotely sensed data: 0.5m color infrared orthophotograph

Normalized DSM with a 1m resolution, obtained from DATIS II LiDAR data with a 1m point spacing.

Reference data: 10 circular plots with a 15 m radius placed in 11 – 16 year old non-intensively managed loblolly pine plantations at the Appomattox-Buckingham State Forest in Central Virginia. The following values were measured:

Trees per hectare

Biomass

Plot data

Attribute Minimum Maximum MeanStandard deviation

Trees per hectare 1118 1966 1566 359

Biomass (Mg / ha) 61 136 90 22

Predicted attribute R2 RMSE

Trees per hectare 0.79 164 (10%)

Biomass 0.64 13 (14%)

Results

Trees per hectare

This map was produced by averaging the predicted values in each stand.

This map was produced by segmenting the predicted biomass output from Blacklight using the SARSEG module in PCI Geomatica.

Biomass

This map was produced by averaging the predicted values in each stand.

This map was produced by segmenting the output from Blacklight using the SARSEG module in PCI Geomatica.

Test 2: Mapping the urban environment

Imagery: 1m normal color USDA NAIP data of Morehead, Kentucky from 2004.

Reference data: 25 randomly selected 100 x 100m plots in which the following were calculated based on photointerpretation:

Percent impervious

Percent tree cover

Percent grass

Photointerpreted data

Attribute Minimum Maximum MeanStandard deviation

Percent impervious 0.0 99.1 32.6 34.4

Percent forest 0.0 100.0 54.0 41.7

Percent grass 0.0 51.5 7.4 12.1

Predicted attribute R2 RMSE

Percent impervious 0.94 8.13

Percent forest 0.94 10.52

Percent grass 0.86 4.48

Results

Percent impervious

The values image.

In this case 91.7% of the cell is estimated to contain impervious surfaces.

Percent forest

Percent grass

Test 3: Population density

Imagery: 1m normal color USDA NAIP data of Harrisonburg, VA from 2003.

Reference data: US Census data from 2000. 20 census blocks were randomly selected and 50 x 50m areas at the center of each plot were used for processing.

Mapping population density would be of use in developing countries with no recent, reliable census data.

Reference data

Attribute Minimum Maximum MeanStandard deviation

People / km2 0 3699 1091 1204

Predicted attribute R2 RMSE

People / km2 0.66 707 (65%)

Results

Population density

This map was produced by averaging the predicted values in each census tract.

This map was produced by segmenting the output from Blacklight using the SARSEG module in PCI Geomatica.

Test 4: Vehicle density

Imagery: 6” normal color Virginia Base Map Program data of Harrisonburg, VA from 2006.

Reference data: Photointerpreted vehicles per acre

Photointerpreted data

Attribute Minimum Maximum MeanStandard deviation

Vehicles / acre 0 44 8 13

Predicted attribute R2 RMSE

Vehicles / acre 0.55 8.9 (111%)

Results

Population density

Conclusions

The algorithm shows promise in multiple types of analysis

Planned improvements:

Additional image processing functions

Better LiDAR integration

Additional object metrics

Ability to select metrics based on a stepwise approach

Would you like to test Blacklight?

If so, I would like to hear from you!

Zachary J. Bortolot

bortolzj@jmu.edu