So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read...

11
Read everything before doing anything with the software. 1 Accurate elevation and detailed image data are among the most important geospatial data layers needed to support economic development and make better decisions about flood protection, emergency services, and managing Pennsylvania’s resources. Pennsylvania needs to rebuild “PAMAP” (the Commonwealth’s official digital map) for two reasons: (1) many natural changes and human modifications have occurred across the state’s terrain since it was last mapped (back in 2008); and (2) LiDAR and aerial imaging technologies have improved since 2008 to produce much better resolution products at lower costs. Decisions made with obsolete data can have costly consequences, especially when and where floods occur (Fig. 1). Figure 1: FEMA’s 100-year floodplain near Johnstown, PA, is displayed with some 3D buildings. Building colors show dollar loss estimates, with higher values in red and orange. Source: Cambria County GIS Center. So, what is lidar? LiDAR is a geospatial technology that makes precise distance measurements using laser light (Bolstad, 2019; Abdullah, 2016; NEON Science, 2014; O-Neil-Dunne, 2013). The word “LiDAR” is an acronym for Light Detection and Ranging. When used with high-precision GNSS (for exterior orientation) and an IMU (for interior orientation)(see Figures 2 and 3), LiDAR data can be used to derive digital surface models (DSM); tree canopy models; building models; and bare-earth digital elevation models (DEM). LiDAR systems come in different flavors (e.g., bathymetric vs. terrestrial) and can produce different quality levels (QL) of data (Table 1). LiDAR data support all kinds of geologic, ecologic, and archeologic work (e.g., WDNR, 2019; BEG, 2014; Asner, 2013).

Transcript of So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read...

Page 1: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 1

Accurate elevation and detailed image data are among the most important geospatial data layers needed to

support economic development and make better decisions about flood protection, emergency services, and

managing Pennsylvania’s resources. Pennsylvania needs to rebuild “PAMAP” (the Commonwealth’s official digital

map) for two reasons: (1) many natural changes and human modifications have occurred across the state’s terrain

since it was last mapped (back in 2008); and (2) LiDAR and aerial imaging technologies have improved since 2008

to produce much better resolution products at lower costs. Decisions made with obsolete data can have costly

consequences, especially when and where floods occur (Fig. 1).

Figure 1: FEMA’s 100-year floodplain near Johnstown, PA, is displayed with some 3D buildings. Building colors show dollar loss estimates, with higher values in red and orange. Source: Cambria County GIS Center.

So, what is lidar? LiDAR is a geospatial technology that makes precise distance measurements using laser light (Bolstad, 2019;

Abdullah, 2016; NEON Science, 2014; O-Neil-Dunne, 2013). The word “LiDAR” is an acronym for Light Detection

and Ranging. When used with high-precision GNSS (for exterior orientation) and an IMU (for interior

orientation)(see Figures 2 and 3), LiDAR data can be used to derive digital surface models (DSM); tree canopy

models; building models; and bare-earth digital elevation models (DEM). LiDAR systems come in different flavors

(e.g., bathymetric vs. terrestrial) and can produce different quality levels (QL) of data (Table 1). LiDAR data

support all kinds of geologic, ecologic, and archeologic work (e.g., WDNR, 2019; BEG, 2014; Asner, 2013).

Page 2: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 2

Figure 2: Airborne LiDAR diagram: 1. During flight the aircraft uses GNSS hardware and software to constantly measure its location, altitude, time, and to simultaneously ingest and apply GNSS corrections in real-time. 2. Also simultaneously, the aircraft uses an Inertial Measurement Unit (IMU) to measure aircraft pitch, roll, and yaw. 3. With aircraft position and orientation known, laser light pulses can be emitted from a precisely known point, along a precisely known trajectory, and at a precisely known moment (see NEON Science, 2014)

Table 1: LiDAR Quality Level (QL) characteristics (from USGS, 2018: p23-24).

LiDAR

Quality Level

Nominal pulse

spacing (m)

Nominal pulse

density

(pulses/sq.m)

NVA @ 95% c.i.

(m)

VVA @ 95% c.i.

(m)

Minimum DEM

cell size (m)

QL0 ≤ 0.35 ≥ 8.0 ≤ 0.10 ≤ 0.15 0.5

QL1 ≤ 0.35 ≥ 8.0 ≤ 0.20 ≤ 0.30 0.5

QL2 ≤ 0.71 ≥ 2.0 ≤ 0.20 ≤ 0.30 1.0

QL3 ≤ 1.41 ≥ 0.5 ≤ 0.40 ≤ 0.60 2.0

DEM = digital elevation model; NVA = non-vegetated vertical accuracy; VVA = vegetated vertical accuracy.

Page 3: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 3

Figure 3: Near-infrared (NIR) laser pulses are emitted downward in a scanning motion from a moving aircraft and

reflected back to the aircraft (at a forward position, of course) depending on the distance traveled, incident angle,

type of reflector, and the amount of atmospheric scattering. (Left) Relatively smooth and solid objects on the

ground can reflect most of the energy in each pulse as a single return; such returns are characterized as ‘first

returns’ and as ‘high intensity’ returns. Building rooftops, bridges, paved roads, bare ground, and exposed

bedrock surfaces typically generate high-intensity returns. (Right) Light pulses that are split, refracted, or partially

absorbed may return to the forward aircraft position as one or multiple lower-intensity returns (1st, 2nd, …, nth).

Trees, however, can scatter light among their leaves and branches, so trees usually return multiple low-intensity

signals from different elevations. (Both) In all cases, the time it takes each emitted laser pulse to travel from the

aircraft’s emitting position and arrive at the aircraft’s forward sensing position is used (insert some geometry,

trigonometry, and physics equations here) to find the 3-D position of the reflecting spot. Clear deep water (not

shown in this figure) absorbs NIR energy (i.e., no pulse energy is returned to the aircraft), so open water bodies

tend to create gaps or holes in LiDAR point cloud data whenever NIR LiDAR is used.

Pennsylvania’s PAMAP program Between 2006 and 2008, Pennsylvania became one of the first states in the country to capture wall-to-wall aerial

photography and QL3 LiDAR data, which were both considered excellent at the time. The QL3 data cost $15M

(17.4M in $2018) to collect and process, and has since informed billions of dollars of engineering, planning,

environmental, and investment decisions. By pure coincidence, these data were collected and released at just the

right time to support natural gas resource development, infrastructure and construction management, and

natural resource conservation in Marcellus Shale regions. PAMAP LiDAR and image products are still among the

most-downloaded datasets from PASDA despite the data no longer representing current conditions at many

locations. Demand for updated LiDAR and aerial image products has been growing statewide.

Page 4: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 4

Current status of LiDAR updates in Pennsylvania Since 2014, the U.S. Geological Survey’s 3-D Elevation Program (3DEP; see USGS, 2019) has partially funded

several QL2 LiDAR acquisitions in Pennsylvania (Figure 4) that support national flood hazard mitigation programs,

transportation planning, and Chesapeake Bay Program compliance. The work is being accomplished, however, in

piecemeal fashion; each of the pieces is not always coordinated with the others. Nevertheless, all the new data

are eventually posted to PASDA and free to download.

When federal agencies are involved, most new PA LiDAR data are spatially referenced to one of the two

metric UTM grid zones (17N or 18N) and organized using square 1500-meter tiles that align with those grids

(Quantum Spatial, Inc., 2018). The data are stored in *.LAS 1.4 format. The filename 18TUK325446.las, for

example, indicates it is situated in zone 18N (N is assumed because … Pennsylvania) and anchored at the point

325000 m N, 4460000 m E. We’ll be working with four (4) LiDAR tiles collected in October 2018.

Figure 4: The current state of LiDAR data coverage for the Commonwealth. Cartography: Jeff Zimmerman @ Susquehanna River Basin Commission.

Page 5: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 5

Current status of aerial imagery updates in Pennsylvania

High resolution aerial photography is usually collected in Pennsylvania on an ad hoc or a just-as-needed basis.

Wealthy counties or cities can often afford to fly annual repeat photography to help them maintain or improve

their spatial data. Meanwhile, depopulating rural counties may not have collected any new imagery since 2008.

Fortunately, the Pennsylvania Emergency Management Agency (PEMA, 2018) has begun collecting new high

resolution aerial imagery (6-inch cell size). Importantly, PEMA captured 4 spectral bands of information (R,G,B,

and NIR) and not just the usual three (R,G,B). The four spectral bands can be used in different combinations to

display a landscape in natural color (R,G,B), to display a landscape in false color infrared (NIR,B,G), or to derive

useful vegetation indices like NDVI (Eq. 1).

𝑁𝐷𝑉𝐼 = (𝑁𝐼𝑅−𝑅)

(𝑁𝐼𝑅+𝑅) Eq. 1

When only state agencies are involved, new image data in Pennsylvania are spatially referenced to one of

the imperial State Plane coordinate grids (PA-North or PA-South) and broken into square 10,000-foot tiles that

align with those grids. The image tiles are stored in *.jp2 format. The filename for tile 31002160PAS.jp2, for

example, indicates the tile is situated in the PA-South zone and anchored at 310000 USft E, 2160000 USft N.1

The purpose of this lab is to provide you with a hands-on opportunity to develop your knowledge and skills

associated with Pennsylvania’s new aerial imagery and new QL2 LiDAR data; which includes visualizing the data in

3-D and deriving some commonly used datasets from them. We are not going to couch this lab in terms of a

current event or some geoenvironmental problem; instead, we are just going to learn how to handle and

geoprocess these new data. They will be used across Pennsylvania for years to come. You are in the very first

cohort of GIS3 students to ever handle these new QL2 LiDAR data.

1. Prepare the new QL2 LiDAR and image data for 3-D visualization and terrain analysis.

2. Visualize your image and LiDAR data and derive some commonly used raster surfaces.

3. Build map figures for others to see what you discovered.

1 Yes, the convention used for naming aerial image tiles could not be any more annoyingly different than the convention used for naming LiDAR point cloud files.

Page 6: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 6

We’ll focus our attention on a small 9 sq.km (3.5 sq.mi) area centered near Williams Grove Speedway and Park

(hereafter, just Williams Grove). Williams Grove hosts a working racetrack and an abandoned amusement park.

Williams Grove is located along the Yellow Breeches Creek, along the Cumberland County-York County border,

about 3 miles east of Boiling Springs, and about 3 miles south of Mechanicsburg Borough. Use Google Maps or a

similar map service to find Williams Grove and to get yourself acquainted with the speedway and the surrounding

area. Topographic relief is minimal, save the creek floodplain, but you’ll find multiple land uses and cover types.

You have been given a ZIP file that contains four (4) QL2 LiDAR tiles in *.LAS 1.4 format, one (1) shapefile that

represents the image tiling system, and one (1) shapefile that represents the LiDAR tiling system.

Advice The raw data alone consumes a lot of disk space (approx. 2 GB unzipped). I strongly recommend taking advantage

of your computer’s fast internal hard drive (the C:\Geotemp folder, best), an external hard drive with a USB3

connection (blue, 2nd best), or an external hard drive with a USB2 connection (white, 3rd best). Do not use thumb

drives – their glacial read-write speeds will torture you. Do not use your network drive – you’ll fill it.

Also, I strongly recommend watching O’Neil-Dunne (2019) before starting any work, for most of the methods you

need to handle LiDAR data in ArcGIS Pro are presented in his video, not in this handout.

Download the data for this lab and extract all ZIP files into your GIS3/Labs folder.

Start a new session of ArcGIS Pro

1. Use ArcGIS Pro’s > Catalog template > to create a new project in your

GIS3/Labs/GIS3_Lab_WilliamsGrove folder, but uncheck the box for creating yet another new

folder. Creating a new ArcGIS Pro project will create an ArcGIS Pro project file (*.aprx), a toolbox (*.tbx), a

file geodatabase (*.gdb), and it will automatically make your new geodatabase your default geodatabase.

2. To facilitate geoprocessing, we’ll use the spatial reference system (SRS) to which the LiDAR tiles are

already tied: (NAD83-2011) UTM Zone 18 N, metric. That means setting the coordinate system of our

map(s) and using the Toolbox’s Environment Settings to specify output coordinate systems accordingly.

Page 7: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 7

3. Watch O’Neil-Dunne (2019) before continuing.

In ArcGIS Pro > Catalog view

4. Build a new LAS dataset (in your project folder, not in your geodatabase).

5. Right-click your LAS dataset. Add your *.las files to the LAS dataset and calculate statistics. No

surface constraints are needed for our study area.

6. Review the statistics for each file and for the entire LAS dataset.

7. Right-click your LAS dataset and Add it to a new map, where you can explore your LiDAR data. Next,

Add the PEMA tile system shapefile to see how and where it overlaps with your LAS dataset.

8. Leverage the spatial overlap to identify the PEMA image tiles that cover your LAS dataset. Download the

linked *.JP2 files into your GIS3_Lab_WilliamsGrove folder. Unzip them with your preferred utility.

Back in ArcGIS Pro > Catalog view

9. You might need to right-click > refresh your GIS3_Lab_WilliamsGrove folder to get ArcGIS to

realize you just unzipped some new stuff there. Wake up ArcGIS!!

10. Next, build a new mosaic dataset (in your geodatabase, not in your folder) to manage your image tiles.

Keep in mind that your JPG-2000 images (*.jp2) have 4 bands of spectral data (red, green, blue, and

NIR), and each band is stored with unsigned 8-bit pixel depth. The output SRS you assign to your new

mosaic dataset (see Step 2) will not match the SRS of your underlying *.jp2 images – that’s okay.

11. Right-click your mosaic dataset and add your PEMA raster datasets to it. Be sure to specify the input

SRS for your *.jp2 files (just import the SRS from one of the files).

12. Add your mosaic dataset to your existing map so you can explore it. You may need to change

your map scale (i.e., zoom in) to reveal the 6 inch (15 cm) pixels inside each footprint.

Back in ArcGIS Pro > Catalog view

13. Add your LAS dataset to a new local scene so you can visualize your point cloud in 3-D.

Page 8: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 8

14. Right-click your new layer and change layer Symbology based on the LiDAR point attributes. Give your

computer a chance to redraw the point cloud before exploring.

a. Elevation (default)

b. Classification

c. Intensity

d. Elevation enhanced by intensity

15. Right-click your layer to Filter your LiDARdata. Revisit O’Neil-Dunne (2019) and step 14 as needed.

a. Bare ground surface: Ground category only, all returns

b. Covered terrain surface: all categories, first return only

i. Use your mosaic dataset to Colorize your LAS dataset

16. Use the knowledge you’re developing about filtering LiDAR point cloud data, one of your LAS dataset

layers, and ArcGIS Pro’s LAS Dataset tools to Export:

a. A raster Digital Surface Model (DSM) that represents the landscape as it is covered.

b. A raster Digital Elevation Model (DEM) that represents bare earth topography, a ground surface

devoid of trees, noise, and anthropogenic structures.

17. To help you visualize your exports, derive hillshade models from your DEM and your DSM.

18. From your DEM only, derive contours (both index and intermediate). By convention, index contours

are usually some multiple of 10 elevation units; the cartographer’s choice of contour index/label interval

depends on both: a) the amount of terrain relief in the study area that needs to be represented; and b)

the amount of space needed on the map for plotting contour labels, other labels, and other map symbols.

You should have completed at least this much geoprocessing in one week. Bring your results to class

and be prepared to discuss them. Use your second week to fix any problems, interpret your results,

and build your maps and report.

Page 9: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 9

19. Layout and share at least three (3) map figures to show others what you found.

a. A clean annotated reference map so others can find the study area.

b. An annotated 2-D map that highlights your bare earth hillshade model and topographic contours.

c. A 3-D scene that highlights one or more anthropogenic ‘features’ lurking in the LiDAR point cloud.

In your report, be sure to include: 1. Three tables with an interpretation for each;

a. Table 1: The number of returns and average point spacing in each LiDAR tile. Add a row for totals

at the bottom and a column for percentages.

i. Using your table, compare and contrast the average point spacings among your four (4)

LiDAR tiles against the nominal point spacing threshold associated with QL2 data (Table 1)

ii. According to the USGS standards shows in Table 1 (this handout), what is the appropriate

cell size you should use when exporting raster surfaces from these QL2 LiDAR data? Did

you follow the USGS standard?

b. Table 2: The number of returns associated with each unique classification code in your LAS

dataset. Add a row for totals at the bottom and a column of percentages.

c. Table 3: The number of returns associated with each return value (1st, 2nd, …, nth ). Add a row for

totals at the bottom and add a column for percentages.

i. Using the data you presented in your Table 3, describe and interpret the relationship you

see between return value (1st, 2nd, …, nth ) and the number of returns.

2. For each raster surface you derive (DEM, DSM, other), report the exact LAS dataset filter and the exact

output raster cell size you chose.

3. For each map figure you build (Objection 3, Method 19), build a concise but complete paragraph that

describes what is illustrated your figure. Use your words to guide your reader’s eyes as they look at your

map. GIS Analysts don’t just make maps; they also help others read and interpret their maps.

4. Finally, compare and contrast what you learned can be interpreted easily from: a) the aerial photography,

b) the LiDAR point cloud, c) the DSM hillshade model, and d) the DEM hillshade model.

Page 10: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 10

Build a well-written lab report that presents the data and the results of your investigation. Include your name,

date, and lab title on the first page. Insert page numbers. Your report should be printed on letter size paper. Set

all page margins to be 0.7” except for the left margin, which should be set to 1.2.” Use 1.5 line spacing, set the

normal font face to be Candara or Bookman Antiqua, and set the normal font size to be 11 points.

Your report should include five sections with bold and left-justified headings: Purpose, Objectives,

Methods and Data, Results and Answers, and Summary. Your Purpose and Objectives sections should be written

in your own words and must address the purpose and objectives this lab.

Your Methods and Data section should contain concise descriptions of your data sources and methods

used. Annotated cartographic models are always useful, but otherwise include screen captures (as figures) that

illustrate how you setup your geoprocessing tools (both parameters and environment settings). PC users can use

the Snipping tool to capture windows on your screen.

The Results and Answers section should include answers to the questions that arose during the process.

In your Summary, describe how well you accomplished your individual objectives and the overall purpose.

Note: All tables and figures must be numbered sequentially, have captions, and be referenced in your

text. Table and figure captions should not be orphaned. When appropriate, table and figure captions can be used

to declare units of measure. All tables and figures must be inserted inline with your text and not added as

attachments. Add a line space before and after each table/caption combo (or figure/caption combo) to buffer

them from your adjacent paragraphs. Table columns that contain text strings ought to be left-justified. Table

columns that contain numbers should be right-justified with all the decimal points aligned (i.e., all values should

be reported with the same decimal precision). Table captions should precede the tables they describe. Figure

captions should proceed the figures they describe. Use your normal font when building tables; avoid using

automatic styles that add useless colors or a lot of text decorations for no apparent reason.

Note: Maps and other figures should be legible when printed (not just on screen). Map layouts exported

into TIF image using a high pixel density (600 d.p.i.) usually don’t suffer any pixelated text or jagged linework

artifacts, but figures exported into JPG format with the default low pixel density of your screen (72 d.p.i.) will

almost always appear pixelated or even illegible when printed.

In GIS3, ‘innovation points’ are available to any and all students that document an attempt – whether successful or

unsuccessful – to push themselves and learn more than what was required by the lab assignment.

Page 11: So, what is lidar?webspace.ship.edu/sadrzy/geo420/labs/GIS3_Lab_LIDAR_WilliamsGrove.pdf · Read everything before doing anything with the software. 3 Figure 3: Near-infrared (NIR)

Read everything before doing anything with the software. 11

(required readings are shown in red)

Abdullah, Qassim. 2016. A Star is Born: The State of New Lidar Technologies. Photogrammetric Engineering and Remote

Sensing, May 2016, p307-312.

Asner, Greg. 2013. Ecology from the Air. YouTube. Last accessed on February 15, 2020 at

https://youtu.be/qCrVpRBBSvY

Bureau of Economic Geology. 2014. Unveiling the Earth's Surface: Airborne Lidar at UT's Bureau of Economic Geology.

YouTube. Last accessed on April 4, 2019 at https://youtu.be/JDR0ttLw6_A

Bolstad, Paul. 2019. GIS Fundamentals, 6th Edition. Eider Press.

Chapter 6: pages 245-251, 262-263, 288-292

Chapter 7: pages 307-308

Kimerling, Jon A. with Aileen R. Buckley, Phillip C. Muehrcke, and Juliana O. Muehrcke. 2012. Map Use: Reading,

Analysis, Interpretation, 7th Edition. Esri Press Academic. ISBN 978-1-58948-190-9

Chapter 6: page 124

National Geographic. 2018. Better Images of Cities Than From Satellites? It's Called LIDAR | National Geographic.

YouTube. Last accessed on April 4, 2019 at https://youtu.be/iSRK1NIT-vA

NEON Science. 2014. How Does LiDAR Remote Sensing Work? Light Detection and Ranging. YouTube. Last accessed

on February 15, 2020 at https://youtu.be/EYbhNSUnIdU

O’Neil-Dunne, Jarlath. 2013. LiDAR 101. YouTube. Last accessed on April 4, 2019 at https://youtu.be/1l0GwRLv2cM

O’Neil-Dunne, Jarlath. 2019. LiDAR Surface Models in ArcGIS Pro. YouTube. Last accessed on April 4, 2019 at

https://www.youtube.com/watch?v=L4tVXARSrUo

Pennsylvania Emergency Management Agency. 2018. PEMA 2018 0.5-foot Orthoimagery - Cycle 1 2018. Pennsylvania

Spatial Data Access (PASDA). Last accessed on January 20, 2020 at

https://www.pasda.psu.edu/uci/DataSummary.aspx?dataset=5104

Quantum Spatial, Inc. 2018. South Central Pennsylvania 2017-2018 QL2 LiDAR, LAS 1.4 files. Pennsylvania Spatial

Data Access (PASDA). Last accessed on January 20, 2020 at

ftp://ftp.pasda.psu.edu/pub/pasda/usgs/LiDAR2017/LAS/

USGS. 2019. 3D Elevation Program: What is 3DEP? U.S. Department of the Interior, U.S. Geological Survey. Last

accessed on April 4, 2019 at https://www.usgs.gov/core-science-systems/ngp/3dep/what-is-3dep

USGS. 2018. Lidar Base Specification, Version 1.3, February 2018. Chapter 4, Techniques and Methods, of Section B,

U.S. Geological Survey Standards, Book 11, Collection and Delineation of Spatial Data. U.S. Department of the

Interior, U.S. Geological Survey. Last accessed on April 5, 2018 at https://pubs.usgs.gov/tm/11b4/pdf/tm11-

B4.pdf

Washington Department of Natural Resources. 2019. The Bare Earth Story Map: How lidar in Washington State

exposes geology and natural hazards. ArcGIS Online. Last accessed on January 20, 2020 at

https://arcg.is/1DGeqL