What is LiDAR?

444
PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Sun, 26 May 2013 00:10:46 UTC What is Lidar? A Resource Curated by AirTopo Group

description

A detailed curated resource for those who would like to learn the science and history behind Lidar in surveying and topography.

Transcript of What is LiDAR?

  • 1. PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information.PDF generated at: Sun, 26 May 2013 00:10:46 UTCWhat is Lidar?A Resource Curated by AirTopo Group

2. ContentsArticlesLidar 1Remote sensing 9Light 17Radar 27Geomatics 45Archaeology 47Geography 67Geology 77Geomorphology 98Seismology 105Forestry 111Atmospheric physics 117Contour line 121Laser 129National Center for Atmospheric Research 149Ultraviolet 152Interferometric visibility 173Infrared 175Aerosol 188Cloud 197Meteorology 225Aircraft 239Satellite 249Surveying 265Micrometre 276Computer 277Beam splitter 296Laser scanning 297Azimuth 300Optics 304Attenuation 329Global Positioning System 334Inertial measurement unit 358National lidar dataset 361 3. Agricultural Research Service 362Canopy (biology) 364Orienteering 366Soil survey 375LIDAR speed gun 377Speed limit enforcement 378Wind farm 388Structure from motion 401CLidar 403Lidar detector 405Satellite laser ranging 405Optical time-domain reflectometer 407Optech 413TopoFlight 414Time-domain reflectometry 416ReferencesArticle Sources and Contributors 418Image Sources, Licenses and Contributors 430Article LicensesLicense 441 4. Lidar 1LidarA FASOR used at the Starfire Optical Range forlidar and laser guide star experiments is tuned tothe sodium D2a line and used to excite sodiumatoms in the upper atmosphere.This lidar may be used to scan buildings, rockformations, etc., to produce a 3D model. The lidarcan aim its laser beam in a wide range: its headrotates horizontally; a mirror tilts vertically. Thelaser beam is used to measure the distance to thefirst object on its path.Lidar (also written LIDAR or LiDAR) is a remote sensing technologythat measures distance by illuminating a target with a laser andanalyzing the reflected light. The term lidar comes from combining thewords light and radar.[]Lidar is popularly known as a technology used to make high resolutionmaps, geomatics, archaeology, geography, geology, geomorphology,seismology, forestry, remote sensing, atmospheric physics,[1]airbornelaser swath mapping (ALSM), laser altimetry, and contour mapping.History and etymology of lidar/LIDARLidar was developed in the early 1960s, shortly after the invention ofthe laser, and combined lasers focused imaging with radars ability tocalculate distances by measuring the time for the signal to return. Itsfirst applications were in meteorology, where it was used to measureclouds by the National Center for Atmospheric Research.[]Although commonly considered to be an acronym, the term lidar isactually a portmanteau of "light" and "radar", the first publishedmention of lidar makes this clear: "Eventually the laser may provide anextremely sensitive detector of particular wavelengths from distantobjects. Meanwhile, it is being used to study the moon by "lidar" (lightradar) and it promises a means of communications, not only all overthe solar system, but also with planets of nearby stars";[2]the OxfordEnglish Dictionary supports this etymology.[]The assumption that lidar was an acronym (LIDAR) came later,beginning in 1970,[3]and was based on the assumption that since thebase term "radar" originally started as an acronym for "RAdioDetection And Ranging", that LIDAR must stand for "LIght DetectionAnd Ranging",[4]or "Laser Imaging, Detection and Ranging".[5]Although "radar" is no longer treated as an acronym and is universallyuncapitalized, the word "lidar" became capitalized as LIDAR in somepublications beginning in the 1980s.[6]Today there is no consensus incapitalization, reflecting uncertainty about whether or not it is anacronym, and if it is an acronym, if it should be lowercase, like "radar".Lidar is also sometimes spelled "LIDAR", "LiDAR", "LIDaR", or"Lidar", depending on the publication, the USGS uses both LIDAR andlidar, sometimes in the same document,[7]and the New York Timesuses both "lidar" and "Lidar".[8] 5. Lidar 2General descriptionLidar uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets,including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and even single molecules.[1]Anarrow laser beam can be used to map physical features with very high resolution.Lidar has been used extensively for atmospheric research and meteorology. Downward-looking lidar instrumentsfitted to aircraft and satellites are used for surveying and mapping a recent example being the NASA ExperimentalAdvanced Research Lidar.[9]In addition lidar has been identified by NASA as a key technology for enablingautonomous precision safe landing of future robotic and crewed lunar landing vehicles.[10]Wavelengths from about 10 micrometers to the UV (ca. 250 nm) are used to suit the target. Typically light isreflected via backscattering. Different types of scattering are used for different lidar applications; most common areRayleigh scattering, Mie scattering, Raman scattering, and fluorescence. Based on different kinds of backscattering,the lidar can be accordingly called Rayleigh Lidar, Mie Lidar, Raman Lidar, Na/Fe/K Fluorescence Lidar, and soon.[1]Suitable combinations of wavelengths can allow for remote mapping of atmospheric contents by looking forwavelength-dependent changes in the intensity of the returned signal.DesignIn general there are two kinds of lidar detection schemes: "incoherent" or direct energy detection (which isprincipally an amplitude measurement) and Coherent detection (which is best for doppler, or phase sensitivemeasurements). Coherent systems generally use Optical heterodyne detection, which, being more sensitive thandirect detection, allows them to operate a much lower power but at the expense of more complex transceiverrequirements.In both coherent and incoherent lidar, there are two types of pulse models: micropulse lidar systems and high energysystems. Micropulse systems have developed as a result of the ever increasing amount of computer power availablecombined with advances in laser technology. They use considerably less energy in the laser, typically on the order ofone microjoule, and are often "eye-safe," meaning they can be used without safety precautions. High-power systemsare common in atmospheric research, where they are widely used for measuring many atmospheric parameters: theheight, layering and densities of clouds, cloud particle properties (extinction coefficient, backscatter coefficient,depolarization), temperature, pressure, wind, humidity, trace gas concentration (ozone, methane, nitrous oxide,etc.).[1]There are several major components to a lidar system:1. Laser 6001000 nm lasers are most common for non-scientific applications. They are inexpensive, but sincethey can be focused and easily absorbed by the eye, the maximum power is limited by the need to make themeye-safe. Eye-safety is often a requirement for most applications. A common alternative, 1550nm lasers, areeye-safe at much higher power levels since this wavelength is not focused by the eye, but the detector technologyis less advanced and so these wavelengths are generally used at longer ranges and lower accuracies. They are alsoused for military applications as 1550nm is not visible in night vision goggles, unlike the shorter 1000nminfrared laser. Airborne topographic mapping lidars generally use 1064nm diode pumped YAG lasers, whilebathymetric systems generally use 532nm frequency doubled diode pumped YAG lasers because 532nmpenetrates water with much less attenuation than does 1064nm. Laser settings include the laser repetition rate(which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, thenumber of passes required through the gain material (YAG, YLF, etc.), and Q-switch speed. Better targetresolution is achieved with shorter pulses, provided the lidar receiver detectors and electronics have sufficientbandwidth.[1]2. Scanner and optics How fast images can be developed is also affected by the speed at which they arescanned. There are several options to scan the azimuth and elevation, including dual oscillating plane mirrors, a 6. Lidar 3combination with a polygon mirror, a dual axis scanner (see Laser scanning). Optic choices affect the angularresolution and range that can be detected. A hole mirror or a beam splitter are options to collect a return signal.3. Photodetector and receiver electronics Two main photodetector technologies are used in lidars: solid statephotodetectors, such as silicon avalanche photodiodes, or photomultipliers. The sensitivity of the receiver isanother parameter that has to be balanced in a lidar design.4. Position and navigation systems Lidar sensors that are mounted on mobile platforms such as airplanes orsatellites require instrumentation to determine the absolute position and orientation of the sensor. Such devicesgenerally include a Global Positioning System receiver and an Inertial Measurement Unit (IMU).3D imaging can be achieved using both scanning and non-scanning systems. "3D gated viewing laser radar" is anon-scanning laser ranging system that applies a pulsed laser and a fast gated camera.Imaging lidar can also be performed using arrays of high speed detectors and modulation sensitive detectors arraystypically built on single chips using CMOS and hybrid CMOS/CCD fabrication techniques. In these devices eachpixel performs some local processing such as demodulation or gating at high speed down converting the signals tovideo rate so that the array may be read like a camera. Using this technique many thousands of pixels / channels maybe acquired simultaneously.[11]High resolution 3D lidar cameras use homodyne detection with an electronic CCD orCMOS shutter.[]A coherent Imaging lidar uses Synthetic array heterodyne detection to enables a staring single element receiver to actas though it were an imaging array.[12]ApplicationsThis lidar-equipped mobile robot uses its lidar toconstruct a map and avoid obstacles.Other than those applications listed above, there are a wide variety ofapplications of lidar, as often mentioned in National LIDAR Datasetprograms. 7. Lidar 4AgricultureAgricultural Research Service scientists havedeveloped a way to incorporate lidar with yieldrates on agricultural fields. This technology willhelp farmers improve their yields by directingtheir resources toward the high-yield sections oftheir land.Lidar also can be used to help farmers determine which areas of theirfields to apply costly fertilizer. Lidar can create a topographical map ofthe fields and reveals the slopes and sun exposure of the farm land.Researchers at the Agricultural Research Service blended thistopographical information with the farm lands yield results fromprevious years. From this information, researchers categorized the farmland into high-, medium-, or low-yield zones.[13]This technology isvaluable to farmers because it indicates which areas to apply theexpensive fertilizers to achieve the highest crop yield.ArchaeologyLidar has many applications in the field of archaeology includingaiding in the planning of field campaigns, mapping features beneathforest canopy, and providing an overview of broad, continuous features that may be indistinguishable on theground.[14]Lidar can also provide archaeologists with the ability to create high-resolution digital elevation models(DEMs) of archaeological sites that can reveal micro-topography that are otherwise hidden by vegetation.LiDAR-derived products can be easily integrated into a Geographic Information System (GIS) for analysis andinterpretation. For example at Fort Beausejour - Fort Cumberland National Historic Site, Canada, previouslyundiscovered archaeological features below forest canopy have been mapped that are related to the siege of the Fortin 1755. Features that could not be distinguished on the ground or through aerial photography were identified byoverlaying hillshades of the DEM created with artificial illumination from various angles. With lidar the ability toproduce high-resolution datasets quickly and relatively cheaply can be an advantage. Beyond efficiency, its ability topenetrate forest canopy has led to the discovery of features that were not distinguishable through traditionalgeo-spatial methods and are difficult to reach through field surveys, as in work at Caracol by Arlen Chase and hiswife Diane Zaino Chase.[15]The intensity of the returned signal can be used to detect features buried under flatvegetated surfaces such as fields, especially when mapping using the infrared spectrum. The presence of thesefeatures affects plant growth and thus the amount of infrared light reflected back.[16]In 2012, Lidar was used by ateam attempting to find the legendary city of La Ciudad Blanca in the Honduran jungle. During a seven day mappingperiod, they found evidence of extensive man-made structures that had alluded ground searches for hundreds ofyears.[17]Autonomous Vehicles3D SICK LidarAutonomous vehicles use lidar for obstacle detection and avoidance tonavigate safely through environments.[18]Biology and conservationLidar has also found many applications in forestry. Canopy heights,biomass measurements, and leaf area can all be studied using airborne lidar systems. Similarly, lidar is also used bymany industries, including Energy and Railroad, and the Department of Transportation as a faster way of surveying.Topographic maps can also be generated readily from lidar, including for recreational use such as in the productionof orienteering maps.[19]In addition, the Save-the-Redwoods League is undertaking a project to map the tall redwoods on Californiasnorthern coast. Lidar allows research scientists to not only measure the height of previously unmapped trees but to 8. Lidar 5determine the biodiversity of the redwood forest. Stephen Sillett who is working with the League on the North CoastLidar project claims this technology will be useful in directing future efforts to preserve and protect ancient redwoodtrees.[20]Wikipedia:Citing sources#What information to includeGeology and soil scienceHigh-resolution digital elevation maps generated by airborne and stationary lidar have led to significant advances ingeomorphology (the branch of geoscience concerned with the origin and evolution of Earths surface topography).Lidars abilities to detect subtle topographic features such as river terraces and river channel banks, to measure theland-surface elevation beneath the vegetation canopy, to better resolve spatial derivatives of elevation, and to detectelevation changes between repeat surveys have enabled many novel studies of the physical and chemical processesthat shape landscapes.[citation needed]In geophysics and tectonics, a combination of aircraft-based lidar and GPS has evolved into an important tool fordetecting faults and for measuring uplift. The output of the two technologies can produce extremely accurateelevation models for terrain - models that can even measure ground elevation through trees. This combination wasused most famously to find the location of the Seattle Fault in Washington, USA.[21]This combination also measuresuplift at Mt. St. Helens by using data from before and after the 2004 uplift.[22]Airborne lidar systems monitorglaciers and have the ability to detect subtle amounts of growth or decline. A satellite-based system, NASAs ICESat,includes a lidar sub-system for this purpose. NASAs Airborne Topographic Mapper[23]is also used extensively tomonitor glaciers and perform coastal change analysis. The combination is also used by soil scientists while creating asoil survey. The detailed terrain modeling allows soil scientists to see slope changes and landform breaks whichindicate patterns in soil spatial relationships.Meteorology and atmospheric environmentThe first lidar systems were used for studies of atmospheric composition, structure, clouds, and aerosols. Initiallybased on ruby lasers, lidar for meteorological applications was constructed shortly after the invention of the laser andrepresent one of the first applications of laser technology.Differential Absorption lidar (DIAL) is used for range-resolved measurements of a particular gas in the atmosphere,such as ozone, carbon dioxide, or water vapor. The lidar transmits two wavelengths: an "on-line" wavelength that isabsorbed by the gas of interest and an off-line wavelength that is not absorbed. The differential absorption betweenthe two wavelengths is a measure of the concentration of the gas as a function of range. DIAL lidars are essentiallydual-wavelength backscatter lidars.[citation needed]Doppler Lidar and Rayleigh Doppler Lidar are used to measure temperature and/or wind speed along the beam bymeasuring the frequency of the backscattered light. The Doppler broadening of gases in motion allows thedetermination of properties via the resulting frequency shift.[24][25]Scanning lidars, such as NASAs HARLIELIDAR, have been used to measure atmospheric wind velocity in a large three dimensional cone.[26]ESAs windmission ADM-Aeolus will be equipped with a Doppler lidar system in order to provide global measurements ofvertical wind profiles.[27]A doppler lidar system was used in the 2008 Summer Olympics to measure wind fieldsduring the yacht competition.[28]Doppler lidar systems are also now beginning to be successfully applied in therenewable energy sector to acquire wind speed, turbulence, wind veer and wind shear data. Both pulsed[29]andcontinuous wave systems[30]are being used. Pulsed systems using signal timing to obtain vertical distanceresolution, whereas continuous wave systems rely on detector focusing.Synthetic Array Lidar allows imaging lidar without the need for an array detector. It can be used for imagingDoppler velocimetry, ultra-fast frame rate (MHz) imaging, as well as for speckle reduction in coherent lidar.[12]Anextensive lidar bibliography for atmospheric and hydrospheric applications is given by Grant.[31] 9. Lidar 6Law enforcementLIDAR speed guns are used by the police to measure the speed of vehicles for speed limit enforcementpurposes.[citation needed]MilitaryFew military applications are known to be in place and are classified, but a considerable amount of research isunderway in their use for imaging. Higher resolution systems collect enough detail to identify targets, such as tanks.Examples of military applications of lidar include the Airborne Laser Mine Detection System (ALMDS) forcounter-mine warfare by Aret Associates.[32]A NATO report (RTO-TR-SET-098) evaluated the potential technologies to do stand-off detection for thediscrimination of biological warfare agents. The potential technologies evaluated were Long-Wave Infrared (LWIR),Differential Scatterring (DISC), and Ultraviolet Laser Induced Fluorescence (UV-LIF). The report concluded that :Based upon the results of the lidar systems tested and discussed above, the Task Group recommends that the bestoption for the near-term (20082010) application of stand-off detection systems is UV LIF .[33]However, in thelong-term, other techniques such as stand-off Raman spectroscopy may prove to be useful for identification ofbiological warfare agents.Short-range compact spectrometric lidar based on Laser-Induced Fluorescence (LIF) would address the presence ofbio-threats in aerosol form over critical indoor, semi-enclosed and outdoor venues like stadiums, subways, andairports. This near real-time capability would enable rapid detection of a bioaerosol release and allow for timelyimplementation of measures to protect occupants and minimize the extent of contamination.[34]The Long-Range Biological Standoff Detection System (LR-BSDS) was developed for the US Army to provide theearliest possible standoff warning of a biological attack. It is an airborne system carried by a helicopter to detectman-made aerosol clouds containing biological and chemical agents at long range. The LR-BSDS, with a detectionrange of 30km or more, was fielded in June 1997.[35]Five lidar units produced by the German company Sick AG were used for short range detection on Stanley, theautonomous car that won the 2005 DARPA Grand Challenge.A robotic Boeing AH-6 performed a fully autonomous flight in June 2010, including avoiding obstacles usinglidar.[36][37]MiningLidar is used in the mining industry for various tasks. The calculation of ore volumes is accomplished by periodic(monthly) scanning in areas of ore removal, then comparing surface data to the previous scan.[citation needed]Physics and astronomyA worldwide network of observatories uses lidars to measure the distance to reflectors placed on the moon, allowingthe moons position to be measured with mm precision and tests of general relativity to be done. MOLA, the MarsOrbiting Laser Altimeter, used a lidar instrument in a Mars-orbiting satellite (the NASA Mars Global Surveyor) toproduce a spectacularly precise global topographic survey of the red planet.In September, 2008, NASAs Phoenix Lander used lidar to detect snow in the atmosphere of Mars.[38]In atmospheric physics, lidar is used as a remote detection instrument to measure densities of certain constituents ofthe middle and upper atmosphere, such as potassium, sodium, or molecular nitrogen and oxygen. Thesemeasurements can be used to calculate temperatures. lidar can also be used to measure wind speed and to provideinformation about vertical distribution of the aerosol particles.[citation needed]At the JET nuclear fusion research facility, in the UK near Abingdon, Oxfordshire, lidar Thomson Scattering is usedto determine Electron Density and Temperature profiles of the plasma.[39] 10. Lidar 7RoboticsLidar technology is being used in Robotics for the perception of the environment as well as object classification.[40]The ability of lidar technology to provide three-dimensional elevation maps of the terrain, high precision distance tothe ground, and approach velocity can enable safe landing of robotic and manned vehicles with a high degree ofprecision.[41]Refer to the Military section above for further examples.SpaceflightLidar is increasingly being utilized for rangefinding and orbital element calculation of relative velocity in proximityoperations and stationkeeping of spacecraft. Lidar has also been used for atmospheric studies from space. Usingshort pulses of laser light beamed from a spacecraft, some of that "light reflects off of tiny particles in theatmosphere and back to a telescope aligned with the laser. By precisely timing the lidar echo, and by measuringhow much laser light is received by the telescope, scientists can accurately determine the location, distribution andnature of the particles. The result is a revolutionary new tool for studying constituents in the atmosphere, from clouddroplets to industrial pollutants, that are difficult to detect by other means."[42][43]SurveyingThis TomTom mapping van is fitted with fivelidars on its roof rack.Airborne lidar sensors are used by companies in the remote sensingfield. It can be used to create DTM (Digital Terrain Models) and DEM(Digital Elevation Models) this is quite a common practice for largerareas as a plane can take in a 1km wide swath in one flyover. Greatervertical accuracy of below 50mm can be achieved with a lower flyoverand a slimmer 200 m swath, even in forest, where it is able to give youthe height of the canopy as well as the ground elevation. A referencepoint is needed to link the data in with the WGS (World GeodeticSystem)[citation needed]In fact, it works a lot like ordinary radar, exceptthat these systems send out narrow pulses or beams of light rather thanbroad radio waves.TransportationLidar has been used in Adaptive Cruise Control (ACC) systems for automobiles. Systems such as those by Siemensand Hella use a lidar device mounted on the front of the vehicle, such as the bumper, to monitor the distance betweenthe vehicle and any vehicle in front of it.[44]In the event the vehicle in front slows down or is too close, the ACCapplies the brakes to slow the vehicle. When the road ahead is clear, the ACC allows the vehicle to accelerate to aspeed preset by the driver. Refer to the Military section above for further examples.Wind farm optimizationLidar can be used to increase the energy output from wind farms by accurately measuring wind speeds and windturbulence.[]An experimental[]lidar is mounted on a wind turbine rotor to measure oncoming horizontal winds, andproactively adjust blades to protect components and increase power.[45]Solar photovoltaic deployment optimizationLidar can also be used to assist planners and developers optimize solar photovoltaic systems at the city level bydetermining appropriate roof tops[46]and for determining shading losses.[47] 11. Lidar 8Other usesThe video for the song "House of Cards" by Radiohead was believed to be the first use of real-time 3D laserscanning to record a music video. The range data in the video is not completely from a lidar, as structured lightscanning is also used.[48][49]Alternative technologiesRecent development of Structure From Motion (SFM) technologies allows delivering 3D images and maps based ondata extracted from visual and IR photography. The elevation or 3D data is extracted using multiple parallel passesover mapped area, yielding both visual light image and 3D structure from the same sensor, which is often a speciallychosen and calibrated digital camera.References[2] James Ring, "The Laser in Astronomy." p. 672-3, New Scientist Jun 20, 1963[3] "New Artillery Against Smog: TV and Lidar" Popular Mechanics, April 1970, p. 104.[4] NOAA, http://www.ngs.noaa.gov/RESEARCH/RSD/main/lidar/lidar.shtml[5] LIDAR patent on file, http://www.google.com/patents/US20090273770[6] Google Books search for "lidar", sorted by date of publication, http://books.google.com/[7] USGS Center for LIDAR Information Coordination and Knowledge, http://lidar.cr.usgs.gov/[8] New York Times, search for "lidar", http://query.nytimes.com/search/sitesearch/#/lidar[9] Experimental Advanced Research Lidar, NASA.org (http://inst.wff.nasa.gov/eaarl/). Retrieved 8 August 2007.[12] Strauss C. E. M., " Synthetic-array heterodyne detection: a single-element detector acts as an array (http://www.opticsinfobase.org/ol/abstract.cfm?id=12612)", Opt. Lett. 19, 1609-1611 (1994)[19] http://www.lidarbasemaps.org[20][20] Councillor Quarterly, Summer 2007 Volume 6 Issue 3[21] Tom Paulson. LIDAR shows where earthquake risks are highest, Seattle Post (Wednesday, April 18, 2001) (http://www.seattlepi.com/local/19144_quake18.shtml).[22] Mount Saint Helens LIDAR Data, Washington State Geospatial Data Archive (September 13, 2006) (http://wagda.lib.washington.edu/data/type/elevation/lidar/st_helens/). Retrieved 8 August 2007.[23] Airborne Topographic Mapper, NASA.gov (http://atm.wff.nasa.gov/). Retrieved 8 August 2007.[24] http://superlidar.colorado.edu/Classes/Lidar2011/LidarLecture14.pdf[26] Thomas D. Wilkerson, Geary K. Schwemmer, and Bruce M. Gentry. LIDAR Profiling of Aerosols, Clouds, and Winds by Doppler andNon-Doppler Methods, NASA International H2O Project (2002) (http://harlie.gsfc.nasa.gov/IHOP2002/Pub&Pats/AMOS 2002 final.pdf).[27] Earth Explorers: ADM-Aeolus, ESA.org (European Space Agency, 6 June 2007) (http://www.esa.int/esaLP/ESAES62VMOC_LPadmaeolus_0.html). Retrieved 8 August 2007.[28] Doppler lidar gives Olympic sailors the edge, Optics.org (3 July, 2008) (http://optics.org/cws/article/research/34878). Retrieved 8 July2008.[29] http://www.lidarwindtechnologies.com/[30] http://www.naturalpower.com/zephir[31] Grant, W. B., Lidar for atmospheric and hydrospheric studies, in Tunable Laser Applications, 1st Edition, Duarte, F. J. Ed. (Marcel Dekker,New York, 1995) Chapter 7.[32] (http://www.arete.com/index.php?view=stil_mcm)[35] .http://articles.janes.com/articles/Janes-Nuclear,-Biological-and-Chemical-Defence/LR-BSDS--Long-Range-Biological-Standoff-Detection-System-United-States.html[36] Spice, Byron. Researchers Help Develop Full-Size Autonomous Helicopter (http://www.cmu.edu/news/blog/2010/Summer/unprecedented-robochopper.shtml) Carnegie Mellon, 6 July 2010. Retrieved: 19 July 2010.[37] Koski, Olivia. In a First, Full-Sized Robo-Copter Flies With No Human Help (http://www.wired.com/dangerroom/2010/07/in-a-first-full-sized-robo-copter-flies-with-no-human-help/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+wired/index+(Wired:+Index+3+(Top+Stories+2))#ixzz0tk2hAfAQ) Wired, 14 July 2010. Retrieved: 19 July 2010.[38] NASA. NASA Mars Lander Sees Falling Snow, Soil Data Suggest Liquid Past NASA.gov (29 September 2008) (http://www.nasa.gov/mission_pages/phoenix/news/phoenix-20080929.html). Retrieved 9 November 2008.[39] CW Gowers.Focus On : Lidar-Thomson Scattering Diagnostic on JET JET.EFDA.org (undated) (http://www.jet.efda.org/pages/focus/lidar/index.html). Retrieved 8 August 2007.[45] Mikkelsen, Torben & Hansen, Kasper Hjorth et al. Lidar wind speed measurements from a rotating spinner (http://orbit.dtu.dk/getResource?recordId=259451&objectId=1&versionId=1) Danish Research Database & Danish Technical University, 20 April 2010. 12. Lidar 9Retrieved: 25 April 2010.[46] Ha T. Nguyen, Joshua M. Pearce, Rob Harrap, and Gerald Barber, The Application of LiDAR to Assessment of Rooftop SolarPhotovoltaic Deployment Potential on a Municipal District Unit (http://www.mdpi.com/1424-8220/12/4/4534/pdf), Sensors, 12, pp.4534-4558 (2012).[49] Retrieved 2 May 2011 (http://www.velodyne.com/lidar/lidar.aspx)WANG, J., ZHANG, J., RONCAT, A., KUENZER, C., WAGNER, W., 2009: Regularizing method for thedetermination of the backscatter cross section in lidar data, In: J. Opt. Soc. Am. A Vol. 26, No. 5/May 2009/,1084-7529/09/051071-9, pp.10711079External links The USGS Center for LIDAR Information Coordination and Knowledge (CLICK) (http://lidar.cr.usgs.gov/) -A website intended to "facilitate data access, user coordination and education of lidar remote sensing for scientificneeds." How Lidar Works (http://airborne1.com/how_lidar.html) LiDAR Research Group (LRG), University of Heidelberg (http://lrg.uni-hd.de) Forecast 3D Lidar Scanner manufactured by Autonomous Solutions, Inc (http://autonomoussolutions.com/forecast-3d-laser-system/): 3D Point Cloud and Obstacle Detection Free online lidar data viewer (http://www.lidarview.com/)Remote sensingSynthetic aperture radar image ofDeath Valley colored usingpolarimetry.Remote sensing is the acquisition of information about an object or phenomenonwithout making physical contact with the object. In modern usage, the termgenerally refers to the use of aerial sensor technologies to detect and classifyobjects on Earth (both on the surface, and in the atmosphere and oceans) bymeans of propagated signals (e.g. electromagnetic radiation emitted from aircraftor satellites).[1][2] 13. Remote sensing 10OverviewThis video is about how Landsat was used to identify areas of conservation in theDemocratic Republic of the Congo, and how it was used to help map an area calledMLW in the north.There are two main types of remotesensings: passive remote sensing and activeremote sensing.[3]Passive sensors detectnatural radiation that is emitted or reflectedby the object or surrounding areas.Reflected sunlight is the most commonsource of radiation measured by passivesensors. Examples of passive remote sensorsinclude film photography, infrared,charge-coupled devices, and radiometers.Active collection, on the other hand, emitsenergy in order to scan objects and areaswhereupon a sensor then detects andmeasures the radiation that is reflected orbackscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delaybetween emission and return is measured, establishing the location, speed and direction of an object.Remote sensing makes it possible to collect data on dangerous or inaccessible areas. Remote sensing applicationsinclude monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions,and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-offcollection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on theground, ensuring in the process that areas or objects are not disturbed.Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which inconjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enoughinformation to monitor trends such as El Nio and other natural long and short term phenomena. Other uses includedifferent areas of the earth sciences such as natural resource management, agricultural fields such as land usage andconservation, and national security and overhead, ground-based and stand-off collection on border areas.[4]Data acquisition techniquesThe basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiationthat stand out from surrounding areas.Applications of remote sensing data Conventional radar is mostly associated with aerial traffic control, early warning, and certain large scalemeteorological data. Doppler radar is used by local law enforcements monitoring of speed limits and in enhancedmeteorological collection such as wind speed and direction within weather systems. Other types of activecollection includes plasmas in the ionosphere. Interferometric synthetic aperture radar is used to produce precisedigital elevation models of large scale terrain (See RADARSAT, TerraSAR-X, Magellan). Laser and radar altimeters on satellites have provided a wide range of data. By measuring the bulges of watercaused by gravity, they map features on the seafloor to a resolution of a mile or so. By measuring the height andwavelength of ocean waves, the altimeters measure wind speeds and direction, and surface ocean currents anddirections. Light detection and ranging (LIDAR) is well known in examples of weapon ranging, laser illuminated homing ofprojectiles. LIDAR is used to detect and measure the concentration of various chemicals in the atmosphere, whileairborne LIDAR can be used to measure heights of objects and features on the ground more accurately than with 14. Remote sensing 11radar technology. Vegetation remote sensing is a principal application of LIDAR. Radiometers and photometers are the most common instrument in use, collecting reflected and emitted radiationin a wide range of frequencies. The most common are visible and infrared sensors, followed by microwave,gamma ray and rarely, ultraviolet. They may also be used to detect the emission spectra of various chemicals,providing data on chemical concentrations in the atmosphere. Stereographic pairs of aerial photographs have often been used to make topographic maps by imagery and terrainanalysts in trafficability and highway departments for potential routes. Simultaneous multi-spectral platforms such as Landsat have been in use since the 70s. These thematic mapperstake images in multiple wavelengths of electro-magnetic radiation (multi-spectral) and are usually found on Earthobservation satellites, including (for example) the Landsat program or the IKONOS satellite. Maps of land coverand land use from thematic mapping can be used to prospect for minerals, detect or monitor land usage,deforestation, and examine the health of indigenous plants and crops, including entire farming regions or forests. Hyperspectral imaging produces an image where each pixel has full spectral information with imaging narrowspectral bands over a contiguous spectral range. Hyperspectral imagers are used in various applications includingmineralogy, biology, defence, and environmental measurements. Within the scope of the combat against desertification, remote sensing allows to follow-up and monitor risk areasin the long term, to determine desertification factors, to support decision-makers in defining relevant measures ofenvironmental management, and to assess their impacts.[5]Geodetic Overhead geodetic collection was first used in aerial submarine detection and gravitational data used in militarymaps. This data revealed minute perturbations in the Earths gravitational field (geodesy) that may be used todetermine changes in the mass distribution of the Earth, which in turn may be used for geological studies.Acoustic and near-acoustic Sonar: passive sonar, listening for the sound made by another object (a vessel, a whale etc.); active sonar,emitting pulses of sounds and listening for echoes, used for detecting, ranging and measurements of underwaterobjects and terrain. Seismograms taken at different locations can locate and measure earthquakes (after they occur) by comparing therelative intensity and precise timings.To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location,what time it is, and the rotation and orientation of the sensor. High-end instruments now often use positionalinformation from satellite navigation systems. The rotation and orientation is often provided within a degree or twowith electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but alsoaltitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at differentlatitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methodsincluding navigation from stars or known benchmarks.Data processingGenerally speaking, remote sensing works on the principle of the inverse problem. While the object or phenomenonof interest (the state) may not be directly measured, there exists some other variable that can be detected andmeasured (the observation), which may be related to the object of interest through the use of a data-derivedcomputer model. The common analogy given to describe this is trying to determine the type of animal from itsfootprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it ispossible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region.The frequency of the emission may then be related to the temperature in that region via various thermodynamic 15. Remote sensing 12relations.The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions.Spatial resolutionThe size of a pixel that is recorded in a raster image typically pixels may correspond to square areas rangingin side length from 1 to 1,000 metres (3.3 to 3,300 ft).Spectral resolutionThe wavelength width of the different frequency bands recorded usually, this is related to the number offrequency bands recorded by the platform. Current Landsat collection is that of seven bands, including severalin the infra-red spectrum, ranging from a spectral resolution of 0.07 to 2.1 m. The Hyperion sensor on EarthObserving-1 resolves 220 bands from 0.4 to 2.5 m, with a spectral resolution of 0.10 to 0.11 m per band.Radiometric resolutionThe number of different intensities of radiation the sensor is able to distinguish. Typically, this ranges from 8to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384 intensities or "shades" of colour, ineach band. It also depends on the instrument noise.Temporal resolutionThe frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiringan averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence communitywhere repeated coverage revealed changes in infrastructure, the deployment of units or themodification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeatthe collection of said location.In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to areference point including distances between known points on the ground. This depends on the type of sensor used.For example, in conventional photographs, distances are accurate in the center of the image, with the distortion ofmeasurements increasing the farther you get from the center. Another factor is that of the platen against which thefilm is pressed can cause severe errors when photographs are used to measure ground distances. The step in whichthis problem is resolved is called georeferencing, and involves computer-aided matching up of points in the image(typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping"the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced.In addition, images may need to be radiometrically and atmospherically corrected.Radiometric correctionGives a scale to the pixel values, e. g. the monochromatic scale of 0 to 255 will be converted to actual radiancevalues.Topographic correctionIn the rugged mountains, as a result of terrain, each pixel which receives the effective illumination variesconsiderably different. In remote sensing image, the pixel on the shady slope receives weak illumination andhas a low radiance value, in contrast, the pixel on the sunny slope receives strong illumination and has a highradiance value. For the same objects, the pixel radiance values on the shady slope must be very different fromthat on the sunny slope. Different objects may have the similar radiance values. This spectral informationchanges seriously affected remote sensing image information extraction accuracy in the mountainous area. Itbecame the main obstacle to further application on remote sensing images. The purpose of topographiccorrection is to eliminate this effect, recovery true reflectivity or radiance of objects in horizontal conditions. Itis the premise of quantitative remote sensing application.Atmospheric correction 16. Remote sensing 13eliminates atmospheric haze by rescaling each frequency band so that its minimum value (usually realised inwater bodies) corresponds to a pixel value of 0. The digitizing of data also make possible to manipulate thedata by changing gray-scale values.Interpretation is the critical process of making sense of the data. The first application was that of aerial photographiccollection which used the following process; spatial measurement through the use of a light table in bothconventional single or stereographic coverage, added skills such as the use of photogrammetry, the use ofphotomosaics, repeat coverage, Making use of objects known dimensions in order to detect modifications. ImageAnalysis is the recently developed automated computer-aided application which is in increasing use.Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS)imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporalscale.Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent ofgeography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to storethe data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, theformat may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is ascomputer-generated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized half-toneimages. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created,copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yetcan be read by human beings with minimal, standardized equipment.Data processing levelsTo facilitate the discussion of data processing in practice, several processing levels were first defined in 1986 byNASA as part of its Earth Observing System[6]and steadily adopted since then, both internally at NASA (e. g.,[7])and elsewhere (e. g.,[8]); these definitions are:Level Description0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronizationframes, communications headers, duplicate data) removed.1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, includingradiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended butnot applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not allinstruments have Level 1b data; level 0 data is not recoverable from level 1b data.2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1source data.3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated,complete regions mosaicked together from multiple orbits, etc.).4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derivedfrom these measurements).A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significantscientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level thatis directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data setstend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally.Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a greatdeal of data handling overhead. These data tend to be generally more useful for many applications. The regularspatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different 17. Remote sensing 14sources.HistoryThe TR-1 reconnaissance/surveillance aircraft.The 2001 Mars Odyssey used spectrometers andimagers to hunt for evidence of past or presentwater and volcanic activity on Mars.The modern discipline of remote sensing arose with the developmentof flight. The balloonist G. Tournachon (alias Nadar) madephotographs of Paris from his balloon in 1858. Messenger pigeons,kites, rockets and unmanned balloons were also used for early images.With the exception of balloons, these first, individual images were notparticularly useful for map making or for scientific purposes.[citationneeded]Systematic aerial photography was developed for military surveillanceand reconnaissance purposes beginning in World War I and reaching aclimax during the Cold War with the use of modified combat aircraftsuch as the P-51, P-38, RB-66 and the F-4C, or specifically designedcollection platforms such as the U2/TR-1, SR-71, A-5[9] and the OV-1series both in overhead and stand-off collection. A more recentdevelopment is that of increasingly smaller sensor pods such as thoseused by law enforcement and the military, in both manned andunmanned platforms. The advantage of this approach is that thisrequires minimal modification to a given airframe. Later imagingtechnologies would include Infra-red, conventional, doppler andsynthetic aperture radar.[citation needed]The development of artificial satellites in the latter half of the 20thcentury allowed remote sensing to progress to a global scale as of theend of the Cold War. Instrumentation aboard various Earth observingand weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARSprovided global measurements of various data for civil, research, and military purposes. Space probes to otherplanets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments,synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, whileinstruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a fewexamples.[citation needed]Recent developments include, beginning in the 1960s and 1970s with the development of image processing ofsatellite imagery. Several research groups in Silicon Valley including NASA Ames Research Center, GTE and ESLInc. developed Fourier transform techniques leading to the first notable enhancement of imagery data.[citation needed]Training and EducationRemote Sensing has a growing relevance in the modern information society. It represents a key technology as part ofthe aerospace industry and bears increasing economic relevance new sensors e.g. TerraSAR-X & RapidEye aredeveloped constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensingexceedingly influences everyday life, ranging from weather forecasts to reports on climate change or naturaldisasters. As an example, 80% of the German students use the services of Google Earth; in 2006 alone the softwarewas downloaded 100 million times. But studies has shown that only a fraction of them know more about the datathey are working with.[10]There exists a huge knowledge gap between the application and the understanding ofsatellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims tostrengthen the support for teaching on the subject.[11]A lot of the computer software explicitly developed for school 18. Remote sensing 15lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated intothe curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remotesensing requires a consolidation of physics and mathematics as well as competences in the fields of media andmethods apart from the mere visual interpretation of satellite images.Many teachers have great interest in the subject remote sensing, being motivated to integrate this topic intoteaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusinginformation.[12]In order to integrate remote sensing in a sustainable manner organizations like the EGU or digitalearth[13]encourages the development of learning modules and learning portals (e.g. FIS Remote Sensing inSchool Lessons[14]or Landmap Spatial Discovery[15]) promoting media and method qualifications as well asindependent working.Remote Sensing InternshipsOne effective way to teach students the many applications of remote sensing is through an internship opportunity.NASA DEVELOP is one such opportunity, where students work in teams with science advisor(s) and/or partner(s) tomeet some practical need in the community. Working through NASA, this program give students experience inreal-world remote sensing applications, as well as providing valuable training. (More information can be found onthe NASA DEVELOP website[16]Another such program is SERVIR. Supporting by the US Agency of International Development (USAID) andNASA, SERVIR provides students with valuable hands-on experience with remote sensing, while providingend-users with the resources to better respond to a whole host of issues. More information can be found on theSERVIR website[17]Remote Sensing softwareRemote Sensing data is processed and analyzed with computer software, known as a remote sensing application. Alarge number of proprietary and open source applications exist to process remote sensing data. Remote SensingSoftware packages include: TNTmips from MicroImages, PCI Geomatica made by PCI Geomatics, the leading remote sensing software package in Canada, IDRISI from Clark Labs, Image Analyst from Intergraph, and RemoteView made by Overwatch Textron Systems. Dragon/ips is one of the oldest remote sensing packages still available, and is in some cases free.Open source remote sensing software includes: OSSIM, Opticks (software), Orfeo toolbox Others mixing remote sensing and GIS capabilities are: GRASS GIS, ILWIS, QGIS, and TerraLook.According to an NOAA Sponsored Research by Global Marketing Insights, Inc. the most used applications amongAsian academic groups involved in remote sensing are as follows: ERDAS 36% (ERDAS IMAGINE 25% &ERMapper 11%); ESRI 30%; ITT Visual Information Solutions ENVI 17%; MapInfo 17%.Among Western Academic respondents as follows: ESRI 39%, ERDAS IMAGINE 27%, MapInfo 9%, AutoDesk7%, ITT Visual Information Solutions ENVI 17%. 19. Remote sensing 16References[4] http://hurricanes.nasa.gov/earth-sun/technology/remote_sensing.html[5] Begni Grard, Escadafal Richard, Fontannaz Delphine and Hong-Nga Nguyen Anne-Thrse, 2005. Remote sensing: a tool to monitor andassess desertification. Les dossiers thmatiques du CSFD. Issue 2. 44 pp. (http://www.csf-desertification.org/index.php/bibliotheque/publications-csfd/doc_details/28-begni-gerard-et-al-2005-remote-sensing)[6] NASA (1986), Report of the EOS data panel, Earth Observing System, Data and Information System, Data Panel Report, Vol. IIa., NASATechnical Memorandum 87777, June 1986, 62 pp. Available at (http://hdl.handle.net/2060/19860021622)[7] C. L. Parkinson, A. Ward, M. D. King (Eds.) Earth Science Reference Handbook A Guide to NASAs Earth Science Program and EarthObserving Satellite Missions, National Aeronautics and Space Administration Washington, D. C. Available at (http://eospso.gsfc.nasa.gov/ftp_docs/2006ReferenceHandbook.pdf)[8] GRAS-SAF (2009), Product User Manual, GRAS Satellite Application Facility, Version 1.2.1, 31 March 2009. Available at (http://www.grassaf.org/general-documents/products/grassaf_pum_v121.pdf)[9] http://toolserver.org/%7Edispenser/cgi-bin/dab_solver.py?page=Remote_sensing&editintro=Template:Disambiguation_needed/editintro&client=Template:Dn[10] Ditter, R., Haspel, M., Jahn, M., Kollar, I., Siegmund, A., Viehrig, K., Volz, D., Siegmund, A. (2012) Geospatial technologies in school theoretical concept and practical implementation in K-12 schools. In: International Journal of Data Mining, Modelling and Management(IJDMMM): FutureGIS: Riding the Wave of a Growing Geospatial Technology Literate Society; Vol. X[11] Stork, E.J., Sakamoto, S.O., and Cowan, R.M. (1999) "The integration of science explorations through the use of earth images in middleschool curriculum", Proc. IEEE Trans. Geosci. Remote Sensing 37, 18011817[12] Bednarz, S.W. and Whisenant, S.E. (2000) Mission geography: linking national geography standards, innovative technologies and NASA,Proc. IGARSS, Honolulu, USA, 27802782 8[13] http://www.digital-earth.eu/[14] http://www.fis.uni-bonn.de/node/92[15] http://www.landmap.ac.uk[16] http://develop.larc.nasa.gov/[17] http://www.nasa.gov/mission_pages/servir/index.htmlFurther reading Campbell, J. B. (2002). Introduction to remote sensing (3rd ed.). The Guilford Press. ISBN1-57230-640-8. Jensen, J. R. (2007). Remote sensing of the environment: an Earth resource perspective (2nd ed.). Prentice Hall.ISBN0-13-188950-8. Jensen, J. R. (2005). Digital Image Processing: a Remote Sensing Perspective (3rd ed.). Prentice Hall. Lentile, Leigh B.; Holden, Zachary A.; Smith, Alistair M. S.; Falkowski, Michael J.; Hudak, Andrew T.; Morgan,Penelope; Lewis, Sarah A.; Gessler, Paul E.; Benson, Nate C.. Remote sensing techniques to assess active firecharacteristics and post-fire effects (http://www.treesearch.fs.fed.us/pubs/24613). International Journal ofWildland Fire. 2006;3(15):319345. Lillesand, T. M.; R. W. Kiefer, and J. W. Chipman (2003). Remote sensing and image interpretation (5th ed.).Wiley. ISBN0-471-15227-7. Richards, J. A.; and X. Jia (2006). Remote sensing digital image analysis: an introduction (4th ed.). Springer.ISBN3-540-25128-6. US Army FM series. US Army military intelligence museum, FT Huachuca, AZ Datla, R.U.; Rice, J.P.; Lykke, K.R.; Johnson, B.C.; Butler, J.J.; Xiong, X.. Best practice guidelines for pre-launchcharacterization and calibration of instruments for passive optical remote sensing (http://nvlpubs.nist.gov/nistpubs/jres/116/2/V116.N02.A05.pdf). Journal of Research of the National Institute of Standards andTechnology. 2011 (MarchApril);116(2):612646. 20. Remote sensing 17External links Remote Sensing (http://www.dmoz.org/Science/Earth_Sciences/Geomatics/Remote_Sensing/) at the OpenDirectory Project Free space images (mosaics) (http://www.terraexploro.com/terralibrary/index.php/space-images) International Journal of Advanced Remote Sensing and GIS (http://www.cloud-journals.com/journal-of-remote-sensing-n-gis-open-access.html)LightThe Sun is Earths primary source of light. About 44% of the sunselectromagnetic radiation that reaches the ground is in the visiblelight range.Visible light (commonly referred to simply as light) iselectromagnetic radiation that is visible to the humaneye, and is responsible for the sense of sight.[1]Visiblelight has a wavelength in the range of about 380nanometres (nm), or 380109m, to about740nanometres between the invisible infrared, withlonger wavelengths and the invisible ultraviolet, withshorter wavelengths.Primary properties of visible light are intensity,propagation direction, frequency or wavelengthspectrum, and polarisation, while its speed in avacuum, 299,792,458 meters per second, is one of thefundamental constants of nature. Visible light, as withall types of electromagnetic radiation (EMR), isexperimentally found to always move at this speed invacuum.In common with all types of EMR, visible light is emitted and absorbed in tiny "packets" called photons, andexhibits properties of both waves and particles. This property is referred to as the waveparticle duality. The study oflight, known as optics, is an important research area in modern physics.In physics, the term light sometimes refers to electromagnetic radiation of any wavelength, whether visible ornot.[2][3]This article focuses on visible light. See the electromagnetic radiation article for the general term.Speed of visible lightThe speed of light in a vacuum is defined to be exactly 299,792,458m/s (approximately 186,282 miles per second).The fixed value of the speed of light in SI units results from the fact that the metre is now defined in terms of thespeed of light. All forms of electromagnetic radiation are believed to move at exactly this same speed in vacuum.Different physicists have attempted to measure the speed of light throughout history. Galileo attempted to measurethe speed of light in the seventeenth century. An early experiment to measure the speed of light was conducted byOle Rmer, a Danish physicist, in 1676. Using a telescope, Rmer observed the motions of Jupiter and one of itsmoons, Io. Noting discrepancies in the apparent period of Ios orbit, he calculated that light takes about 22 minutes totraverse the diameter of Earths orbit.[4]However, its size was not known at that time. If Rmer had known thediameter of the Earths orbit, he would have calculated a speed of 227,000,000m/s.Another, more accurate, measurement of the speed of light was performed in Europe by Hippolyte Fizeau in 1849.Fizeau directed a beam of light at a mirror several kilometers away. A rotating cog wheel was placed in the path ofthe light beam as it traveled from the source, to the mirror and then returned to its origin. Fizeau found that at a 21. Light 18certain rate of rotation, the beam would pass through one gap in the wheel on the way out and the next gap on theway back. Knowing the distance to the mirror, the number of teeth on the wheel, and the rate of rotation, Fizeau wasable to calculate the speed of light as 313,000,000m/s.Lon Foucault used an experiment which used rotating mirrors to obtain a value of 298,000,000m/s in 1862. AlbertA. Michelson conducted experiments on the speed of light from 1877 until his death in 1931. He refined Foucaultsmethods in 1926 using improved rotating mirrors to measure the time it took light to make a round trip from Mt.Wilson to Mt. San Antonio in California. The precise measurements yielded a speed of 299,796,000m/s.The effective velocity of light in various transparent substances containing ordinary matter, is less than in vacuum.For example the speed of light in water is about 3/4 of that in vacuum. However, the slowing process in matter isthought to result not from actual slowing of particles of light, but rather from their absorption and re-emission fromcharged particles in matter.As an extreme example of the nature of light-slowing in matter, two independent teams of physicists were able tobring light to a "complete standstill" by passing it through a Bose-Einstein Condensate of the element rubidium, oneteam at Harvard University and the Rowland Institute for Science in Cambridge, Mass., and the other at theHarvard-Smithsonian Center for Astrophysics, also in Cambridge.[5]However, the popular description of light being"stopped" in these experiments refers only to light being stored in the excited states of atoms, then re-emitted at anarbitrary later time, as stimulated by a second laser pulse. During the time it had "stopped" it had ceased to be light.Electromagnetic spectrum and visible lightElectromagnetic spectrum with light highlightedGenerally, EM radiation, or EMR (thedesignation radiation excludes staticelectric and magnetic and near fields)is classified by wavelength into radio,microwave, infrared, the visible regionthat we perceive as light, ultraviolet,X-rays and gamma rays.The behaviour of EMR depends on itswavelength. Higher frequencies haveshorter wavelengths, and lowerfrequencies have longer wavelengths.When EMR interacts with single atomsand molecules, its behaviour dependson the amount of energy per quantum it carries.EMR in the visible light region consists of quanta (called photons) that are at the lower end of the energies that arecapable of causing electronic excitation within molecules, which lead to changes in the bonding or chemistry of themolecule. At the lower end of the visible light spectrum, EMR becomes invisible to humans (infrared) because itsphotons no longer have enough individual energy to cause a lasting molecular change (a change in conformation) inthe visual molecule retinal in the human retina. This change triggers the sensation of vision.There exist animals that are sensitive to various types of infrared, but not by means of quantum-absorption. Infraredsensing in snakes depends on a kind of natural thermal imaging, in which tiny packets of cellular water are raised intemperature by the infrared radiation. EMR in this range causes molecular vibration and heating effects, and this ishow living animals detect it.Above the range of visible light, ultraviolet light becomes invisible to humans, mostly because it is absorbed by thetissues of the eye and in particular the lens. Furthermore, the rods and cones located at the back of the human eyecannot detect the short ultraviolet wavelengths, and are in fact damaged by ultraviolet rays, a condition known as 22. Light 19snow eye.[6]Many animals with eyes that do not require lenses (such as insects and shrimp) are able to directlydetect ultraviolet visually, by quantum photon-absorption mechanisms, in much the same chemical way that normalhumans detect visible light.OpticsThe study of light and the interaction of light and matter is termed optics. The observation and study of opticalphenomena such as rainbows and the aurora borealis offer many clues as to the nature of light.RefractionAn example of refraction of light. The straw appears bent, because ofrefraction of light as it enters liquid from air.A cloud illuminated by sunlightRefraction is the bending of light rays when passingthrough a surface between one transparent material andanother. It is described by Snells Law:where is the angle between the ray and the surfacenormal in the first medium, is the angle between theray and the surface normal in the second medium, andn1and n2are the indices of refraction, n = 1 in avacuum and n > 1 in a transparent substance.When a beam of light crosses the boundary between avacuum and another medium, or between two differentmedia, the wavelength of the light changes, but thefrequency remains constant. If the beam of light is notorthogonal (or rather normal) to the boundary, thechange in wavelength results in a change in thedirection of the beam. This change of direction isknown as refraction.The refractive quality of lenses is frequently used tomanipulate light in order to change the apparent size ofimages. Magnifying glasses, spectacles, contact lenses,microscopes and refracting telescopes are all examplesof this manipulation.Light sourcesThere are many sources of light. The most commonlight sources are thermal: a body at a given temperatureemits a characteristic spectrum of black-body radiation.A simple thermal source is sunlight, the radiationemitted by the chromosphere of the Sun at around6,000Kelvin peaks in the visible region of theelectromagnetic spectrum when plotted in wavelengthunits[7]and roughly 44% of sunlight energy thatreaches the ground is visible.[8]Another example isincandescent light bulbs, which emit only around 10% of their energy as visible light and the remainder as infrared.A common thermal light source in history is the glowing solid particles in flames, but these also emit most of their 23. Light 20radiation in the infrared, and only a fraction in the visible spectrum. The peak of the blackbody spectrum is in thedeep infrared, at about 10 micrometer wavelength, for relatively cool objects like human beings. As the temperatureincreases, the peak shifts to shorter wavelengths, producing first a red glow, then a white one, and finally ablue-white colour as the peak moves out of the visible part of the spectrum and into the ultraviolet. These colours canbe seen when metal is heated to "red hot" or "white hot". Blue-white thermal emission is not often seen, except instars (the commonly seen pure-blue colour in a gas flame or a welders torch is in fact due to molecular emission,notably by CH radicals (emitting a wavelength band around 425nm, and is not seen in stars or pure thermalradiation).Atoms emit and absorb light at characteristic energies. This produces "emission lines" in the spectrum of each atom.Emission can be spontaneous, as in light-emitting diodes, gas discharge lamps (such as neon lamps and neon signs,mercury-vapor lamps, etc.), and flames (light from the hot gas itselfso, for example, sodium in a gas flame emitscharacteristic yellow light). Emission can also be stimulated, as in a laser or a microwave maser.Deceleration of a free charged particle, such as an electron, can produce visible radiation: cyclotron radiation,synchrotron radiation, and bremsstrahlung radiation are all examples of this. Particles moving through a mediumfaster than the speed of light in that medium can produce visible Cherenkov radiation.Certain chemicals produce visible radiation by chemoluminescence. In living things, this process is calledbioluminescence. For example, fireflies produce light by this means, and boats moving through water can disturbplankton which produce a glowing wake.Certain substances produce light when they are illuminated by more energetic radiation, a process known asfluorescence. Some substances emit light slowly after excitation by more energetic radiation. This is known asphosphorescence.Phosphorescent materials can also be excited by bombarding them with subatomic particles. Cathodoluminescence isone example. This mechanism is used in cathode ray tube television sets and computer monitors.A city illuminated by artificial lightingCertain other mechanisms can produce light: Bioluminescence Cherenkov radiation Electroluminescence Scintillation Sonoluminescence triboluminescenceWhen the concept of light is intended to includevery-high-energy photons (gamma rays), additionalgeneration mechanisms include: Particleantiparticle annihilation Radioactive decayUnits and measuresLight is measured with two main alternative sets of units: radiometry consists of measurements of light power at allwavelengths, while photometry measures light with wavelength weighted with respect to a standardised model ofhuman brightness perception. Photometry is useful, for example, to quantify Illumination (lighting) intended forhuman use. The SI units for both systems are summarised in the following tables. 24. Light 21Quantity Unit Dimension NotesNameSymbol[9] Name Symbol SymbolRadiant energyQe[10] joule JML2T2 energyRadiant fluxe[10] watt WML2T3 radiant energy per unit time, also called radiantpower.Spectral powere[10][11] watt per metreWm1MLT3 radiant power per wavelength.Radiant intensity Iewatt per steradianWsr1ML2T3 power per unit solidangle.Spectral intensityIe[11] watt per steradian permetreWsr1m1MLT3 radiant intensity per wavelength.Radiance Lewatt per steradian persquaremetreWsr1m2MT3 power per unit solid angle per unit projected sourcearea.confusingly called "intensity" in some other fields ofstudy.Spectral radianceLe[11]orLe[12]watt per steradian permetre3orwatt per steradian persquaremetre per hertzWsr1m3orWsr1m2Hz1ML1T3orMT2commonly measured in Wsr1m2nm1with surfacearea and either wavelength or frequency.IrradianceEe[10] watt per square metreWm2MT3 power incident on a surface, also called radiant fluxdensity.sometimes confusingly called "intensity" as well.SpectralirradianceEe[11]orEe[12]watt per metre3orwatt per square metreper hertzWm3orWm2Hz1ML1T3orMT2commonly measured in Wm2nm1or 1022Wm2Hz1, known as solar flux unit.[13]Radiant exitance/RadiantemittanceMe[10] watt per square metreWm2MT3 power emitted from a surface.Spectral radiantexitance /Spectral radiantemittanceMe[11]orMe[12]watt per metre3orwatt per squaremetre per hertzWm3orWm2Hz1ML1T3orMT2power emitted from a surface per wavelength orfrequency.RadiosityJeorJe[11]watt per square metreWm2MT3 emitted plus reflected power leaving a surface.Radiant exposure Hejoule per square metreJm2MT2Radiant energydensitye joule per metre3Jm3ML1T2See also: SI Radiometry Photometry (Compare) 25. Light 22Quantity Unit Dimension NotesNameSymbol[14] Name Symbol SymbolLuminous energyQv[15] lumen second lmsTJ[16] units are sometimes called talbotsLuminous fluxv[15] lumen (=cdsr) lmJ[16] also called luminous powerLuminous intensity Ivcandela (=lm/sr) cdJ[16] an SI base unit, luminous flux per unit solid angleLuminance Lvcandela per square metrecd/m2L2Junits are sometimes called nitsIlluminance Ev lux (=lm/m2)lxL2Jused for light incident on asurfaceLuminous emittance Mv lux (=lm/m2)lxL2Jused for light emitted from asurfaceLuminous exposure Hvlux second lxsL2TJLuminous energy density v lumen second per metre3lmsm3L3TJLuminous efficacy[15] lumen per watt lm/WM1L2T3Jratio of luminous flux to radiant fluxLuminous efficiency V 1 also called luminous coefficientSee also: SI Photometry Radiometry (Compare)The photometry units are different from most systems of physical units in that they take into account how the humaneye responds to light. The cone cells in the human eye are of three types which respond differently across the visiblespectrum, and the cumulative response peaks at a wavelength of around 555nm. Therefore, two sources of lightwhich produce the same intensity (W/m2) of visible light do not necessarily appear equally bright. The photometryunits are designed to take this into account, and therefore are a better representation of how "bright" a light appearsto be than raw intensity. They relate to raw power by a quantity called luminous efficacy, and are used for purposeslike determining how to best achieve sufficient illumination for various tasks in indoor and outdoor settings. Theillumination measured by a photocell sensor does not necessarily correspond to what is perceived by the human eye,and without filters which may be costly, photocells and charge-coupled devices (CCD) tend to respond to someinfrared, ultraviolet or both.Light pressureLight exerts physical pressure on objects in its path, a phenomenon which can be deduced by Maxwells equations,but can be more easily explained by the particle nature of light: photons strike and transfer their momentum. Lightpressure is equal to the power of the light beam divided by c, the speed of light. Due to the magnitude of c, theeffect of light pressure is negligible for everyday objects. For example, a one-milliwatt laser pointer exerts a force ofabout 3.3 piconewtons on the object being illuminated; thus, one could lift a U.S. penny with laser pointers, butdoing so would require about 30 billion 1-mW laser pointers.[17] However, in nanometer-scale applications such asNEMS, the effect of light pressure is more significant, and exploiting light pressure to drive NEMS mechanisms andto flip nanometer-scale physical switches in integrated circuits is an active area of research.[18]At larger scales, light pressure can cause asteroids to spin faster,[19]acting on their irregular shapes as on the vanesof a windmill. The possibility of making solar sails that would accelerate spaceships in space is also underinvestigation.[20][21]Although the motion of the Crookes radiometer was originally attributed to light pressure, this interpretation isincorrect; the characteristic Crookes rotation is the result of a partial vacuum.[22]This should not be confused withthe Nichols radiometer, in which the (slight) motion caused by torque (though not enough for full rotation againstfriction) is directly caused by light pressure.[23] 26. Light 23Historical theories about light, in chronological orderClassical Greece and HellenismIn the fifth century BC, Empedocles postulated that everything was composed of four elements; fire, air, earth andwater. He believed that Aphrodite made the human eye out of the four elements and that she lit the fire in the eyewhich shone out from the eye making sight possible. If this were true, then one could see during the night just aswell as during the day, so Empedocles postulated an interaction between rays from the eyes and rays from a sourcesuch as the sun.In about 300 BC, Euclid wrote Optica, in which he studied the properties of light. Euclid postulated that lighttravelled in straight lines and he described the laws of reflection and studied them mathematically. He questionedthat sight is the result of a beam from the eye, for he asks how one sees the stars immediately, if one closes oneseyes, then opens them at night. Of course if the beam from the eye travels infinitely fast this is not a problem.In 55 BC, Lucretius, a Roman who carried on the ideas of earlier Greek atomists, wrote:"The light & heat of the sun; these are composed of minute atoms which, when they are shoved off, lose no time inshooting right across the interspace of air in the direction imparted by the shove." On the nature of the UniverseDespite being similar to later particle theories, Lucretiuss views were not generally accepted.Ptolemy (c. 2nd century) wrote about the refraction of light in his book Optics.[24]Classical IndiaIn ancient India, the Hindu schools of Samkhya and Vaisheshika, from around the early centuries CE developedtheories on light. According to the Samkhya school, light is one of the five fundamental "subtle" elements (tanmatra)out of which emerge the gross elements. The atomicity of these elements is not specifically mentioned and it appearsthat they were actually taken to be continuous.On the other hand, the Vaisheshika school gives an atomic theory of the physical world on the non-atomic ground ofether, space and time. (See Indian atomism.) The basic atoms are those of earth (prthivi), water (pani), fire (agni),and air (vayu) Light rays are taken to be a stream of high velocity of tejas (fire) atoms. The particles of light canexhibit different characteristics depending on the speed and the arrangements of the tejas atoms.[citation needed]TheVishnu Purana refers to sunlight as "the seven rays of the sun".[citation needed]The Indian Buddhists, such as Dignga in the 5th century and Dharmakirti in the 7th century, developed a type ofatomism that is a philosophy about reality being composed of atomic entities that are momentary flashes of light orenergy. They viewed light as being an atomic entity equivalent to energy.[citation needed]DescartesRen Descartes (15961650) held that light was a mechanical property of the luminous body, rejecting the "forms"of Ibn al-Haytham and Witelo as well as the "species" of Bacon, Grosseteste, and Kepler.[25]In 1637 he published atheory of the refraction of light that assumed, incorrectly, that light travelled faster in a denser medium than in a lessdense medium. Descartes arrived at this conclusion by analogy with the behaviour of sound waves.[citation needed]Although Descartes was incorrect about the relative speeds, he was correct in assuming that light behaved like awave and in concluding that refraction could be explained by the speed of light in different media.Descartes is not the first to use the mechanical analogies but because he clearly asserts that light is only a mechanicalproperty of the luminous body and the transmitting medium, Descartes theory of light is regarded as the start ofmodern physical optics.[25] 27. Light 24Particle theoryPierre Gassendi.Pierre Gassendi (15921655), an atomist, proposed a particletheory of light which was published posthumously in the 1660s.Isaac Newton studied Gassendis work at an early age, andpreferred his view to Descartes theory of the plenum. He stated inhis Hypothesis of Light of 1675 that light was composed ofcorpuscles (particles of matter) which were emitted in alldirections from a source. One of Newtons arguments against thewave nature of light was that waves were known to bend aroundobstacles, while light travelled only in straight lines. He did,however, explain the phenomenon of the diffraction of light(which had been observed by Francesco Grimaldi) by allowingthat a light particle could create a localised wave in the aether.Newtons theory could be used to predict the reflection of light, butcould only explain refraction by incorrectly assuming that lightaccelerated upon entering a denser medium because thegravitational pull was greater. Newton published the final versionof his theory in his Opticks of 1704. His reputation helped theparticle theory of light to hold sway during the 18th century. The particle theory of light led Laplace to argue that abody could be so massive that light could not escape from it. In other words it would become what is now called ablack hole. Laplace withdrew his suggestion later, after a wave theory of light became firmly established as themodel for light (as has been explained, neither a particle or wave theory is fully correct). A translation of Newtonsessay on light appears in The large scale structure of space-time, by Stephen Hawking and George F. R. Ellis.Wave theoryTo explain the origin of colors, Robert Hooke (1635-1703) developed a "pulse theory" and compared the spreadingof light to that of waves in water in his 1665 Micrographia ("Observation XI"). In 1672 Hooke suggested that lightsvibrations could be perpendicular to the direction of propagation. Christiaan Huygens (1629-1695) worked out amathematical wave theory of light in 1678, and published it in his Treatise on light in 1690. He proposed that lightwas emitted in all directions as a series of waves in a medium called the Luminiferous ether. As waves are notaffected by gravity, it was assumed that they slowed down upon entering a denser medium.[26]Thomas Youngs sketch of the two-slit experimentshowing the diffraction of light. Youngs experimentssupported the theory that light consists of waves.The wave theory predicted that light waves could interfere witheach other like sound waves (as noted around 1800 by ThomasYoung), and that light could be polarised, if it were a transversewave. Young showed by means of a diffraction experiment thatlight behaved as waves. He also proposed that different colourswere caused by different wavelengths of light, and explainedcolour vision in terms of three-coloured receptors in the eye.Another supporter of the wave theory was Leonhard Euler. Heargued in Nova theoria lucis et colorum (1746) that diffractioncould more easily be explained by a wave theory.Later, Augustin-Jean Fresnel independently worked out his own wave theory of light, and presented it to theAcadmie des Sciences in 1817. Simeon Denis Poisson added to Fresnels mathematical work to produce aconvincing argument in favour of the wave theory, helping to overturn Newtons corpuscular theory. By the year1821, Fresnel was able to show via mathematical methods that polarisation could be explained only by the wave 28. Light 25theory of light and only if light was entirely transverse, with no longitudinal vibration whatsoever.The weakness of the wave theory was that light waves, like sound waves, would need a medium for transmission.The existence of the hypothetical substance luminiferous aether proposed by Huygens in 1678 was cast into strongdoubt in the late nineteenth century by the Michelson-Morley experiment.Newtons corpuscular theory implied that light would travel faster in a denser medium, while the wave theory ofHuygens and others implied the opposite. At that time, the speed of light could not be measured accurately enough todecide which theory was correct. The first to make a sufficiently accurate measurement was Lon Foucault, in1850.[27]His result supported the wave theory, and the classical particle theory was finally abandoned, only to partlyre-emerge in the 20th century.Quantum theoryIn 1900 Max Planck, attempting to explain black body radiation suggested that although light was a wave, thesewaves could gain or lose energy only in finite amounts related to their frequency. Planck called these "lumps" oflight energy "quanta" (from a Latin word for "how much"). In 1905, Albert Einstein used the idea of light quanta toexplain the photoelectric effect, and suggested that these light quanta had a "real" existence. In 1923 Arthur HollyCompton showed that the wavelength shift seen when low intensity X-rays scattered from electrons (so calledCompton scattering) could be explained by a particle-theory of X-rays, but not a wave theory. In 1926 Gilbert N.Lewis named these liqht quanta particles photons.Eventually the modern theory of quantum mechanics came to picture light as (in some sense) both a particle and awave, and (in another sense), as a phenomenon which is neither a particle nor a wave (which actually aremacroscopic phenomena, such as baseballs or ocean waves). Instead, modern physics sees light as something thatcan be described sometimes with mathematics appropriate to one type of macroscopic metaphor (particles), andsometimes another macroscopic metaphor (water waves), but is actually something that cannot be fully imagined. Asin the case for radio waves and the X-rays involved in Compton scattering, physicists have noted thatelectromagnetic radiation tends to behave more like a classical wave at lower frequencies, but more like a classicalparticle at higher frequencies, but never completely loses all qualities of one or the other. Visible light, whichoccupies a middle ground in frequency, can easily be shown in experiments to be describable using either a wave orparticle model, or sometimes both.Electromagnetic theory as explanation for all types of visible light and all EM radiationA linearly polarised light wave frozen in time and showing the two oscillatingcomponents of light; an electric field and a magnetic field perpendicular to each other andto the direction of motion (a transverse wave).In 1845, Michael Faraday discoveredthat the plane of polarisation oflinearly polarised light is rotated whenthe light rays travel along the magneticfield direction in the presence of atransparent dielectric, an effect nowknown as Faraday rotation.[]This wasthe first evidence that light was relatedto electromagnetism. In 1846 hespeculated that light might be someform of disturbance propagating alongmagnetic field lines.[]Faradayproposed in 1847 that light was ahigh-frequency electromagneticvibration, which could propagate even in the absence of a medium such as the ether. 29. Light 26Faradays work inspired James Clerk Maxwell to study electromagnetic radiation and light. Maxwell discovered thatself-propagating electromagnetic waves would travel through space at a constant speed, which happened to be equalto the previously measured speed of light. From this, Maxwell concluded that light was a form of electromagneticradiation: he first stated this result in 1862 in On Physical Lines of Force. In 1873, he published A Treatise onElectricity and Magnetism, which contained a full mathematical description of the behaviour of electric andmagnetic fields, still known as Maxwells equations. Soon after, Heinrich Hertz confirmed Maxwells theoryexperimentally by generating and detecting radio waves in the laboratory, and demonstrating that these wavesbehaved exactly like visible light, exhibiting properties such as reflection, refraction, diffraction, and interference.Maxwells theory and Hertzs experiments led directly to the development of modern radio, radar, television,electromagnetic imaging, and wireless communications.In the quantum theory, photons are seen as wave packets of the waves described in the classical theory of Maxwell.The quantum theory was needed to explain effects even with visual light that Maxwells classical theory could not(such as spectral lines).Notes[1] CIE (1987). International Lighting Vocabulary (http://www.cie.co.at/publ/abst/17-4-89.html). Number 17.4. CIE, 4th edition. ISBN978-3-900734-07-7.By the International Lighting Vocabulary, the definition of light is: Any radiation capable of causing a visual sensation directly.[6] http://www.yorku.ca/eye/lambdas.htm[7] http://thulescientific.com/LYNCH%20&%20Soffer%20OPN%201999.pdf[9] Standards organizations recommend that radiometric quantities should be denoted with a suffix "e" (for "energetic") to avoid confusion withphotometric or photon quantities.[10] Alternative symbols sometimes seen: W or E for radiant energy, P or F for radiant flux, I for irradiance, W for radiantemittance.[11] Spectral quantities given per unit wavelength are denoted with suffix "" (Greek) to indicate a spectral concentration. Spectral functions ofwavelength are indicated by "()" in parentheses instead, for example in spectral transmittance, reflectance and responsivity.[12] Spectral quantities given per unit frequency are denoted with suffix ""(Greek)not to be confused with the suffix "v" (for "visual")indicating a photometric quantity.[13] NOAA / Space Weather Prediction Center (http://www.swpc.noaa.gov/forecast_verification/F10.html) includes a definition of thesolarfluxunit(SFU).[14] Standards organizations recommend that photometric quantities be denoted with a suffix "v" (for "visual") to avoid confusion withradiometric or photon quantities.[15] Alternative symbols sometimes seen: W for luminous energy, P or F for luminous flux, and or K for luminous efficacy.[16] "J" here is the symbol for the dimension of luminous intensity, not the symbol for the unit joules.[18] See, for example, nano-opto-mechanical systems research at Yale University (http://www.eng.yale.edu/tanglab/research.htm).[22][22] P. Lebedev, Untersuchungen ber die Druckkrfte des Lichtes, Ann. Phys. 6, 433 (1901).[25] Theories of light, from Descartes to Newton A. I. Sabra CUP Archive,1981 pg 48 ISBN 0-521-28436-8, ISBN 978-0-521-28436-3[26] Fokko Jan Dijksterhuis, Lenses and Waves: Christiaan Huygens and the Mathematical Science of Optics in the 17th Century (http://books.google.com/books?id=cPFevyomPUIC), Kluwer Academic Publishers, 2004, ISBN 1-4020-2697-8References 30. Radar 27RadarA long-range radar antenna, known as ALTAIR,used to detect and track space objects inconjunction with ABM testing at the RonaldReagan Test Site on Kwajalein Atoll.Israeli military radar is typical of the type of radarused for air traffic control. The antenna rotates ata steady rate, sweeping the local airspace with anarrow vertical fan-shaped beam, to detectaircraft at all altitudes.Radar is an object detection system which uses radio waves todetermine the range, altitude, direction, or speed of objects. It can beused to detect aircraft, ships, spacecraft, guided missiles, motorvehicles, weather formations, and terrain. The radar dish or antennatransmits pulses of radio waves or microwaves which bounce off anyobject in their path. The object returns a tiny part of the waves energyto a dish or antenna which is usually located at the same site as thetransmitter.Radar was secretly developed by several nations before and duringWorld War II. The term RADAR was coined in 1940 by the UnitedStates Navy as an acronym for RAdio Detection And Ranging.[1]Theterm radar has since entered English and other languages as thecommon noun radar, losing all capitalization.The modern uses of radar are highly diverse, including air trafficcontrol, radar astronomy, air-defense systems, antimissile systems;marine radars to locate landmarks and other ships; aircraft anticollisionsystems; ocean surveillance systems, outer space surveillance andrendezvous systems; meteorological precipitation monitoring; altimetryand flight control systems; guided missile target locating systems; andground-penetrating radar for geological observations. High tech radarsystems are associated with digital si