Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object and thus in contrast to on-site observation, especially the Earth. Remote sensing is used in numerous fields, including geography, land surveying and most Earth science disciplines (for example, hydrology, ecology, meteorology, oceanography, glaciology, geology); it also has military, intelligence, commercial, economic, planning, and humanitarian applications.
In current usage, the term "remote sensing" generally refers to the use of satellite- or aircraft-based sensor technologies to detect and classify objects on Earth, including on the surface and in the atmosphere and oceans, based on propagated signals (e.g. electromagnetic radiation). It may be split into "active" remote sensing (such as when a signal is emitted by a satellite or aircraft and its reflection by the object is detected by the sensor) and "passive" remote sensing (such as when the reflection of sunlight is detected by the sensor).
Passive sensors gather radiation that is emitted or reflected by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography, infrared, charge-coupled devices, and radiometers. Active collection, on the other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delay between emission and return is measured, establishing the location, speed and direction of an object.
Remote sensing makes it possible to collect data of dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed.
Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and conservation, and national security and overhead, ground-based and stand-off collection on border areas.
The basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiation that stand out from surrounding areas. For a summary of major remote sensing satellite systems see the overview table.
To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location and the orientation of the sensor. High-end instruments now often use positional information from satellite navigation systems. The rotation and orientation is often provided within a degree or two with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different latitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methods including navigation from stars or known benchmarks.
The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions.
In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to a reference point including distances between known points on the ground. This depends on the type of sensor used. For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of measurements increasing the farther you get from the center. Another factor is that of the platen against which the film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which this problem is resolved is called georeferencing, and involves computer-aided matching of points in the image (typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping" the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced.
In addition, images may need to be radiometrically and atmospherically corrected.
Interpretation is the critical process of making sense of the data. The first application was that of aerial photographic collection which used the following process; spatial measurement through the use of a light table in both conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of photomosaics, repeat coverage, Making use of objects’ known dimensions in order to detect modifications. Image Analysis is the recently developed automated computer-aided application which is in increasing use.
Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS) imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal scale.
Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent of geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store the data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, the format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as computer-generated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized half-tone images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created, copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet can be read by human beings with minimal, standardized equipment.
Generally speaking, remote sensing works on the principle of the inverse problem: while the object or phenomenon of interest (the state) may not be directly measured, there exists some other variable that can be detected and measured (the observation) which may be related to the object of interest through a calculation. The common analogy given to describe this is trying to determine the type of animal from its footprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it is possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region. The frequency of the emissions may then be related via thermodynamics to the temperature in that region.
To facilitate the discussion of data processing in practice, several processing "levels" were first defined in 1986 by NASA as part of its Earth Observing System and steadily adopted since then, both internally at NASA (e. g.,) and elsewhere (e. g.,); these definitions are:
|0||Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed.|
|1a||Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).|
|1b||Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data.|
|2||Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data.|
|3||Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.).|
|4||Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).|
A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.
While these processing levels are particularly suitable for typical satellite data processing pipelines, other data level vocabularies have been defined and may be appropriate for more heterogeneous workflows.
The modern discipline of remote sensing arose with the development of flight. The balloonist G. Tournachon (alias Nadar) made photographs of Paris from his balloon in 1858. Messenger pigeons, kites, rockets and unmanned balloons were also used for early images. With the exception of balloons, these first, individual images were not particularly useful for map making or for scientific purposes.
Systematic aerial photography was developed for military surveillance and reconnaissance purposes beginning in World War I and reaching a climax during the Cold War with the use of modified combat aircraft such as the P-51, P-38, RB-66 and the F-4C, or specifically designed collection platforms such as the U2/TR-1, SR-71, A-5 and the OV-1 series both in overhead and stand-off collection. A more recent development is that of increasingly smaller sensor pods such as those used by law enforcement and the military, in both manned and unmanned platforms. The advantage of this approach is that this requires minimal modification to a given airframe. Later imaging technologies would include infrared, conventional, Doppler and synthetic aperture radar.
The development of artificial satellites in the latter half of the 20th century allowed remote sensing to progress to a global scale as of the end of the Cold War. Instrumentation aboard various Earth observing and weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARS provided global measurements of various data for civil, research, and military purposes. Space probes to other planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments, synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, while instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a few examples.
Recent developments include, beginning in the 1960s and 1970s with the development of image processing of satellite imagery. Several research groups in Silicon Valley including NASA Ames Research Center, GTE, and ESL Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data. In 1999 the first commercial satellite (IKONOS) collecting very high resolution imagery was launched.
Remote Sensing has a growing relevance in the modern information society. It represents a key technology as part of the aerospace industry and bears increasing economic relevance – new sensors e.g. TerraSAR-X and RapidEye are developed constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensing exceedingly influences everyday life, ranging from weather forecasts to reports on climate change or natural disasters. As an example, 80% of the German students use the services of Google Earth; in 2006 alone the software was downloaded 100 million times. But studies have shown that only a fraction of them know more about the data they are working with. There exists a huge knowledge gap between the application and the understanding of satellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims to strengthen the support for teaching on the subject. A lot of the computer software explicitly developed for school lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated into the curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remote sensing requires a consolidation of physics and mathematics as well as competences in the fields of media and methods apart from the mere visual interpretation of satellite images.
Many teachers have great interest in the subject "remote sensing", being motivated to integrate this topic into teaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusing information. In order to integrate remote sensing in a sustainable manner organizations like the EGU or Digital Earth encourage the development of learning modules and learning portals. Examples include: FIS – Remote Sensing in School Lessons, Geospektiv, Ychange, or Spatial Discovery, to promote media and method qualifications as well as independent learning.
Remote sensing data are processed and analyzed with computer software, known as a remote sensing application. A large number of proprietary and open source applications exist to process remote sensing data. Remote sensing software packages include:
Open source remote sensing software includes:
According to an NOAA Sponsored Research by Global Marketing Insights, Inc. the most used applications among Asian academic groups involved in remote sensing are as follows: ERDAS 36% (ERDAS IMAGINE 25% & ERMapper 11%); ESRI 30%; ITT Visual Information Solutions ENVI 17%; MapInfo 17%.
Among Western Academic respondents as follows: ESRI 39%, ERDAS IMAGINE 27%, MapInfo 9%, and AutoDesk 7%.
In education, those that want to go beyond simply looking at satellite images print-outs either use general remote sensing software (e.g. QGIS), Google Earth, StoryMaps or a software/ web-app developed specifically for education (e.g. desktop: LeoWorks, online: BLIF).
First satellite UV/VIS observations simply showed pictures of the Earth's surface and atmosphere. Such satellite images are still used, for instance as input for numerical weather forecast. The first spectroscopic UV/VIS observations started in 1970 on board of the US research satellite Nimbus 4. These measurements (backscatter ultraviolet, BUV, later also called Solar BUV, SBUV) operated in nadir geometry, i.e., they measured the solar light reflected from the ground or scattered from the atmosphere. Like for the Dobson instruments also the BUV/SBUV instruments measure the intensity in different narrow spectral intervals. The intention of these BUV/SBUV observations was to determine information on the atmospheric O3 profile, since the penetration depth into the atmosphere strongly depends on wavelength. For example, the light at the shortest wavelengths has only 'seen' the highest parts of the O3 layer whereas the longest wavelengths have seen the total column. While in principle the BUV/SBUV measurements worked well, they suffered from instrumental instabilities.
The big breakthrough[according to whom?] in UV/VIS satellite remote sensing of the atmosphere took place in 1979 with the launch the Total Ozone Mapping Spectrometer (TOMS) on Nimbus 7. TOMS is similar to the BUV/SBUV instrument but measures light at longer wavelengths. Thus it is only sensitive to the total O3 column (instead of the O3 profile). However, compared to the BUV/SBUV instruments the TOMS instruments were much more stable. The TOMS instrument on board of Nimbus 7 yielded the so far longest global data set on O3 (1979–1992). This period in particular includes the discovery of the ozone hole. Several further TOMS instruments have been launched on other satellites. Like the Dobson instruments on the ground they yield very accurate O3 total column densities using a relatively simple method. Besides events of very strong atmospheric SO2 absorption and aerosols they are, however, only sensitive to O3.
Since April 1995 the first DOAS instrument is operating from space. The Global Ozone Monitoring Experiment (GOME) was launched on the European research satellite ERS-2. Like SBUV and TOMS also GOME is a nadir viewing instrument; unlike its predecessor instruments it covers a large spectral range (240 – 790 nm) at a total of 4096 wavelengths arranged in four 'channels' with a spectral resolution between 0.2 and 0.4 nm. Its normal ground pixel size in 320 × 40 km2. Global coverage is achieved after three days. For O3 profile measurements the intensities at short wavelengths are observed (BUV/SBUV instruments). For the determination of the total atmospheric O3 column the intensities at larger wavelengths are used (TOMS instruments). In contrast to the limited spectral information of BUV/SBUV and TOMS instruments, GOME spectra yield a surplus of spectral information. By applying the DOAS method to these measurements it is thus possible to retrieve a large variety of atmospheric trace gases, the majority of which are very weak absorbers (O3, NO2, BrO, OClO, HCHO, H2O, O2, O4, SO2). In addition other quantities like aerosol absorptions, the ground albedo or indices characterising the solar cycle can be analysed. Because of the high sensitivity of GOME it is in particular possible to measure various tropospheric trace gases (NO2, BrO, HCHO, H2O, SO2). A further important advantage is that the GOME spectra can be analysed with respect to a spectrum of direct sun light, which contains no atmospheric absorptions. Therefore, in contrast to ground based DOAS measurements the DOAS analysis of GOME spectra yields total atmospheric column densities rather than the difference between two atmospheric spectra.
In March 2002 a second DOAS satellite instrument, the SCanning Imaging Absorption SpectroMeter for Atmospheric ChartographY (SCIAMACHY) was launched on board of the European research satellite Envisat. In addition to GOME it measures over a wider wavelength range (240 nm – 2380) including also the absorption of several greenhouse gases (CO2, CH4, N2O) and CO in the IR. It also operated in additional viewing modes (nadir, limb, occultation), which allows to derive stratospheric trace gas profiles. Another advantage is that the ground pixel size for the nadir viewing mode was significantly reduced to 30 x 60 km2 (in a special mode even to 15 x 30 km2). Especially for the observation of tropospheric trace gases this is very important because of the strong spatial gradients occurring for such species. The first tropospheric results of SCIAMACHY showed that it was now possible to identify pollution plumes of individual cities or other big sources.