One had to explore to see to create a map until the 20th century. With the advent of aerial photography in World War I (WWI) followed by stereo optic techniques developed in WWII, mapping became possible from afar. Radar first opened the senses to the invisible. Satellites enabled worldwide observation and new technologies added a wide array of sensors across the electromagnetic and acoustic spectrums. Now even variations in the earth’s gravity as measured from satellites are used to plot the contours of the oceans floors.
An aerial photo like this one of San Francisco might be useful to create a bird’s eye view map, but it’s not a good source for making modern maps. One needs to create an orthophoto, an aerial photograph that has been geometrically corrected so that it has a uniform scale like a map. The photo should be taken overhead like the one to the right of an intersection in Orange County. The image is then adjusted for the distortion caused by the lens and the tilt of the camera as well as the topographic variations in the scene. Once corrected, the orthophoto is fully functional when each pixel is registered to its exact geographic position.
Image resolution is measured by the length each square pixel represents in the image. In this aerial photo of Orange County, each pixel represents 6 inches on the ground. One might think that an analog photo would have higher resolution, but that’s not necessarily so with today’s digital cameras. You can see a comparison of a digital and an analog aerial photo here.
The USGS’s Western Geographic Science Center in Menlo Park, California has produced the orthophotos for the US since the ‘60’s when it was the Western Mapping Center. Initially it dealt with analog photos and developed improved techniques for their processing. In the ‘80’s, the Center developed the techniques and production processes for the digital orthophoto quadrangle (DOQ) which is a standard today.
The digital orthophotos can be seamed together into a large area map. Large collections of orthophotos are used by Geographic Information Systems (GIS) and spatial information displays such as Google Maps and Google Earth.
Other airborne sensors can be employed such as infrared cameras to create images such as the one to the right taken of the Sacramento River delta near Stockton, California. The use of LiDAR (Light Detection and Ranging) has enabled very accurate topographic mapping from aircraft equipped with lasers. By coupling differential GPS with side angle radar (SAR) and onboard processing, data providers can now produce accurate digital topographic maps and images as they fly even on a cloudy day.
The first satellite photos of the earth date to 1959 and since then dozens of satellite programs and systems have added to the imagery archive in the visible, infrared, and radar spectrums.
Resolutions for commercially available images primarily go from 30 meters for NASA/USGS’s Landsat down to GeoEye-1 at 0.5 meters though wide area imagery is also available especially for environmental studies. The SPOT Image image of San Francisco’s SFO here is 2.5 meters per pixel. Please note that this image has been downsized and compressed so that much of that resolution is not shown here. It is thought that the classified US intelligence satellites have resolutions substantially better than the 0.5 meter data available from commercial satellites.
Satellites for weather, mapping, agriculture, environmental measurements and more have been launched by most developed countries and some of that data is available free or for licensing by private parties. There are at least 5 satellite ventures funded privately at this time: SPOT Image, Digital Globe, GeoEye, ImageSat (EROS), and RapidEye. Each has unique niches of expertise.
Most satellites operate with multiple sensors at varying frequencies in the electromagnetic spectrum. Infrared is common as in the 4 meter resolution image of downtown San Diego to the right from GeoEye – 1. Thematic maps created from these multi-spectral images can be used for monitoring agriculture, environmental changes, finding natural resources, tracking land use … As with aircraft, active radar can used to measure terrestrial elevations (topography) and the sea surface variations. Ground penetrating radar has been able to make archaeological discoveries.
Since the imagery today is digital, each pixel can be readily registered to an accurate geographic position. It can then be readily integrated with other data into a GIS or other application. Because the data is geo-referenced, it can be coupled with digital elevation data and models to produce 3-dimensional views such as this of an area north of Los Angeles produced by SPOT Image. With the computing power of today’s machines, one can simulate flying through 3D scenes such as these.
Satellite imagery is used extensively in responses to natural disasters. SPOT Image was able to assist in fighting the large-scale wild fires near San Diego in 2007. Here’s a sample image as the flames raced toward Rancho Bernardo, CA. Note that the highways could be added for reference due to the digital geo-referencing of the image. For ready access to the hundreds of thousands of persons affected by the disaster, the images were released over the internet on Google.
SENSING AND MAPPING OTHER WORLDS
Almost by its very nature, oceanography has become a science based on remote sensing under the sea, on the sea and above the sea. The examples are almost too extensive to document. La Jolla’s Scripps Institution of Oceanography is a world leader in ocean and environmental research utilizing remote sensing and marine exploration. Scientists there have developed the techniques to map the world’s ocean floors from satellite radar elevation data. Go to http://sio.ucsd.edu/ to get a taste.
Our solar system explorations and research may be the ultimate in remote sensing. The Jet Propulsion Laboratory (JPL), part of the California Institute of Technology in Pasadena is the designated Program Manager for almost all of the robotic missions. Sending a lander 23 million miles to Mars to sense the make-up of the planet’s geology, atmosphere and potential for life can hardly be surpassed for sensing the remote. Inherent in all of this data collection is mapping the planets. The data is spatially registered and combined in GIS analyses to understand the planets better. Click here to enlarge this photo from the Pathfinder Mission in 1997. Go to http://www.jpl.nasa.gov/ to dig deeper.
But here’s a possible topper: a photo of the Arecibo Observatory, a radio telescope, in Puerto Rico taken by the remote sensing satellite, GeoEye-1. The 1,000 foot diameter sensor is the largest single aperture telescope in the world. Used for deep space radio astronomy, it has gained new notoriety for its role in collecting information for the SETI@home Project to find signals from other living beings in the universe. It has done much to map the universe especially in its early times. It also performs research of the high altitude atmosphere and radar astronomy of the solar system. Shot at 0.5 m resolution, this image has been reduced in size considerably to your PC to handle the image. So this is a remotely sensed image from space of a remote sensor of the universe.
Bringing it back down to Earth, here’s a GeoEye-1 photo at 0.5 m resolution of Haiti taken less than 24 hours after the earthquake on January 12, 2010. With virtually all systems down in Haiti, such images are helping in assessing the damage and are aiding in directing the emergency response. So these tools can be used for real world apps too.