Six months ago the location-based augmented reality game Pokémon Go was released. Developed in partnership by Niantic and Google, it is really a data mining type of game developed for iOS and Android devices, where players can nab the historic Pokémon in their own local environments. The marriage of geospatial and augmented reality is a gamechanger for the geospatial industry, evidenced by just how many people can be reached with over 100 million Android downloads in the first month of its entry onto the market.
Aerial imagery provides GIS managers and other professionals with a realistic bird’s eye view of a region, but it can’t always tell the full story. While an aerial image may present a realistic representation of the Earth, it can’t always portray things like elevation data accurately. Looking at LiDAR data, which uses laser technology, is one way to! obtain information about terrain and land cover, but there are other ways to gain information about the land around us and classify the surfaces, vegetation, and other details on the ground.
The human eye can only see visible light in the electromagnetic spectrum. This is because the receptors, or cones, in our eyes can see colors in the red, green, and blue spectrums. However, there are ways for humans to “see” what we can’t – things like ultraviolet (UV) rays, X-rays, Gamma Rays, infrared, microwaves, and radio waves.
When it comes to the infrared spectrum, just outside of visible light, multispectral cameras capture what the human eye can’t see. These cameras capture Near Infrared (NIR) frequencies in order to classify the vegetation, impervious surfaces, and other features present in the photographs.
Near Infrared serves a number of purposes; namely, as aerial images, they allow for visual identification of vegetation presence and health. GIS, agriculture, and local government agencies rely on NIR to detect vegetation, impervious surfaces, pollution, and other features.
Why do GIS managers and other government agencies need a Near Infrared view? See the rest on the EagleView blog.