Wide Area Aerial Surveillance Technologies Evolve for Homeland Security and Other Applications
WAAS: Seeing the big picture
The Wide Area Aerial Surveillance (WAAS) systems that found improvised explosive devices (IEDs) in Iraq and Afghanistan now lend themselves to domestic counterterrorism, border patrol, drug interdiction, and disaster management.
The Department of Homeland Security (DHS) last March conducted its first test of WAAS technology to track people, vehicles, and objects in areas of 5 to 10 square kilometers. In 2013, Sierra Nevada Corporation and ITT Exelis – makers of the Air Force Gorgon Stare WAAS system – will offer commercial Vigilant Stare services with a sensor field of view covering 16 square kilometers. WAAS becomes wide area persistent surveillance (WAPS) with the integration of long-endurance platforms.
Synoptic sensors on manned and unmanned aircraft and tethered aerostats today collect wide-area video imagery for near-real-time and forensic (after-the-fact) analyses. In Operations Iraqi Freedom and Enduring Freedom, WAAS put kinetic events in context for ground commanders. According to Dr. Michael Eismann, senior scientist for electro-optical and infrared sensors in the Air Force Research Laboratory(AFRL) Sensors Directorate, “The value of that is we and others started to recognize the value for other applications. In countering IEDs, it’s not how do we find them, but how do we find the activity associated with IEDs?”
AFRL, in conjunction with the Los Alamos National Laboratory, developed the Angel Fire and Blue Devil WAAS systems first deployed in Iraq. Typical high-resolution reconnaissance and targeting sensors with narrow, transient fields of view were ill-suited to the Iraqi insurgency. “You could cover a large area but get very poor resolution, or zoom in and look at a combatant but completely lose the context,” observed Eismann.“It was combatant activity that we needed to look at.”
Angel Fire development was sponsored by the Marines and the Joint IED Defeat Organization (JIEDDO). “There was an interest from the Marine Corps to help them with real-time situational awareness,” said Eismann. Commercial imaging technology provided a wide-field-of-view daytime sensor able to update the big picture continuously. “The way we responded was to continuously surveil an area with fine resolution, a large area tens of kilometers square at fine enough resolution that you could put multiple pixels on a combatant.”
An Angel Fire operational assessment in California in late 2006 showed a wide-field-of-view persistent sensor with 0.5 meter resolution and update rates of one to two frames per second was good enough to watch dismounted fighters. (Full-motion video refreshes at 30 frames per second.) Contract operator SAIC subsequently took the WAAS sensor to Iraq on King Air turboprops. Ground stations enabled analysts to steer and zoom the airborne sensor and see the big picture with 10 seconds latency. “The hard part of that was not in the sensor hardware needed to do it,” said Eismann. “The hard part was in moving the data down to the ground and out to the people who need it and manage the large data volume.”
Angel Fire analysts could call up imagery in small, independent data packets that saved downlink bandwidth. Data stored on the ground could in turn be “rewound” to reconstruct events. The look-back image exploitation and analysis of WAAS systems aims to determine “patterns of life” and correlate events, anomalies, and vehicle tracks with geographic information systems. Forensic analysis of WAAS imagery can, for example, backtrack IED makers to explosive caches.
Angel Fire’s success in Iraq and broader joint-service central command missions drove further WAAS developments. “We recognized a lot of bad things happen at night,” said Eismann. “We had to figure out how to do this in the infrared band to operate at night hours.” The Naval Research Laboratory and other DoD resources contributed wide-field-of-view infrared imaging technology. AFRL meanwhile sought to integrate the day-night Angel Fire with other sensors. According to Eismann, ”We recognized that if we take this wide-area motion imagery [WAMI] capability and couple it up with other reconnaissance and surveillance capabilities from other sources – ground-based or airborne – we can get a big gain from the multi-sensor combination in these counterinsurgency missions.”