Defense Media Network

NASA: Assured Autonomy for Aviation Transformation

NASA's Aerospace Research Mission Directorate on NACA's 100th Anniversary

 

One NextGen technology, for example, is a satellite-based system called Automatic Dependent Surveillance-Broadcast, or ADS-B, which uses GPS to locate planes in three dimensions and establish a real-time connection to air traffic control. ADS-B “Out” – the capability for a vehicle to broadcast position information to ground stations and other aircraft – will be required on all aircraft flying in controlled airspace by 2020.

Before unmanned aircraft can be integrated into the NextGen system, FAA needs to lay down some rules, but this task faces several obstacles. In a 2014 report commissioned by NASA, the National Research Council listed both the potential benefits and barriers – legal, social, regulatory, and technological – associated with introducing UAS in the national airspace.

The Federal Aviation Administration gave approval for energy corporation BP and unmanned aircraft systems (UAS) manufacturer AeroVironment to fly an AeroVironment Puma AE (“All Environment”) UAS for aerial surveys in Alaska – the first time the FAA authorized a commercial UAS operation over land. The FAA issued a Certificate of Waiver or Authorization to survey BP pipelines, roads, and equipment at Prudhoe Bay, Alaska, the largest oilfield in the United States. AeroVi ronment photo

The Federal Aviation Administration gave approval for energy corporation BP and unmanned aircraft systems (UAS) manufacturer AeroVironment to fly an AeroVironment Puma AE (“All Environment”) UAS for aerial surveys in Alaska – the first time the FAA authorized a commercial UAS operation over land. The FAA issued a Certificate of Waiver or Authorization to survey BP pipelines, roads, and equipment at Prudhoe Bay, Alaska, the largest oilfield in the United States. AeroVironment photo

NASA is helping the FAA address technical barriers to integration with a project launched in 2011: the Unmanned Aerial Systems Integration in the National Airspace System (UAS in the NAS) project.

“NASA’s focus is on those technical areas, and on passing on data that will help officials make appropriate policies and rules to govern this integration,” said Director of Integrated Systems Research Ed Waggoner, D.Sc.

UAS in the NAS creates a test environment in which investigators seek solutions to the following technical challenges:

  • Sense and Avoid (SAA)/Detect and Avoid (DAA) Performance Standards: UAS integration requires assurance that an unmanned aircraft is able to perceive and avoid trouble in the sky. NASA research will aim to develop and validate the minimum operational standards for an aircraft’s ability to safely share airspace.
  • Command and Control (C2) Performance Standards: Currently operating UAS use a data communications link between the remote, ground-based pilot and the aircraft – a link known as the Control and Non-Payload Communications (CNPC) waveform. This link will require its own dedicated and protected bandwidth spectrum, and NASA researchers are in the process of testing a prototype CNPC radio.
  • Integrated Test and Evaluation (IT&E): Simply put, this technical challenge involves creating an advanced simulated testing environment that will generate useful research findings related to the other project issues.

In March 2012, NASA’s Ikhana unmanned aircraft undertook the first flights of hardware for the UAS in the NAS project during several test runs at Dryden (now Armstrong) Flight Research Center. The Ikhana was fitted with an ADS-B device – the first time the technology had been used with an unmanned system – and was able to provide detailed position, velocity, and altitude data about itself to air traffic controllers, pilots of other ADS-B-equipped aircraft in the vicinity, and to its own ground-based crew.

The successful use of ADS-B aboard the Ikhana provided some assurance that an unmanned craft could communicate with others in its vicinity – but it was still, Waggoner pointed out, a one-way demonstration: “We’re looking at how to adapt ADS-B In – a function that will allow the aircraft to receive ADS-B signals from others broadcasting in the vicinity, so you’ll know where other traffic is.” In the next two years, the project will evaluate the performance of an onboard radar system that could be used to detect other aircraft that may or may not also be using ADS-B – a key first step, Waggoner said, in developing detect-and-avoid capabilities for unmanned systems.

It’s important to point out, as Waggoner does, that the current research design aims at a remote pilot’s ability to avoid trouble detected through an onboard sensor suite. The kind of artificial intelligence that will allow an unmanned aircraft to think for itself – to both sense danger and avoid it – isn’t here yet, at least not near a level that would offer any assurance to a regulatory agency such as the FAA, which continues to require a human pilot in the loop for all aircraft flights.

“We have a focus project in the integration of unmanned aircraft into the national airspace that is only touching on the periphery of autonomy,” Waggoner said, “because it is using highly automated systems and various levels of automation. That in itself is not really getting towards full autonomy – though it’s a necessary step that will have to be taken to achieve autonomy for unmanned aircraft.”

 

The Low-Altitude Frontier

The distinction between automation and autonomy is essentially the difference between relegation – assigning discrete, easily performed tasks – and delegation – assigning a given set of mission parameters. Danette Allen, Ph.D., Chief Technologist for Autonomy and Head of the Autonomy Incubator at NASA’s Langley Research Center, explained that it’s the difference between machine-based execution and machine-based decision-making, and it’s a crucial difference, one that points to a yawning gap between the current capabilities of unmanned systems and the future many envision for unmanned flight.

“Think about the movies you’ve seen,” said Allen. “For a mission, you send out multiple pilots, and they’re constantly talking to each other, constantly renegotiating what the mission is based on the health of the vehicles, the status of the pilots. Now think about doing that without people involved. That’s an example of what machine autonomy is.” It’s the difference, she explained, between pressing the button on your toaster and saying to an unmanned aircraft: “Here’s the goal. Go and execute it. I’m not going to tell you how.”

Prev Page 1 2 3 4 Next Page

By

Craig Collins is a veteran freelance writer and a regular Faircount Media Group contributor who...