Military Embedded Systems

Unmanned ground vehicle pilots drones for "eye in the sky"

Story

June 15, 2017

Sally Cole

Senior Editor

Military Embedded Systems

By pairing sensors on an unmanned ground vehicle (UGV) with one or more unmanned aerial systems (UASs), Southwest Research Institute (SwRI) researchers in San Antonio, Texas, can create an "eye in the sky" to improve situational awareness in extreme environments.

By Sally Cole, Senior Editor

By pairing sensors on an unmanned ground vehicle (UGV) with one or more unmanned aerial systems (UASs), Southwest Research Institute (SwRI) researchers in San Antonio, Texas, can create an “eye in the sky” to improve situational awareness in extreme environments.

The technology enables UGVs to pilot UASs in hazardous environments – whether difficult terrain, in the middle of a battlefield, or extreme hot and cold temperatures – which provides critical situational-awareness data to enable safer maneuvers.

SwRI’s system involves sensors mounted on a ground vehicle to perceive obstacles within its path, along with sensors aboard an aerial vehicle to help detect low terrain that may be obstructed by objects in front of the ground vehicle’s sensor path. Sensors on the ground vehicle also help the aerial vehicle navigate and sense tall objects that might affect its flight path.

The UGV’s control system is capable of controlling one or more UASs, which means that the system can then use the combined data from the on-ground and in-sky perception sensors to determine the best paths for the ground vehicle and its remotely controlled drones (Figure 1).

 

Figure 1: Southwest Research Institute researchers developed technology that allows unmanned ground vehicles to pilot one or more unmanned aerial systems for improved situational awareness. Photo courtesy of SwRI.

(Click graphic to zoom by 1.9x)


21

 

 

“We developed this capability to support defense clients seeking solutions to the challenges of UGVs navigating in extreme environments,” says Ryan Lamm, director of the Applied Sensing Department for SwRI.

The system “directly addresses mobility limitations of UGVs, and is application- and mission-independent,” he says. “Many military applications are envisioned for UGVs – including route clearance; casualty evacuation; logistics and transport; and intelligence, surveillance, and reconnaissance (ISR).”

One example of the mobility limitations that SwRI’s system addresses is when navigation sensors on UGVs have difficulty seeing negative obstacles while operating within complex terrain, Lamm points out, because the sensors typically aren’t able to be mounted high enough on UGVs to perceive the depth of the negative obstacle until it’s too late to react.

By offboarding the sensors to a UAS, the UGV can get a view from above. “Today, payload capacities of small UASs are limited. The system that we developed allows the sensor processing to be done onboard the UGV – reducing power requirements and increasing the flight time of the UAS, and thereby extending the mission,” he explains.

It’s also possible to offload the navigation of the UAS by having the UGV teleoperate it to fulfill its mission. “One intelligent autonomous robot is essentially in control of one or more less intelligent UASs,” Lamm says. By having the UGV controlling itself and the UASs, it makes the UGV’s world model – situational awareness – more complete.

So what types of embedded computing and processing does this require? “Its processing requirements aren’t that extensive,” Lamm says. “The system specifically offloads the processing to the UGV, which typically has access to more power and computing capacity.”

Does the system use commercial off-the-shelf (COTS) components? “Absolutely. The system leverages COTS components, but its algorithms are custom,” he adds.

While pairing remote-controlled ground with air vehicles isn’t exactly new, SwRI’s approach is novel because it provides a completely autonomous solution that allows the systems to benefit from each other’s capabilities. Previous systems relied on humans remotely controlling one or both vehicles.

Autonomous control allows a vehicle’s onboard control system to perform its mission independent of a human operator, providing a safe alternative in dangerous environments.

How big is the challenge of pairing the UGV and UAS? “There certainly are challenges to successfully integrate a remote mobile aerial sensor into a world model for a UGV,” Lamm notes. “A few examples include latency in updating the model, the rapidly changing position of the remote sensor, and ensuring the vehicles know each other’s relative position.”

The system’s most surprising capability? “The enhanced situational awareness that can be achieved by fusing in various sensor perspectives,” Lamm says. “The UGV can recognize navigation hazards of the UAS, and the UAS can recognize navigation hazards for the UGV. By fusing this model of the environment together, the UGV can place the UAS in unique positions to give it the information it needs to complete its overall mission.”

This technology has immediate military applications; it is also guiding SwRI as it develops future commercial solutions for remote inspections control.

For more information, visit www.swri.org.