Designing multifunction radar and EW systems means staying ahead of the tech curve
Huge data demands - including those from artificial intelligence (AI) and machine learning (ML) applications - are pushing radar and electronic warfare (EW) developers to seek new ways to deliver multifunction systems that also meet strict size, weight, and power (SWaP) requirements.
Radar and electronic warfare (EW) developers are leveraging high-speed signal-processing devices such as field-programmable gate arrays (FPGAs) and general-purpose graphics processing units (GPGPUs) to meet the U.S. military’s insatiable desire for data. Meanwhile artificial intelligence (AI), neural networks, and machine learning (ML) concepts are more and more being embraced in EW and radar applications to ensure the U.S. maintains that tactical edge over its adversaries for years and decades to come.
“Since the enemy will pursue similar strategies and weapons, the challenge is to keep outperforming them with better technology and methods,” says Rodger Hosking, vice president and cofounder at Pentek, Inc. [Upper Saddle River, New Jersey].
To stay ahead, the defense industry is working at addressing the technical challenges it faces. “Said simply, it is ‘big data,’ says Chris Rappa, product line director for Radio Frequency, Electronic Warfare and Advanced Electronics within BAE Systems FAST Labs [Arlington, Virginia]. “EW and radar systems throw away loads of data that could be useful. It is all about power. We are getting smart on how to optimize power modes to handle these gigantic increases in data, but the challenges on how to process more and more data are not going to be solved over the next three years.”
Addressing these data pain points means getting down to the nitty-gritty of radar and EW systems. And, “One of the greatest challenges will be in the exponential increase in sensor data,” says Tammy Carter, senior product manager for OpenHPEC products for Curtiss-Wright Defense Solutions [Ashburn, Virginia]. “This will drive the need for even faster backplanes, memory/data management, and the associated challenges of reliability. This in turn will drive the need for faster and larger data recorders with more focus on security. The development cycle of these new systems will require more analysts and data scientists to design and verify the new algorithms.”
Staying ahead of adversaries and getting past technical challenges also means upgrading and taking into account obsolescence issues, Noah Donaldson, chief technology officer at Annapolis Micro Systems, explains. “How do customers upgrade to the latest technology quickly? Traditionally, it has taken five to 10 years to upgrade a platform, but that is too long. That’s where commonality and open standards come into play. We work closely with SOSA [Sensor Open Systems Architecture consortium] and VITA [standards organization] to develop modular open architecture products that have standardized hardware profiles. As an example, our WILDSTAR 6XB2 6U Board was developed in alignment with SOSA and VITA standards.” (Figure 1).
Industry players have begun to collaborate to support the advancement of radar/EW systems, says Roy Keeler, senior product and business development manager, Aerospace and Defense, at ADLINK Technology [Washington D.C.]. Keeler says, “You’re seeing partnerships in industry at the chip-level manufacturing, at the manufacturing level for chips and for systems that will enable us to overcome some of the technical challenges that we’re facing.
What tech is necessary and what isn’t?
“The military is now starting to question why we need dedicated pieces of hardware that just do one job in particular,” Keeler says. “For that reason, the trend in signal-processing designs for radar and EW systems is leading towards more configurable-type of systems where for example, the end user can utilize a radar as an EW receiver or transceiver.”
Requirements for power consumption, efficient data management, and limited space are playing a role in designing these systems. Essentially, some U.S. Department of Defense (DoD) requirements include the addition of “higher-density solutions to support multielement antenna arrays, where each element requires a separate signal processing channel for phased array beam steering,” Hosking says. “Each element may require both receive and transmit functions to support radar and EW countermeasures applications. While size and weight of the electronics for each channel are important, so are power requirements and cost per channel.”
In addition, users want “to install these functions as close to the antenna as possible to minimize cabling that imposes performance penalties due to signal loss and interference,” Hosking continues. “By incorporating RF circuitry, data conversion, and DSP [digital signal processing] functions in a housing near the antenna, wideband digitized signals can be delivered through optical cables, maintaining optimum signal fidelity.”
There is a definite need to readjust components in order to have a cognitive radar or EW system on the field. “We’re also seeing requests for moving general processing to the ARM cores on many FPGA SoCs [field-programmable gate array system-on-chip], pulling out the SBC [single-board computer] and handling general processing elsewhere in the system via a networked processor,” says Peter Thompson, vice president, product management, at Abaco Systems [Huntsville, Alabama]. “Abaco’s VP430 Direct RF [radio-frequency] Processing System is a 3U VPX COTS [commercial off-the-shelf] solution that leverages the Xilinx ZU27DR RF system-on-chip (RFSoC) technology,” Thompson says (Figure 2).
SWaP and multifunction systems
SWaP requirements are pushing designers to the max, with users desiring “more capability at a lower cost in shrinking form factors,” Rappa explains. “Increases in bandwidth and data rates with lower latency continue to be recurring themes. Lower size, weight, power, and cost desires drive the industry towards multifunction systems, where radar, EW, and other functions need to be combined with minimal performance sacrifice.”
The market has been using GPGPUs and FPGAs to make all these demands a reality: “I am seeing increased interest in using GPGPUs and direct transfer of data from the FPGA to the GPGPUs using GPUDirect to reduce latency,” Carter says. “More systems will have GPGPUs because of their power efficiency and the possibility for reduced SWaP because of the better throughput. For example, a 6U OpenVPX chassis with five server-class Xeon boards and a switch card yields around 5.6 TFLOPS for approximately 1000 watts. In comparison, a 6U OpenVPX single Xeon server-class CPU card paired with two OpenVPX dual NVIDIA Pascal P5000 GPGPU boards pumps out over 13 TLFOPS for around 550 watts.”
Convergence of systems
Without a doubt, “the military has turned the heat up on these trends to converge systems that utilize RF such as radar and electronic warfare into one,” says David Jedynak, chief technology officer at Curtiss-Wright [Austin, Texas]. “The military is looking to converge signal-processing tech that does multiple things into one system. From a bird’s-eye view that sounds really easy. However, from a technical level, that proves to be a challenge.”
From a technical perspective, “Combining these capabilities requires smart techniques,” Rappa says. “Adaptive, adaptable, or cognitive capabilities can adjust the function according to the mission demands, potentially reducing the computational requirements to do everything at once.”
Enter AI and ML …
Data and the role of AI, ML tech
The ripple effect of the AI push is resulting in adaptive cognitive radar and EW systems. When it comes to the big data challenge, AI may have the answer.
“The end-all goal for electronic warfare is to realize the concept of cognitive EW,” Jedynak explains. “The question is: How does artificial intelligence enable signal processing? Artificial intelligence and machine learning techniques are helping to automate tasks to ease the end user’s workload.”
“What we’re trying to do and what the industry is trying to do is see through the noise, see through the clutter in a very congested electromagnetic environment,” Keeler adds. “What AI will enable is to preprocess data by using machine learning techniques to learn the environment, then make decisions from the sensor-collected data.
“This way, gobs of data are not being sent back to be filtered in the rear echelons,” Keeler continues. “Artificial intelligence will put more intelligence at the edge, at the forward edge of the battle area, and allow the sensors to filter through all of that noise and unwanted data. Specifically, the user community would like to employ AI to reduce that sensor-to-shooter timeframe. AI will enable that by allowing preprocessing to take place.”
Yet engineers still face a challenge even when putting AI into radar and EW systems. Processing demands will continue and “there are always processing constraints,” Rappa adds.
“Classically, heavy filtering has been used to reduce the processing demands,” he continues. “Filtering, unfortunately, destroys features that AI might use to make better decisions. Therefore, any capability that reduces the necessity to filter becomes an enabler for artificial intelligence, and power efficiency is a big driver. The adjacency of processing with data needs to be prioritized as data transport has a huge power cost. Improvements in the power added efficiency of the front end are mandatory. We have prioritized investment in more efficient front-ends with capabilities like short gate GaN-on-SiC (gallium nitride on silicon carbide) and have a qualified 6-inch GaN-on-SiC foundry. Computation in memory enabling minimal data transport is also technology that we are very interested in adopting to enable AI.”
To continue enabling all these technologies, “BAE Systems signed a cooperative agreement with AFRL to transition their short gate GaN technology to our foundry,” Rappa adds. When it comes to the power needs, “This technology has significant improvements in power-added efficiency over the current state of the art.”
For radar and EW systems, “the most important factor in artificial intelligence signal processing is high bandwidth,” Donaldson says. There is a market for systems that require a high volume of data coming in, so much so that “there are new processors coming out that are focused solely on AI.”
With so much investment going into AI, “The idea is to teach and train these systems to automate more tasks,” Jedynak says. “With cognitive EW, if we’ve got the right silicate on platforms, which is suited for machine learning approaches – also meaning deep neural networks, convolutional neural networks, and LSTM [long-short-term memory]-type systems – with the right foundation on the platform, we can start enabling more of these capabilities.”
“The increase in available memory on the processor, communication bandwidth for passing data, and overall throughput in embedded signal processing systems enable artificial intelligence for radar and EW applications,” Carter says. “A deep learning reference optimizer and runtime, NVIDIA’s TensorRT, can help radar and EW applications achieve low latency and higher throughput.”
Power also needs to be addressed when incorporating AI or neural networks in radar and EW systems: “What it takes to train a neural network is not the same as what it takes to run it,” Jedynak explains. “It takes a lot of horsepower to train a neural network and a lot of effort needs to go into it. However, once you have a trained network it can run at a much more modest hardware.”
Therefore, emphasis is being placed on the software side of developing radar and EW systems, says Keeler. “When you look at, for instance, NVIDIA and their frameworks for their software, when they first started releasing this in 2015, they had approximately 400,000 downloads. Then, in 2016, that jumped to two million downloads. Then, last year, they had over eight million downloads of their frameworks that deal with artificial intelligence. It’s just starting to ramp up. It’s just starting to gain traction in the industry. AI will really have an impact on EW and radar applications in the future.”
Ramping up AI and ML for military
A glimpse of what AI will be able to do for the military: “The long-term results, if we look at commercial areas that are applying AI and machine learning in systems, such as what DeepMind is doing with AlphaGo and now AlphaGo Zero, these are now teaching themselves how to play games all alone without any other direction,” Jedynak says. “That gives you an idea of where the technology is heading.”