The role of machine learning in autonomous spectrum sharing
Speakers and attendees at the recent NIWeek in Austin, Texas, – the annual conference and exhibition held by National Instruments (NI) – discussed the technological obstacles and potential solutions for enabling autonomous spectrum sharing and touted the U.S. Defense Advanced Research Projects Agency’s (DARPA’s) Spectrum Collaboration Challenge (SC2) as an excellent platform for finding such solutions.
Launched in 2016, SC2’s goal is to create a collaborative machine-learning competition to address radio frequency (RF) spectrum challenges. DARPA experts created SC2 to help users of the existing radio spectrum overcome the problem of clogged spectrum. Demand for radio spectrum has grown steadily over the past century, and in the past several years has increased at a rate of 50 percent per year.
SC2 wants to move away from traditional ways of communicating via one frequency. As DARPA’s Paul Tilghman explained during his keynote speech at NIWeek, one of the biggest obstacles in spectrum management is that “frequency isolation completely dominates our spectrum landscape.”
The problem is that radios can no longer stay in one frequency lane and at the same time meet the growing needs of the commercial and military world, Tilghman pointed out. That’s where SC2 comes in: It’s an open competition to create a unique radio system that will autonomously collaborate with other radios with no spectrum allocation.
A total of 30 teams joined the competition, all competing for $3.75 million in prize money that will be allocated to first-, second-, and third-place winners.
To make this competition a reality, National Instruments teamed up with Johns Hopkins University Applied Physics Laboratory (APL) and DARPA to create one of the biggest testbeds in history. Competitors will use what the group calls the “Colosseum,” with a physical location at John Hopkins, to test their algorithms and eventually create radios that can cognitively collaborate with each other.
The purpose of the Colosseum – a 256-by-256-channel RF channel emulator – is to emulate the wireless environment. It can calculate in real time more than 65,000 channel interactions among 256 wireless devices and can emulate thousands of interactions between all types of wireless devices that include the Internet of Things (IoT), cellphones, military radios, and the like.
“The Colosseum is the wireless research environment that we hope will catalyze the advent of autonomous, intelligent, and – most importantly – collaborative radio technology, which will be essential as the population of devices linking wirelessly to each other and to the internet continues to grow exponentially,” Tilghman said in a statement released by DARPA.
NIWeek events provided a glimpse of what the future will hold in terms of spectrum management.
In short, said Manuel Uhm – director of marketing, Ettus Research, a National Instruments company, and chair of the Board of Directors of the Wireless Innovation Forum – at NI Week: “This huge effort will enable warfighters and their radios to have spectrum situational awareness through the machine learning and AI [artificial intelligence] capabilities.”
The ideal end goal of the program is to enable autonomous spectrum sharing. “Today we need spectrum sensors (Environmental Sensing Capability, or ESC) and a Spectrum Access System (SAS),” said Uhm. “In the future, it would be amazing if none of that was necessary due to optimized and highly effective collaborative spectrum usage. It will take a long time before DoD [the Department of Defense] or commercial operators put their faith in such algorithms, but one can dream!”
SC2 will address current limitations in the communications arena. “When a naval radar signal is detected by a spectrum sensor, all the radios – known as CBRSDs (Citizens Broadband Radio Service Devices) – in that area have to stop using that spectrum within 60 seconds of notification,” Uhm said during NIWeek. “Adding more of the intelligence into the radio so they can act more autonomously would be a tremendous benefit, both commercially and for the military.”
The SC2 project’s avenue into enabling machine learning are NI’s universal software radio peripherals (USRPs). “The teams are going to be controlling SDRs [software-defined radios] in multiple possible different scenarios,” Uhm explained. “The USRPs that the performers are going to be using have SDR and cognitive radio technology by the industry definition today, but the secret sauce isn’t for them to develop the radio part of it. The radio’s a solved problem. The unsolved part is how do you create a radio that is intelligent enough to be able to adapt dynamically to the spectrum situation that it’s facing?”
Competitors will use the testbed to “upload machine learning algorithms into those servers that are connected to the radios and they’ll be judged on the basis of how well those algorithms do on a series of criteria in terms of being able to communicate not just to their team members, but doing so in a way that doesn’t impinge on the communications of other teams,” he adds.
An important factor in the challenge is that each simulated channel acts on the premise that it has 100 MHz of bandwidth. Moreover, each transmission and reception frequency can be tuned anywhere between 10 MHz and 6 GHz.
Knowing these parameters, collaboration is the operative word in this challenge. “The key is those machine-learning algorithms that are downloaded onto the server, which control the behavior of the radio in reaction to the scenario being faced,” Uhm stated.
Additionally, he said, “In the background, there will also be a system that helps enable spectrum awareness. If you think about spectrum sharing today, there are systems called SAS that are used in CBRS to be aware of all the nodes in a particular area, and what frequency range they’re transmitting in within the 3.55 to 3.7 GHz for CBRS. Then, if a maritime radar signal is sensed, the SAS sends a signal to stop transmitting in that band to all the CBSDs in that geographic vicinity so there’s no interference with the radar. Well, that SAS is a logical component of how you could create situational awareness within Colosseum. It’s also useful in a broader potential wartime scenario or other types of homeland scenario type of events where broad communications is vital.”
The Colosseum in the clouds
SDR technology will enable SC2 competitors to integrate in a cloud-computing-like environment. By using such an environment, the testbed will be accessible by all teams from around the country. “Putting such a powerful wireless testbed into the cloud is a first-of-its-kind achievement in and of itself,” Tilghman said, “but the most exciting thing will be to see the rapid evolution of how AI solves wireless communications challenges once the competitors have this unique and powerful resource at their disposal.”
With cloud capabilities, the possibilities of using this testbed are endless. Uhm explained: “If you place the intelligence in the cloud, you have a number of dummy nodes, whether they’re radios or some other type of sensor node that is just essentially passing sensor data to the cloud. In this situation, the cloud is making the decisions and then changing the behavior of the nodes.”
Putting intelligence at the edge “means those nodes now have the capability to have some level of intelligence, as well as being able to communicate with the cloud in most scenarios. By having processing capability right at the edge, it is possible to more quickly make decisions right at the edge. However, having artificial intelligence in a node means that you have to have computing resources, which consume power. For example, we all know how long our smartphone batteries last and none of us are happy about it.”
Which brings the competitors to some technical but well-known challenges that include processing and power limitations within the scenarios. “A smartphone can do amazing things within the context of a day before it needs to recharge. There are going to be scenarios where battery-operated sensors will need to last more than a day before being recharged, perhaps years. From a practical perspective, those are going be scenarios where there will be very limited processing capability in the sensor to reduce power consumption. The sensor will communicate critical data to the network for decision-making. But again, that’s also a tradeoff because it also consumes power to transmit back and forth to a cloud. So there are a number of tradeoffs at the end of the day and the best answer will depend on the use case and requirements.”
The benefits of these teams running these scenarios are enormous for the military and commercial world. “There’s absolutely a huge commercial aspect related to having some level of dynamic capability and cognition in your smartphone,” Uhm said “And again, some capabilities may require the broader dataset and processing capabilities in the cloud, so your phone just becomes the sensor. Taking some of the technology developed in the DARPA Spectrum Collaboration Challenge and putting that into a commercial content is absolutely on the table and that’s what DARPA would like to see, very honestly.”
This issue and challenge is a big deal: “The spectrum is a multibillion-dollar asset,” Uhm stated. “Maximizing the value of it is critically important, not just for the military, but in particular for commercial operators, and for all of us as end users.”
The story doesn’t stop here. Manuel Uhm and Paul Tilghman will be hosting a full-day workshop at WinnComm 2017 on November 13-15. The workshop, titled “DARPA Spectrum Collaboration Challenge Challenges,” will also have multiple speakers discussing the challenges that engineers are still facing as well as those that have already been tackled. Check it out to get a better picture of SC2 at WinnComm and visit https://spectrumcollaborationchallenge.com to learn how to get involved.