Calling AI: Researchers Dial in Machine Learning for 5G

Berlin, Virginia Tech teams join effort to accelerate telecom industry with GPUs.
by Soma Velayutham
cellular tower pixabay

5G researchers from three top institutions have joined NVIDIA in bringing AI to telecom.

The Heinrich Hertz Institute (HHI), the Technical University in Berlin (TU Berlin) and Virginia Tech are collaborating with NVIDIA to harness the power of GPUs for next-generation cellular networks.

The journey began in October at MWC Los Angeles, where NVIDIA and partners announced plans to enable virtual radio access networks (vRANs) for 5G with GPUs.

NVIDIA also debuted Aerial, a software development kit for accelerating vRANs. And partners Ericsson, Microsoft and Red Hat are working with us to deliver 5G at the edge of the network powered by GPUs.

These vRANs will bring cellular network operators the kind of operational efficiencies that cloud service providers already enjoy. Carriers will program network functions in high-level software languages, easing the work of adding new capabilities and deploying capacity where and when it’s needed.

Forging Wireless Ties

Our new research partnerships with HHI, TU Berlin and Virginia Tech will explore multiple ways to accelerate 5G with AI.

They’ll define novel techniques leveraging GPUs that help wireless networks use precious spectrum more efficiently. The work will span research in reinforcement learning and other techniques that build on the product plans announced in October.

HHI is part of Germany’s Fraunhofer Society, a research group founded in 1928 that has a history of pioneering technologies in mobile and optical networking as well as video compression. The collaboration with TU Berlin includes a new 5G test bed with participation from a number of wireless companies in Germany.

“I want to redesign many algorithms in radio access networks (RAN) so we can perform tasks in parallel, and the GPU is a good architecture for this because it exploits massive parallelism,” said Slawomir Stanczak, a professor at TU Berlin and head of HHI’s wireless networking department.

Stanczak’s teams will explore use cases such as adapting AI to deliver improved 5G receivers. “If we are successful, they could offer a breakthrough in dramatically increasing performance and improving spectral efficiency, which is important because spectrum is very expensive,” he said.

In a session for GTC Digital, Stanczak recently described ways to apply AI to the private 5G campus networks which he believes will be market drivers for vRANs. Stanczak chairs a focus group on the use of AI in 5G for the ITU, a leading communications standards group. He’s also author of a widely cited text on the math behind optimizing wireless networks.

Hitting 5G’s Tight Timing Targets

Work at Virginia Tech is led by Tom Hou, a professor of computer engineering whose team specializes in solving some of the most complex and challenging problems in telecom.

His Ph.D. student, Yan Huang, described in a 2018 paper how he used an NVIDIA Quadro P6000 GPU to solve a complex scheduling problem in a tight 100-microsecond window set by the 5G standard. His latest effort cut the time to 60 microseconds using an NVIDIA Tensor Core V100 GPU.

The work “got an enormous response because at that time people using traditional computational techniques would hit roadblocks — no one in the world could solve such a complex problem in 100 microseconds,” said Hou.

“Using GPUs transformed our research group, now we are looking at AI techniques on top of our newly acquired parallel techniques,” he added.

Specifically, Virginia Tech researchers will explore how AI can automatically find and solve in real time thorny problems optimizing 5G networks. For instance, AI could uncover new ways to weave multiple services on a single frequency band, making much better use of spectrum.

“We have found that for some very hard telecom problems, there’s no math formulation, but AI can learn the problem models automatically, enhancing our GPU-based parallel solutions” said Huang.

Groundswell Starts in AI for 5G

Other researchers, including two who presented papers at GTC Digital, are starting to explore the potential for AI in 5G.

Addressing one of 5G’s top challenges, researchers at Arizona State University showed a new method for directing millimeter wave beams, leveraging AI and the ray-tracing features in NVIDIA Turing GPUs.

And Professor Terng-Yin Hsu described a campus network at Taiwan’s National Chiao Tung University that ran a software-defined cellular base station on NVIDIA GPUs.

“We are very much at the beginning, especially in AI for vRAN,” said Stanczak. “In the end, I think we will use hybrid solutions that are driven both by data and domain knowledge.”

Compared to 4G LTE, 5G targets a much broader set of use cases with a much more complex air interface. “AI methods such as machine learning are promising solutions to tackle these challenges,” said Hou of Virginia Tech.

NVIDIA GPUs bring the programming flexibility of CUDA and cuDNN environments and the scalability of multiple GPUs connected on NVLink. That makes them the platform of choice for AI on 5G, he said.

Today we stand at a pivot point in the history of telecom. The traditional principles of wireless signal processing are based on decades-old algorithms. AI and deep learning promise a revolutionary new approach, and NVIDIA’s GPUs are at the heart of it.