All of the most critical gear for the tower erectors, contractors, and professionals in the wireless communications infrastructure industry
Issue link: http://talleyinc.uberflip.com/i/1533902
Move over, ChatGPT. Operators are turning away from proprietary solu ons and looking to do more ar ficial intelligence (AI) work in-house with open source models. According to a Nvidia survey of 450 telecom professionals around the world, the percentage of operators planning to use open source tools jumped from 28% in 2023 to 40% in 2025. Similarly, the percentage of respondents who indicated they will build AI solu ons in-house rose from 27% in 2024 to 37% this year. "They're really looking to do more of this work themselves," Nvidia's Global Head of Business Development for Telco Chris Penrose told Fierce. "They're seeing the importance of them taking control and ownership of becoming an AI center of excellence, of doing more of the training of their own resources." This, of course, is a bit easier said than done. Penrose noted that the AI skills gap remains the biggest hurdle for operators. Why? Because, as he put it, just because someone is an AI scien st doesn't mean they are also necessarily a genera ve AI or agen c AI scien st specifically. And in order to a ract the right talent, operators need to demonstrate that they have the infrastructure that will allow top- er employees to do amazing work. See also: GPUs, data center infrastructure, etc. AvidThink Founder and Principal Roy Chua noted one of the biggest undertakings operators will have when using open source models is ve ng the outputs they get during training. But having skilled talent ma ers for more than just training AI. Penrose noted that with the rise of agen c AI, operator engineers will need to figure out how to link their in-house models with those offered by partners. "It's not going to be one AI, it's going to be a bunch of AIs," Penrose explained. "And so, one of the big things telcos are going to need to think about is how do they interface and link these AIs … how do I s tch these things together? When do I invoke each of these to do what type of work? That's the next thing that they need to be thinking about." AI-RAN Ge ng it right will be cri cal as operators both look to AI to generate new revenue streams from external-facing services and streamline their own internal opera ons. One key area where operators are looking to deploy AI is in the radio access network (RAN). Just a year a er the launch of the AI -RAN Alliance at Mobile World Congress 2024, a whopping 66% of operators said they are looking to deploy AI services on the RAN. Another 53% are exploring the use of AI to improve spectral efficiency. But there are different ways to pair AI and the RAN. Chua noted that there's "AI on the RAN" (aka running AI workloads on the network), "AI for the RAN" (which is where spectral efficiency and capacity improvements feature) and "AI and the RAN" (which is where RAN and enterprise workloads share the same compute resources). And while AI on the RAN is rela vely straigh orward, the others are a li le more tricky. According to Penrose, Nvidia and partners like Fujitsu have already proven it's possible to run the RAN on accelerated compute infrastructure like Nvidia's GPUs and Aerial pla orm. That work has mostly focused on Layer 1 of the network (aka, the physical layer). Now, efforts are focused on Layer 2, also known as the data layer, to boost spectral efficiency. "That's a huge deal, that's super powerful," Penrose said of the ability to make be er use of limited spectrum resources. So, where do things stand today in terms of deployments? Penrose said Nvidia has been public about AI-RAN deployment efforts with T-Mobile and So Bank, but is also in conversa ons with other unnamed operators around the world on this front. He added part of the slowness in moving is that some operators are looking to me AI-RAN deployments with their usual investment cycles. The maturity of solu ons from preferred partners is also a factor, he said. "True field deployments are s ll at very early stages … But I think you're going to see more live field going out this year and scaling next year," Penrose concluded. For what it's worth, Chua is a li le skep cal that we'll see anything beyond proof-of-concept trials for AI and the RAN this year. Rest assured, we'll have our eyes peeled at MWC for new network AI use cases in a few weeks! Ar cle credit: h ps://www.fierce-network.com/wireless/ telcos-are-increasingly-going-diy-when-it-comes-ai Telcos are increasingly going DIY when it comes to AI Talleycom.com SHEET QUARTER 1 2025