Telecom & Connectivity | 4 min read

89% of Telecom Executives Are Increasing AI Budgets. The ROI Isn't There Yet.

89% of telecom executives plan to increase AI spending over the next 12 months, but power costs, specialized hardware, and uncertain short-term ROI are delaying full AI-native network deployments.

Hector Herrera
Hector Herrera
A Operations Center featuring monitor, data centers, related to 89% of Telecom Executives Are Increasing AI Budgets. The ROI
Why this matters 89% of telecom executives plan to increase AI spending over the next 12 months, but power costs, specialized hardware, and uncertain short-term ROI are delaying full AI-native network deployments.

89% of Telecom Executives Are Increasing AI Budgets. The ROI Isn't There Yet.

By Hector Herrera | May 6, 2026 | Telecom

Nearly nine in ten telecom executives plan to increase AI spending over the next 12 months, according to a new state-of-AI-in-telecom report. The top use cases — autonomous network management and AI-RAN (Radio Access Network systems that use AI to optimize spectrum and signal processing) — have clear theoretical returns. The problem: costs are the biggest hurdle, and short-term ROI remains uncertain enough that full AI-native deployments are being delayed at most carriers.

This is the telecoms' version of a near-universal enterprise AI story: the investment conviction is there; the deployment economics haven't fully resolved.

What Carriers Are Actually Building

The report identifies three primary deployment areas where AI is already generating measurable value in telecom networks:

Autonomous network management: AI systems that monitor network performance, detect anomalies, and make real-time adjustments without human operators. The promise is fewer outages, faster fault resolution, and lower NOC (Network Operations Center) staffing costs. Early deployments are showing results in fault detection speed — catching issues in minutes rather than hours.

AI-RAN: Applying machine learning to radio access networks allows dynamic spectrum allocation and signal optimization that traditional rule-based systems can't match. In dense urban environments where spectrum is constrained and interference is high, AI-RAN can improve throughput and coverage. The challenge: it requires specialized hardware and deep integration with existing RAN infrastructure.

Edge computing + AI: Moving AI inference to the network edge — closer to end devices rather than centralized in the cloud — reduces latency for latency-sensitive applications. For carriers, it also opens a potential new revenue stream: selling edge compute capacity to enterprises that need low-latency AI at the network level.

The Four Cost Barriers

The report is specific about what's holding back AI-native network deployments, and the barriers are structural, not trivial:

  1. Power consumption. AI inference and training workloads are energy-intensive. Running AI continuously at network scale — across thousands of cell sites and data centers — adds meaningful operating cost. For carriers already managing thin margins on connectivity revenue, energy cost is a first-order constraint.

  2. Specialized hardware. AI-RAN requires purpose-built accelerators (GPUs, NPUs) that are expensive and exist in short supply. Carriers can't just upgrade software on existing hardware. The capital expenditure cycle for network equipment runs 5–10 years, which means full AI-RAN deployment is measured in the same timeframe.

  3. Integration complexity. Telecom networks are extraordinarily complex, multi-vendor environments. Integrating AI systems that span multiple hardware generations, vendor ecosystems, and legacy OSS/BSS (Operations Support Systems / Business Support Systems) platforms requires substantial engineering work and creates long implementation timelines.

  4. Uncertain short-term ROI. The business case for autonomous networks is sound at full deployment scale. At partial deployment — which is where most carriers are — the ROI picture is murkier. You incur the full cost of the AI system but capture only a fraction of the labor and efficiency savings.

Why Carriers Are Investing Anyway

The 89% planning to increase budgets despite these barriers reflects a strategic logic: the carriers that build AI-native networks first will have structural cost and capability advantages that compound over time. Network automation means lower operating costs at scale. AI-optimized spectrum means better coverage at lower capex. The carriers that defer face catching up to competitors who spent the last five years integrating.

There's also a competitive threat from non-traditional players. Hyperscalers — Amazon, Google, Microsoft — have both the AI capability and the capital to build private network infrastructure. If carriers don't provide compelling AI-native offerings, some enterprise customers will route around them.

What the Numbers Mean for Vendors

The telecom AI spending surge is a significant revenue opportunity for a specific set of vendors. Ericsson and Nokia, the dominant RAN vendors, are both positioning AI-RAN as a core feature of their current-generation equipment. NVIDIA is a key enabler — GPU acceleration for AI-RAN inference is a real and growing market. Cloud providers are positioning as AI platform partners for network automation.

Smaller AI operations software vendors targeting NOC automation and network analytics are also seeing accelerated interest. The enterprise architecture for autonomous networks requires a stack that most carriers are still assembling.

What to Watch

Track carrier capital expenditure announcements for explicit AI-RAN line items — several major carriers have begun breaking out AI network spending separately, which will provide the clearest picture of deployment pace. Ericsson and Nokia quarterly earnings calls are the best windows into hardware order momentum. The first carrier to announce a fully autonomous network for a major metro market will set the benchmark others are measured against.

Key Takeaways

  • By Hector Herrera | May 6, 2026 | Telecom
  • costs are the biggest hurdle
  • Autonomous network management:
  • Edge computing + AI:
  • Specialized hardware.

Did this help you understand AI better?

Your feedback helps us write more useful content.

Hector Herrera

Written by

Hector Herrera

Hector Herrera is the founder of Hex AI Systems, where he builds AI-powered operations for mid-market businesses across 16 industries. He writes daily about how AI is reshaping business, government, and everyday life. 20+ years in technology. Houston, TX.

More from Hector →

Get tomorrow's AI briefing

Join readers who start their day with NexChron. Free, daily, no spam.

More from NexChron