AI Infrastructure Public

AMD

GPU and CPU challenger in AI compute

Founded 1969 Santa Clara, CA 26,000+ employees AMD Hardware sales
AMD NMS
$278.39 ▲ +0%
Market Cap Tier
large-cap

Price delayed up to 15 minutes. Source: Yahoo Finance.

Earnings Snapshot

Q4 2025: Revenue $7.6B (+24% YoY). Data center revenue $3.9B (+69%). AI accelerator revenue exceeded $5B for full year 2025.

About AMD

Advanced Micro Devices (AMD) is the primary competitor to NVIDIA in the AI accelerator market. Founded in 1969, AMD designs CPUs and GPUs for data centers, gaming, and embedded systems. Under CEO Lisa Su's leadership since 2014, AMD has transformed from a struggling chipmaker into a serious challenger across every major computing segment.

AMD's MI300X GPU is the company's flagship AI accelerator, designed to compete directly with NVIDIA's H100. With 192GB of HBM3 memory (50% more than the H100), the MI300X targets large language model inference workloads where memory capacity is the bottleneck. AMD has secured significant AI accelerator orders from Microsoft, Meta, and Oracle.

The company's challenge is not hardware performance — the MI300X is competitive on many benchmarks — but software ecosystem. NVIDIA's CUDA has over a decade of developer adoption, making ROCm (AMD's equivalent) a harder sell despite being open-source. AMD is investing heavily in ROCm development and working with major AI frameworks to ensure compatibility.

Technology & Approach

AMD's AI strategy combines its CDNA GPU architecture (optimized for compute) with ROCm, its open-source GPU computing platform. The MI300X uses a chiplet design that combines GPU and CPU dies with HBM3 memory on a single package, maximizing memory bandwidth. AMD emphasizes memory capacity as a differentiator — enabling larger models to fit in GPU memory without the performance penalty of model parallelism across multiple GPUs. The company is also developing AI capabilities in its EPYC server CPUs and Ryzen AI client processors.

Products & Services

MI300X

Flagship AI accelerator with 192GB HBM3 memory. Designed for LLM inference and training workloads.

Hardware

ROCm

Open-source GPU computing platform. AMD's answer to NVIDIA CUDA for AI and HPC workloads.

Platform

EPYC

Server CPU family powering major cloud data centers. AI-optimized variants include dedicated inference accelerators.

Hardware

Ryzen AI

Client processors with integrated neural processing units (NPUs) for on-device AI inference.

Hardware

Leadership

L
Lisa Su
Chair & CEO
Led AMD's turnaround since 2014. One of the most respected CEOs in tech.
V
Victor Peng
President
Former CEO of Xilinx (acquired by AMD for $49B).
M
Mark Papermaster
EVP & CTO
Leads technology and engineering across all product lines.

Notable Achievements

  • MI300X secured major orders from Microsoft, Meta, and Oracle
  • Lisa Su named one of Fortune's Most Powerful Women repeatedly
  • Acquired Xilinx for $49B, adding FPGA and adaptive computing capabilities
  • ROCm platform is fully open-source, unlike NVIDIA's proprietary CUDA
  • Data center revenue grew over 100% YoY in 2024

NexChron Coverage

Latest articles mentioning AMD

No articles yet. Our coverage of AMD is expanding.

Financial Disclosure: NexChron provides financial data for informational purposes only. This is not investment advice, a recommendation to buy or sell securities, or an offer to transact. Stock prices are delayed up to 15 minutes and sourced from Yahoo Finance. Funding round data is compiled from public reports and may not reflect the most current information. Company valuations, revenue estimates, and financial projections are based on publicly available data and may be inaccurate or outdated. Always consult a qualified financial advisor before making investment decisions. NexChron, its founder, and contributors may hold positions in companies mentioned on this site.