The Nvidia and Groq logos side-by-side with a Navy background
News

Nvidia Acquires Groq Assets for $20B

2 minute read
Michelle Hawley avatar
By
SAVED
Nvidia's largest deal brings in AI inference technology and key talent in-house.

Key Takeaways 

  • Nvidia announces acquisition of Groq's AI chip assets for $20 billion.
  • Groq executives and engineers will join Nvidia, while Groq remains independent.
  • Senior technology executives should assess potential improvements in AI infrastructure and inferencing performance from Nvidia's expanded platform.

Nvidia's $20 billion Groq acquisition is the latest in the chipmaker's aggressive push to dominate AI inference as enterprise demand accelerates.

The transaction, announced Dec. 24, 2025, marks Nvidia's largest deal to date, nearly tripling its previous record acquisition of Mellanox for close to $7 billion in 2019, according to company officials.

Groq founder and CEO Jonathan Ross, President Sunny Madra and other senior leaders will join Nvidia to advance the licensed technology. Groq will continue operating as an independent company under new CEO Simon Edwards, the company's former finance chief. The GroqCloud business is not part of the transaction and will continue without interruption.

Alex Davis, CEO of Disruptive, which led Groq's $750 million financing round in September at a valuation of about $6.9 billion, confirmed the deal came together quickly.

Table of Contents

What the Groq Asset Acquisition Adds to Nvidia’s AI Stack

Nvidia is licensing Groq's inference technology through a non-exclusive agreement while acquiring the startup's assets and hiring key personnel. The deal positions Nvidia to expand its generative AI capabilities and strengthen its inference platform.

"We plan to integrate Groq's low-latency processors into the NVIDIA AI factory architecture," said Nvidia CEO Jensen Huang, "extending the platform to serve an even broader range of AI inference and real-time workloads."

What Nvidia GainsHow It Works
Low-latency processorsGroq chips designed for faster AI inference tasks
Inference technology licenseNon-exclusive rights to Groq's AI acceleration IP
Nvidia AI factory integration
Extends platform for broader inference workloads
Talent acquisition Key Groq executives and engineers joining Nvidia

Nvidia Tightens Grip on AI Compute With New Investments

Nvidia reached a historic $5 trillion market valuation in October 2025, becoming the first company to achieve this milestone. The company announced $500 billion in AI chip orders as of October 2025, with its H100 and Blackwell processors powering major large language models including ChatGPT.

Nvidia has made significant strategic investments, including plans to invest up to $100 billion in OpenAI while becoming a key data center chip supplier beginning in 2026, and plans to invest up to $1 billion in AI startup Poolside. 

Why Every Tech Giant Is Racing to Build Its Own AI Chips

Tech giants are racing to build custom silicon, forging alliances and acquiring startups to control their AI compute destiny.

Meta acquired Rivos in October 2025 to build custom AI chips and reduce dependency on third-party accelerators. Amazon developed AWS Trainium and Inferentia, Microsoft created Azure Maia and Cobalt and Google built Axion — all seeking to decrease reliance on external suppliers.

Frontier AI Capacity Consolidates Under US Tech Giants

Strategic partnerships have become critical as companies secure compute resources. Anthropic signed a multi-billion dollar deal with Google for access to one million TPUs in October 2025. OpenAI inked a $38 billion AWS deal in November 2025 to diversify beyond Microsoft Azure.

Approximately 75% of the world's large-scale AI compute capacity now resides in the United States, with Microsoft, OpenAI and Nvidia controlling most frontier compute resources.

Learning Opportunities

Nvidia’s Road to Dominance in GPU and AI Infrastructure

Nvidia, founded in 1993 in California, designs and manufactures graphics processing units (GPUs), central processing units (CPUs), cloud services and software development kits supporting AI, high-performance computing and data analytics. The company's hardware underpins much of the machine learning infrastructure powering today's AI applications.

About the Author
Michelle Hawley

Michelle Hawley is an experienced journalist who specializes in reporting on the impact of technology on society. As editorial director at Simpler Media Group, she oversees the day-to-day operations of VKTR, covering the world of enterprise AI and managing a network of contributing writers. She's also the host of CMSWire's CMO Circle and co-host of CMSWire's CX Decoded. With an MFA in creative writing and background in both news and marketing, she offers unique insights on the topics of tech disruption, corporate responsibility, changing AI legislation and more. She currently resides in Pennsylvania with her husband and two dogs. Connect with Michelle Hawley:

Main image: Simpler Media Group
Featured Research