Editor's Note: This article has been updated to include the latest data and information.
There is one thing that is certain about AI: It is a hungry beast. It takes power to build and train large language models (LLMs). It takes processing speed to deliver generative AI that can answer complex questions, do in-depth research and — even — hallucinate wild answers to persistent questions at the speed of thought.
All types of AI eat massive amounts of high-speed computing power for lunch — and for breakfast and dinner. That means that at the core of everything, AI is a powerful processor, a chip created from silicon and innovation. The companies that build the fastest and smallest chips, perhaps those that can deliver high-power computing without drinking the planet’s resources, will win the heart — and investment dollars — of this industry. But it takes time and a huge investment to build this technology. And the race is already hot.
AI’s ravenous appetite for compute power is pushing the edge of Moore’s Law and forcing chip designers and chipmakers to innovate in ways they haven’t had to in years. Silicon has been creating wealth for decades. But this is a new gold rush.
Here are some of the top chip companies competing in the AI market.
Table of Contents
1. NVIDIA
NVIDIA is a dominant force when it comes to AI chips. This dominance has shot the company to the top of the stock market and made it one of the world’s most valuable companies (with a valuation currently sitting at just under $2.8 trillion).
The company’s AI chips dominate the market and are used by everyone from car makers to tech and AI companies. This success has not caused the company to kick back and slow down. In March of 2024, NVIDIA announced the NVIDIA Blackwell platform, which is designed to help organizations build and run real-time generative AI at 25 times less cost and energy consumption. The massive performance leaps provided by this architecture have already started changing everything from gaming technology to the speed at which foundational models work.
2. AMD
NVIDIA is an AI-chip powerhouse, but long-time chip maker AMD is throwing some serious compute power into the game. It announced the Ryzen AI Pro 300 series chips late last year.
These are the company’s third generation commercial AI mobile processors, and they offer three times the performance of the previous generation of chips. Designed for AI-powered laptops — operating at the cutting edge of workplace demand — they can handle trillions of operations per second and exceed Microsoft’s AI power requirements. A partnership between AMD and Microsoft means these are the chips that power the AI built into Microsoft’s Copilot PCs.
Related Article: Meet the Startups Taking on Big Tech With Smarter AI Chips
3. Intel
Intel has long been inside the computers we use, though it has taken a bit of a beating from NVIDIA. The company announced a new line of AI chips last year: The Gaudi 3 accelerator is designed to be the backbone for generative AI software and is capable of training LLMs 50% faster than NVIDIA’s H100 processor.
Sales of this chip did not meet expectations, however, and the company didn’t meet its revenue targets in 2024. Experts speculate that this is because other chips — NVIDIA’s Blackwell and AMD’s MI300X — are better suited to training LLMs.
4. Meta
Further demonstrating the company’s commitment to AI, Meta is moving deeper into custom-made AI chips. The company’s Meta Training and Inference Accelerator (MTIA) is built with the power to speed up deep learning. The company considers the chip part of a long-term plan to build out the infrastructure to support its AI-hungry services. The company is investing heavily in this infrastructure and has released two generations of this chip.
5. Qualcomm
Qualcomm is a chip developer to its core. It started as a mobile chip designer and is now building AI into those mobile chips. A powerful combination.
The company’s Snapdragon processors include dedicated AI engines to efficiently handle machine learning (ML) right on the device — a boost to speed and privacy for the person holding that device. Qualcomm is also building AI into its chips for PCs, cars and internet of things (IoT) devices.
6. Google
Late in 2024, Google announced the general availability of Trillium, the company’s sixth generation TPU. The enormous processing ability of this Tensor Processing Unit is capable of training and fine-tuning large language models. The company uses this chip to train its AI model Gemini 2.0. This chip also sits at the core of Google Cloud’s AI Hypercomputer, a supercomputer that uses an integrated system of super powerful hardware to deliver mind-blowing compute power.
7. Amazon
AWS’s Trainium family of AI chips currently delivers as much as four times the performance of the company’s first-generation chips, which are created with the express purpose of training AI. It’s right there in the name, and many AI developers have adopted the chip for this purpose.
The company also produces the Inferentia accelerator, which is a high-performance, low-cost AI chip. Paired with the company’s $4 billion investment in Anthropic and $110 million investment in university-led research into AI, the company is a major player in the AI universe. Anthropic not only uses AWS’s cloud services, it also uses the Trainium and Inferentia chips to develop and train its future foundation models.
Related Article: Supercomputing for AI: A Look at Today’s Most Powerful Systems
8. Groq
Groq, inc. — not to be confused with Grok the chatbot — is an AI company founded by a group of former Google engineers. The company's GroqRack Compute Cluster is an AI chip made up of as many as 64 interconnected chips. It is a network on a chip and can optimize data center efficiency, support large-scale deployments and is often used for deep learning, model training and large-scale data processing.
The high speed, scalability, wide bandwidth and reliability of this cluster of high-speed computing on a single chip has the processing power to manage massive workloads, train LLMs, aid research and many other applications.
9. Microsoft
We all think of Microsoft as a software company, but it also delivers some major hardware. So, it is perhaps not that surprising that the company announced it was building its own AI chip — the Azure Maia 100 — in November of 2023.
The chip is intended to not only handle the unique demands of AI workloads — vast power, cooling and network capability — but to also integrate seamlessly into Microsoft’s Azure infrastructure. The company sees this as a way to tailor every aspect of its products to its own needs. Given that Microsoft also has a big stake in OpenAI, a significant market-leading product in Microsoft Copilot, and the cloud AI services through Azure, Maia holds promise.
10. Apple
Apple might be late to the game compared to other chip companies here. But lagging behind everyone else has never stopped Apple from innovating new technologies — or markets. The company is reportedly overhauling its entire Mac line with its own AI-focused M4 chip. The chips boast Apple’s fastest ever Neural Engine and are capable of 38 trillion operations per second. They are already powering the iPad Pro, giving the company’s flagship power tablet industry-leading speed and performance.