Russ Cohen

The Chipmaker Revolution: Navigating the Data Center Frontier The Chipmaker Revolution: Navigating the Data Center Frontier

Apple is best known for its consumer electronic products that develop a cult-like following. In fact, the tech stock is a formidable chipmaker. The very first in-house chip appeared in the iPhone 4 way back in 2010. Yet today, all Apple computers run on their own processors.

Apple’s chips are for its own consumption though. They don’t sell them to others. And because it relies upon its own designs, it is no longer beholden to other chipmakers for supply. Now Apple isn’t a foundry. It doesn’t physically make its own processors. As so many others do, it turns that function over to Taiwan Semiconductor Manufacturing.

But currently, Apple is looking to expand. A recent report in The Wall Street Journal said the tech giant was developing its own chips that are designed to run artificial intelligence (AI) software in data center servers.

Apple Chips in Data Centers are codenamed ACDC. These processors will target AI inference rather than AI training. The difference between them is with training, the processors develop and refine AI algorithms while inference, take already-trained AI algorithms to use real-world data.

Apple isn’t alone. Microsoft and Meta Platforms are developing custom inference chips to minimize reliance on third-party chipmakers. But with the field growing more crowded which are the best chipmaker stocks to buy? The three companies below easily make the cut.

The Reign of NVIDIA (NVDA)

There is no question Nvidia has to top the list. Its graphics processing units (GPU) dominate right now with a 98% share of the data center market, according to Wells Fargo. Although that is expected to drop to between 94% and 96% as rivals bring their own processors to market, Nvidia will still be the industry leader.

Nvidia’s GPUs target the training side of the equation. Its original Tesla and Ampere GPUs were designed to tackle the vast computational challenges AI presented. Its next-generation Hopper GPUs took it to the next level. Those had an ability to deliver sixfold more performance than the Ampere chips. The H100 chips are still seen as the foundational base that other chipmakers aspire to teach AI models. Now the latest iteration is the GH200 Grace Hopper Superchip which triples the bandwidth the H100 offered.

See also  Natural Gas Futures: Weathering the Storm of Rising Production Natural Gas Futures: Weathering the Storm of Rising Production

But, it is much more expensive, so adoption may not be as quick. For unparalleled power, hyperscalers may choose the GH200. Yet it opens up a wedge for competitors to offer lower cost chips that can more than get the job done. Still, Nvidia remains the premiere AI chipmaker stock to buy.

ARM Holdings (ARM)

British chip stock Arm Holdings is not exactly a chipmaker. Rather it develops a chip architecture and a set of instructions that it licenses to customers. Apple’s chips are all ARM processors. The two signed an agreement last year that extends through 2040 and beyond.

Apple and Arm’s other clients include Qualcomm and Amazon. These companies pay royalties to Arm Holdings on each unit shipped.

ARM’s low power needs have made the chips exceptionally popular with mobile handset manufacturers. Also, it made them in demand for data centers, where energy consumption is increasingly a concern. The large language model chips from Nvidia, Microsoft, and Amazon all run on Arm’s newest V9 technology. While the primary functions of data centers are performed by GPUs, Arm Holdings chips will run alongside them. So, they will benefit from the growth AI is creating in data center demand.

Indeed, because of Nvidia’s data center dominance, you could see ARM become an integral part of data center architecture. Nvidia’s latest, most advanced chip Grace Blackwell Superchip integrates the chipmaker’s GPUs with Arm Holding’s central processing units (CPUs). ARM CEO Rene Haas told analysts, “I think now with Nvidia’s most recent announcement, Grace Blackwell, you are going to see an acceleration of ARM of the data center in these AI applications.”