As AI propels semiconductor stocks, Cantor analysts said Nvidia is their favorite choice.
Analysts at Cantor Fitzgerald believe that artificial intelligence is “the only meaningful growth driver” for chip stocks, and they have predictions about which ones will gain the most from the projected increase in spending on AI infrastructure by major cloud providers, sovereigns, neoclouds, and enterprises.
Analysts at Cantor said in a note on Wednesday that they anticipate these trends “to continue driving the AI trade” as tech giants revise capital expenditures upward and cite the need to spend more on AI infrastructure, and as businesses that make AI hardware exhibit “strong product cycles.”
Nvidia Corp. (NVDA), which is currently expanding its Blackwell AI platform, was selected as the best choice by the Cantor team. The Blackwell ramp, according to the analysts, positions Nvidia “up for meaningful beats/raises,” and its earnings per share might reach $8 in the upcoming year, supporting their $240 price target. Additionally, that would be significantly higher than the $6.31 consensus projection for Nvidia’s EPS in the upcoming fiscal year.
View more: Nvidia’s financial results were inconsistent. What Wall Street has to say is as follows.
Taiwan Semiconductor Manufacturing Company Ltd. (TW:2330), Advanced Micro Devices Inc. (AMD), Broadcom Inc. (AVGO), and Micron Technology Inc. (MU) are the other chip stocks that Cantor anticipates would benefit from exposure to AI.
AI is now “an area of relative certainty within a very uncertain world,” according to the analysts, because of geopolitical tensions and economic difficulties. For instance, TSMC told Bloomberg on Tuesday that the U.S. government has revoked its waiver for exporting specific chip-making technology and equipment to its plant in China, and that it will no longer have validated end-user status on December 31. Additionally, the VEU status of SK Hynix Inc. (KR:000660) and Samsung Electronics Co. (KR:005930) was withdrawn.
However, the analysts stated that although momentum stocks have lost some of their luster, this is really a short-term issue. “Overdone,” the experts added, are recent claims that claim businesses have struggled to integrate AI into their operations. They believe that the return on capital for the hyperscale cloud providers is “still strong,” thus they are not worried.
Two months ago, analysts predicted that capital expenditures at Microsoft Corp. (MSFT), Meta Platforms Inc. (META), Alphabet Inc. (GOOGL) (GOOG), and Amazon.com Inc. (AMZN) would climb by 40% and 9%, respectively, but now they are expected to grow by 57% this year and 20% in 2026.
The Cantor team also mentioned that the AI industry is facing “some agita” due to reports that the Chinese government is discouraging local businesses from implementing American technology. This is particularly true of Nvidia’s H20 chip, which the company is waiting to resell to Chinese consumers after the Trump administration effectively banned it from doing so in April.
But according to Cantor, the problems are “as noise today,” and analysts are reaffirming their “bullish thesis” that AI research and application are still in their early stages.
As AMD prepares for its analyst day in November, the Cantor analysts stated that there is increasing hope for the company’s data-center graphics processing units. Additionally, AMD’s client and server central processing unit businesses have “only accelerated higher” due to higher average sales prices and share gains, as well as a “intact” outlook for its Instinct AI accelerator series. Earlier this year, investors were worried that customers were making purchases in advance to beat out potential tariff price increases later. Analysts estimate AMD’s annual earnings per share to be around $4. The consensus from FactSet is $3.85.
Analysts have “high hopes around the company’s ability to significantly grow its penetration across the AI landscape” because of AMD’s push to offer rack-scale solutions, even though the chip maker’s data-center revenues are predicted to remain “relatively small” at an estimated $6.5 billion this year. AMD’s “vision of capturing a non-trivial share in large scale training clusters,” as well as the increasing demand for AI inference, were also mentioned.