Advertisement

Micron stock declines after forecast fails to meet lofty AI spending expectations

  • Micron’s shares fell 7 per cent in extended trading, spoiling this year’s 67-per cent rally on expectations that it will benefit from AI demand

Reading Time:2 minutes
Why you can trust SCMP
Ramping up production of high-bandwidth memory chips has been a challenge, according to Micron Technology. Photo: Shutterstock
Micron Technology, the largest US maker of memory chips, declined in late trading after its forecast disappointed investors seeking a bigger payoff from artificial intelligence (AI) mania.
Advertisement

Micron’s fiscal fourth-quarter sales will be US$7.4 billion to US$7.8 billion, the company said in a statement on Wednesday. While the average analyst estimate was US$7.58 billion, some projections were above US$8 billion. Profit will be about US$1.08 a share, minus certain items, versus a projection of US$1.02.

Though Micron is getting a boost from the global AI boom, demand is still sluggish in its traditional markets, such as personal computers and smartphones. Those sectors are only beginning to recover from a historic slump last year.

The Nasdaq-listed company’s shares fell about 7 per cent in extended trading. Micron had rallied 67 per cent this year before the close, lifted by investor expectations that it will be one of the main beneficiaries of AI spending.

A bare wafer stacker sorts silicon wafers in Building 51 at Micron Technology’s headquarters in Boise, Idaho, on June 10, 2024. Photo: Bloomberg
A bare wafer stacker sorts silicon wafers in Building 51 at Micron Technology’s headquarters in Boise, Idaho, on June 10, 2024. Photo: Bloomberg

In the third quarter, which ended May 30, Micron’s revenue rose 82 per cent year on year to US$6.81 billion. The Boise, Idaho-based company reported a profit of 62 cents a share, excluding certain items. That compares with estimated sales of US$6.67 billion and a projected profit of 50 cents a share.

Advertisement
Micron sells a vital component of AI hardware – high-bandwidth memory (HBM) chips – that works with processors from Nvidia Corp to crunch data. HBM chips can serve up information more quickly, helping computing systems develop and run AI models.
Advertisement