Meta Platforms to deploy in-house custom chips this year to power AI drive: internal memo
- The chip, a second generation of an in-house silicon line Meta announced last year, could help to reduce Meta’s dependence on the Nvidia chips that dominate the market
- The deployment of its own chip is a positive turn for Meta’s in-house AI silicon project, after a decision in 2022 to pull the plug on the chip’s first iteration
Facebook owner Meta Platforms plans to deploy into its data centres this year a new version of a custom chip aimed at supporting its artificial intelligence (AI) push, according to an internal company document seen by Reuters on Thursday.
The second-generation chip, from an in-house silicon line that Meta announced last year, could help to reduce company's dependence on the Nvidia chips that dominate the market and control the spiralling costs associated with running AI workloads, as it races to launch AI products.
The world’s biggest social media company has been scrambling to boost its computing capacity for the power-hungry generative AI products it is pushing into apps Facebook, Instagram and WhatsApp, as well as on hardware devices like its Ray-Ban smart glasses. The company is spending billions of dollars to amass arsenals of specialised chips and reconfigure data centres to accommodate them.
At the scale at which Meta operates, a successful deployment of its own semiconductor could potentially shave off hundreds of millions of dollars in annual energy costs and billions in chip purchasing costs, according to Dylan Patel, founder of the silicon research group SemiAnalysis.
The chips, infrastructure and energy required to run AI applications have become a giant sinkhole of investments for tech companies, to some degree offsetting gains made in the rush of excitement around the technology.
A Meta spokesman confirmed the plan to put the updated chip into production in 2024, saying it would work in coordination with the hundreds of thousands of off-the-shelf graphics processing units (GPUs) – the go-to chips for AI – the company was buying.
“We see our internally developed accelerators to be highly complementary to commercially available GPUs in delivering the optimal mix of performance and efficiency on Meta-specific workloads,” the spokesman said in a statement.