(Bloomberg) — Microsoft is working with Advanced Micro Devices Inc. The chipmaker is expanding into AI processors, according to people familiar with the situation, as part of a multi-pronged strategy to secure more highly sought-after components.
Most Read from Bloomberg
The people, who asked not to be identified because the matter is private, said the companies are collaborating to offer an alternative to Nvidia Corp. , which dominates the market for AI-capable chips called GPUs. The software giant is providing financial backing to support AMD’s efforts, and is working with the chipmaker on a native Microsoft processor for AI workloads, codenamed Athena.
AMD shares jumped more than 6.5% Thursday, and Microsoft gained about 1%. Representatives for the two companies declined to comment. Nvidia stock fell 1.9%.
The arrangement is part of a broader push to increase AI processing power, much needed after the explosion of chatbots like ChatGPT and other technology-based services. Microsoft is the largest provider of cloud computing services and a driving force in the use of artificial intelligence. The company has poured $10 billion into OpenAI for the ChatGPT industry, and has vowed to add such features to its entire suite of software.
Read more: Microsoft Brings OpenAI Chatbot Technology to Office Apps
The move also reflects Microsoft’s deep involvement in the chip industry. The company has built the silicon division over the past several years under former Intel Corp CEO Rani Purkar, and the group now has nearly 1,000 employees. Information reported last month about Microsoft’s development of the Athena AI chip.
Several hundred such employees work on Project Athena, and Microsoft has spent about $2 billion on its chip efforts, according to one person. But the pledge does not herald a split with Nvidia. Microsoft intends to continue to work closely with that company, whose chips are working tricks for training and running AI systems. It’s also trying to find ways to get more Nvidia processors, underlining the pressing shortage that Microsoft and others are facing.
Microsoft’s relationship with OpenAI—and its newly introduced array of AI services—requires computing power on a level beyond what the company expected when it ordered chips and built data centers. OpenAI’s ChatGPT service has attracted interest from companies who want to use it as part of their own products or corporate applications, and Microsoft has introduced a chat-based version of Bing and AI-enhanced Office tools.
It also updates older products such as GitHub’s code generation tool. All of these AI programs run in Microsoft’s Azure cloud and require expensive and powerful Nvidia processors.
The region is also a major priority for AMD. “We’re very excited about our opportunity in the AI space — that’s our No. 1 strategic priority,” CEO Lisa Su said during the company’s earnings call Tuesday. “We are in the very early stages of the computing era of AI, and the rate of adoption and growth is faster than any other technology in recent history.”
Su also said that AMD has an opportunity to make custom chips in part for its largest customers to use in their AI data centers.
From Bloomberg Businessweek: Doing, not just chatting, is the next stage in the AI hype cycle
Borkar’s team at Microsoft, which has also worked on chips for servers and Surface PCs, is now prioritizing Project Athena. It is developing a GPU that can be used to train and run AI models. One person said the product is already being tested internally and could be widely available as soon as next year.
People said that even if the project makes that schedule, the first release is just a starting point. It takes years to build a good chip, and Nvidia has a head start. Nvidia is the chip supplier of choice for many generative AI tool providers, including Amazon.com Inc. and AWS and Google cloud, thousands of which Elon Musk has secured for his fledgling AI business, according to reports.
Creating an alternative to Nvidia’s lineup will be a difficult task. This company offers a package of software and hardware that work together—including chips, programming language, networking equipment, and servers—allowing customers to quickly upgrade their capabilities.
This is one of the reasons Nvidia has become so dominant. But Microsoft is not alone in trying to develop in-house AI processors. Cloud rival Amazon acquired Annapurna Labs in 2016 and has developed two different AI processors. Alphabet Inc’s Google also has its own training chip.
(Updates with stock prices in the third paragraph.)
Most Read from Bloomberg Businessweek
© 2023 Bloomberg LP
“Lifelong beer expert. General travel enthusiast. Social media buff. Zombie maven. Communicator.”