Microsoft Corp. made its first in-house artificial intelligence chip and cloud computing processor public in an effort to gain greater control over its technology and expand its product line in the fiercely competitive AI computing market. Additionally, the business unveiled new software that enables customers to create custom AI assistants.
Microsoft Azure cloud users will now have a new method to create and execute AI content-generating programmes according to the announcement of the Maia 100 chip made on Wednesday at the company’s annual Ignite conference in Seattle. According to Rani Borkar, a vice president in charge of Azure’s chip unit, Microsoft is already testing the chip with its Bing and Office AI products. The processor is also being tested by OpenAI, the company behind ChatGPT, Microsoft’s primary AI partner. In some Microsoft data centres early in the following year, Maia and the server chip Cobalt will make their debuts.
At the conference, Microsoft Chief Executive Officer Satya Nadella stated, “Our goal is to ensure that the ultimate efficiency, performance, and scale is something that we can bring to you from us and our partners.” According to him, Maia will initially power Microsoft’s AI apps before becoming accessible to partners and clients.
Microsoft’s multi-year investment demonstrates how important chips have become to being competitive in the cloud and AI. Companies can extract performance and cost benefits from hardware by producing them internally. Additionally, the project may protect Microsoft from becoming unduly reliant on any one source, a weakness that is currently highlighted by the competition among manufacturers for Nvidia Corp.’s AI chips. Rivals in the cloud have made similar moves before Microsoft’s foray into processors. Selling services based on multiple cloud and artificial intelligence chip types, Amazon.com Inc. purchased a chip manufacturer in 2015. In 2018, Google started granting users access to its AI accelerator processors.
According to Borkar, in order to provide optimal performance and prevent supply-chain bottlenecks, it is crucial for a company the size of Microsoft to optimize and integrate every component of its hardware. And, ultimately, to offer clients the option for infrastructure.
Early in the following year, Microsoft will also offer services to customers based on Nvidia’s most recent H200 chip and Advanced Micro Devices Inc.’s MI300X processor, both of which are meant for AI tasks. Nevertheless, there appears to be a long-term move in the industry towards in-house chips. For Intel Corp., whose own AI chip efforts are lagging behind, the shift is especially bad news. Microsoft is currently collaborating with AMD and Amazon on Cobalt in an attempt to gain market share in the server chip industry, which is presently dominated by Intel.
Maia is intended to speed up the enormous volume of data processing needed by AI systems to perform tasks like speech and image recognition. Azure Cobalt is a central processing unit that will be comparable to AMD and Intel products with its 128 computing cores, or mini processors. The more cores, the better, as they can split up work quickly into manageable tasks and complete them all at once. Additionally, Cobalt uses designs from Arm Holdings Plc, whose designs, according to their supporters, are intrinsically more efficient because they originated from designs found in battery-powered devices such as smartphones. The manufacturer of both chips will be Taiwan Semiconductor Manufacturing Co.
Microsoft has previously altered chips in a few ways, such as designing Xbox processors with AMD and creating unique chips for the HoloLens goggles and Xbox Kinect motion controller. Nevertheless, Maia and Cobalt represent the largest and most all-encompassing endeavours to date—ambitious steps in a challenging and costly industry to break into.
It’s an old joke that Microsoft doesn’t get things right until version 3.0, but in the semiconductor space, every vendor usually does. Borkar, who worked for Intel for 27 years, expressed her confidence that Microsoft’s initial efforts are impressive. “Next year, we’re going to use these,” she stated.
The company also announced Copilot Studio, software that allows clients to customize Microsoft’s AI assistant software or build their own AI assistants from scratch. Customers can also create ways for Microsoft’s copilot software to appear in their own apps.
Charles Lamanna, a vice president at Microsoft, stated that customer feedback was remarkably consistent. For example, if all you want is supply chain data in a copilot experience, don’t force you to build a whole other bot.
Generally speaking, Microsoft stated that it is combining all of its AI copilots and Bing Chat AI features into a single software package that users can access.
Additionally, the business declared the following:
A newly reduced fee for customers who utilize the sales copilot and the Microsoft 365 Copilot product for Office software. Copilot for Azure is an AI assistant made to assist IT managers with troubleshooting operating systems, applications, and other things. A survey conducted by Microsoft revealed a high level of satisfaction with the company’s new corporate AI tools. For instance, 77% of users stated they don’t want to switch back to their previous methods after using it.