(Bloomberg) -- Microsoft Corp. announced new tools to help cloud customers build and deploy artificial intelligence applications, the company’s latest effort to get more revenue from generative AI.
Azure AI Foundry will make it easier to switch between the large language models undergirding artificial intelligence. A customer using an older OpenAI product can try a newer one or switch from OpenAI to tools from Mistral or Meta Platforms Inc., cloud computing chief Scott Guthrie said in an interview. Besides mixing and matching models, clients will be able to ensure applications are working properly and have a decent return on investment.
Microsoft, which announced the new offerings on Tuesday at its annual Ignite conference in Chicago, is giving away the software in the hopes of persuading corporate customers to buy more of its cloud services.
The company currently has 60,000 customers using Azure AI, a cloud service that lets developers build and run applications using any of 1,800 different AI models. But the process remains cumbersome, and it’s hard to keep up with the constant supply of new models and updates. Customers don’t want to redo their applications every time something novel comes along; nor do they want to switch models without knowing which tasks they’re best suited to.
“What developers are often finding is that each new model — even if it’s in the same family of models — has benefits in terms of better answers or better performance on many things, but you might have regressions on other things,” Guthrie said. “If you’re a business that has got a mission critical application, you don’t want to just flip a switch and hope it works.”
Parts of Foundry come from an older offering called Azure AI Studio. Other features are new, including tools that help companies deploy AI agents, which are digital assistants that can take actions on a user’s behalf.
Guthrie said making it easier for customers to switch between models won’t weaken Microsoft’s close partnership with OpenAI. For one thing, he noted, it will now be simpler to choose the most appropriate OpenAI model for each job. Still, Microsoft knows offering choice is key to attracting and retaining clients.
“For a huge number of use cases, the OpenAI models are absolutely the best today in the industry,” he said. “At the same time, there are different use cases, and sometimes people do have different reasons for you wanting to use different things. Choice is also going to be important.”
Even as it tries to persuade customers to invest more in AI, Microsoft has been warning investors that cloud sales growth will decline because the company lacks sufficient data center capacity to meet demand. Guthrie said those constraints are temporary, and the company insists it will have sufficient computing power going forward.
Microsoft, which announced its first homegrown cloud-computing and AI chips at the Ignite conference last year, is also unveiling two new semiconductors. One is a security microprocessor that protects things like encryption and signing keys. Starting next year, the new chip will be installed in every new Microsoft data center server, the company said.
The second offering is a data processing unit, a type of networking chip Nvidia Corp. also makes that transfers data more quickly to computing and AI semiconductors, speeding up tasks. Microsoft and its rivals are chasing increasingly powerful cloud systems to train and run AI models.
“The models are growing so big,” said Rani Borkar, the Microsoft vice president who oversees chip design and development. Each layer of chips, servers, software and other components have to improve and have optimal performance, she said. Data processing units are part of that, helping speed network and storage performance, while consuming less power, she said.Other announcements:
- Microsoft’s Maia AI chip is running the Azure OpenAI service, in addition to competing chips from Nvidia.
- Microsoft 365 Copilot, an AI assistant designed for corporations, has a new feature, in private preview, that automates such repetitive tasks as creating a summary of important action items at the start or end of each day.
- The Teams Copilot will be able understand and answer questions about slides, web images and other visual content.
- Next year, Teams will be able to translate a user’s words into different languages and speak those languages in the user’s voice.
- Starting next year, PowerPoint’s Copilot will be able to translate presentations into 40 languages.
(Updates number of AI models available in fourth paragraph, adds Teams language translation features in penultimate bullet point)
©2024 Bloomberg L.P.