Microsoft to Add OpenAI to All Products and Offer API to ChatGPT Through Azure
Davos interview offers new insights into Microsoft's plans
Satya Nadella, CEO of Microsoft, sat down for an interview with the Wall Street Journal yesterday at the annual World Economic Forum conference in Davos, Switzerland. Nadella said in response to a question:
“So I think at this point, the way Microsoft is really going to commercialize all of this is Azure has become the place for anybody and everybody who thinks about AI and large sale training…Second, is we are going to make these foundational models available as platforms…Today we just made the OpenAI Azure Service available generally and then even ChatGPT will be available as an API. And, so, we are very, very excited about that. And then we will incorporate this in our own applications…Every product at Microsoft will have some of the same AI capabilities to completely transform the product.”
Azure Model Training and Services
He also indicated that Microsoft had learned a lot about providing efficient infrastructure for training large language models, and he expects that to be a significant differentiator for Azure. The Azure OpenAI Service debuted in November 2022 and launched as general availability Monday. A blog post announcing the news stated:
With Azure OpenAI Service now generally available, more businesses can apply for access to the most advanced AI models in the world—including GPT-3.5, Codex, and DALL•E 2—backed by the trusted enterprise-grade capabilities and AI-optimized infrastructure of Microsoft Azure, to create cutting-edge applications. Customers will also be able to access ChatGPT—a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure—through Azure OpenAI Service soon.
You can already access GPT-3.5 directly through OpenAI’s APIs. Azure is also going to offer the ChatGPT fine-tuned version of that model directly to developers. Sythedia has written previously about the nice alignment of OpenAI products and features with Microsoft’s products ranging from the Bing Search Engine to the Office productivity suite. In addition, AI models consume a lot of computing power. Microsoft wants to promote the use of large AI models and to ensure they are run on Azure.
OpenAI, Azure, and Microsoft
OpenAI needed a lot of computing power, and Microsoft wanted to make an investment. It could make that billion-dollar investment in currency, or it could offer computing credits. Even if it pays in cash, if OpenAI agrees to build and train its models on Azure, then at least some of the Microsoft investment returns to the company in terms of computing fees.
Rumors around the new investment deal suggest it might be as high as $10 billion and could also involve similar round-trip payments back to Microsoft through Azure. That deal may also have the added risk reduction of providing Microsoft with a healthy share of any profits OpenAI might generate up to the level of the investment.
This was a savvy bet by Microsoft. In 2019, it wasn’t at all certain that OpenAI would ever generate substantial revenue, and it still has not to date. However, it appears that the company’s solutions are rising quickly in demand. It may soon be a cash machine, though to service that demand, the company will need to scale. The scale required over the past five years was for training large language models. It now looks like scale will be needed for run-time throughput. Azure can provide this. And, importantly, when it does provide computing resources, AWS and Google are not.
In addition, it is not trivial to set up your own run-time environment for large AI models. It will be far easier for many developers to simply plug into Azure for elastic access to OpenAI services. This will reduce adoption friction and should be beneficial to OpenAI.
Sam Altman, OpenAI’s CEO, also seemed to indicate in an interview this week that the company was not restricted from using cloud computing resources offered by Azure competitors. While Microsoft will surely have an advantage in providing these services, it doesn’t mean they will be exclusive.
Thank you Brett for sharing this WSJ video of the Satya interview. My biggest takeaway...time to buy MS stock!
P.S. I envision that we'll see LLM "trained" for specific verticals e.g., law, which will provide all interested in the subject (including lawyers and judges) with a "co-pilot"/personal Subject Matter Expert (SME) on all things related to that/their particular vertical. I'd also love to hear your insights on how/who (I believe there are already a few e.g. Jasper) will take advantage of the ChatGPT API in the near term?