At Microsoft's Ignite 2023 conference the company fleshed out its generative AI offerings and strategy and, in some places, put some serious distance between it and the competition.
Here's a look at three Ignite 2023 takeaways. Also see the following for coverage:
- Microsoft launches Azure Models as a Service
- Microsoft launches AI chips, Copilot Studio at Ignite 2023
- Microsoft's full list of Ignite news and video.
Copilot for Azure
With Copilot for Azure, Microsoft is taking direct shots at Duet AI from Google.
Scaling up and down and managing cloud applications/infrastructure has always been difficult for many IT shops. Copilot for Azure allows users, via a natural language chat interface, to discover app configuration, infrastructure details and optimize the workloads, etc. These chores have been a major issue for almost any enterprise running in the Azure cloud until now. Copilot for Azure can be particularly helpful for organizations that want to be mindful of all the services they use, costs, etc. Particularly interesting is the option that the customers can analyze their observability data using Copilot for Azure in optimizing the cloud applications but also diagnosing the incidents and/or configuring it the right way. Copilot for Azure will directly compete with a lot of AIOps and observability vendors.
Copilot for Azure will also simplify management a good bit--a lot of customers have found managing Azure at the infrastructure level more complicated relative to other hyperscalers.
The caveat for Copilot for Azure is that it's a first version that needs to prove the accuracy and worthiness of the usage. In addition to the accuracy and hallucination issues, if the recommendation is wrong it could cost customers more. Copilot for Azure could become the code base for effectively managing the app and infrastructure layer. If Copilot for Azure works as advertised when it goes to GA, enterprises can create a blueprint every time a new app is deployed and continuously optimize. It remains to be seen how Copilot for Azure usage develops.
Other items worth noting about Copilot for Azure.
- The what-if analysis of the cost and performance module is going to compete directly against a lot of FinOps vendors who found a niche to play for a while.
- If Microsoft offers the same blanket legal liability/indemnity coverage as the OpenAI services, Copilot for Azure could gain some traction.
GitHub Copilot Chat
Copilot was more of an experimentation via GitHub source code for code development for developers, but now is a front-and-center initiative for Microsoft now. GitHub Copilot Chat is embedded into Microsoft 365, Security offerings, D365 Service Copilot etc. Interestingly, Microsoft seems to move away from the failed "Bing experimentation" Bing chat and Bing Chat Enterprise will be rebranded as copilot, which seemed to have a lot more traction than Bing naming convention.
One of the major announcements is the Copilot Studio, which will allow the users to design, test, and publish copilots much similar model to custom GPTs. Microsoft is figuring out a way to engage more in community development to build an ecosystem that can make their technology adoption more viral. With the latest announcements, Microsoft is turning GitHub into an AI-powered developer platform instead of the (open) source code platform that it used to be. GitHub Copilot Chat in many ways competes against Microsoft Visual Studio with the exception that developers can freely write open-source code in this development platform and use it as a repository, compile and deploy in Azure. There is tighter integration with Azure now.
Copilot Studio and GitHub Copilot Chat move Microsoft in the direction of citizen programmers. The original copilot options allow the programmers to finish lines of code or partial code in development IDEs. The GitHub chat interface allows developers to ask for code for certain types of programs being written. In other words, you don't have to start writing code to have the copilot suggest and finish the task. The risk is that the copilot could suggest half-baked code that lands in production. As a productivity enhancement tool, especially for junior and entry-level developers GitHub Copilot Chat can offer a lot.
Given the potential for hallucination, accuracy, copyright and IP issues, Microsoft will probably back blanket coverage by providing indemnity/legal and liability protection.
With the introduction of the Azure Migrate application and code assessment, Microsoft is hoping large existing .NET workloads will seamlessly move to the Azure cloud faster to go with the AI and innovation workloads already moving over.
Azure container apps, a serverless app, is a good addition to having large AI workloads that are not OpenAI API calls to move the Azure cloud. With dedicated GPU workload profiles, vector database add-ins, and Azure container apps, Microsoft is hoping to have enterprises use Azure to build general-purpose or context sensitive LLMs, and SLMs instead of just using OpenAI for inferencing. Building LLMs is where the big money is for now.
With the addition to Azure Kubernetes, Microsoft is going after AI training workloads and the hosting of LLMs--a massive market. Today, LLMs run where they are trained. With optimized workloads in Azure there will be fewer manual configurations. Especially, the Kubernetes AI toolchain operator offers the LLMOps functionality optimized across CPUs and GPUs. Particularly noteworthy is the optimization of GPU vs CPU based on availability. Enterprises could move workloads to CPU clusters instead of waiting for costly, high-demand GPUs for inferencing.
Bottom line: While other vendors are fighting for LLM creation and model traction, Microsoft has moved into operationalizing LLMs and AI. Those moves will leave Azure rivals scrambling once again to catch up.