Microsoft Ignites the Copilot era

Microsoft kicked off its annual developer conference “Ignite” with one message: Microsoft is the copilot company. In his keynote, Satya Nadella walked us all through the 5 layers of the “Copilot Stack” that Microsoft is building. Here’s a slightly modified recap of all the chit-chat:

1) Hardware and Infrastructure

Microsoft is combining the power of Nvidia, AMD and Microsoft’s own designed chips for the hardware. AMD’s MI300X now runs GPT-4 for Azure. Microsoft is now making custom chips, starting with introducing Azure Cobalt CPU (Nadella claims it’s the fastest) and Azure Maia AI accelerator.

Nvidia is still the big deal for Microsoft: it is offering the new AI beast from Nvidia in Azure with brand new research they call “confidential computing." Jensen Huang made a cameo and talked a lot about how Microsoft is partnering with just about everyone in the ecosystem.

2) Foundation models

All the latest models from Open AI’s dev day are now available in Azure. But the limelight is taken by the new Models as a service offering on Azure where you can fine-tune with Llama 2, Mistral and Jais (Arabic) for fine-tuning. Microsoft’s homegrown SLM Phi-2 is launched and open-source (research license). Satya claims it is 50 % better at math reasoning than Phi 1.5 and still just 2.7B parameters.

3) Software for building AI models

The Azure AI studio can connect to any endpoint now. In layman's terms, it means that now using Windows can run AI models (instead of running in the cloud) is easier. With Nvidia Foundry as a service, you can use Nvidia’s software stack to build custom language models right inside Azure AI.

To add in the Aure AI part of announcements, Microsoft Fabric, it’s data lakehouse is generally available along with the support for mirroring your data from other storage clouds inside Fabric. It has also made Azure vector search with SOTA reranking technology for better RAG (currently powering ChatGPT).

4) Copilot and Copilot Studio

Now the gears are shifting away from Azure and towards Microsoft consumers. First thing: every AI product is now “Copilot”. Bing Chat is Copilot, enterprise Copilot is Copilot. No more Copilot for XYZ app. It’s connected to all the Microsoft apps via the Microsoft Graph.

And this new Copilot is more personalized, like learning your style from your previously sent emails. Copilot in Teams can create virtual worlds from text commands using Mesh. It combines the v1 of agent-like behaviour of doing your tasks as well.

Copilot Studio is Microsoft’s new playground where you can enhance Copilot by adding external data, use plugins from different apps, and create new automations inside Copilot.

To top it off, we got a glimpse of AI with Mixed Reality and if I’m bing honest 😉: that stuff is pure sci-fi.

Reply

or to participate.