• Ben's Bites
  • Posts
  • Daily Digest: Microsoft made Google sing

Daily Digest: Microsoft made Google sing

PLUS: Nvidia sneaks in a punch.

Sign up | Advertise | Ben’s Bites News
Daily Digest #286

Hello folks, here’s what we have today;

PICKS
  1. Microsoft kicked off its annual developer conference “Ignite” with one message: Microsoft is the copilot company. In his keynote, Satya Nadella walked us all through the 5 layers of the “Copilot Stack” that Microsoft is building. 🍿Here’s a recap of all the chit-chat (also below)

  2. Lyria - Google's new AI music model. Google’s announced a new music generation model that can create instrumentals and vocals that sound too real. With Lyria as its foundation, Google is announcing the new AI music generation tool Dream Track in YouTube Shorts as an experiment with additional music tools for music artists, songwriters and producers.🍿Our Summary (also below)

  3. Ignite had a bunch of stuff linked with Nvidia. The result: Nvidia has silently launched a bunch of new features alongside Microsoft Ignite. New model classes, upgrades to the software stack, wider availability of its tools and more.🍿Our Summary (also below)

TOP TOOLS
WHO’S HIRING IN AI
NEWS
QUICK BITES

Microsoft kicked off its annual developer conference “Ignite” with one message: Microsoft is the copilot company. In his keynote, Satya Nadella walked us all through the 5 layers of the “Copilot Stack” that Microsoft is building. Here’s a slightly modified recap of all the chit-chat:

1) Hardware and Infrastructure

Microsoft is combining the power of Nvidia, AMD and Microsoft’s own designed chips for the hardware. AMD’s MI300X now runs GPT-4 for Azure. Microsoft is now making custom chips, starting with introducing Azure Cobalt CPU (Nadella claims it’s the fastest) and Azure Maia AI accelerator.

Nvidia is still the big deal for Microsoft: it is offering the new AI beast from Nvidia in Azure with brand new research they call “confidential computing." Jensen Huang made a cameo and talked a lot about how Microsoft is partnering with just about everyone in the ecosystem.

2) Foundation models

All the latest models from Open AI’s dev day are now available in Azure. But the limelight is taken by the new Models as a service offering on Azure where you can fine-tune with Llama 2, Mistral and Jais (Arabic) for fine-tuning. Microsoft’s homegrown SLM Phi-2 is launched and open-source (research license). Nadella claims it is 50 % better at math reasoning than Phi 1.5 and still just 2.7B parameters.

3) Software for building AI models

The Azure AI studio can connect to any endpoint now. In layman's terms, it means that now using Windows can run AI models (instead of running in the cloud) is easier. With Nvidia Foundry as a service, you can use Nvidia’s software stack to build custom language models right inside Azure AI.

To add in the Aure AI part of announcements, Microsoft Fabric, it’s data lakehouse is generally available along with the support for mirroring your data from other storage clouds inside Fabric. It has also made Azure vector search with SOTA reranking technology for better RAG (currently powering ChatGPT).

4) Copilot and Copilot Studio

Now the gears are shifting away from Azure and towards Microsoft consumers. First thing: every AI product is now “Copilot”. Bing Chat is Copilot, enterprise Copilot is Copilot. No more Copilot for XYZ app. It’s connected to all the Microsoft apps via the Microsoft Graph.

And this new Copilot is more personalized, like learning your style from your previously sent emails. Copilot in Teams can create virtual worlds from text commands using Mesh. It combines the v1 of agent-like behaviour of doing your tasks as well.

Copilot Studio is Microsoft’s new playground where you can enhance Copilot by adding external data, use plugins from different apps, and create new automations inside Copilot.

To top it off, we got a glimpse of AI with Mixed Reality and if I’m bing honest 😉: that stuff is pure sci-fi.

QUICK BITES

Google’s announced a new music generation model Lyria that can create instrumentals and vocals that sound too real. With Lyria as its foundation, Google is announcing the new AI music generation tool Dream Track in YouTube Shorts as an experiment with additional music tools for music artists, songwriters and producers.

What is going on here?

Google has released a new music generation model with live experiments within YouTube.

What does this mean?

Dream Track is an experiment on YouTube Shorts using Lyria. It lets creators make 30 second soundtracks with an AI-generated voice in the style of certain artists like Sia and John Legend. The lyrics, backing track, and vocals are all generated together.

The Music AI tools are designed with input from artists to enhance creativity. You could transform a vocal melody into accompaniment, turn MIDI chords into a choir, or generate new sections.

Any Lyria-generated content will have an audio watermark called SynthID. This converts the audio into a visual spectrogram and embeds an inaudible digital watermark. Even if modified, SynthID can detect if parts of a song were AI-generated.

Why should I care?

Google worked closely with the music industry to align experiments with principles that encourage creativity while protecting artists. Continued engagement will be important for developing these tools responsibly.

The implications are that AI will empower musical creativity in new ways. Songwriters, producers and fans can benefit from technologies that enhance and inspire the process. We're only beginning to explore the possibilities.

QUICK BITES

Nvidia has silently launched a bunch of new features alongside Microsoft Ignite. New model classes, upgrades to the software stack, wider availability of its tools and more.

What is going on here?

Nvidia is making it easier for developers to work with AI, many steps at a time.

What does this mean?

Here’s a recap of what it has announced:

1) Nvidia Foundation Models - A new family of foundation models Nemotron-3-8B with chat and QA variants. Nvidia combines them with a curated list of other enterprise-grade models from third-party providers. Developers can experiment with new NVIDIA AI Foundation Models directly from a browser, test in their applications with NVIDIA AI Foundation Endpoints, then customize using their unique business data.

2) NVIDIA announced an AI foundry service — a collection of NVIDIA AI Foundation Models, NVIDIA NeMo framework and tools, and NVIDIA DGX Cloud AI supercomputing and services — that gives enterprises an end-to-end solution for creating and optimizing custom generative AI models. It’s also applying it with Amdocs in the Telco industry.

3) Faster silicon with confidentiality - Microsoft Azure Cloud will offer new NVIDIA GPU virtual machines. Azure will have H100 VMs, delivering high AI performance. Confidential VMs with H100 GPUs are coming to enable privacy. Azure will add H200 GPUs next year, ideal for large generative AI models.

4) Run heavy computer graphics simulations in the cloud - Nvidia Omniverse now has a cloud version (again, hosted on Azure) to use the platforms without on-prem setup. Autonomous vehicle companies can simulate virtual factories on Omniverse Cloud to increase production quality while saving years of effort and millions of dollars.

5) TensorRT-LLM for Windows will soon be compatible with OpenAI’s popular Chat API through a new wrapper. This will enable hundreds of developer projects and applications to run locally on a PC with RTX starting at 8GB of VRAM, instead of in the cloud — so users can keep private and proprietary data on Windows 11 PCs.

Why should I care?

The new tools and services from Nvidia make it easier for any developer or company to leverage powerful AI, especially generative AI. You now have an end-to-end solution to go from experimenting with models to deploying customized AI.

This means you can more quickly integrate next-generation AI capabilities into your products and services. The optimized models, APIs, and cloud infrastructure remove a lot of the heavy lifting required before.

Ben’s Bites Insights

We have 2 databases that are updated daily which you can access by sharing Ben’s Bites using the link below;

  • All 10k+ links we’ve covered, easily filterable (1 referral)

  • 6k+ AI company funding rounds from Jan 2022, including investors, amounts, stage etc (3 referrals)

Join the conversation

or to participate.