• Ben's Bites
  • Posts
  • Nvidia GTC packs a punch in hardware and software announcements.

Nvidia GTC packs a punch in hardware and software announcements.

NVIDIA announced a whole suite of tools to power massive AI models, including a new chip architecture (Blackwell), faster networking, the ability to easily create custom AI tools (NIMs), expanded simulation capabilities (Omniverse Cloud APIs) and foundation model for robotics (GR00T).

What’s going on here?

Nvidia kicked off their developer conference GTC yesterday and CEO Jensen Huang announced many things in his keynote.

What does this mean?

Blackwell - A new chip from Nvidia.

Blackwell offers 2.5x the training performance of Nvidia’s previous chip Hopper using FP8 precision, and 5x the speed for inference with FP4. It comes into production as GB200, Grace Blackwell 200, which is a combo of 2 Blackwell GPUs, one Grace CPU and a bunch of other supporting parts. But the real deal unlocks with NVIDIA's DGX SuperPOD, 11.5 exaflops of AI supercomputing built with thousands of Nvidia’s GB20.

NIMs - Nvidia Inference Microservices.

Rather than building AI from scratch, developers can use pre-built AI microservices from Nvidia (NIMs). Think of these like plug-and-play AI tools for specific tasks. NVIDIA also lets companies build custom AI "copilots" using their own proprietary data on top of NIMs. NIMs are designed to be easy to use across different software platforms and even older GPU models.

Gr00T - General Robotics zero zero Three

Project GR00T is a specifically designed AI foundation model for humanoid robots, focused on giving robots better movement, perception, and adaptability.

Gr00T will run on the upgraded Jetson Thor computer, specifically designed for robotics. This likely means smaller but more powerful onboard computers for realistic humanoid robots.

Join the conversation

or to participate.