• Ben's Bites
  • Posts
  • AI is getting empathetic with Hume's EVI.

AI is getting empathetic with Hume's EVI.

Hume AI previewed a new chatbot called EVI. EVI standing for empathic voice interface is a conversational AI that understands the tone of your voice and then adjusts his language and speech based on that.

What’s going on here?

Hume AI’s new demo of an empathic chatbot is the next surprise in AI.

What does that mean?

First off, you have to try EVI to get a real sense of what capturing emotion from voice looks like. But still to give you a quick brief, here’s how EVI makes interacting with AI using voice much better.

  1. It responds with human-like tones of voice based on your expressions.

  2. Reacts to your expressions with language that addresses your needs and maximizes satisfaction.

  3. EVI knows when to speak, because it uses your tone of voice for state-of-the-art end-of-turn detection.

  4. It stops when interrupted, but can always pick up where it left off.

  5. It can notice your reaction to its responses and self-improve over time.

EVI’s in a preview right now but devs can apply for early access to the API. Hume also raised a $50M series B.

Why should I care?

2023 was the year of chat. 2024 and beyond will have a voice as a big part. But who wants to listen to that robotic voice from AI chatbots? The current AI voices might sound like a human but they lose the nuance when speaking long-form text.

Hume is trying to solve that with EVI. If it can, imagine many more AI therapy and AI relationship tools coming up soon. One interesting idea away from these can be “speech practice”—using the little markers that show emotions present in your voice to create a public speaking coach.

Join the conversation

or to participate.