• Ben's Bites
  • Posts
  • Talking ChatGPT gets in hands of early users.

Talking ChatGPT gets in hands of early users.

OpenAI is finally letting people use its HER-like voice chatbot. Not all users get access to this advanced voice mode but the alpha is now up and running.

What is going on here?

OpenAI rolls out the alpha program to test ChatGPT’s Advanced voice mode.

What does this mean?

Didn’t ChatGPT have a voice mode earlier too? Yes, but this new “advanced mode” is different. Here’s how:

Earlier, ChatGPT used to simply read the text output from its AI model. Now, the model outputs audio from scratch.

That means the voice outputs are faster and have a wider range of speed, tone and emotions (as OpenAI demoed in May).

The video and screen-sharing capabilities from those demos are not available in this alpha program either. OpenAI says that all ChatGPT plus users will have access to the advanced voice mode by fall.

Why should I care?

OpenAI’s getting a lot of flak for being slow in rolling out these latest batch of demos. Advanced voice mode was demoed in mid-May, more than 2 months ago. this is an eternity in the current AI scene.

Alpha rollout means the capabilities are really getting into users’ hands and we’ll get to see it in the wild (not just curated sample videos). Let’s see if it delivers on the hype.

Reply

or to participate.