ChatGPT and Siri betas battle on iPhone

To get a sense of what a smarter Siri in iOS 18.1 might look like once it appears, Open AI just introduced a new voice mode in its app, albeit in limited alpha, meaning not every user will get ahold of the new tech.

Delayed for a month in response to quality concerns, this is a test of the company’s Advanced Voice Mode on ChatGPT; it’s available to iPhone users who subscribe to the $20 per month ChatGPT Plus service using its GPT-4o model. 

The company warns that it might make mistakes and says access and rate limits are subject to change. It isn’t expected to be universally available across all users until the end of the year, and should be available to Mac, iPhone, and iPad users once it appears. Subscribers accepted to the alpha group will get an alert in the app and an email inviting them to take part in the test. 

“We’ll continue to add more people on a rolling basis and plan for everyone on Plus to have access in the fall,” OpenAI said.

What does Advanced Voice Mode do? 

Effectively, it’s a more powerful chatbot that delivers more natural, real-time conversations with a degree of contextual awareness, which means it can understand and respond to emotion and non-verbal cues. It is also capable of processing prompts more swiftly, which significantly reduces the latency within conversations, and lets you interrupt it to get it to change what it says at any time.

OpenAI first demonstrated the new mode in April, when it showed how the tool can recognize different languages simultaneously and translate them in real time. During that demo, employees were able to interrupt ChatGPT, get it to tell stories in different ways, and more. One thing the bot can no longer do is sound like Scarlet Johansson — it now supports only four preset voices in order to prevent it being used for impersonation. OpenAI has also put filters in place to block requests to generate music or other copyrighted audio, reflecting legal challenges raised against song-generating AI firms such as Suno.

Video and screen sharing capabilities are not yet available.

How it works

If you are a ChatGPT Plus subscriber running the latest version of the app, and are accepted to the test, you can access the bot from within the app by tapping the Voice icon at the bottom of the screen. You can then switch between the new Advanced mode and the existing Standard mode (better for longer sessions) using an interface at the top of the screen. Privacy concerns mean many Apple users might prefer to access these features via Apple Intelligence.

What about privacy?

Apple Intelligence puts additional safeguards in place to protect people’s privacy. As Wired points out, ChatGPT’s user agreement at present appears to want to use your voice and images for training purposes. In a remarkably quotable line, AI consultant Angus Allan calls it a “data hoover on steroids. Their privacy policy explicitly states they collect all user input and reserve the right to train their models on this,” he said.

This is less a problem when used with Apple Intelligence, as ChatGPT requests are anonymized and data from those sessions is not used to train ChatGPT models, according to Apple. If that proves true, many Apple users will eventually gravitate to accessing ChatGPT via their Apple AI as the safest way to use it.

All eyes now will turn to Google, which is expected to introduce similar features within Google Gemini AI soon — features that might also end up being integrated inside Apple Intelligence. The battle of the bots is heating up.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

​To get a sense of what a smarter Siri in iOS 18.1 might look like once it appears, Open AI just introduced a new voice mode in its app, albeit in limited alpha, meaning not every user will get ahold of the new tech.

Delayed for a month in response to quality concerns, this is a test of the company’s Advanced Voice Mode on ChatGPT; it’s available to iPhone users who subscribe to the $20 per month ChatGPT Plus service using its GPT-4o model. 

The company warns that it might make mistakes and says access and rate limits are subject to change. It isn’t expected to be universally available across all users until the end of the year, and should be available to Mac, iPhone, and iPad users once it appears. Subscribers accepted to the alpha group will get an alert in the app and an email inviting them to take part in the test. 

“We’ll continue to add more people on a rolling basis and plan for everyone on Plus to have access in the fall,” OpenAI said.

What does Advanced Voice Mode do? 

Effectively, it’s a more powerful chatbot that delivers more natural, real-time conversations with a degree of contextual awareness, which means it can understand and respond to emotion and non-verbal cues. It is also capable of processing prompts more swiftly, which significantly reduces the latency within conversations, and lets you interrupt it to get it to change what it says at any time.

OpenAI first demonstrated the new mode in April, when it showed how the tool can recognize different languages simultaneously and translate them in real time. During that demo, employees were able to interrupt ChatGPT, get it to tell stories in different ways, and more. One thing the bot can no longer do is sound like Scarlet Johansson — it now supports only four preset voices in order to prevent it being used for impersonation. OpenAI has also put filters in place to block requests to generate music or other copyrighted audio, reflecting legal challenges raised against song-generating AI firms such as Suno.

Video and screen sharing capabilities are not yet available.

How it works

If you are a ChatGPT Plus subscriber running the latest version of the app, and are accepted to the test, you can access the bot from within the app by tapping the Voice icon at the bottom of the screen. You can then switch between the new Advanced mode and the existing Standard mode (better for longer sessions) using an interface at the top of the screen. Privacy concerns mean many Apple users might prefer to access these features via Apple Intelligence.

What about privacy?

Apple Intelligence puts additional safeguards in place to protect people’s privacy. As Wired points out, ChatGPT’s user agreement at present appears to want to use your voice and images for training purposes. In a remarkably quotable line, AI consultant Angus Allan calls it a “data hoover on steroids. Their privacy policy explicitly states they collect all user input and reserve the right to train their models on this,” he said.

This is less a problem when used with Apple Intelligence, as ChatGPT requests are anonymized and data from those sessions is not used to train ChatGPT models, according to Apple. If that proves true, many Apple users will eventually gravitate to accessing ChatGPT via their Apple AI as the safest way to use it.

All eyes now will turn to Google, which is expected to introduce similar features within Google Gemini AI soon — features that might also end up being integrated inside Apple Intelligence. The battle of the bots is heating up.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe. Read More