New iOS 17 Accessibility features: Assistive Access, Personal Voice, and Live Speech

While everyone is anticipating WWDC 2023, Apple has pleasantly surprised us by releasing a sneak peek of iOS 17’s new accessibility features. Due to the announcement’s timing—just before Global Accessibility Awareness Day—it was of greater significance.

Some of the new features that Apple will roll out later this year have been introduced to fans. So let’s get an overview of the new accessibility features in iOS 17 before we dive into some hands-on practise.

Apple previews new iOS 17 accessibility features

Apple has created accessibility features that benefit a variety of users with disabilities since its primary goal has always been to make people’s lives better.With on-device machine learning, the new features emphasise cognition, visual, hearing, and mobility accessibility.

According to Apple CEO Tim Cook, “Today, we’re excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create, communicate, and do what they love.”

At Apple, we believe technology should be designed to help everyone do what they love. We’re excited to preview new accessibility features to help even more people follow their dreams.— Tim Cook (@tim_cook) May 16, 2023

Let’s delve into the upcoming iPhone features and learn what Apple offers.

1. Assistive Access

Assistive Access in iOS 17
Image credit: Apple

The cognitive accessibility feature known as Assistive Access will make it easier and more autonomous for people to utilise the iPhone and iPad. It will simplify the UI of the apps and highlight crucial components, lowering cognitive strain. In order to precisely build this functionality, Apple closely collaborated with individuals who are intellectually impaired.

There will be high-contrast buttons, large text labels, and configurable choices for personal preferences in all necessary apps like Camera, Photos, Music, Calls, and Messages. In order to increase usability and freedom, Assistive Access will provide a customised experience, regardless of whether you prefer a text-based interface or a visual, grid-based style.

Additionally, for convenience, Apple has integrated Phone and FaceTime into the Calls app. With the emoji-only keyboard and the ability to record a video message to send to loved ones in the Messages app, users with cognitive disabilities can connect visually.

2. Live Speech and Personal Voice Advance Speech Accessibility

Live Speech and Personal Voice Advance Speech Accessibility in iOS 17
Image credit: Apple

For customers who are at risk of losing their ability to speak, such as individuals with ALS (amyotrophic lateral sclerosis), Apple has launched the Live Speech and Personal Voice feature for speech accessibility.

For accessible communication, you can use Live Speech to input what you want to say during conversations and FaceTime, and your iPhone will speak that out loud. In addition, users can save frequently used words and phrases for easy access during talks.

On the other hand, you can produce a synthesised voice that sounds like you with the Personal Voice option. You will need to record 15 minutes of audio on your iPhone while following text prompts. After that, Live Speech will be incorporated with this voice. And as usual,

3. Point and Speak

Point and Speak, a new accessibility tool for vision, will be added to Magnifier’s Detection Mode. It is intended for users with low eyesight or blindness. To help you engage with actual things that include text labels, your iPhone will recognise when you point at any text and read it aloud.

Data from the Camera app, the LiDAR scanner, and on-device machine learning are used by Point and Speak. It will work with VoiceOver and can be combined with other Magnifier capabilities to aid impaired individuals move about, such as People Detection, Door Detection, and Image Descriptions.

Apple accessibility Magnifier Point and Speak in iOS 17
Image credit: Apple

Cheers to empowerment!

Apple’s commitment to diversity and the empowerment of disadvantaged customers worldwide is reflected in the future features in iOS 17. In-device machine learning safeguards user privacy while ensuring that these features address real-life issues thanks to Apple’s work with handicap communities. I can’t contain my excitement about trying out the new features and seeing how they work. How are you doing?


Please enable JavaScript in your browser to complete this form.

Leave a comment