Global Accessibility Day Announcements from Apple

iPad, and 2 iPhones

by Joe Lonergan

Global Accessibility Awareness Day was on May 18th and to celebrate Apple announced some accessibility improvements coming this year, most likely arriving in iOS 17.

Announced were improved features designed to enhance useability for people with cognitive, vision, hearing, and mobility accessibility issues, as well as nonspeaking and people at risk of losing their speech.

We will cover the new features that will concern users with vision impairment.

Point and Speak
Point and Speak is a new feature that is going to be added to the magnifier app. Already the magnifier app has recently added features like door detection, people detection, and image detection and this new feature will be another string to the bow of the magnifier app, the only catch is you will need to have a supported model that has LiDAR technology, which in short is any iPhone with the title pro or pro max at the end of it, iPhone12 Pro or Pro max up to the latest model of the iPhone 14 pro or pro max.

By using the LiDAR technology together with the camera and machine learning technology point and speak announces the text on various buttons as the user moves their finger across them, for example, the buttons on a kitchen appliance such as a washing machine or Microwave. So, as you can imagine it could be useful in many situations. It is not yet known what the optimal use case for Point and Speak but it has heightened speculation about the much-anticipated mixed reality headset that is expected to be announced at next month’s Worldwide Developer conference. At the moment it is touted to only recognise text and not objects and graphics, but it could be one of the features built into the mixed reality headset, as they say, watch this space.

Other improvements were announced for low vision and VoiceOver users such as enhanced text size adjustment across various native Mac apps including Finder, Mail, Notes, Calendar, and Messages.

VoiceOver users that like to use Siri will get some more natural and expressive Siri voices, even at higher rates of speech.

Siri itself will become more customisable to suit the users’ preferences. You will soon be able to customise the rate that Siri communicates with options ranging from 0.8 to 2X.

Assistive Access
Assistive Access is a customisable experience that simplifies the home screen and core applications such as Messages, Camera, Photos, and Music, which will only display the core functions and larger text and buttons.

The Phone and Facetime app will be combined into one when in assistive access mode, it will be a basic calls app. The Home screen will also be customisable and can simply be put into a grid or list view depending on user preference.

We can see Assistive Access becoming a great addition to those who want a smartphone but do not want the cognitive overload of having dozens of apps taking over their home screen or for those who do not want to navigate many buttons before completing a simple task such as playing their favourite music.

Assistive Access mode will also have the option to have an emoji-only keyboard and the ability to easily record video messages. This is better for those who prefer to communicate visually.

Live Speech
Apple announced another great feature coming soon called Live Speech, which allows people to type what they want to say, and have it be spoken aloud during phone, FaceTime calls, and in-person conversations. This might be helpful for some deaf-blind communication as well as helping people with speech difficulties to communicate in more situations.

Personal Voice
Apple is also working on a cool feature called Personal Voice. This is geared towards people that may be at risk of losing their speaking voice. The idea is a person will spend 15 mins saying several provided phrases and this will create an on-device AI audio model of the person’s voice.

The exact way this will be used is unclear, but it is thought it will give the user the choice to type up phrases and have them speak out in their voice. Irish people might be familiar with the famous news reporter Charlie Bird who did something similar when he was losing his speaking voice. The difference was that Charlie Bird had almost four decades to go on. Whereas this new type of voice banking technology will be available on a more widespread and accessible platform from Apple.

Who knows where it will take us, maybe you will have the option of having a screen reader in your voice but this is not envisaged in the short term.