We don’t expect Apple to announce iOS 17, macOS 14, or iPadOS 17 until its WWDC23 event on June 5 but the company is already starting to share some details ahead of time, specifically in relation to new accessibility features.
The company appears to be clearing the decks for what is sure to be a busy WWDC thanks to the unveiling of the Reality Pro AR/VR headset.
So it’s announcing some changes now, and Live Speech, Personal Voice, and Point and Speak in Magnifier are some of those changes.
Apple detailed the changes including Live Speech, saying that it will allow users to type what they want to say and then have it read out over FaceTime and in-person chats.
With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.
Blind and low-vision users will also be able to use Point and Speak, an expansion of the Detection Mode.
Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels. For example, while using a household appliance — such as a microwave — Point and Speak combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad.
There’s a lot more on offer as well. On Macs for example, Apple tells us that “made for iPhone hearing devices can be paired directly to Mac and be customized to the user’s hearing comfort.”
It’ll be interesting to see if these new features are further expanded on till WWDC23, and whether we’ll see them roll out with the launch of the iPhone 15 series.
You may also like to check out: