Apple has always been improving its features to promote accessibility, and one of its latest updates will help individuals who are nonspeaking or at risk of not being able to, which includes training the device to speak using your own voice.
Live Speech and Personal Voice
Apple is working with community groups that represent people with disabilities to develop features that can help them use the features to make life easier. Later this year, the updates will be available on both iPhones, iPads, and Macs.
Live speech will allow users to type what they want to be spoken out loud, which can be useful during phone calls, FaceTime calls, or even conversations in real life. Users can even save phrases that they usually use to quickly respond.
The feature is created for the millions of people who had already lost the speech. However, Apple also created another feature that gives people who are at risk of losing it to record their own voice to use for the Live Speech Tool.
Personal Voice allows users to read random sets of text prompts for 15 minutes so the system can mimic how they sound like. This can be a great tool for those with ALS, as well as other conditions that will eventually affect a person's ability to speak.
As for security risks, Apple uses on-device machine learning with Live Speech, which means that the information will be kept secure. After setting it up, the user can now speak through the device by typing with their own voice to speak to friends and loved ones.
Board member and ALS advocate at the Team Gleason nonprofit, Philip Green stated that if you can tell your friends and family that you love them in a voice that sounds like you, it makes all the difference in the world, and being able to create that in 15 minutes is "extraordinary."
Apple's Inclusivity
Apple CEO Tim Cook says that the company has always believed that the best technology is built for everyone. Cook added that they were excited to share the new features that contribute to Apple's history of making technology accessible to everyone.
The tech giant has also released other features for the sake of cognitive accessibility and vision accessibility. The company introduced Assistive Access which "distills apps and experiences to their essential features" in order to lighten the cognitive load for others.
Apps and services Phone and FaceTime can be combined into a single Calls app, along with Messages, Camera, Photos, and Music. It will also feature a distinct interface, high-contrast buttons, and large text labels for easier access.
For instance, a user who wants to converse through visual means can use emojis by accessing Messages' emoji-only keyboard, as well as have the option to record video messages for loved ones. Their Home Screen can also be simplified to a grid layout of their most-used apps.
Senior director of National Program Initiatives at The Arc of the United States, Katy Schmid stated that having a feature that provides a cognitively accessible experience on Apple devices means more open doors to education, employment, safety, and autonomy.