Apple’s iPhone will soon speak with your voice

- Apple introduces personalized voice and Live Speech for speech-impaired and phone-call-averse users.
- Google’s Project Relate and Euphonia improve understanding of non-standard speech.
- Apple announces Assistive Access, Point and Speak, and more accessibility features at WWDC.
Tech companies have been actively working on solutions to assist individuals with speech issues, and Apple is now joining the effort. Previously, Google introduced Project Relate and Euphonia, initiatives aimed at training speech recognition models to better understand “non-standard” speech.
Apple’s solution goes a step further by offering a personalized voice for those who may eventually lose their ability to speak. This feature could also benefit individuals who prefer not to make phone calls for various reasons. It would be wonderful if Google introduced a similar feature for Android users as well.
Read “Google’s Universal Dubbing: Breaking Language Barriers!“
In addition to the Personal Voice feature, Apple is introducing another accessibility tool called Live Speech. With Live Speech, users can type a message that will be spoken aloud during phone calls or FaceTime chats. This feature is reminiscent of RTT calling technology, but the inclusion of Personal Voice support makes it more personal and tailored to individual needs.
Apple has also announced other accessibility features that will be rolled out later this year. One such feature is Assistive Access, which can be seen as an easy mode with a simplified user interface, making it more accessible to everyone. Another feature, Point and Speak, utilizes the device’s camera viewfinder to read out labels or text as users point at them.
While there is no specific timeline provided for the release of these features, it is likely that they will be included in the upcoming iPhone and iPad updates. More information regarding these updates can be expected at Apple’s WWDC event next month.
One Comment