Transformative Tech: Apple’s Inclusive Features Unveiled

383
Transformative Tech: Apple's Inclusive Features Unveiled

Apple has announced a series of innovative accessibility features, harnessing the power of artificial intelligence and machine learning. Notable features include Eye Tracking, which allows users with physical disabilities to control their iPhone or iPad using their eyes, and Music Haptics, providing a new way for deaf or hard-of-hearing users to experience music. Additional features include Vocal Shortcuts, Vehicle Motion Cues and various enhancements for CarPlay and visionOS. These additions highlight Apple’s ongoing commitment to inclusive design and their dedication to enhancing user experience through accessible technology.

  • Apple announces new accessibility features, set to be released later this year, including Eye Tracking, Music Haptics and Vocal Shortcuts.
  • Features are designed to enhance user experience for individuals with physical disabilities, hearing impairments and speech challenges.
  • Apple showcases commitment to inclusive design by leveraging technology like artificial intelligence and machine learning.

Apple Introduces Eye Tracking for iPads and iPhones

Eye Tracking, a groundbreaking feature powered by artificial intelligence, allows users to navigate iPads and iPhones using only their eyes. This feature is primarily designed for individuals with physical disabilities and promises to transform the way they interact with their devices. The technology uses the front-facing camera to set up and calibrate in seconds, and all data used to control this feature is securely stored on the device, ensuring user privacy.

Music Haptics: A New Musical Experience

Another innovative feature, Music Haptics, is designed to make music more accessible for users who are deaf or hard of hearing. With this feature turned on, the iPhone’s Taptic Engine plays taps, textures and refined vibrations in sync with the audio of the music, offering a unique way to experience music.

Vocal Shortcuts and Other Speech Features

Apple’s Vocal Shortcuts allows iPhone and iPad users to assign custom utterances that Siri can recognise to execute shortcuts and complete complex tasks. This feature is particularly beneficial for individuals with conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS) or those who have suffered the effects of a stroke.

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” said Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator.

On the Move: Reducing Motion Sickness and CarPlay gets Voice Control

Vehicle Motion Cues, another new feature, aims to reduce motion sickness for passengers using their iPhone or iPad in moving vehicles – be that by land, sea or air. Through animated dots on the screen’s edges representing changes in vehicle motion, this feature helps reduce the sensory conflict that often leads to motion sickness.

Accessibility features coming to CarPlay include Voice Control, Colour Filters and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps with just their voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens. For users who are colourblind, Colour Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.
 

Accessibility Features coming to visionOS

This year, accessibility features coming to visionOS will include systemwide Live Captions to help everyone follow along with spoken dialogue in live conversations and in audio from apps. With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.
Features such as VoiceOver, Zoom and Colour Filters can also provide users who are blind or have low vision access to spatial computing, while features such as Guided Access can support users with cognitive disabilities. Users can control Vision Pro with any combination of their eyes, hands or voice, with accessibility features including Switch Control, Sound Actions and Dwell Control that can also help those with physical disabilities.
 
“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant, and cofounder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

Final Thoughts

Apple’s announcement of these new accessibility features is a significant step towards inclusivity and a testament to the transformative power of technology. These features enhance the usability of Apple products for individuals with physical disabilities, as well as those individuals who suffer from motion sickness, and demonstrates Apple’s commitment to designing products for everyone. We all look forward to these features’ release date.

FAQ

Q: What is Eye Tracking and how does it work?
A: Eye Tracking is a new accessibility feature that allows users with physical disabilities to control iPad and iPhone with just their eyes. It uses the front-facing camera to set up and calibrate in seconds, and users can navigate through apps and activate functions solely with their eyes.

Q: How does Music Haptics make music more accessible?
A: Music Haptics is a feature for users who are deaf or hard of hearing, which uses the Taptic Engine in iPhone to play taps, textures and refined vibrations to the audio of the music. This allows users to experience music in a different way.

Q: What are Vocal Shortcuts and who can benefit from them?
A: Vocal Shortcuts allow iPhone and iPad users to assign custom utterances that Siri can understand to launch shortcuts and perform tasks. These are beneficial for users with acquired or progressive conditions that affect speech.

Q: How does Vehicle Motion Cues help reduce motion sickness?
A: Vehicle Motion Cues is a feature for iPhone and iPad that helps reduce motion sickness for users in moving vehicles by representing changes in vehicle motion with animated dots on the screen, reducing sensory conflict without interfering with the main content.

Q: What additional updates are coming to CarPlay for accessibility?
A: CarPlay will introduce Voice Control for navigation and app control, Sound Recognition for alerts on car horns and sirens for users who are deaf or hard of hearing, and Colour Filters for users who are colourblind, along with other visual accessibility features.

Q: What new accessibility features are coming to visionOS?
A: visionOS will introduce Live Captions for face-to-face conversations and audio apps, support for additional hearing devices, and updates such as Reduce Transparency, Smart Invert and Dim Flashing Lights for users with low vision or sensitivity to bright lights.

Q: What updates are coming for users who are blind or have low vision?
A: VoiceOver will include new voices and customisation options, Magnifier will offer new modes, and Braille users will have improved input options. Additionally, Hover Typing will show larger text for users with low vision.