Affiliate links on Android Authority may earn us a commission. Learn more.
iPhone to get a motion sickness-fighting feature, Music Haptics, and more
- Apple is rolling out a list of accessibility features for the iPhone and iPad.
- These features include vehicle motion cues, music haptics, eye tracking, and more.
- The tech giant plans to launch these features later this year.
Apple is working on a list of new accessibility features that will come to the iPhone and iPad. These features range from creating a new way to navigate through apps to reducing motion sickness and more.
In a blog post, Apple announced it plans to launch a number of the accessibility features, one of which allows the user to use their iPhone or iPad with their eyes. This eye-tracking feature uses the front camera and on-device machine learning to help a user navigate their apps with just their eyes. Additionally, these users can use Dwell Control to access extra functions with their eyes like physical buttons, swipes, and gestures.
For users who are deaf or hard of hearing, Apple is also rolling out what it calls “Music Haptics.” This feature uses the Taptic Engine to vibrate the iPhone in sync with the music that’s playing. Apple says the feature works with millions of songs in the Apple Music catalog.
Another interesting feature the tech giant brought up aims to help those who are vulnerable to motion sickness. Soon iPhone and iPad will offer “Vehicle Motion Cues,” which are animated dots that appear on screen and show changes in vehicle direction. This feature is designed to fix the sensory conflict between the what you see and what you feel without disrupting the reading or watching experience.
In addition to these features, Apple also announced vocal shortcuts, voice control for CarPlay, live captions for visionOS, the ability to move captions in Vision Pro, and more. There’s no exact date for the rollout of these features, but Apple says they will launch later this year.