The following Apple accessibility features let you control your iPhone and iPad with just your eyes

Ahead of Global Accessibility Day on May 16, 2024, Apple unveiled a number of new accessibility features for the iPhone, iPad, Mac and Vision Pro. Eye tracking heads a long list of new features that let you control your iPhone and iPad by moving your eyes.

Eye Tracking, Music Haptics, Vocal Shortcuts and Vehicle Motion Cues will be available on eligible Apple gadgets later this year. These new accessibility features will most likely be released with iOS 18, iPadOS 18, VisionOS 2, and the next version of macOS.

These new accessibility features have become an annual decline for Apple. Normally, the curtain is raised a few weeks before WWDC, also known as the Worldwide Developers Conference, which kicks off on June 10, 2024. That should be the event where we see Apple show off its next generation of core operating systems and AI chops.

Eye-Tracking looks seriously impressive

(Image credit: Apple)

Eye tracking looks impressive and is an important way to make the iPhone and iPad even more accessible. As noted in the release and captured in a video, iPadOS (and iOS too) lets you navigate, open apps and even control elements using just your eyes, and uses the front-facing camera, artificial intelligence and local machine learning throughout the experience.

You can look around the interface and use “Dwell Control” to interact with a button or element. Gestures are also handled via eye movements. This means you can first look at Safari, Phone or any other app, hold that view and it will open.

Most importantly, all setup and usage data is kept locally on the device, so you’re done with just your iPhone. You don’t need an accessory to use eye tracking. It is designed for people with physical disabilities and builds on other accessible ways to control an iPhone or iPad.

Vocal shortcuts, music haptics and live subtitles on Vision Pro

(Image credit: Apple)

Another new accessibility feature is Vocal Shortcuts, designed for iPad and iPhone users with ALS (amyotrophic lateral sclerosis), cerebral palsy, stroke, or “acquired or progressive conditions that affect speech.” This allows you to set a custom sound that Siri can learn and identify to launch a specific shortcut or perform a task. It lives together with Listen for Atypical Speech, designed for the same users, to open speech recognition to a wider audience.

These two features build on several introduced in iOS 17, so it’s great to see Apple continuing to innovate. With Atypical Speech in particular, Apple uses artificial intelligence to learn and recognize different types of speech.

Music Haptics on iPhone is designed for users who are hearing impaired or deaf to experience music. The built-in taptic engine, which powers the iPhone’s haptics, plays various vibrations, such as taps and textures, that resemble the audio of a song. At launch it will work with “millions of songs” within Apple Music, and there will be an open API that allows developers to implement and make accessible music from other sources.

In addition, Apple has previews of a number of other features and updates. Vehicle Motion Cues will be available on iPhone and iPad and aim to reduce motion sickness with animated dots on that screen that change when vehicle movement is detected. It’s designed to help reduce motion sickness without blocking what you see on the screen.

(Image credit: Apple)

An important addition to VisionOS – also known as the software that powers Apple Vision Pro – are Live Captions across the system. This makes subtitles for spoken dialogue in FaceTime calls and audio from apps immediately visible to you. Apple’s release notes that it is designed for users who are deaf or hard of hearing, but that, like all accessibility features, it can be found in Settings.

Since these are Live Captions on an Apple Vision Pro, you can move and resize the subtitle window like any other window. Visibility accessibility within VisosOS will also see reduced transparency, smart reversal, and dimmed flashing functionality.

As for when these will arrive, Apple notes in the release that the “new accessibility features (are coming) later this year.” We’re keeping a close eye on this and imagine these will ship with next-generation operating systems like iOS 18 and iPadOS 18, meaning those with developer accounts may be able to test these features in upcoming beta releases.

Considering that some of these features are powered by on-device machine learning and artificial intelligence, helping with accessibility features is just one of the ways Apple believes AI can have an impact. We’ll likely hear the tech giant share more of its thoughts on AI and consumer-ready features at WWDC 2024.

You might like it too

Related Post