Accessibility features are a core element to Apple’s many platforms and services. The company is extending that belief with new software features that are coming soon to all of its operating systems.

Today, Apple previewed powerful new software technology that’s meant to make things a bit easier for people with disabilities. The announcement says the new software features are designed for those with mobility, vision, cognitive, and hearing disabilities. The press release echoes Apple’s belief that “accessibility is a human right,” and shows that Apple’s devices are customizable in such a way to make them useful tools for all users.

Article image

Apple says that, later this year, the company will launch new software updates across all of its operating systems to welcome these new software features. New elements like AssistiveTouch will make it easier for those with “limb differences” to use Apple Watch; while the iPad will gain support for third-part eye-tracking hardware. And using on-device intelligence, Apple’s existing VoiceOver screen reading software will be much improved for those with vision disabilities.

Apple has a completely new feature coming soon, launching on July 29, 2025. It’s called SignTime, and it will make it easier for those with speech disabilities to communicate with Retail Customer Care and AppleCare specialists. They will rely on American Sign Language (ASL) in the United States, British Sign Language (BSL) in the United Kingdom, and French Sign Language (FSL) in France. All of this will be possible from within their web browser, with a clear video call making it possible to have a one-on-one conversation.

Article image

At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make,’ said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. ‘With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can’t wait to share them with our users.

Apple has a video showing off AssistiveTouch. As soon as the company uploads it to its official YouTube channel, we will add them here.

Here’s how Apple describes AssistiveTouch for Apple Watch, new eye-tracking hardware support for iPad, and expanded VoiceOver:

AssistiveTouch

To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls. Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.

Apple is introducing new features for VoiceOver, an industry‑leading screen reader for blind and low vision communities. Building on recent updates that brought Image Descriptions to VoiceOver, users can now explore even more details about the people, text, table data, and other objects within images. Users can navigate a photo of a receipt like a table: by row and column, complete with table headers. VoiceOver can also describe a person’s position along with other objects within images — so people can relive memories in detail, and with Markup, users can add their own image descriptions to personalize family photos.

iPad with eye-tracking support

iPadOS will support third-party eye-tracking devices, making it possible for people to control iPad using just their eyes. Later this year, compatible MFi devices will track where a person is looking onscreen and the pointer will move to follow the person’s gaze, while extended eye contact performs an action, like a tap.

The press release offers much more.

Support for bi-directional hearing aids and more

Apple has made a “significant update” to the Made for iPhone (MFi) hearing devices program, now supporting bi-directional hearing aids. With this support, these hearing aids with their improved microphones will make it possible for those “who are deaf or hard of hearing to have hands-free phone and FaceTime conversations,” with models available later this year.

Apple is also bringing support for recognizing audiograms — charts that show the results of a hearing test — to Headphone Accommodations. Users can quickly customize their audio with their latest hearing test results imported from a paper or PDF audiogram. Headphone Accommodations amplify soft sounds and adjust certain frequencies to suit a user’s hearing.

In an effort to support neurodiversity, Apple is introducing new background sounds to help people focus, rest, or otherwise stay calm:

Everyday sounds can be distracting, discomforting, or overwhelming, and in support of neurodiversity, Apple is introducing new background sounds to help minimize distractions and help users focus, stay calm, or rest. Balanced, bright, or dark noise, as well as ocean, rain, or stream sounds continuously play in the background to mask unwanted environmental or external noise, and the sounds mix into or duck under other audio and system sounds.

And there’s even more coming later this year, including more customization options for Memoji:

Apple isalsocelebrating Global Accessibility Awareness Day within Apple Fitness+ and other areas:

There is a lot coming from Apple in terms of support for accessibility.