Apple’s upcoming iOS 19 and macOS 16 releases mark a significant shift in digital accessibility, rolling out features that address concrete challenges for users with disabilities. These updates don’t just tweak existing options—they introduce new tools and information that directly impact how users interact with their devices, apps, and the world around them.
Accessibility Nutrition Labels Arrive on the App Store
Users often struggle to determine if an app will meet their accessibility needs before downloading. With iOS 19, the App Store will display Accessibility Nutrition Labels on product pages. These labels provide a clear summary of which accessibility features an app supports, such as VoiceOver
, Voice Control
, Larger Text
, Sufficient Contrast
, Reduced Motion
, and captions. This change enables people with disabilities to quickly assess whether an app is usable for them—eliminating the need to download and test blindly. For developers, these labels offer a new avenue to communicate their app’s accessibility strengths, reinforcing transparency and trust.
Magnifier Comes to Mac: Real-World Zoom, Document Reading, and More
Reading small print or distant objects on a Mac has traditionally required workarounds or external hardware. The new Magnifier app for Mac, debuting with macOS 16, connects directly to a user’s camera—whether it’s the built-in webcam, Continuity Camera from an iPhone, or a USB camera. This lets users zoom in on real-world objects, documents, or whiteboards and view them on a large screen. The app supports multiple live sessions, so users can, for example, follow a presentation while referencing a book. Customization options include brightness, contrast, color filters, and perspective adjustments. Captured views can be grouped and saved, streamlining repeated tasks like reviewing classroom notes or recipes.
Magnifier for Mac also integrates with Accessibility Reader (detailed below), converting physical text into a customizable, readable format for users with low vision or dyslexia.
Braille Access: Full-Featured Note-Taking Across Devices
Typing notes or performing calculations in braille typically requires specialized hardware. Apple’s new Braille Access feature transforms the iPhone, iPad, Mac, and Apple Vision Pro into robust braille note-takers. Users can launch apps, take notes, and perform calculations using Nemeth Braille—a code widely used for math and science in educational settings. The feature supports direct opening of Braille Ready Format (BRF
) files, unlocking a library of accessible books and documents. Live Captions can now be transcribed directly onto braille displays in real time, making conversations and lectures more accessible for users who are deafblind or hard of hearing.
Accessibility Reader: Systemwide Customization for Reading
Standard reading modes often fail to accommodate users with dyslexia or low vision. Accessibility Reader, available across iPhone, iPad, Mac, and Apple Vision Pro, introduces a systemwide reading mode that lets users tailor font, color, spacing, and contrast to their preferences. Spoken Content support is built in, and the feature can be launched from any app—making it easier to digest everything from menus to textbooks. The Reader is also embedded in the Magnifier app, so users can interact with real-world text on the fly.
Live Captions and Audio Tools Expand Across Devices
For people who are deaf or hard of hearing, transcribing spoken content in real time is vital. With iOS 19 and watchOS 12, Live Listen controls and Live Captions are now available on Apple Watch. An iPhone can act as a remote microphone, streaming audio to hearing aids or headphones, while the paired Apple Watch displays real-time captions—allowing users to follow conversations without needing to be tethered to their phone. The Apple Watch also serves as a remote control for Live Listen sessions, providing flexibility in meetings or classrooms.
On the audio front, Background Sounds now offers more personalization, including EQ settings and automation options, helping users manage distractions or symptoms of tinnitus. Music Haptics on iPhone can be set to vibrate for entire songs or just the vocals, with adjustable intensity for a tailored sensory experience.
Vision Pro and CarPlay: Broader Accessibility Coverage
Apple Vision Pro’s visionOS receives upgrades that let users zoom in on their surroundings using the main camera. Live Recognition, powered by on-device machine learning, describes environments, locates objects, and reads documents out loud. Developers can tap into a new API for live, hands-free visual interpretation—making real-world navigation more accessible.
CarPlay adds support for Large Text, making dashboard information easier to read at a glance. Sound Recognition in CarPlay can now alert drivers and passengers to critical sounds like a crying baby, expanding on previous support for horns and sirens. These updates directly improve safety and usability for drivers and passengers with hearing or vision impairments.
Switch Control for Brain Computer Interfaces and Expanded Input Options
Apple is introducing a new protocol for Switch Control that works with Brain Computer Interfaces (BCIs). This technology allows users with severe mobility disabilities to operate their devices using neural signals—removing the need for any physical movement. Eye Tracking and Head Tracking improvements also offer more flexible ways to control devices, with options like dwell selection and reduced steps for keyboard input. QuickPath typing is now available for Eye Tracking and Vision Pro users, speeding up text entry.
Additional Improvements: Sharing, Sound Recognition, and More
Several other updates round out Apple’s accessibility overhaul:
- Share Accessibility Settings: Users can temporarily transfer their custom accessibility settings to another iPhone or iPad, simplifying device sharing in public or social settings.
- Sound Recognition: Now includes Name Recognition, alerting users when their name is called. This feature is especially useful in group environments or noisy settings.
- Assistive Access: Adds a simplified Apple TV app and new tools for developers to create tailored experiences for users with intellectual or developmental disabilities.
- Voice Control: Introduces a programming mode for Xcode, vocabulary syncing across devices, and expanded language support—including Korean, Turkish, Italian, and more.
- Live Captions: Language support now covers English (India, UK, Australia, Singapore), Mandarin, Cantonese, Spanish, French, Japanese, German, and Korean.
iOS 19’s accessibility features represent a clear step forward in making Apple’s ecosystem more inclusive, offering practical solutions that address real-world barriers for millions of users.
Member discussion