Latest news with #MagnifierforMac


Forbes
15-05-2025
- Entertainment
- Forbes
Apple Goes Beyond 'Reasonable Accommodation' With Inclusive Tech
A young woman's raised hand in a lecture hall. In most workplaces, and across nearly every college or university in the country, accessibility is built on the concept of reasonable accommodation. The idea is straightforward: if someone with a disability discloses their condition, they can request 'accommodations' for modified work schedules, note-taking assistance, accessible course materials, or assistive technology, so long as those changes do not cause undue hardship. It is a legal standard. But it is also, in practice, a limited one. Because what 'reasonable accommodation' often assumes is that people must first identify themselves as different, advocate for their needs, and enter into a system where inclusion is conditional — not designed in, but granted. Now imagine a different model. One that does not wait for a diagnosis or a request. One that empowers the user from the start, not through permission, but through design. That is what Apple has done. On this year's Global Accessibility Awareness Day, Apple announced a sweeping update to its accessibility features — including a new Magnifier for Mac, enhanced hearing tools like Live Listen and Live Captions, and expanded Braille and Vision Pro support. But beyond product updates, Apple sent a bigger signal: accessibility is not a favor or exception. It is foundational. In Apple's newly released video introducing Magnifier for Mac, we meet Sophie, a college student with low vision. She wears cool-looking eyeframes that hold her strong prescription lenses. She dresses stylishly, a sweater with bold colorz and wonderfully artsy jewelry that reflects her personality. The video places us in a college lecture hall. Her professor is guiding the class through 'the journey of Odysseus,' focusing on his character arc — how he changes, adapts, and ultimately returns home transformed. It is a timeless story of growth and resilience, and Sophie is fully present for it. Sophie uses her iPhone to zoom in on the chalkboard — not just to see, but to understand. Then she opens her Mac. The same content appears on her screen. With Magnifier for Mac, she adjusts brightness, font properties, contrast, and zoom level — all in real time. What begins as visual clarity becomes cognitive clarity. She is not just improving how she sees. She is enhancing how she learns. And that distinction is critical. This is no longer about accommodation. It is about personalization. Sophie's use of technology may have begun as a response to her low vision, but what she is really doing is tailoring her learning style to how she best processes information. And that is something all of us can relate to. Some people learn best visually with charts, images, or organized notes. Others are auditory learners with lectures, podcasts, and conversation help ideas stick. Still others learn through motion and touch. Some need simplicity. Some need stimulation. Some need silence. Others need tools that reduce fatigue or distractions. And most of us learn differently depending on the day. What makes Apple's approach so powerful is that it does not ask users to prove or explain these needs. It simply equips them to shape their experience. Sophie is not using a magnifier because she has a problem. She is using it because it works for her. And that is true for anyone who wants to learn or work in ways that feel natural. Apple did not wait for Sophie, or anyone, to ask. Instead, it built tools into the system that everyone can access from the start. This reflects a growing shift in how leaders think about accessibility — moving away from reactive, compliance-driven models toward proactive, inclusive innovation. As Robert Ludke, author of Case Studies in Disability-Driven Innovation, explains: Apple's newly announced features are not reactive workarounds. They are built-in, empowering, and immediate. iPhone and Apple Watch screen display showcase Accessibility Nutrition Label, Braille Access and ... More Live Listen and Live Caption. Key features include: Back in Sophie's classroom, her professor pauses and asks, 'Who has a strong opinion about this?' Sophie raises her hand. 'Yes, Sophie?' the professor responds. That moment — casual to most — says everything. She is not stuck waiting for notes, asking for adjustments, or explaining her tools. She is ready. She is part of the room. And I feel that moment in my bones. In the past two days, I have watched Apple's Magnifier for Mac video more times than I can count. I kept coming back, not just for what it shows, but for what it stirred in me. I was that kid with profound hearing loss and noticeable hearing aids who struggled. I really struggled. Grade school was a battlefield of confusion, exhaustion, and isolation. There was no tech in the classroom, no captions, no personalization, and no way to participate on my terms. So when I saw Sophie move so freely — adjusting her tools, engaging without delay — I bawled my eyes out. She does not carry the silent weight of not seeing or keeping up. She is learning, contributing, showing up. And in that moment, all the difficult memories from my own school years came rushing back. But they did not just haunt me, they reminded me how far we have come. I wish this had existed when I was growing up. But I am grateful to be here now, to witness what is possible and to tell the next generation: Today is different. Today is full of tools that do not ask you to fight for access, they just hand it to you. Apple is not just releasing accessibility features. It is modeling what leadership looks like in tech, education, design, and making the human connection. Sophie is not an exception. She is the new baseline. She represents a generation of learners, employees, and customers who expect systems to meet them where they are without friction or delay. For leaders, the question is no longer, 'What do we need to do to comply?' The question is, 'What can we do to empower?' Because accessibility done right is not about accommodation. It is about design. It is about strategy. And ultimately, it is about dignity.

Engadget
13-05-2025
- Engadget
Apple brings Magnifier to Macs and introduces a new Accessibility Reader mode
This Thursday is Global Accessibility Awareness Day (GAAD), and as has been its custom for the last few years, Apple's accessibility team is taking this time to share some new assistive features that will be coming to its ecosystem of products. In addition to bringing "Accessibility Nutrition Labels" to the App Store, it's announcing the new Magnifier for Mac, an Accessibility Reader, enhanced Braille Access as well as a veritable cornucopia of other updates to existing tools. According to the company's press release, this year in particular marks "40 years of accessibility innovation at Apple." It's also 20 years since the company first launched its screen reader, and a significant amount of this year's updates are designed to help those with vision impairments. One of the most noteworthy is the arrival of Magnifier on Macs. The camera-based assistive feature has been available on iPhones and iPads since 2016, letting people point their phones at things around them and getting auditory readouts of what's in the scene. Magnifier can also make hard-to-read things easier to see, by giving you the option to increase brightness, zoom in, add color filters and adjust the perspective. With Magnifier for Mac, you can use any USB-connected camera or your iPhone (via Continuity Camera) to get feedback on things around you. In a video, Apple showed how a student in a large lecture hall was able to use their iPhone, attached to the top of their MacBook, to make out what was written on a distant whiteboard. Magnifier for Mac also works with Desk View, so you can use it to more easily read documents in front of you. Multiple live session windows will be available, so you can keep up with a presentation through your webcam while using Desk View to, say, read a textbook at the same time. Magnifier for Mac also works with another new tool Apple is unveiling today — Accessibility Reader. It's a "new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision." Accessibility Reader will be available on iPhones, iPads, Macs and the Apple Vision Pro, and it's pretty much the part of Magnifier that lets you customize your text, with "extensive options for font, color and spacing." It can help minimize distractions by getting rid of clutter, for instance. Accessibility Reader also supports Spoken Content, and as it's built into the Magnifier app, can be used to make real-world text like signs or menus easier to read as well. You can also launch it from any app, as it's a mode available at the OS level. For people who are most comfortable writing in Braille, Apple has supported Braille input for years, and more recently started working with Braille displays. This year, the company is bringing Braille Access to iPhones, iPads, Macs and Vision Pros, and it's designed to make taking notes in Braille easier. It will come with a dedicated app launcher that allows people to "open any app by typing with Braille Screen Input or a connected braille device." Braille Access also enables users to take notes in braille format and use Nemeth code for their math and science calculations. Braille Access can open files in the Braille Ready Format (BRF), so you can return to your existing documents from other devices. Finally, "an integrated form of Live Captions allows users to transcribe conversations in real time directly on braille displays." Wrapping up the vision-related updates is an expansion of such accessibility features in visionOS. The Zoom function, for instance, is getting enhanced to allow wearers to magnify what they see in both virtual reality and, well, actual reality. This uses the Vision Pro's cameras to see what's in your surroundings, and Apple will make a new API available that will "enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes." Finally, Live Recognition is coming to VoiceOver in the Vision Pro, using on-device machine learning to identify and describe things in your surroundings. It can also read flyers or invitations, for example, and tell you what's on them. For those who have hearing loss, the Live Listen feature that's already on iPhones will be complemented by controls on the Apple Watch, plus some bonus features. When you start a Live Listen session on your iPhone, which would stream what its microphone picks up to your connected AirPods, Beats headphones or compatible hearing aids, you'll soon be able to see Live Captions on your paired Apple Watch. You'll also get controls on your wrist, so you can start, stop or rewind a session. This means you can stay on your couch and start Live Listen sessions without having to go all the way over to the kitchen to pick up your iPhone and hear what your partner might be saying while they're cooking. Live Listen also works with the hearing health and hearing aid features introduced on the AirPods Pro 2. While we're on the topic of sound, Apple is updating its Background Sounds feature that can help those with tinnitus by playing white noise (or other types of audio) to combat symptoms. Later this year, Background Sounds will offer automatic timers to stop after a set amount of time, automation actions in Shortcuts and a new EQ settings option to personalize the sounds. Personal Voice, which helps those who are at risk of losing their voice preserve their vocal identity, is also getting a major improvement. When I tested the feature to write a tutorial on how to create your personal voice on your iPhone, I was shocked that it required the user to read out 150 phrases. Not only that, the system needed to percolate overnight to create the personal voice. With the upcoming update, Personal Voices can be generated in under a minute, with only 10 phrases needing to be recorded. The resulting voice also sounds smoother and with less clipping and artifacts. Apple is also adding Spanish language support for the US and Mexico. Last year, Apple introduced eye-tracking built into iPhones and iPads, as well as vehicle motion cues to alleviate car sickness. This year, it continues to improve those features by bringing the motion cues to Macs, as well as adding new ways to customize the onscreen dots. Meanwhile, eye-tracking is getting an option to allow users to dwell or use a switch to confirm selections, among other keyboard typing updates. Apple's ecosystem is so vast that it's almost impossible to list all the individual accessibility-related changes coming to all the products. I'll quickly shout out Head Tracking, which Apple says will enable people to more easily control their iPhones and iPads by moving their heads "similar to Eye Tracking." Not much else was shared about this, though currently head-tracking on iPhones and iPads is supported through connected devices. The idea that it would be "similar to Eye Tracking" seems to imply integrated support, but we don't know if that is true yet. I've asked Apple for more info and will update this piece with what I find out. Speaking of connected devices, Apple is also adding a new protocol to Switch Control that would enable support for Brain Computer Interfaces (BCIs). Theoretically, that would mean brainwave-based control of your devices, and Apple lists iOS, iPadOS and visionOS as those on deck to support this new protocol. Again, it's uncertain whether we can go as far as to say brainwave-based control is coming, and I've also asked Apple for more information on this. For those who use Apple TV, Assistive Access is getting a new custom Apple TV app featuring a "simplified media player," while Music Haptics on the iPhone will offer the option to turn on haptics for an entire track or just the vocals, as well as general settings to fine-tune the intensity of taps, textures and vibrations. The Sound Recognition feature that alerts those who are deaf or hard of hearing to concerning sounds (like alarms or crying babies) will add Name Recognition to let users know when they are being called. Sound Recognition for CarPlay, in particular, will inform users when it identifies crying children (in addition to the existing support for external noises like horns and sirens). CarPlay will also get support for large text, which should make getting glanceable information easier. Other updates include greater language support in Live Captions and Voice Control, as well as the ability to share accessibility settings quickly and temporarily across iPads and iPhones so you can use a friend's device without having to painstakingly customize it to your needs. There are plenty more accessibility rollouts from Apple across its retail locations, Music playlists, Books, Podcasts, TV, News, Fitness+ and the App Store, mostly around greater representation and inclusion. There isn't much by way of exact release window for most of the new features and updates I've covered here, though they have usually showed up in the next release of iOS, iPadOS, macOS and visionOS. We'll probably have to wait until the public rollout of iOS 19, iPadOS 19 and more to try these on our own, but for now, most of these seem potentially very helpful. And as always, it's good to see companies design inclusively and consider a wider range of needs. If you buy something through a link in this article, we may earn commission.