by David Redmond
At this WWDC 2023, Apple officially announced its next big product. The Apple Vision Pro is a mixed-reality headset, and it’s the first mainstream headset with a dedicated screen reader built in. Let’s talk about everything we know about Apple Vision Pro, and work out if it will be any good for those with low or no vision.
What is Apple Vision Pro?
Vision Pro is a mixed-reality headset. It’s shaped like a bulky pair of swimming goggles and has screens and lenses on the inside. When you wear Vision Pro you see the real world through these screens, with a virtual overlay on top. You can control vision pro in three ways. With your eyes, with your hands, and with your voice. For those who are fully sighted, simply look at an object, pinch your fingers, and it’s activated. Sounds simple. For those with low vision, it’s going to be different though.
The Vision Pro has VoiceOver built in. When VoiceOver is on, you control it by pinching different fingers together. Pinch one pair of fingers to go forward, another to move back, and another to activate. You also have multi-pinch gestures, such as double pinch, triple pinch, or triple pinch and hold. I’d be lying if I didn’t say it sounds uncomfortable, but I haven’t tried it.
Zoom is coming and just looks like a big, magnified window, and we are also getting color filters, bold text, text size customisation, and more. Apple has by the looks of it really lived up to its reputation for making accessible products.
How might a blind person benefit from Vision Pro?
Ahead of WWDC, I was talking with someone online about the then-untitled mixed-reality headset. I pointed out that it could be the first headset of its kind with a screen reader built in. That prompted them to ask, if someone’s a VoiceOver user, how would they benefit from mixed reality? It’s a great question.
The Vision Pro is visual at its core. It’s literally got Vision in its name. But I do think the Vision Pro has the potential to change the game for those with sight loss. Imagine apps like Seeing AI, Be My Eyes, or Soundscape on Vision Pro. Just look at the text and it’s read to you. That’s super cool.
The headset also has eye-scanning tech built in for its Eye Tracking and Optic ID features. Imagine if some of that data could be turned into health data and passed over to the Health app on iPhone. You could use FaceTime on Vision Pro to communicate, and be seen as a realistic model of yourself, meaning you’ll never need to focus your camera again. And all this is a first-generation product. With more sensors and more apps, the possibilities are truly endless for Vision Pro.
When Apple Watch was announced, the idea of it replacing talking watches was a far-fetched dream, yet in many cases it has. Vision Pro might be about to do the same to specialised smart glasses, and that’s just the beginning. Yes, the Vision Pro is insanely expensive, but so are the products I mention above. The Apple Watch is more than a talking watch, and Vision Pro is more than OCR Smart Glasses.
What the future looks like for this tech isn’t clear, but I do certainly feel excited. Not necessarily just excited for Vision Pro, but for the future Apple Vision products that will build on it. It’s going to be a wild ride, and I’m so looking forward to it.