Are the Envision Glasses any good?

Envision Glasses next to heading Labs Product Review

by David Redmond

Before I write this article, I want to make something very clear. As you’ll read, I had a very mixed experience with the Envision glasses, but that’s no reflection of Envision more generally. Their app is in my view an under-appreciated rival to seeing AI, and others may well have a much better experience with the Envision glasses compared to me, in fact I know some have. This review is simply my opinion of the product, based on my experience using it at a specific point in time. The product may improve significantly down the road, but for now, this is my experience.

Out of the box

The glasses came in a nice case that feels really high quality, you can feel the shape of the word Envision at the top, and it all felt quite nice.

It was when I took the glasses out of the box that things went downhill. Naturally, I wanted to turn the product on. I felt around, but I couldn’t find a power button. There was one easy-to-find button at the front which felt slightly tacky, I was scared to press this as it felt like it would detach the frame from the body of the device. Seeking instructions I went to the Envision app, which informed me that I should look at the back of the frame.

I continued feeling, and for a solid 5 minutes I couldn’t find the button. It was there, but goodness it was hard to find. The button is in my view not nearly distinctive or tactile enough, and is also hard to hold down for the required six seconds. Regardless, I found it and was now ready to turn on the device.

Except it didn’t turn on, Maybe the battery was dead? I plugged it in, but nothing. Not a sound. I was a bit unsure as the app said it could take a minute to turn on, but it had been charging for more than a minute and there were no signs of life. I was getting frustrated at this point. I’d spent ages trying to find the button, and even when I did I wasn’t getting anywhere. I put the glasses aside for a bit to do some other work, and then, a chirp, we had life.

The battery was obviously dead but I had no idea that it was even charging. This is an experience that really should have been better in my view, but I’d put all that aside if the product was good. Let’s give it a go.

Pairing the product was fine, and the tutorials were well done. For the most part, the software sticks to flicks and double taps which I like as it felt natural to me as a VoiceOver user, and the device while sounding tinny was easy to hear.

The Features

The device gives you a few options accessible from the main menu, all accessible through flicks and double taps. One thing I will say for the product is that it’s not short of features.

In the menu you have several items, all centered around a home option. Below is the menu in full at the time of writing.

  • Help
  • Settings
  • Home
  • Instant text
  • Scan text
  • Call an ally
  • Call an Aira agent
  • Describe scene
  • Recognise cash
  • Find object
  • Detect light
  • More features

In the more features menu you’ve got, wait for it, more features.

  • Batch scan
  • Find people
  • Explore
  • Scan QR code
  • Detect colours

And there you have the full list of device features. It’s a good list to be fair, but how good do they work? Let’s look at each individually.

Home

The home option acts as a nice anchor point. When you wake the device up you’ll usually find yourself on the home option, ready to flick left and right.

When you double tap you get a bit of general info such as the time, battery level, your W-Fi network, etc. I would like things to be spoken a bit nicer, the time, in particular, feels like a string of numbers more than the time, but generally, it works as a good way to get quick info.

Maybe it would be cool to customise this information provision. Have the core stuff, but allow the user to add things like weather if they want. All in all, a good start.

Instant text

It does what it says on the tin. Need to know what’s on a box or label? No worries. I really like that this feature works without an internet connection, so it could be a handy aid while out shopping.

Scan text

This for many people is I think the strongest feature of the product. It gives good guidance, is very accurate, and is in my view nice and snappy. Personally I found it clunky to navigate a document, but documents can be exported so I don’t mind that too much.

One of the really cool features this mode has is called Ask Envision. Basically, when you have text scanned, you can hold the button at the front of the device, (the one that made me nervous about detaching the frame earlier), and ask Envision a question about the text you’ve scanned.

I was really excited about this feature as I feel this general idea of being able to ask for specific info could be a major leap forward in assistive tech going forward. When reality hits though, you instantly realise that there’s some more work needed.

Some stuff worked perfectly. The AI had no issues answering questions about letters, but utility bills confused it. How much will this cost? I asked excitedly. I don’t know how much utility bills cost in your region. In my view the feature does have potential, but it needs more time before it’s reliably good.

Call an Ally

When I wanted to set up an Ally a notification went to my phone, bringing me to the relevant section in the app. You can add someone using their email, but they do need the Envision Ally App set up already. I would have expected it would send an invite email with instructions, but thats not the end of the world.

Once I’d added my Ally, I was good to go. This alongside Aira is in my view the most useful feature of the product.

I had added my boss JP as an Ally so he was able to observe what the camera was seeing, and guide me if required.

JP described both the audio and video as being really good, and in his view it was clearer than a Be My Eyes call. From my perspective it was also clear as I listened to JP, even when he switched to using his AirPods.

Overall it’s hard to fault this feature, and it’s in my view the best feature of the product. JP was able to read door signs with ease, and I think we were both impressed.

The only thing of note is I would like to see a better explainer sent to my Ally if the Envision glasses battery dies. The battery on my glasses died mid-call, and it just looked like things got disconnected for JP. More context would be nice.

We didn’t get to test things in low light, but overall it felt quite nice using this feature. Unfortunately, as soon as you leave an area with WiFi you’re out of luck, so to use out and about won’t really be possible.

Laptop and headphones on a desk

Closed door signed 'Meeting Room'

Call an Aira agent

In order to use Aira on Envision, you first need to link your account. I personally liked how this worked. When you double-tap the Aira option you’re instructed to go to the menu in the Aira Explorer app, find the Envision option, at which point you can display a QR code. I particularly like the small detail of being reminded on both devices to make sure screen curtain is off.

To scan the QR Code I first tried using the scan QR code option. For whatever reason either my brain or the device skipped the double tap to scan option in Aira, but that’s more on me I think. Once I found what I was doing things were paired instantly really. I got a nice beep and everything was well explained.

Generally after this things work a bit like calling an Ally. If you’re an Aira user you probably won’t be disappointed. I did find at times it was hard to get an agent, I’ve never run into this with Aira and it kinda felt like glasses users were deprioritised compared to app users, but that’s pure speculation on my part.

Describe scene

Describe scene does what it says on the tin. In my experience, it gives you basic, but mostly accurate scene descriptions.

It doesn’t give elaborate descriptions and you’re certainly not going to get as much info as a sighted person, but if it’s purely context information you want it does its job.

Recognise cash

It works, but it’s slow. I generally found the glasses weren’t too snappy at stuff, and this was no exception.

I did find myself having to flip the notes around and practically wave the money in front of my face for it to recognise the cash, so compared to other solutions I wasn’t blown away.

To be fair to Envision it does work though, so if you find the likes of Seeing AI or Cash Reader tough then it might be an option, it’s just not very fast.

Find object

This was a bit of a disappointment for me. Objects which I thought the device should have no issues with seemed to just be total non-runners. Speaking of runners, the footwear option was a total write-off, as was the bottles option. Doors were hit and miss but more miss to be honest, while options like mobile phone gave equally poor results. Contrast didn’t seem to matter as I tried a white teacup on a dark countertop, while the glasses also couldn’t find an iPhone on a perfectly contrasting table.

Notable exceptions to bad results were the sink, it absolutely loved finding the sink. Chairs also seemed to work well but there were lots of chairs around so finding one can’t have been hard.

I think use of LiDAR could have improved things a lot here as even when objects were found you’d no clue how far away they were, and that’s assuming it finds things in the first place.

Overall a bit of a disappointment honestly.

Detect light

Seemed to work okay. The lower the tone the darker the light. If you’ve used this feature in Seeing AI you probably know what to expect here, it does what it says on the tin.

This really is a feature designed for those with no vision, and as such Mairead tried it for us. She found it better than Seeing AI in that the tone disappeared fully when there was no light. It’s simple yet effective, and certainly serves the target market well.

Batch scan

This feature works the same as scan text. It works well and I had no issues. I think most of what I’d say on this feature has already been covered, but I can confirm that yes, it does as you’d expect.

Find people

Find people is one of those features that just doesn’t feel quite right to me. It feels half-baked.

To add a person to the face recognition list you need to take 5 photos of them. This was a bit weird feeling to me, as most people aren’t big fans of photos. Add the fact that it’s for facial recognition and it gets weirder and slightly awkward.

Imagine asking your friend from college if you can take 5 photos of them so they can be recognised by your fancy blind person glasses. Sorry now, but it felt weird.

Once you’ve done it, it’s much like the explore and find object features, except you’re finding people, not objects. In fact, finding people is a stretch. It really just finds faces in my experience. If Aunty Nora has her back to you it won’t tell you she’s there, or even that a person is there at all. Nora needs to actually be facing you to be recognised as even being a person, and I’d have expected Nora to say hi before the glasses find her.

Maybe if it’s a blind person finding another blind person it might be good, but it all felt a bit ropey to me. Finding people is certainly an issue don’t get me wrong, but I don’t think this is the solution.

Explore

I was very surprised to find I had a bed in my office, or so I was told. I don’t as it happens, and as far as I’m concerned nothing in my office looks like a bed either. It seems that Explore uses the same engine as the object recognition discussed above, with the same flaws just more pronounced.

It’s a shame as this could really be a great feature with a bit of work. I really do like the way they use the clock navigation style, door at 2 o clock etc, but the inaccuracies of this feature really negate its benefits in my view.

Scan QR code

If you have text behind a QR Code, this works well and is actually relatively snappy it’s basic, functional and in my view works well.

Detect colours

It’s hit-and-miss, but it works. It’s tough at times to know what the glasses are seeing, so, therefore, it can be hard to know if this is a grey shirt or if the wall in the background is grey.

In fairness though, I don’t know what more could be done on this front. Issues around light and focus are a struggle with every solution in this area from my experience, and Envision certainly isn’t a magic bullet that solves these issues.

It did accurately describe a teal T-shirt though so I’m relatively happy with this feature. Is it amazing? no, but it’s as good as any of its competitors so I can’t knock that.

Overall impressions

In a way, I don’t know what I was expecting. The Envision glasses are fine I guess but they are also nothing revolutionary. Everything the glasses can do, you can do on your phone, and oftentimes, the phone experience is better.

There is an advantage to being able to do things with a free hand though which needs to be acknowledged. I’m a cane user myself and I’m certainly familiar with the experience of using the phone in one hand, cane in the other, and just wishing for a third hand. For me though I feel a lot of experiences are missing preventing me from embracing the glasses in that way.

Maybe if it had something like Soundscape built in, or maybe if it’s something my phone can’t do like provide traffic light info, then I could see the device having more value.

As of now, it feels like it’s very first-gen to me. It’s got lots of features, but I think most need refinement. I would say focus on improving existing features but it also needs new features to aid with navigation for example.

Personally, I do think the concept is good, but it needs a lot of work. It needs new sensors, maybe a second camera for better viewing angles, better AI, and new features. As I say it just feels like a first-generation device.

It’s quite slow at times and the battery life seemed quite poor. Even with all that said I’d never write it off though. The Apple Watch Series 1 was awful, and it was 4 or 5 generations before it got to a solid point. Could Envision glasses be an amazing product in five years of revisions, absolutely. But is it great in its current form? In my view, honestly speaking, no.

Conclusion

I really wanted to like the Envision glasses more than I did. I was excited by its AI capabilities, and I’ve spoken before in the newsletter about how big a deal I think VR, AR, and wearables like this could be for those with sight loss.

What Envision shows me is that we’re not there yet. The foundations are there. A good menu system, in my view a relatively comfortable feel, is the basic feature set. But we’re still a long way out from something groundbreaking. The foundation is there but the building isn’t built yet.

I really hope it does get built as the concept does I feel have potential, but I couldn’t honestly recommend the product at this time. It’s not ready yet.