Rabbits aren’t for blind people, but they could be

When I heard about the rabbit, I wanted to be one of the first to own an Agent, and I realize that I was never told it would be adapted for blind and visually impaired people, but I thought all devices these days had to have universal access? It stops talking when you need to talk and is impossible for a blind person like me to set up on their own. There was no way I could enter my password for my Wi-Fi set up without being able to hear what was on thescreen. When I heard about what the device could do, I was very excited to add it to my tool kit of adaptive tools that I use to describe and navigate the world like the Ray band, smart glasses, and be my AI. I still haven’t been able to get it to work yet. I have had it for about three weeks now and I’m very disappointed and fed up. Perhaps there is a setting that turns on some sort of voiceover like function, but the device doesn’t talk when it’s in the settings mode for some reason, so I don’t know. I think it could be an amazing tool for blind and visually impaired people if some more thought had been put into using it for that purpose.

Didn’t anybody try to set it up with their eyes closed? I think it was a missed opportunity.

Hey Darrin! Thank you so much for sharing your feedback. As a brand new device category, running our own software, this means we don’t have the advantage of having all the accessibility features baked in, as would be the case if we were using existing operating systems that have been around for a long time.

That being said I absolutely agree with you that we should work to include this and I’ll bring it up with the team. We are only a small team, building something new, but I definitely think that there is a lot of potential for users with accessibility requirements here and that the form factor and voice first paradigm gives us somewhat of a head start.

I can’t make any promises about how quickly we can improve in this area, but it’s definitely something I will be discussing more with the team soon.

1 Like

That’s all I can ask. I want access to that 360° camera so that it can describe the world around me in a similar way as be my AI and Envision. That kind of information is life-changing. It won’t be long before these devices are capable of narrating the world to me in real time. Life changing.

1 Like

i agree it could be for blind people, but there’s a lot of censorship issues. It’s got some kind of weird conservative christian programming and it will literally refuse to describe anything it is programmed to find offensive.

It’s a form of double blinding. It’s bad enough I lost my vision but now these AI are determining what is appropriate or important enough to describe.

1 Like

I find your will and determination very inspiring.

My mother was blind and elderly, I was hoping a device like this could have given her more independence, but development is very slow and not as much of an integrated IOT as I had hoped. Still time before I get to that age, hope r1 will fast track development for all our parents (and ours) to improve lives.

1 Like

Hi Atreides
Indeed the R1 is close to be a good IOT for blind but still need devellopements for a good accessibility, there is a post in the suggestion chat if you have ideas for use case or options to make the rabbit better for blind people

1 Like

I think if it achieves anything above your basic, “Hey Google…” will be an indication of improvement with the r1, current Lv1

Look also here please!