MillionaireMatch.com - the best millionaire dating site for sexy, successful singles!
(Pocket-lint) – Show and Tell helps Echo Show and Alexa users who are visually impaired. The feature has been around in the US for over a year but is just launching in the UK. It works on all Echo Show devices as it needs the camera to function.
If you’re blind or suffer from low vision, sometimes it can be useful if someone is around to let you know, for instance, which box of cereal you’re holding before you pour yourself a bowl. The same thing goes for jars of food or even clothes that you’re packing for a trip.
That’s where the Show and Tell feature comes in handy.
To ask Alexa to identify an item, simply say: “Alexa, what am I holding?” or “Alexa, what’s in my hand?”
Amazon developed it based on user feedback and worked with the Vista Center for the Blind and Visually Impaired in California on research and development. It essentially allows you to hold an item in front of a supported Echo and ask Alexa for help identifying what you’re holding. Alexa will then serve up an answer, so you won’t have to rely on others for help.
“Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment,” explains Amazon.
How does Show and Tell work?
With Show and Tell, blind and visually impaired users can hold up an item to the Echo Show camera and ask Alexa for help identifying the item. This is possible thanks to advanced computer vision and machine learning technologies for object recognition, Amazon explained in a blog post.
“It’s a tremendous help and a huge time saver because the Echo Show just sits on my counter, and I don’t have to go and find another tool or person to help me identify something. I can do it on my own by just asking Alexa,” said Vista Center Manager Stacie Grijalva.
Writing by Maggie Tillman. Editing by Dan Grabham.