Meta’s smart glasses can now describe what you’re seeing in more detail
1 min read
Summary
In a recent feature on Global Accessibility Awareness Day, Meta outlined two new tools aimed at assisting visually impaired users whilst wearing their Ray-Ban Meta smart glasses.
The first feature uses Meta AI to provide a more detailed description of a user’s surroundings when asked about their environment, with responses covering aspects such as the state of the grass in a park (for example, if it has been recently manicured).
This feature will be available in the US, Canada, the UK, Australia and Ireland, with more countries eligible in the future.
The second feature, Call a Volunteer, will go live in all 18 countries that currently support Meta AI at the end of this month.
It helps users to connect with a network of over 8m sighted volunteers to assist with tasks such as following a recipe or locating an item on a shelf.
Both features can be activated when users say, “Hey Meta, Be My Eyes”.