Meta's smart glasses will soon provide detailed information regarding visual stimuli

5 months ago 7

The Ray-Ban Meta glasses are getting an upgrade to better help the blind and low vision community. The AI assistant will now provide "detailed responses" regarding what's in front of users. Meta says it'll kick in "when people ask about their environment." To get started, users just have to opt-in via the Device Settings section in the Meta AI app.

The company shared a video of the tool in action in which a blind user asked Meta AI to describe a grassy area in a park. It quickly hopped into action and correctly pointed out a path, trees and a body of water in the distance. The AI assistant was also shown describing the contents of a kitchen. 

I could see this being a fun add-on even for...

Source: https://www.engadget.com/ai/metas-smart-glasses-will-soon-provide-detailed-information-regarding-visual-stimuli-153046605.html?src=rss

Read Entire Article

Disclaimer of liability !!!

NEWS.SP1.RO is an automatic news aggregator. In each article, taken over by NEWS.SP1.RO with maximum 500 characters from the original article, the source name and hyperlink to the source are specified.

The acquisition of information aims to promote and facilitate access to information, in compliance with intellectual property rights, in accordance with the terms and conditions of the source.

If you are the owner of the content and do not wish to publish your materials, please contact us by email at [email protected] and the content will be deleted as soon as possible.