By: Nick Gambino

A pretty robust set of AI features is coming to Meta’s Ray-Ban smart glasses in April 2024, according to The New York Times. The new features will allow users to quickly identify things they’re looking at with a simple command.

“Hey Meta, look and tell me how much sugar is in this pack of gummies,” will get you an answer within a few seconds. The Meta AI snaps a photo, which you can hear with the click of a digital shutter, and then gets to work sussing out an answer based on its generative AI capabilities.

This neat little feature means you won’t have to bust out your phone every time you want a little information on what you’re looking at. It also bypasses the need to pick up and read the package. Meta AI in these Ray-Ban smart glasses goes further than just packaging info. You can look at an animal or a tree and ask what it is. You can even ask for a recipe based on the ingredients in front of you.

Just think of it as ChatGPT if it had access to your eyes. The “look” command is necessary to get it to provide info on what you’re seeing. Of course, you could just say “Hey Meta” and ask a question related to what you’re seeing without evoking the camera on the glasses.

 

View this post on Instagram

 

A post shared by Mark Zuckerberg (@zuck)

Every photo snapped and the answers provided by the AI chatbot are stored in the Meta View phone app for later reference. It’s a great way to turn your smart glasses into a kind of dictation device or “sentient” note taker. Simply ask it questions and it’ll record what you’re looking at or trying to figure out with contextual suggestions that may come in handy later.

Now this multimodal AI is still in beta so it’s far from perfect. If you want to start using it you have to sign-up and make it through a waitlist before you’ll gain access.