Meta has announced smart glasses with a screen in the right lens, meaning you can read WhatsApp messages, look at a map or translate a conversation – all from the comfort of your face.
The company describes them as the world’s most advanced AI glasses, and it’s the first time it has put a display in its smart Ray-Bans.
Mark Zuckerberg believes such hi-tech specs are the future of portable computing, telling the unveiling event they’re “the only form factor where you can let AI see what you see, hear what you hear”.
Released on 30 September for $799 (£587), the display is controlled using a neural band that wraps around the user’s wrist and monitors their hand movements.
A twist of the fingers will turn the glasses’ volume up or down or zoom on the camera; two taps of the thumb to the forefinger will close the display and soon, users will be able to write texts by drawing letters in the air.
“The amount of signals the band can detect is incredible – it has the fidelity to measure movement even before it’s visually perceptible,” said a Meta spokesperson.
The company says their glasses are “designed to help you look up and stay present”, a “technology that keeps you tuned in to the world around you, not distracted from it”.
But I tried the tech on at an event with Meta earlier this month and found the opposite.
I was so distracted by the display that during an interview with Ankit Brahmbhatt, director of product management at Meta, I realised I was watching a game I’d accidentally left on in my lens.
I confessed and asked Mr…