The world of personal AI assistants is crowded. No doubt, you've probably seen or heard a cacophony of complaints around the recent launches of AI-powered gadgets that are meant to act as personal assistants, while minimizing use of your smartphone. We've seen the reviewers trash the Humane AI Pin, and the Teenage Engineering-designed, Rabbit R1.
But, even though these devices were highly panned, it doesn't seem to have dampened anyones spirits about the promise of AI. Moreover many reviews I have watched have related that these devices have them yearning for something like this that actually works and provides AI assistance without the hassle.
And there is a whole host of others vying for space on our lapels and wrists. But what if the future of AI assistance wasn't a gadget we wear, but an extension of what we already wear: our glasses? Hear me out on this, if you are going to interact with a device audibly, where do you want it? The answer is obviously we want it close to our mouth.
If we want the computer-vision powered AI to see what we see, the place to have it is close to our eyes. And if we want to get audible feedback from it, the placement would be to have it close as possible to our ears so the feedback can be somewhat discrete if we want it to be. And it would be preferable if the auditory feedback didn't require something in our ears that would block out the sound from the rest of our environment.
Having been a Google Glass Explorer and being interested in Artificial Intelligence I bought a pair. Unlike what I saw in the reviews of the other hardware AI assistants, the Meta AI glasses were a breeze to unbox, setup and began working perfectly right out of the box.
When I first received them users had to request early access to the AI assistant, but it's now just part of the software thats installed on the glasses. Updates are all automatic, the Meta View app handles all the interconnection with the smartphone and glasses themselves is simple and straightforward. Yes the AI has had a few stumbles and missteps, but it's spot-on 99.99 percent of the time.
Although I don't wear it all day long every day, I do use daily for all kinds of experiments. One thing I am going to use it for is gathering one-minute videos at the next event I attend. Not shots of the crowd, the venue or the aha moments that happen onstage, but rather an up close and personal one minute interviews with the people participating in the event. Just a quick introduction of themselves and then their aha moment or key takeaway from the event.
The Ray-Ban / Meta SmartGlasses received the latest Llama 3 update. Now, Meta's AI is now available via Facebook, Facebook Messenger, Instagram, Threads, What's App and the smart glasses. So as far as societial impact I think the Meta AI model will get uptake from the multitude of users that use Meta's social platforms. This means that Meta can have a huge influence with their AI with mainstream social media audiences. It's available in English in about 10 different countries now through the apps and will have world-wide impact for AI.
To be technical about it Meta's AI isn't a large language model. It has been trained visually with images and videos in a process known as V-JEPA, a new architecture for self-supervised learning. V-JEPA is the vision of Yann LeCum, Meta's Chief AI Scientist working on Llama 3. His Video Joint Embedding Predictive Architecture (V-JEPA) model is a crucial step in advancing machine intelligence with a more grounded understanding of the world.
"V-JEPA is a step toward a more grounded understanding of the world so machines can achieve more generalized reasoning and planning," says LeCun, who proposed the original Joint Embedding Predictive Architectures (JEPA) in 2022. "Our goal is to build advanced machine intelligence that can learn more like humans do, forming internal models of the world around them to learn, adapt, and forge plans efficiently in the service of completing complex tasks.
How good is Meta AI? I have tried ChatGPT, Co-Pilot, Gemini, and Zuckerberg's version of artificial intelligence. I have to say, the Ray-Ban / Meta AI glasses are pretty damn good at "looking" at a scene, assessing what it "sees", and reporting that to the user. Both verbally through the two small speakers mounted right above your ears and also in the companion app where one tab has a log of all your interactions with Meta AI.
Here's a couple of examples, I was out for an evening walk and asked the glasses to take a picture of a house that was getting new landscaping after the house just being renovated. I said "Hey Meta, what am I looking at?" It replied that I was looking at a pile of stones and construction materials in front of a house, possibly indicating renovation or landscaping work.
Another time I was walking through where they have just laid new fiber. One of the sidewalks had been torn out to lay the cable so a new form complete with rebar and orange safety caps was in place awaiting concrete to be poured. Again I queried and it the AI was well versed enough to tell me that I was looking at sidewalk formwork, which is used to pour concrete for a new sidewalk. I was sold after that one - its ability to identify was astounding.
Meta's AI smart glasses correctly identifying John Werner's Apple Luggable
So what does this all have to do with Fin+AI? I said this back when I had Google Glass, I see eyewear as the ultimate place for a virtual assistant. I see this or something like this as the ultimate SpeedPass through life. Having an assistant close to your eyes, ears, and mouth is the best possible placement for voice commands and replies, point of view computer vision, and unrestricted access for auditory input.
Are there any payment uses for it yet? No, but could you have the glasses look at a credit card to relay payment information, absoutely. It could be the AI equivalent of Jumio's NetSwipe solution. Could voice be used to authenicate payments - 100% they can. Could you use the glasses to read a QR code to order products and services? Yes, yes, and yes. Could NFC, GPS, and Ultrawide Band chips be embedded in the headgear? My answer is why aren't they already. Headwear that both stylish and comfortable - what's cooler than a pair of Wayfarer's? - they made Casey Neistat a social legend with his painted up frames.
Recently on the Vergecast the whole crew Nilay, Alex and David had been clowning on the rickety offerings from Humane and Rabbit all divulged that they loved the Ray-Ban / Meta offering and said they didn't use them enough. I belive it was Nilay who called it a silent hit. No one is running around like a wingnut like they did with Vision Pro, the Meta glasses are low-key cool and just kind of there when you need them. I'm glad I got the transition lenses, I can wear them comfortably inside and out and I kind of forget I have access to the world's information resting right on the bridge of my nose.
Learn more about AI and Fintech at our upcoming Fin+AI 2024 Conference. It's being held October 2-4, 2024 at Le Meridien in Dania Beach, Florida. Register today and save $$$!
Use discount code EARLYBIRD for a $500 discount from the standard registrtion pricing. Visit https://www.finaiconference.com and become a part of the AI revolution in finance!