close
close
I tried Google’s XR glasses and you have already hit my meta-ray bans in 3 ways

Android XR glasses

Kerry Wan/Zdnet

Google presented a number of new AI tools and functions at E/A and dropped the term Gemini 95 times and ai 92 times. However, the best announcement of the entire show was not a AI function. Rather, the title went to one of the two announced hardware products – the Android XR glasses.

Also: I am a AI expert, and these 8 announcements on Google I/O were most impressed by me

For the first time, Google gave the public a look at its long-awaited smart glasses, which integrate the support of Geminis, in-lens displays, speakers, cameras and mics into the form factor of traditional glasses. I had the opportunity to wear it for five minutes in which I went through a demo to use it to maintain visual gemini support, take photos and get navigation instructions.

As a META-Ray Bans user, I couldn’t help but notice the similarities and differences between the two intelligent glasses-and the functions that I now wish my meta pair would have done it.

  1. In-Lenens displays

Android XR glasses

Kerry Wan/Zdnet

The biggest difference between the Android XR glasses and the meta-ray bans is the inclusion of an in-leness display. The Android XR glasses have a display that is useful in any case, the text includes, e.g.

Google

The meta-ray bans have no display, and although other intelligent glasses such as Hallidays do, the interaction includes the uprising on the optical module, which is placed on the frame, which leads to a more unnatural experience. The display is limited in what it can indicate because it is not a living display. The ability to see elements beyond the text adds another dimension to experience.

In addition: I tested the meta-ray bans for months, and these 5 functions are still surprising me surprise me

For example, my favorite part of the demo was the use of the intelligent glasses to take a photo. After clicking on the button on the lens at the top, I was able to take a photo in the same way as I do with the meta-ray bans. The difference, however, was that after the image was taken, I could see the results on the color in color and in rather sharp details.

Although it was not particularly helpful to see the picture, it gave me a look at what it feels like when it feels that it has always integrated an over -layer, always integrated into its everyday glasses, and all possibilities.

2. Aid

Android XR glasses

Kerry Wan/Zdnet

Google has continuously improved its Gemini assistant by integrating its most advanced Gemini models, which makes him an increasingly capable and reliable AI assistant. While the “best” AI assistant ultimately depends on personal preference and application, in my experience I have exceeded the MET KI-AI-AI, which is currently used in the Ray Ban Smart glasses from Meta over the years.

Also: Your Google Gemini assistant receives 8 useful functions -here is the update protocol

My preference is based on several factors, including Gemini’s more advanced tools, such as deep research, Advanced Code Generation and more nuanced conversation skills in which Gemini currently has an advantage over META AI. Another remarkable difference is in terms of security.

For example, Gemini has stricter guardrails to create sensitive content, such as: B. pictures of political figures while Meta Ai is looser. It is still unclear how many of Gemini’s functions will be transferred to intelligent glasses, but if the full experience is implemented, the Android -Smart glasses would be a competitive advantage.

3 .. Light form factor

Android XR glasses

Sabrina Ortiz/Zdnet

Although they don’t look very different in the Wayfarer style, Google’s XR glasses felt noticeably lighter than that of Meta. As soon as I put her on, I was a bit shocked about how much easier they were than I expected. While a real comfort test would require wear for a whole day, there is also the possibility that the glasses shine like a great victory at this moment when the glasses reached production.

In addition: The best intelligent glasses presented at the I/O 2025 were not made by Google

If the glasses can maintain their current light design, it is much easier to fully use the AI ​​support they offer in everyday life. They would not sacrifice comfort, especially around the nose bridges and behind the ears to wear them for longer periods. Ultimately, these glasses act as a bridge between AI support and the physical world. This connection only works if you are ready and able to carry it consistently.

Get the top stories of the morning in your inbox with ours every day Tech newsletter today.

Leave a Reply

Your email address will not be published. Required fields are marked *