I know someone who is blind, who was so excited when gemini 2.0 live came out. He can just have his phone out and "see" through Gemini effortlessly. Imagine going on vacation in NYC and you can see again through this assistant. Or browsing the web without all the hassle. Or sitting on a park bench where you can enjoy the view for the first time and see people passing by. It's incredible Anyone who says AI is just hype isn't getting that this tech is already changing people's lives in ways nothing else can.
@@drhxa it’s definitely gonna be something worthwhile. I’m in New York. I essentially am in the city every other day for both life and professional reasons. Ultimately, it would be nice to look over and be told that a pretty woman is smiling at me or that there’s a really interesting ice cream shop at the corner with extraordinary flavors or that shy little boy is trying to approach a horse in Central Park without getting spooked or being noticed. There’s all these little intricacies in life that I know are happening around me that I wish I could be a part of even if only for a second or two. Don’t get me wrong. It’d be really nice to see again (and that might be possible with gene therapies and other technologies that could fix my site). And this could be something that gives me so much of those little things back.
Neuralink... was thinking the same thing. But never know, there might be something even more inconspicuous and seamless in the future. Perhaps atomic devices like nanobots that will be way more capable and invisible. So far the trajectory looks promising.
I know someone who is blind, who was so excited when gemini 2.0 live came out. He can just have his phone out and "see" through Gemini effortlessly. Imagine going on vacation in NYC and you can see again through this assistant. Or browsing the web without all the hassle. Or sitting on a park bench where you can enjoy the view for the first time in 15 years and see people passing by. It's incredible Anyone who says AI is just hype isn't getting that this tech is already changing people's lives in ways nothing else can.
Visiting museums with these is a good use case. Personal comments at your own pace, with the ability to ask related questions to deep dive into the history of a painting, an artefact, a sculpture, and so on
What's cool is that we're seeing physically working prototypes - they actually work and seem to work quite well. The catch is the cost; yes, we can build these but not currently at a cost that anyone can afford (or would be willing to justfify). That changes up the equation from where we usually find ourselves; the challenge is to get the cost down, either through reducing costs associated with the current prototype components or by finding acceptible alternatives at a lower cost (possibly with some compromises in capability). It will be interesting to see where this stands in 5yrs.
Same thing was said about Tesla, when the Roadster was out. Now Teslas pull up at just about every light around you. The costs came way down pretty fast in car years. Maybe these glasses will see price drop soon after early adoption.
I like when they show prototypes like Meta. I don't care if they look funny. But when they show videos like the lady walking around and the nice UI like we are there already, then it sets you up for a big disappointment.
I'd be surprised if the glasses don't eventually have gravity and accelerometer sensors that are used with smartphone apps. Sensors are small enough right now to do this, so who knows, maybe we'll see it at release.
Thats just R&D. Nearly everything, from new medications to automobiles to new tech that they see, has been in development for decades (or at least aspects of those new products have been in development for that long). For instance, early research into blue LEDs started in the 1970's, successes in the 1980's, commercialization in the 1990's, and mainstream adoption in the 2000's. Figure a decade for basic research, a decade for translational research, a decade for applied and early manufacture/early adopter, and a decade for scale implementation. For major technologies, the cycle is fairly consistent, varying by perhaps a few years.
@3:07 you state “the final form factor of artificial intelligence is going to be glasses”. I’m going to disagree… the final form factor of artificial intelligence is going to be neural implants. First as chips, later as nanotechnology.
I also don't want to wear glasses all day. They would be useful for navigating around unfamiliar streets or providing stats at sporting events but I enjoy being glasses free 99% of the time. The other big factor is cost.
It seems to me we are SO close. Take gemini 2.0 (or anything similar) + one of the multiple humanoid robots that are coming out + improve their technique with RL and IL just like tesla is improving their FSD. We aren’t getting them tomorrow, but I think we can see the way. AI isn’t solved, but we don’t need AGI to get extremely useful assistants. I honestly can’t imagine the world not dramatically changing in the following years. Sorry I’m hyped lol.
the final form factor for humans enhanced with ai will be brain to microchip interaction and communication. Be it internal, like neuralink or external like how other companies are doing it.
This is just the begining. Remember, these modalities are to collect data for AI training in the future. This is the only reason for these. This is a step towards "AGI" as people love to call it. The question is when to stop training them? The safety 1M question.
And just like with audio on a TV, they'll crank up the brightness and contrast for longer persistence of vision, to make sure you have to think about their ad for a longer time.
I’m blind and this technology is life-changing. I love my Meta glasses. I’m able to read all kinds of things and get descriptions of the world around me. and I could only really just take a picture and ask it to describe it. I can’t say, tell me when my taxi has arrived, for example. But it sounds like I would be able to do this with the Google glasses. and the Meta glasses will not describe people. And it sounds like the Google glasses will do this and that they will recognize people too. If you manage to get a pair, can you test these two use cases?
I think glasses are absolutely the next step, but battery life will be a huge blocker for uptake. If someone has to carry around a charging case to pop them on and off throughout the day, no one's going to want to do that.
Using a thin metallic reflective layer on the inner side of the glasses, a laser projects an image on the eye retina at the human retina resolution. A camera tracks each eye position for a seamless stable wide view experience
@@Justin_Arut Before Google bought focals by north and shut that down several years ago, they were getting ready for pre-orders of their second generation and they were able to use prescription lenses. Fingers crossed
Many people where glasses on the daily. I can where sunglasses all day if I need to and often where blue-light blocking glasses in the evenings or when Im in front of screens for longer periods. Wearing AI glasses would be no problem at all for me and I believe it is the same for most people.
i hope they'll let you swap out the model, and i hope they'll make the base open source, like they have with android, so we can have alternatives like lineageos
Yes, big time! I’m ready to wear them all day, every day, coupled with Bluetooth earbuds or whatever they are. I’ve been ready for years. Bring them on! And by the way, greetings from Taiwan. 🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼
Mobile dev here. The new google xr sdk is very well written (actually a near copy of visionOS) so it will be easy to get all android apps running on it it 2d and the 3d extensions will make it easy to add 2d e.g. 3d solar system to astronomy app.. If they can beat apple on weight, obsessive privacy with cameras (allow scanning of buildings and room - block faces) and add GPS and Google Maps apple might be in trouble. If they can get the gamers off Quest then they might win. Though splitting the Android XR ecosystem between Google and Meta is a win for apple. Xreal so far is winning in the on face entertainment. Their latest Xreal One solves most of the on face problems though they still are more a hw company than sw. I wish a good sw company would buy xreal :)
Cameras, projectors, mic, and speakers are fundamental components for human interaction with technology. The form factor, whether glasses or another wearable devices, will depend on advances in hardware technology, and will be an exciting area of innovation next few years.
1. classify the image data of your mobile gallery into different persons (or download leaked government data from dark web 😂) 2. Train LLM for face recognision 3. Deploy it in the cloud 4. connect that cloud app with AI glasses and now you have your own james bond glasses to recognise each person.
I could see the glasses in specific situations, such as being in a town that I'm not familiar with, touring a museum, walking a university campus, etc.. For me I would prefer AI and my AI agent to be on my phone in my pocket.
Definitely! There are plenty of vegan pizza options out there. Most places now offer vegan cheese, and you can load up your pizza with delicious toppings like mushrooms, bell peppers, onions, olives, spinach, and artichokes. Some spots even have specialty vegan pizzas with plant-based meats or creative sauces. If you're making it at home, there are tons of great vegan cheese brands to try, or you can skip the cheese altogether and focus on flavorful veggies and herbs. What kind of vegan pizza are you craving?
Yes, I'd wear glasses all day long if they weren't so bulky and looked like regular glasses. I'm a pool tech, so I'm outside every day, and I have a route to follow. I usually work alone, so I can listen to music or podcasts all day. This technology would enhance my day-to-day routines, and create a fun working environment.
I'm intrigued by AI-powered glasses, but I'm not sure I want to wear them all day. Can't wait to test them out and see how they change the way we interact with information.
The glasses won't have AI in them. Most likely they will outsource the heavy lifting first to your phone and then to the cloud. So, all the glasses have to do is basically power the sensors and display stuff into your eyeballs. Which is impressive already. But the glasses are probably totally agnostic to whether they will be used for AI or just displaying a screen saver :).
So these people that never created anything non-digital didn’t think of putting a visual level along with Geo spacing so it can create a shelf outline on the wall ? I guess that would require some lidar
When Glass was available there were barriers with the tools to develop apps. Hopefully, these better will be available for less sophisticated developers. Also battery life and privacy and security rules won't be a roadblock.
I’m addicted to my Meta glasses, but I can’t wait for augmented reality. I’m an Apple user so I don’t wanna switch to an android phone but something tells me they’ll be lots of competition out there.
What if you already have to wear glasses to see properly? Will you be able to get prescription versions with your eyesite correction built in? Are the developers in discussions with opticians and glasses manufacturers?
If google says the homosexual seasonal advertisement is a nominal portion of its campaign; why is it the only seasonal ad I have seen from google on my phone?
The data will be handy. Every app sells your data to 6 ish other data businesses. The hit man entering your house can know where you are sleeping and where you put the laptop they came for (besides you). Its so fortunate for this market that people just blindly click thru terms of service, and eveyone including google are just looking at your data as something for them to sell.
I agree that privacy and data collection is going to be an even greater thing with his step up in tech. Everything we do and say will be bought and sold. I hope we developed a privacy focused open source Linux version soon.
This is only going to get really good when you couple this technology with neural link. I think it's going to be weird if everybody walks around on the street with classes talking to themselves
"The where did i put my level?" "I saw the level last on the console next to you." I wonder how far its memory can go back to analyze and give you a proper answer.
I'm gonna wait till they develop more software like filters, so you can just go around watching the world as if you are in a cartoon or a Van Gogh painting.
As a glasses wearer, I wouldn't wear ones as thick or as heavy as these ones. Why couldn't a level line be projected using AR onto the wall? I think that would have been more useful. I would feel uncomfortable on the street talking to myself. You couldn't tell the loonies and Gemini AI users apart! What could it do for you if someone tried to rob you at knifepoint? Would it recognise a robbery taking place and would it send images/video to an emergency contact and automatically call the police? Would it refuse to respond to the thief after its been stolen and could it decide to automatically destroy itself (power surge its chips) so it has no value to the thief?
They show people wearing what appears to be regular spectacles, but what they’re going to deliver are things that look a lot like Apple Vision Pro. Does that mean the spectacles technology doesn’t really exist?
It exists in prototype form at Meta. They showed them working ina bunch of demos you can see in TH-cam. But yeah they aren't that thin and light yet. But it's only a matter of time now.
If these don't detect when you get behind the wheel and then turn off the display automatically, we're gonna be seeing many more vehicular accidents and all our insurance rates will go up accordingly.
Very cool, but as someone who had to wear glasses since he was four years old I would not wear them all the time. At age 17 I very consciously opted for contact lenses. I understand the form, but I personally do not think this will be the ultimate form, but just a stage.
I will likely be a customer. I've wanted AR glasses for a long time now, and we are finally getting close to them being an actual product. Meta's looked very impressive, but the price tag would've been absurdly high. I'm hoping Samsung & Google can build a similar product, but for a much better price. If these are available under a grand, I'm a customer. What that tech can enable, with a top notch ai in them, is nothing short of remarkable. I can now be a master carpenter, plumber, doctor, mechanic, etc. They............will.............sell!
For all-day glasses, I thought Even Realities G1 had the right idea. But there are still issues with UX design. Too much info can be distracting. One person with the G1 even got into a car accident. And it's pretty light weight in terms of information, requiring one to tilt one's head to get info. Imagine it popping up in the middle of one's field of view. Scrolling through longer text is doable, but more cumbersome. Also, the Android XR text display (in the demo) presents a very small amount of text. I wonder if this is even useful. I wonder how this would work if one needs a longer output? Currently, I've gotten use to my smart watch for hands free (phone free) notifications. It seems to serve the purpose. I don't know if smart glasses (esp. for daily use), would make life easier, or add more of distraction.
I personally don't mind owning one of those smart glasses but I won't be wearing them for the entire day. I also believe that there's many people who are the same like me, who won't mind owning a smart glasses but to wear them all day long, it will be a problem. 😎💯💪🏾👍🏾
Neuralink is probably the final frontier, not glasses. I do think glasses are the next step, especially something like what Google and Meta are working on. I've heard that Meta won't have production Orion glasses at a price point for a decent sized market for another 5 years or so. I can't imagine Google will beat them to market. Though it's gonna be a while, I do think there will be some significant demand for these things if they can get the pricing right. I have the meta Raybans and love them. they're great sunglasses and for about $80 more, you get some interesting tech that includes bluetooth speakers, direct access to MetaAI, and you can take pictures and videos.
Those demos are aspirational vfx projects… in other words, imaginary. The Meta Ray Bans have natural language, hands free interaction with Ai now plus all the other stuff
Privacy would be definitely the concern if voice control is the only or main interface. Voice as interface also process way too slow. Human love gesture control as it’s much more intuitive and efficient to them- that’s why mouse and touch screen is still our favourite today . Glasses will only replace the screen. I am looking forward to more creative innovation on the control interface part. Maybe Elon’s idea on how we interact with Machines using “mind” will be our destinations.
I don't know, I really love the tech and innovation behind it, but I am not wearing glasses and I find it weird if even more people are talking to themself ...
The only problem with only having glasses is how to scan barcodes to pay. Well, I guess your glasses can scan, but how does the store scan your barcode from Venmo? I guess they will solve this
Lots of videos about these at the moment, but won’t catch on and all honest reviewers know this. Google has to explore this tech, but until it’s a contact lens, it’s not going to succeed
I think it will be an awesome thing to have. I have very good thoughts about the new AI glasses. Although I don't think this is the last we are going to see if AI itself. There is so much more to learn about generative ai. It's not just about the physical devices anymore AI is going to be inputted with a lot of stuff including using it with our everyday lives like video gaming emergance tech lines and other things.
As a blind person, this excites me. It puts me on equal footing in many ways as other people.
Not really you still can't appreciate all the urban decay or bad fashion choices
Great! Wait-
You know this comment confused me for a few seconds, then i remembered there's more than one kind of blindness.
DERP!
I know someone who is blind, who was so excited when gemini 2.0 live came out. He can just have his phone out and "see" through Gemini effortlessly. Imagine going on vacation in NYC and you can see again through this assistant. Or browsing the web without all the hassle. Or sitting on a park bench where you can enjoy the view for the first time and see people passing by. It's incredible
Anyone who says AI is just hype isn't getting that this tech is already changing people's lives in ways nothing else can.
@@drhxa it’s definitely gonna be something worthwhile. I’m in New York. I essentially am in the city every other day for both life and professional reasons. Ultimately, it would be nice to look over and be told that a pretty woman is smiling at me or that there’s a really interesting ice cream shop at the corner with extraordinary flavors or that shy little boy is trying to approach a horse in Central Park without getting spooked or being noticed. There’s all these little intricacies in life that I know are happening around me that I wish I could be a part of even if only for a second or two. Don’t get me wrong. It’d be really nice to see again (and that might be possible with gene therapies and other technologies that could fix my site). And this could be something that gives me so much of those little things back.
The literal embodiment of Big Brother whether it's Google or Meta.
The chip in your head will be the final hardware.
And the Matrix - the final virtual environment.
@saint_ofc there is no need for chip, search for sentient world simulation
We will be the final hardware... batteries to feed the machines. I saw the movie.
Neuralink... was thinking the same thing. But never know, there might be something even more inconspicuous and seamless in the future. Perhaps atomic devices like nanobots that will be way more capable and invisible. So far the trajectory looks promising.
please come faster
Soon everyone will be talking to themselves.
But, they'll be looking up again!
😂
No different than now
Until the technology advances to the point where you don't actually need to speak
They already do
I know someone who is blind, who was so excited when gemini 2.0 live came out. He can just have his phone out and "see" through Gemini effortlessly. Imagine going on vacation in NYC and you can see again through this assistant. Or browsing the web without all the hassle. Or sitting on a park bench where you can enjoy the view for the first time in 15 years and see people passing by. It's incredible
Anyone who says AI is just hype isn't getting that this tech is already changing people's lives in ways nothing else can.
Visiting museums with these is a good use case. Personal comments at your own pace, with the ability to ask related questions to deep dive into the history of a painting, an artefact, a sculpture, and so on
Why would you need to if the fidelity of the figmentation was factual enough ?
What's cool is that we're seeing physically working prototypes - they actually work and seem to work quite well. The catch is the cost; yes, we can build these but not currently at a cost that anyone can afford (or would be willing to justfify). That changes up the equation from where we usually find ourselves; the challenge is to get the cost down, either through reducing costs associated with the current prototype components or by finding acceptible alternatives at a lower cost (possibly with some compromises in capability). It will be interesting to see where this stands in 5yrs.
Nail it
Same thing was said about Tesla, when the Roadster was out. Now Teslas pull up at just about every light around you. The costs came way down pretty fast in car years. Maybe these glasses will see price drop soon after early adoption.
I like when they show prototypes like Meta. I don't care if they look funny. But when they show videos like the lady walking around and the nice UI like we are there already, then it sets you up for a big disappointment.
you can buy glasses today..
Are you kidding? Anyone with a 3D printer can use the Gemini API with about $80 worth of hardware, build their own.
Why need a leveler for a shelf, should AI glasses not be able to project a horizontal line onto the wall?
I'd be surprised if the glasses don't eventually have gravity and accelerometer sensors that are used with smartphone apps. Sensors are small enough right now to do this, so who knows, maybe we'll see it at release.
I think the point was to tell us that Gemini can record and remember everything.
Do you want to straight though?
Or can you envision virtual shelves with virtual objects?
These prototype glasses have been coming out for decades, it'll only be truly useable when people replace their smartphones for them.
I don't think the smartphone will be replaced. But together they can be used to enhance whatever the use case is.
I like how smartphones are 17 years old but these glasses have been coming out for decades...lol
Thats just R&D. Nearly everything, from new medications to automobiles to new tech that they see, has been in development for decades (or at least aspects of those new products have been in development for that long). For instance, early research into blue LEDs started in the 1970's, successes in the 1980's, commercialization in the 1990's, and mainstream adoption in the 2000's. Figure a decade for basic research, a decade for translational research, a decade for applied and early manufacture/early adopter, and a decade for scale implementation. For major technologies, the cycle is fairly consistent, varying by perhaps a few years.
@@laartwork I remember playing with the Virtual Boy back in 1995.
@@laartworkdo some research and let us know exactly. We won’t be surprised tho. But you….
@3:07 you state “the final form factor of artificial intelligence is going to be glasses”. I’m going to disagree… the final form factor of artificial intelligence is going to be neural implants. First as chips, later as nanotechnology.
I hope I live to see it.
9 out of 10 people don't want dangerous invasive apparatus poked through holes cut in their skull. Glasses will give us control over our privacy.
Quantum computing says otherwise, but we are all going to be long gone before that
Could just say genetic modification it easier since nanotechnology doesn't mean anything since chips are already at sub micron scale.
@@mik3lang3loyour response makes me curious about what you mean by the argument, do you mind some elaboration
I also don't want to wear glasses all day. They would be useful for navigating around unfamiliar streets or providing stats at sporting events but I enjoy being glasses free 99% of the time. The other big factor is cost.
We are that much closer to robot companions.
It seems to me we are SO close. Take gemini 2.0 (or anything similar) + one of the multiple humanoid robots that are coming out + improve their technique with RL and IL just like tesla is improving their FSD. We aren’t getting them tomorrow, but I think we can see the way. AI isn’t solved, but we don’t need AGI to get extremely useful assistants. I honestly can’t imagine the world not dramatically changing in the following years. Sorry I’m hyped lol.
OMG! This is just beginning and blowing my mind. Very interesting information.
the final form factor for humans enhanced with ai will be brain to microchip interaction and communication. Be it internal, like neuralink or external like how other companies are doing it.
Who needs cameras when you can tap directly into the visual cortex? Borg hive mind in 3..2..1..
@@Justin_Arut 😂 Lets hear what Locutus has say about this.
The Expanse tv show depicted this well (the stowaway spy9
Enter the booooooorg-ah!
internal obviously
What if you already use sight glasses
With the meta ones you can get prescription lenses
@@michaelnobbs5028 great
Raise your hand if you do not TRUST Google!
"with glasses you'll feel like you're a local"
um, no, you won't, lol
This is just the begining. Remember, these modalities are to collect data for AI training in the future. This is the only reason for these. This is a step towards "AGI" as people love to call it. The question is when to stop training them? The safety 1M question.
Ah, advertiser’s dream: now you can buy from google not only your browser history and files, but your actual field of view.
glassholes… glassholes everywhere
That was because of the look of the camera. Just like when wearing a Bluetooth headset was the worse ... and now everyone does.
@ … and recording everything except now we also plug it into AI…
And twats like you thinking you are cool for putting first adopters down... When we all know your punk ass will be buying them in just a few years.
So basically now
@@TurdFergusenyou've been recorded in public for many, many years now. Shopping centres, airports, bars, people's phones, the list is endless.
Great summary. No light in the background! Well done. I'm happy.
Can't wait to get adverts constantly delivered directly to my retina whilst I'm walking around.
And just like with audio on a TV, they'll crank up the brightness and contrast for longer persistence of vision, to make sure you have to think about their ad for a longer time.
They will make every building full of ads, you will barely see where you're going..
I’m blind and this technology is life-changing. I love my Meta glasses. I’m able to read all kinds of things and get descriptions of the world around me. and I could only really just take a picture and ask it to describe it. I can’t say, tell me when my taxi has arrived, for example. But it sounds like I would be able to do this with the Google glasses. and the Meta glasses will not describe people. And it sounds like the Google glasses will do this and that they will recognize people too. If you manage to get a pair, can you test these two use cases?
I think glasses are absolutely the next step, but battery life will be a huge blocker for uptake. If someone has to carry around a charging case to pop them on and off throughout the day, no one's going to want to do that.
The next form will be eye contacts
Followed by intraocular lens implants
For language translation, it's a win.
Using a thin metallic reflective layer on the inner side of the glasses, a laser projects an image on the eye retina at the human retina resolution. A camera tracks each eye position for a seamless stable wide view experience
As a glasses wearer, can’t wait!!
They'll need to look more like regular glasses and work with prescription lenses. Then again, I've been wanting LASIK for years anyway.
@@Justin_Arut Before Google bought focals by north and shut that down several years ago, they were getting ready for pre-orders of their second generation and they were able to use prescription lenses.
Fingers crossed
As a no glasses wearer, cant wait!
Many people where glasses on the daily. I can where sunglasses all day if I need to and often where blue-light blocking glasses in the evenings or when Im in front of screens for longer periods. Wearing AI glasses would be no problem at all for me and I believe it is the same for most people.
i hope they'll let you swap out the model, and i hope they'll make the base open source, like they have with android, so we can have alternatives like lineageos
Yes, big time! I’m ready to wear them all day, every day, coupled with Bluetooth earbuds or whatever they are. I’ve been ready for years. Bring them on!
And by the way, greetings from Taiwan. 🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼🇹🇼
Do you wear your smart watch all day long?
Mobile dev here. The new google xr sdk is very well written (actually a near copy of visionOS) so it will be easy to get all android apps running on it it 2d and the 3d extensions will make it easy to add 2d e.g. 3d solar system to astronomy app.. If they can beat apple on weight, obsessive privacy with cameras (allow scanning of buildings and room - block faces) and add GPS and Google Maps apple might be in trouble. If they can get the gamers off Quest then they might win. Though splitting the Android XR ecosystem between Google and Meta is a win for apple. Xreal so far is winning in the on face entertainment. Their latest Xreal One solves most of the on face problems though they still are more a hw company than sw. I wish a good sw company would buy xreal :)
Cameras, projectors, mic, and speakers are fundamental components for human interaction with technology. The form factor, whether glasses or another wearable devices, will depend on advances in hardware technology, and will be an exciting area of innovation next few years.
1. classify the image data of your mobile gallery into different persons (or download leaked government data from dark web 😂)
2. Train LLM for face recognision
3. Deploy it in the cloud
4. connect that cloud app with AI glasses and now you have your own james bond glasses to recognise each person.
I could see the glasses in specific situations, such as being in a town that I'm not familiar with, touring a museum, walking a university campus, etc.. For me I would prefer AI and my AI agent to be on my phone in my pocket.
What happens if you wear prescription glasses? Does all of this work in conjunction with that?
Definitely! There are plenty of vegan pizza options out there. Most places now offer vegan cheese, and you can load up your pizza with delicious toppings like mushrooms, bell peppers, onions, olives, spinach, and artichokes. Some spots even have specialty vegan pizzas with plant-based meats or creative sauces. If you're making it at home, there are tons of great vegan cheese brands to try, or you can skip the cheese altogether and focus on flavorful veggies and herbs. What kind of vegan pizza are you craving?
Meta Orion and This will finally get Mixed Reality to the masses 😊
Wouldn’t you have to be online to get the feed? So how do you power them all day?
I've worn glasses since I was 11 years old and have no problem with Gemini as Eyewear. I'll be pre-ordering a pair as soon as possible!
Yes, I'd wear glasses all day long if they weren't so bulky and looked like regular glasses. I'm a pool tech, so I'm outside every day, and I have a route to follow. I usually work alone, so I can listen to music or podcasts all day. This technology would enhance my day-to-day routines, and create a fun working environment.
I'll wear the Raspberry Pi models when they come out, I'd only pay 30 quid for them though
Yeah we need an open source and private version
I'm intrigued by AI-powered glasses, but I'm not sure I want to wear them all day. Can't wait to test them out and see how they change the way we interact with information.
The glasses won't have AI in them. Most likely they will outsource the heavy lifting first to your phone and then to the cloud. So, all the glasses have to do is basically power the sensors and display stuff into your eyeballs. Which is impressive already. But the glasses are probably totally agnostic to whether they will be used for AI or just displaying a screen saver :).
I am definitely interested in multi input output glasses. It would revolutionize the field service industry
So these people that never created anything non-digital didn’t think of putting a visual level along with Geo spacing so it can create a shelf outline on the wall ? I guess that would require some lidar
When Glass was available there were barriers with the tools to develop apps. Hopefully, these better will be available for less sophisticated developers. Also battery life and privacy and security rules won't be a roadblock.
Seems super useful, but remains as horrifying a surveillance technology as it ever was, and people shouldn't forget that.
You know they will also be collecting data on what you ask. So will you need a separate data plan for the glasses to be on the net?
There is the option to not take part in that.... have you not been doing that with your phone?
Why are you on TH-cam then these comments are used in training LLMs ?
People are not ready for this. This type of technology will be very, very frightening to many people.
They should make a Laforge viewer
Google maps already does most of that, there was no Gemini tech in that Maps demo.
Similar to reading glasses, I would only wear the XR glasses when I needed them.
But I would like to have them.
This is the next step for AI
It will feel so natural to use.
"Episodic" will the the first use case. I can see mechanics, travelers, medical staff, and gamers using it.
Can't wait til they available, I'm 8n the market for my 1st VR endeavour and have been overwhelmed by all the choices lately haha
Finally I will find my keys on spot.
I’m addicted to my Meta glasses, but I can’t wait for augmented reality. I’m an Apple user so I don’t wanna switch to an android phone but something tells me they’ll be lots of competition out there.
What if you already have to wear glasses to see properly? Will you be able to get prescription versions with your eyesite correction built in? Are the developers in discussions with opticians and glasses manufacturers?
If google says the homosexual seasonal advertisement is a nominal portion of its campaign; why is it the only seasonal ad I have seen from google on my phone?
The data will be handy. Every app sells your data to 6 ish other data businesses. The hit man entering your house can know where you are sleeping and where you put the laptop they came for (besides you). Its so fortunate for this market that people just blindly click thru terms of service, and eveyone including google are just looking at your data as something for them to sell.
Give it two months and there will be an open source solution... and likely the hardware for the open source option will be cheaper too!
I agree that privacy and data collection is going to be an even greater thing with his step up in tech. Everything we do and say will be bought and sold. I hope we developed a privacy focused open source Linux version soon.
i just wish we would get paid for all that data
This is only going to get really good when you couple this technology with neural link. I think it's going to be weird if everybody walks around on the street with classes talking to themselves
"The where did i put my level?" "I saw the level last on the console next to you."
I wonder how far its memory can go back to analyze and give you a proper answer.
Are we available to getthe glasses with prescriptions lenses .
I saw the level last on the console next to you - is this overselling or false advertising from Google again?
The old videos made me think, I hope we don’t lose emotion because at some point losing a loved one is not going to be the same
Not for someone normal
so what
This is a reminder, live and experience the moment. Don't get too attached to gadgets.
That would be pretty much be the end of Social Media including TH-cam though
I already have to wear prescription glasses throughout the day, if the bonus of having ai to it is there, I am totally down for it!!! All love from BR
I prefer to have Ai vision turn off when I don’t need it, like OpenAI have done it in their Advanced Voice project.
Im most excited for this technology, it looks so useful everyday.
This better not be an ecosystem. That's basically the entire reason the Apple vision pro failed that and its high price
I'm gonna wait till they develop more software like filters, so you can just go around watching the world as if you are in a cartoon or a Van Gogh painting.
'Small form factor glasses'?
Dude. They make you look like Millhouse from the Simpsons.
As a glasses wearer, I wouldn't wear ones as thick or as heavy as these ones. Why couldn't a level line be projected using AR onto the wall? I think that would have been more useful.
I would feel uncomfortable on the street talking to myself. You couldn't tell the loonies and Gemini AI users apart! What could it do for you if someone tried to rob you at knifepoint? Would it recognise a robbery taking place and would it send images/video to an emergency contact and automatically call the police? Would it refuse to respond to the thief after its been stolen and could it decide to automatically destroy itself (power surge its chips) so it has no value to the thief?
How would this work for people who already wear glasses?
shelf adjustment level should be shown in glasses!
They show people wearing what appears to be regular spectacles, but what they’re going to deliver are things that look a lot like Apple Vision Pro. Does that mean the spectacles technology doesn’t really exist?
Yeah, it doesn't exist.
It exists in prototype form at Meta. They showed them working ina bunch of demos you can see in TH-cam. But yeah they aren't that thin and light yet. But it's only a matter of time now.
If these don't detect when you get behind the wheel and then turn off the display automatically, we're gonna be seeing many more vehicular accidents and all our insurance rates will go up accordingly.
'a little bit thicker' lol but seriously can't wait
Very cool, but as someone who had to wear glasses since he was four years old I would not wear them all the time. At age 17 I very consciously opted for contact lenses. I understand the form, but I personally do not think this will be the ultimate form, but just a stage.
Next step will be contact lenses for sure.
Will they be available in prescription lenses?!
I will likely be a customer. I've wanted AR glasses for a long time now, and we are finally getting close to them being an actual product. Meta's looked very impressive, but the price tag would've been absurdly high. I'm hoping Samsung & Google can build a similar product, but for a much better price. If these are available under a grand, I'm a customer. What that tech can enable, with a top notch ai in them, is nothing short of remarkable. I can now be a master carpenter, plumber, doctor, mechanic, etc.
They............will.............sell!
I'd wear the glasses a lot but maybe not all day. I would like to try a pair though.
Yeah figure glasses are it..can..
see..them becoming even less intrusive than glasses. Some sort of thin headband perhaps
In 10 years from now it'll be the Google A-Eye Contacts
For all-day glasses, I thought Even Realities G1 had the right idea. But there are still issues with UX design. Too much info can be distracting. One person with the G1 even got into a car accident. And it's pretty light weight in terms of information, requiring one to tilt one's head to get info. Imagine it popping up in the middle of one's field of view. Scrolling through longer text is doable, but more cumbersome.
Also, the Android XR text display (in the demo) presents a very small amount of text. I wonder if this is even useful. I wonder how this would work if one needs a longer output? Currently, I've gotten use to my smart watch for hands free (phone free) notifications. It seems to serve the purpose. I don't know if smart glasses (esp. for daily use), would make life easier, or add more of distraction.
I personally don't mind owning one of those smart glasses but I won't be wearing them for the entire day. I also believe that there's many people who are the same like me, who won't mind owning a smart glasses but to wear them all day long, it will be a problem. 😎💯💪🏾👍🏾
I want to use it during learning languages, electronics & fixing my IKEA furniture. Question: battery life of this glasses
Neuralink is probably the final frontier, not glasses. I do think glasses are the next step, especially something like what Google and Meta are working on. I've heard that Meta won't have production Orion glasses at a price point for a decent sized market for another 5 years or so. I can't imagine Google will beat them to market. Though it's gonna be a while, I do think there will be some significant demand for these things if they can get the pricing right. I have the meta Raybans and love them. they're great sunglasses and for about $80 more, you get some interesting tech that includes bluetooth speakers, direct access to MetaAI, and you can take pictures and videos.
Those demos are aspirational vfx projects… in other words, imaginary. The Meta Ray Bans have natural language, hands free interaction with Ai now plus all the other stuff
Privacy would be definitely the concern if voice control is the only or main interface. Voice as interface also process way too slow. Human love gesture control as it’s much more intuitive and efficient to them- that’s why mouse and touch screen is still our favourite today .
Glasses will only replace the screen. I am looking forward to more creative innovation on the control interface part. Maybe Elon’s idea on how we interact with Machines using “mind” will be our destinations.
I don't know, I really love the tech and innovation behind it, but I am not wearing glasses and I find it weird if even more people are talking to themself ...
Where can I buy them
Captions in real life are gonna be lit!!
Damn and I have developed serious nerd neck for no reason now?!
The only problem with only having glasses is how to scan barcodes to pay.
Well, I guess your glasses can scan, but how does the store scan your barcode from Venmo?
I guess they will solve this
Lots of videos about these at the moment, but won’t catch on and all honest reviewers know this.
Google has to explore this tech, but until it’s a contact lens, it’s not going to succeed
I think it will be an awesome thing to have. I have very good thoughts about the new AI glasses. Although I don't think this is the last we are going to see if AI itself. There is so much more to learn about generative ai. It's not just about the physical devices anymore AI is going to be inputted with a lot of stuff including using it with our everyday lives like video gaming emergance tech lines and other things.
After some testing, gemini 2.0 flash Live-video mode has bad speech recognition, certainly compared to same mode of cGPT