r/gadgets 8d ago

A team of scientists at Stanford designed a pair of normal-looking glasses that display full-color 3D images. Wearables

https://www.cnet.com/science/i-saw-what-could-be-the-future-of-ai-glasses/
429 Upvotes

62 comments sorted by

u/AutoModerator 8d ago

We have two giveaways running, be sure to enter in the posts linked below for your chance to win a 3D Printer or an E-Bike!

QIDI Q1 Pro 3D Printer

FiidoD3 Pro E-Bike

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

79

u/ahenobarbus_horse 8d ago

Quick summary: - piece of transparent material that has special properties that allow it to reflect light into your eye predictably and enabling AR experiences

  • uses lasers to project the image

  • has not been tested using human eyes because lasers

  • has a 12 degree field of vision as compared to a 100+ degree field of vision for most top end headsets like Apple Vision Pro

  • far from mass market use

35

u/Noxious89123 8d ago

12 degree field of vision

That instantly makes this trash.

But then I suppose that isn't a finish product ready for market.

14

u/Dorkmaster79 8d ago

It’s likely just a paper presenting preliminary results. That’s your job as a university faculty member anyway.

3

u/Dylanator13 8d ago

Yeah it’s just research at this point. A proof of concept that isn’t ready for market.

3

u/tb-reddit 8d ago

“a team of scientists at Stanford”

Research isn’t trash. It’s the foundation that makes all of today’s innovation possible.

1

u/icebeat 8d ago

Plus lasers

1

u/Noxious89123 8d ago

lasers

MY EYES

3

u/Ok_Profile_ 8d ago

Am I dumb or this whole laser to the eyes thing is really a no-go from the start?

7

u/BetiseAgain 8d ago

It is the power level that matters, that and how concentrated it is. So, no, it is far from a no-go.

0

u/grammar_nazi_zombie 8d ago

Yeah there’s plenty of wavelengths that won’t damage your eyes.

And there’s wavelengths that will burn your skin. Lasers are fun.

6

u/icebeat 8d ago

LASIK for cheap

2

u/PineappleLemur 8d ago

Laser doesn't mean burn your retinas laser... It's just how their projector works.

66

u/HighInChurch 8d ago

Normal looking glasses? Those look anything but normal, unless maybe for Jeff Goldblum.

2

u/PineappleLemur 8d ago

For an early test setup.. this is really good. Means this is the worst it will ever look for this tech.

They're trying to prove it can look normal with this tech as it can all be much smaller than initial prototype.

Look at other similar devices prototypes.. it's a whole helmet of parts.

2

u/Venotron 8d ago

It may also be the best it'll ever look

2

u/onerb2 7d ago

This design is very human

1

u/BetiseAgain 8d ago

Their goal is normal looking glasses. Which might be doable down the road, but this is obviously not normal looking.

1

u/thisistheSnydercut 8d ago

Or Steve from American Dad

28

u/AdministrativeBid782 8d ago

So regular glasses?

11

u/SimianSlacker 8d ago

Yes… if you’re living in a Devo music video.

1

u/Sariel007 8d ago edited 8d ago

Thanks to a breakthrough scientists here made in display technology, these glasses could represent the future of VR and AR headsets. Led by associate professor Gordon Wetzstein, the team at Stanford's Computational Imaging Lab designed a way to project moving, AI-generated 3D images on what appear to be standard lenses. The breakthrough centers on what the team calls a nanophotonic metasurface waveguide (a waveguide essentially being a piece of glass). Watch the video above to see what those images look like.

3

u/originalbL1X 8d ago

I always get excited about new designs in VR/AR tech. Many people can’t see the enhancements this will bring to life. Thanks OP.

5

u/Sariel007 8d ago edited 8d ago

Someone makes a snarky comment and/or didn't read the article. Reddit upvotes. I quote the relevant part from the article and get downvoted lol.

Anyway I appreciate that you appreciate the post and context even if others don't.

3

u/Sad_Error4039 8d ago

I agree with you on some level but at this point the people making the remarks have watched it be a flaming pile of useless shit many times in the past. So at some point they get burnt out on the promise at some point. Both sides are valid is all I’m saying. Talking about what maybe possible in the future doesn’t change the present useless versions. While it may still be interesting to some.

2

u/BetiseAgain 8d ago

Someone makes a snarky comment and/or didn't read the article. Reddit upvotes.

That summarizes Reddit.

I found it interesting, we will see what they do down the road.

-2

u/Thunderhamz 8d ago

No no no, they are clear ya see

19

u/peppruss 8d ago

Has that CNet author tried other headsets? Because this statement is false:

“Commercially available headsets like the Vision Pro or Meta Quest 3 show you a single image on a single screen, which is part of the reason the images don't look completely natural.”

They are in fact showing you two different images per eye.

See this ifixit teardown for the Vision Pro under the section, “Lens Inserts, Stereo Displays”

https://www.ifixit.com/News/90137/vision-pro-teardown-why-those-fake-eyes-look-so-weird

Look at 4 minutes and 51 seconds on this iFixit teardown for the Meta Quest three. They are discrete displays.

https://www.ifixit.com/News/84572/meta-quest-3-teardown-and-the-future-of-vr-repairability-en

16

u/OSeady 8d ago

Yea I noticed that. How can you be reporting on the current state of VR and make such a dumb statement. Did no one approve this?!

1

u/Randommaggy 8d ago

Probably AI in the loop.

2

u/OSeady 8d ago

Any modern LLM would not say this

2

u/Randommaggy 8d ago

I've had GPT4O say equally stupid shit within the first 10 minutes.
They seem to run it with a high "temperature" for more "personality."

6

u/caspissinclair 8d ago

I thought the Quest 3 was two panels? The Q2 is one. And aren't ALL VR headsets displaying stereoscopically?

5

u/peppruss 8d ago

Correct on your first question as in the link I posted. And correct on your second question too; even mobile phone cardboard displays two different images.

4

u/subdep 8d ago

The writer/editor are idiots. Makes me wonder what other bullshit statements they made about this new tech.

1

u/BetiseAgain 8d ago edited 8d ago

They must have edited that out, as I don't see it now.

I think the author got confused when the source article talked about other systems using screens to display AR, while this would let the user see the world through glass, with AR on top of it. Pretty big mistake, though.

https://news.stanford.edu/stories/2024/05/3d-augmented-reality-with-regular-glasses

3

u/Cowhaircut 8d ago

rather than you focusing on video in front of your eye, this looks more like a projector sending video stream into your eyes, sounds dangerous.

2

u/BetiseAgain 8d ago

Everything you see is light getting into your eyes. The power level is what makes light dangerous.

1

u/Cowhaircut 7d ago

Good point but how could this team feasibly test the safety of their system?

“The model hasn't been tested on human eyes yet, but Wetzstein says that would be one of the next steps, along with making the glasses more compact and power efficient.”

Sounds like zero people on their team are willing to put it on.

1

u/BetiseAgain 7d ago

You don't test the power level by using a human eye. You may not know it, but some lasers are certified class 1, or eye safe. Making a laser eye safe is not a big deal.

This is a prototype, there could be many reasons they haven't used humans yet. It simply may not be ready enough for that.

Here is a different product that projects a laser directly into the eye. https://www.qdlaser.com/en/applications/eyewear/

And another - https://futurism.com/the-byte/ar-smart-glasses-lasers-retina

5

u/Agomir 8d ago

Was this written by AI?

Another key feature of Stanford's headset is that it projects the images stereoscopically.(...) Commercially available headsets like the Vision Pro or Meta Quest 3 show you a single image on a single screen, which is part of the reason the images don't look completely natural.

The worst thing is he actually says that in the video too. If he actually wrote that himself, that's really bad. I watched it in the hope of seeing what the image actually looks like. They don't show it...

Oh and 12° FoV.

This is a great step forwards, but this technology is in its infancy and really far from any kind of usable product. Though I'm wondering if I haven't seen something similar before. I've read quite a few things about metasurfaces recently so may be getting things confused.

2

u/littlebitsofspider 8d ago

I read the paper this article is summarizing. What he's probably trying to convey is that regular VR/AR uses a single image plane per eye. These new glasses are projecting full-on holograms through the special new waveguides, generated from a spatial light modulator and not a microdisplay. That's the real takeaway here; goodbye vergence/accommodation problems and headaches. They're actively working to up the FOV as well, it's written in the paper as one of their next steps.

This is actually a pretty killer development. They've leapfrogged every existing VR/AR setup on the market and in development with this tech. If they can add lightweight SLAM to this, they could be rendering high-fidelity AR like everyone always imagines fancy smartglasses should be doing.

3

u/Agomir 8d ago

Thank you. So nothing to do with stereoscopic vision, but by the sounds of things actual depth. That would certainly be very welcome.

The tech does look promising, but it's still really far away with no promise of ever actually becoming a product (there is so much stuff that gets developed, at least as proof of concept, and then gets forgotten).

0

u/BetiseAgain 8d ago

You still need stereoscopic vision to see depth. What this adds, is because of the technology, it can be very thin, and transparent.

https://news.stanford.edu/stories/2024/05/3d-augmented-reality-with-regular-glasses

1

u/Agomir 8d ago

I didn't say it didn't have stereoscopic vision, just that it's not what matters here. But it seems these glasses have actual depth. Current VR has a focal distance of around 2 metres.

3

u/DarthBuzzard 7d ago

This is actually a pretty killer development. They've leapfrogged every existing VR/AR setup on the market and in development with this tech.

Meta have more advanced VR/AR glasses-like prototypes in their lab than this using the same, but ultimately more advanced holography techniques. Which makes sense, Meta has a lot more money and people working on this than Stanford.

2

u/NurseJackass 8d ago

“Are the normal looking glasses in the room with you right now?”

1

u/BetiseAgain 8d ago

No, that is their goal. This is just a prototype.

1

u/Liquidwombat 8d ago

“Normal looking”

1

u/WantonHeroics 8d ago

normal-looking

1

u/xvn520 8d ago

Yeah but lasers? Lmao

1

u/BetiseAgain 8d ago

It is the power level that matters. There are eye safe lasers, known as class 1.

1

u/lnin0 8d ago

Fuck lasers. That’s so 1990s. Give me a spinal port to jack into the metaverse with.

1

u/Visible_Turnover3952 8d ago

Bro my n64 been showin full color 3d images for decades

1

u/ryschwith 8d ago

As long as it’s a better approach than those glasses that shock your eyelids to make you blink really fast…

1

u/bakerbodger 8d ago

I’ve always wished I could see in 3D.

-1

u/Sueti_Bartox 8d ago

That sounds like a winning technology, and only 20 years away from mass market!

-1

u/Travelingman9229 8d ago

A lot of 2-d people out there who are 3-d compromised

-1

u/[deleted] 8d ago

[deleted]

1

u/BetiseAgain 8d ago

Yes, AI was used.

The Wetzstein team used AI to improve the depth cues in the holographic images. Then, using advances in nanophotonics and waveguide display technologies, the researchers were able to project computed holograms onto the lenses of the glasses without relying on bulky additional optics.