Reading view

There are new articles available, click to refresh the page.

AWE 2024 Panel: The Current State and Future Direction of AR Glasses

Introduction

At AWE 2024, I was on a panel discussion titled “The Current State and Future Direction of AR Glasses.” Jeri Ellsworth, CEO of Tilt Five, Ed Tang, CEO of Avegant, Adi Robertson, Senior Reporter at The Verge, and I were on the panel, with Jason McDowell, The AR Show, moderating. Jason McDowell did an excellent job of moderation and keeping the discussion moving. Still, with only 55 minutes, including questions from the audience, we could only cover a fraction of the topics we had considered discussing. I’m hoping to reconvene this panel sometime. I also want to thank Dean Johnson, Associate Professor at Western Michigan University, who originated the idea and helped me organize this panel. AWE’s video of our panel is available on YouTube.

First, I will outline what was discussed in the panel. Then, I want to follow up on small FOV optical AR glasses and some back-and-forth discussions with AWE Legend Thad Starner.

Outline of the Panel Discussion

The panel covered many topics, and below, I have provided a link to each part of our discussion and added additional information and details for some of the topics.

  • 0:00 Introductions
  • 2:19 Apple Vision Pro (AVP) and why it has stalled. It has been widely reported that AVP sales have stalled. Just before the conference, The Information reported that Apple had suspended the Vision Pro 2 development and is now focused on a lower-cost version. I want to point out that a 1984 128K Mac 1 adjusted for inflation would cost over $7,000 adjusted for inflation, and the original 1977 Apple 2 4K computer (without a monitor or floppy drive) would cost about $6,700 in today’s dollars. I contend that utility and not price is the key problem with the AVP sales volume and that Apple is thus drawing the wrong conclusion.
  • 7:20 Optical versus Passthrough AR. The panel discusses why their requirements are so different.
  • 11:30 Mentioned Thad Starner and the desire for smaller FOV optical AR headsets. It turns out that Thad Starner attended our panel, but as I later found out, he arrived late and missed my mentioning him. Thad, later questioned the panel. In 2019, I wrote the article FOV Obsession, which discussed Thad’s SPIE AR/VR/MR presentation about smaller FOV. Thad is a Georgia Institute of Technology professor and a part-time Staff Researcher at Google (including on Google Glass). He has continuously worn AR devices since his research work at MIT’s media lab in the 1990s.
  • 13:50 Does “tethering make sense” with cables or wirelessly?
  • 20:40 Does an AR device have to work outside (in daylight)?
  • 26:49 The need to add displays to today’s Audio-AI glasses (ex. Meta Ray-Ban Wayfarer).
  • 31:45 Making AR glasses less creepy?
  • 35:10 Does it have to be a glasses form factor?
  • 35:55 Monocular versus Biocular
  • 37:25 What did Apple Vision Pro get right (and wrong) regarding user interaction?
  • 40:00 I make the point that eye tracking and gesture recognition on the “Apple Vision Pro is magical until it is not,” paraphrasing Adi Robertson, and I then added, “and then it is damn frustrating.” I also discuss that “it’s not truly hands-free if you have to make gestures with your hands.”
  • 41:48 Waiting for the Superman [savior] company. And do big companies help or crush innovation?
  • 44:20 Vertical integration (Apple’s big advantage)
  • 46:13 Audience Question: When will AR glasses replace a smartphone (enterprise and consumer)
  • 49:05 What is the first use case to break 1 million users in Consumer AR?
  • 49:45 Thad Starner – “Bold Prediction” that the first large application will be with small FOV (~20 degrees), monocular, and not centered in the user’s vision (off to the ear side by ~8 to 20 degrees), and monochrome would be OK. A smartphone is only about 9 by 15 degrees FOV [or ~20 degrees diagonally when a phone is held at a typical distance].
  • 52:10 Audience Question: Why aren’t more companies going after OSHA (safety) certification?

Small FOV Optical AR Discussion with Thad Starner

As stated in the outline above, Thad Starner arrived late and missed my discussion of smaller FOVs that mentioned Thad, as I learned after the panel. Thad, who has been continuously wearing AR glasses and researching them since the mid-1990s, brings an interesting perspective. Since I first saw and met him in 2019, he has strongly advocated for AR headsets having a smaller FOV.

Thad also states that the AR headset should have a monocular (single-eye) display and be 8—to 20 degrees on the ear side of the user’s straight-ahead vision. He also suggests that monochrome is fine for most purposes. Thad stated that his team will soon publish papers backing up these contentions.

In the sections below, I went from the YouTube transcript and did some light editing to make what was said more readable.

My discussion from earlier in the panel:

11:30 Karl Guttag – I think a lot of the AR or Optical see-through gets confabulated with what was going on in VR because VR was cheap and easy to make a wide field of view by sticking a cell phone with some cheap Optics in front of your face. You get a wide field of view, and people went crazy about that. I made this point years ago on my blog [2019 article FOV Obsession] was the problem. Thad Starner makes this point: he’s one of our Legends at AWE, and I took that to heart many years ago at SPIE AR/VR/MR 2019.

The problem is that as soon as you say beyond about 30-degree field of view, even projecting forward [with technology advancements], as you go beyond 30-degree field of view, you’re in a helmet, something looking like Magic Leap. And Magic Leap ended up in Nowheresville. [Magic Leap] ended up with 25 to 30% see-through, so it’s not really that good see-through, and yet it’s not got the image quality that you would get of an old display shot right in your eyes. You might you could get a better image on an Xreal or something like that.

People are confabulating too many different specs, so they want a wide field of view. The problem is as soon as you say 50 degrees and then you say, yeah, and I need like spatial recognition, I want to do SLAM, and I want to do this, and I want to do that. You’ve now spiraled into the helmet. I mean, you know, Meta was talking the other day about the other panels and said they’re looking at about 50 grams [for the Meta Ray Bans], and my glasses are 23 grams. You’re out of that as soon as you say 50-degree field of view, you’re over 100 grams and and and and and heading to the Moon as you add more and more cameras and all this other stuff, so I think that’s one of our bigger problems whereas AR really Optical AR.

The experiment we’re going to see played out because many companies are working on adding displays to to so called AI audio glasses. We’re going to see if that works because companies are getting ready to make glasses that have 20—to 30-degree field of view glasses tied into AI and audio stuff.

Thad Starner’s comments and the follow-up discussion during the Q&A at the end of the panel:

AWE Legend Thad Starner Wearing Vuzix’s Ultralight Glasses – After the Panel

49:46 Hi, my name is Thad Starner. I’m Professor Georgia Tech. I’m going to make a bold prediction here that the future, at least the first system to sell over a million units, will be a small field of view monocular, non-line-of-sight display, monochrome is okay now; the reason I say that is number one I’ve done different user studies in my lab that we’ll be publishing soon on this subject but the other thing is that you know our phones which is the most popular interface out there are only 9 degrees by 16 degrees field of view. Putting something outside of the line of sight means that it doesn’t interrupt you while you’re crossing the street or driving or flying a plane, right? We know these numbers, so between 8° and 20 degrees towards the ear and plus or minus 8 degrees, I’m looking at Karl [Guttag] here so he can digest all these things.

Karl – I wrote a whole article about it [FOV Obsession]

Thad – And not having a pixel in line of sight, so now feel free to pick me apart and disagree with me.

Jeri-  I want to know a price point.

Thad, I think the first market will be captioning for the heart of hearing, not for the deaf. Also, possible transcription, not translation; at that price point, you’re talking about making reading glasses for people instead of hearing aids. There’s a lot of pushback against hearing, but reading glasses people tend to do, so I’d say you’re probably in the $200 to $300 range.

Ed – I think your prediction is spot on, minus the color green. The only thing I think is that it’s not going to fly.

Thad – I said monochrome is okay.

Ed – I think the monocular field of view is going to be an entry-level product, and you see, I think you will see products that will fit that category with roughly that field of view with roughly that offset angle [not in the center of view] is what you’re going to see in the beginning. Yeah I agree with that but I don’t I think that’s the first step I think you will see a lot of products after that that’s going to do a lot more than monocular monochrome offset displays, start going to larger field of view binocular I think that will happen pretty quickly.

Adi – It does feel like somebody tries to do that every 18 months, though, like Intel tried to make a pair of glasses that did that. It’s a little bit what North did. I guess it’s just a matter of throwing the idea at the wall because I think it’s a good one until it takes.

I was a little taken aback to have Thad call me out as if I had disagreed with him when I had made the point about the advantages of a smaller FOV earlier. Only after the presentation did I find out that he had arrived late. I’m not sure what comment I made that made Thad think I was advocating for a larger FOV in AR glasses.

I want to add that there can be big differences between what consumers and experts will accept in a product. I’m reminded of a story I read in the early 1980s when there was a big debate between very high-resolution monochrome versus lower-resolution color (back then, you could only have one or the other with CRTs) that the head of IBM’s monitor division said, “Color is the least necessary and most desired feature in a monitor.” All the research suggested that resolution was more important for the tasks people did on a computer at the time, but people still insisted on color monitors. Another example is the 1985 New Coke fiasco, in which Coke’s taste studies proved that people liked New Coke better, but it still failed as a product.

In my experience, a big factor is whether the person is being trained to use the device for enterprise or military use versus whether the user is buying it for their own enjoyment. The military has used monochrome displays on devices, including night vision and heads-up displays for decades. I like to point out that the requirement can change if “If the user paid to use versus is paying to use.” Enterprises and the military care about whether the product gets the job done and pay someone to use the device. The consumer has different criteria. I will also agree that there are cases where the user is motivated to be trained, such as Thad’s hard-of-hearing example.

Conclusion on Small FOV Optical AR

First, I agree with Thad’s comments about the smaller FOV and have stated such before. There are also cases outside of enterprise and industrial use where the user is motivated to be trained, such as Thad’s hard-of-hearing example. But while I can’t disagree with Thad or his studies that show having a monocular monochrome image located outside the line of sight is technically better, I think consumers will have a tougher time accepting a monocular monochrome display. What you can train someone to use differs from what they would buy for themselves.

Thad makes a good point that having a biocular display directly in the line of sight can be problematic and even dangerous. At the same time, untrained people don’t like monocular displays outside the line of sight. It becomes (as Ed Tang said in the panel) a point of high friction to adoption.

Based on the many designs I have seen for AR glasses, we will see this all played out. Multiple companies are developing optical see-through AR glasses with monocular green MicroLEDs, color X-cube-based MicroLEDs, and LCOS-based displays with glass form-factor waveguide optics (both diffractive and reflective).

Brilliant Labs Frame AR with AI Glasses & a Little More on the Apple Vision Pro

Introduction

A notice in my LinkedIn feed mentioned that Brilliant Labs has started shipping its new Frame AR glasses. I briefly met with Brilliant CEO Bobak Tavangar at AWE 2023 (right) and got a short demonstration of its “Monocle” prototype. So, I investigated what Brilliant Labs was doing with its new “Frame.”

This started as a very short article, but as I put it together, I thought it would be an interesting example of making design decisions and trade-offs. So it became longer. Looking at the Frames more closely, I found issues that concerned me. I don’t mean to pick on Brillant Labs here. Any hardware device like the Frames is a massive effort, and they talk like they are concerned about their customers; I am only pointing out the complexities of supporting AI with AR for a wide audience.

While looking at how the Frame glasses work, I came across some information related to the Apple Vision Pro’s brightness (in nits), discussed last time in Apple Vision Pro Discussion Video by Karl Guttag and Jason McDowall. In the same way, the Apple Vision Pro’s brightness is being misstated as “5000 nits,” and the Brilliant Labs Frame’s brightness has been misreported as 3,000 nits. In both cases, the nits are the “potential” out of the display and not “to the eye” after the optics.

I’m also repeating the announcement that I will be at SID’s DisplayWeek next week and AWE next month. If you want to meet, please email meet@kgontech.com.

DisplayWeek (next week) and AWE (next month)

I will be at SID DisplayWeek in May and AWE in June. If you want to meet with me at either event, please email meet@kgontech.com. I usually spend most of my time on the exhibition floor where I can see the technology.

If you want to meet, please email meet@kgontech.com.

AWE has moved to Long Beach, CA, south of LA, from its prior venue in Santa Clara, and it is about one month later than last year. Last year at AWE, I presented Optical Versus Passthrough Mixed Reality, available on YouTube. This presentation was in anticipation of the Apple Vision Pro.

At AWE, I will be on the PANEL: Current State and Future Direction of AR Glasses on Wednesday, June 19th, from 11:30 AM to 12:25 PM.

There is an AWE speaker discount code – SPKR24D , which provides a 20% discount, and it can be combined with Early Bird pricing (which ends May 9th, 2024 – Today as I post this). You can register for AWE here.

Brilliant Labs Monocle & Frame “Simplistic” Optical Designs

Brillian Labs Monocle and Frame used the same basic optical architecture, but it is better hidden in the Frame design. I will start with the Monocle, as it is easier to see the elements and the light path. I was a little surprised that both designs use a very simplistic, non-polarized 50/50 beam splitter with its drawbacks.

Below (left) is a picture of the Monocle with the light path (in green). The Monocle (and Frame) both use a non-polarizing 50/50 beamsplitter. The splitter projects 50% of the display’s light forward and 50% downward to the (mostly) spherical mirror, magnifying the image and moving the apparent focus. After reflecting from the mirror, the light is split again in half, and ~25% of the light goes to the eye. The front project image will be mirrored, with an unmagnified view of the display that will be fairly bright. Front projection or “eye glow” is generally considered undesirable in social situations and is something most companies try to reduce/eliminate in their optical designs.

The middle picture above shows a picture I took of the Monocle from the outside, and you can see the light from the beam splitter projecting forward. Figures 5A and 6 (above right) from Brilliant Labs’ patent application illustrate the construction of the optics. The Monocle is made with two solid optical parts, with the bottom part forming part of the beam splitter and the bottom surface being shaped to form the curved mirror and then mirror coated. An issue with the 2-piece Monocle construction is that the beam splitter and mirror are below eye level, which requires the user to look down to see the image or position the whole device higher, which results in the user looking through the mirror.

The Frame optics work identically in function, but the size and spacing differ. The optics are formed with three parts, which enables Brilliant to position the beam splitter and mirror nearer the center of the user’s line of sight. But as Brilliant Lab’s documentation shows (right), the new Frame glasses still have the virtual (apparent) image below the line of sight.

Having the image below the line of sight reduces the distortion/artifacts of the real world by looking through the beam splitter when looking forward, but it does not eliminate all issues. The top seam of the beam splitter will likely be visible as an out-of-focus line.

The image below shows part of the construction process from a Brilliant Labs YouTube video. Note that the two parts that form the beamsplitter with its 50/50 semi-mirror coating have already been assembled to form the “Top.”

The picture above left is of a prototype taken by Forbes’ author Ben Sin of a Frame prototype from his article Frame Is The Most ‘Normal’ Looking AI Glasses I’ve Worn Yet. In this picture, the 50/50 beam splitter is evident.

Two Types of Birdbath

As discussed in Nreal Teardown: Part 1, Clones and Birdbath Basics and its Appendix: Second Type of Birdbath, there are two types of “birdbaths” used in AR. The Birdbath comprises a curved mirror (or semi-mirror) and a beamsplitter. It is called a “birdbath” because the light reflects out of the mirror. The beamsplitter can be polarized or unpolarized (more on this later). Birdbath elements are often buried in the design, such as the Lumus optical design (below left) with its curved mirror and beam splitter.

From 2023 AR/VR/MR Lumus Paper – A “birdbath” is one element of the optics

Many AR glasses today use the birdbath to change the focus and act as the combiner. The most common of these designs is where the user looks through a 50/50 birdbath mirror to see the real world (see Nreal/Xreal example below right). In this design, a polarised beam splitter is usually used with a quarter waveplate to “switch” the polarization after the reflection from the curved semi-mirror to cause the light to go through the beam splitter on its second pass (see Nreal Teardown: Part 1, Clones and Birdbath Basics for a more detailed explanation). This design is what I refer to as a “Look through the mirror” type of birdbath.

Brilliant Labs uses a “Look through the Beamsplitter” type of birdbath. Google Glass is perhaps the most famous product with this birdbath type (below left). This birdbath type has appeared in Samsung patents that were much discussed in the electronic trade press in 2019 (see my 2019 Samsung AR Design Patent—What’s Inside).

LCOS maker Raontech started showing a look through the beamsplitter reference design in 2018 (below right). The various segments of their optics are labeled below. This design uses a polarizing beam splitter and a quarter waveplate.

Brilliant Labs’ Thin Beam Splitter Causes View Issues

If you look at the RaonTech or Google Glass splitter, you should see that the beam splitter is the full height of the optics. However, in the case of the Frames and Monocle designs (right), the top and bottom beam splitter seams, the 50/50 mirror coating, and the curved mirror are in the middle of the optics and will be visible as out-of-focus blurs to the user.

Pros and Cons of Look-Through-Mirror versus Look-Through-Beamsplitter

The look-through-mirror birdbaths typically use a thin flat/plate beam splitter, and the curved semi-mirror is also thin and “encased in air.” This results in them being relatively light and inexpensive. They also don’t have to deal with the “birefringence” (polarization changing) issues associated with thick optical materials (particularly plastic). The big disadvantage of the look-through-mirror approach is that to see the real world, the user must look through both the beamsplitter and the 50/50 mirror; thus, the real world is dimmed by at least 75%.

The look-through-beamsplitter designs encase the entire design in either glass or plastic, with multiple glued-together surfaces coated or coated with films. The need to encase the design in a solid means the designs tend to be thicker and more expensive. Worse yet, typical injected mold plastics are birefringent and can’t be used with polarized optics (beamsplitters and quartwaveplates). Either heavy glass or higher-cost resin-molded plastics must be used with polarized elements. Supporting a wider FOV becomes increasingly difficult as a linear change in FOV results in a cubic increase in the volume of material (either plastic or glass) and, thus, the weight. Bigger optics are also more expensive to make. There are also optical problems when looking through very thick solid optics. You can see in the Raontech design above how thick the optics get to support a ~50-degree FOV. This approach “only” requires the user to look through the beam splitter, and thus the view of the real world is dimmed by 50% (or twice as much light gets through as the look-through-mirror method).

Pros and Cons Polarized Beam Splitter Birdbaths

Most companies with look-through-mirror and look-through-beamsplitter designs, but not Brilliant Labs, have gone with polarizing beam splitters and then use quarter waveplates to “switch” the polarization when the light reflects off the mirror. Either method requires the display’s light to make a reflective and transmissive pass via the beam splitter. With a non-polarized 50/50 beam splitter, this means multiplicative 50% losses or only 25% of the light getting through. With a polarized beam splitter, once the light is polarized with a 50% loss, with proper use of quarter waveplates, there are no more significant losses with the polarized beamsplitter.

Another advantage of the polarized optics approach is that front-projection can be mostly eliminated (there will be only a little due to scatter). The look-through-mirror method can be accomplished (as discussed in Nreal Teardown: Part 1, Clones and Birdbath Basics) with a second-quarter waveplate and a front polarizer. With the look-through-beamsplitter method, a polarizer before the beamsplitter will block the light that would project forward off the polarized beamsplitter.

As mentioned earlier, using polarized optics becomes much more difficult with the thicker solid optics associated with the look-through-beamsplitter method.

Brilliant Labs Frame Design Decision Options

It seems that at every turn in the decision process for the Frame and Monocle optics, Brilliant Labs chose the simplest and most economical design possible. By not using polarized optics, they gave up brightness and will have significant front projection. Still, they can use much less expensive injection-molded plastic optics that do not require polarizers and quart waveplates. They avoided using more expensive waveguides, which would be thinner but require LCOS or MicroLED (inorganic LED) projection engines, which may be heavier and larger. Although, the latest LCOS and MicroLED engines are getting to be pretty small and light, particularly for a >30-degree FOV (see DigiLens, Lumus, Vuzix, Oppo, & Avegant Optical AR (CES & AR/VR/MR 2023 Pt. 8)).

Frames Brightness to the Eye – Likely >25% of 3,000 nits – Same Problem as Apple Vision Pro Reporting

As discussed in the last article on the Apple Vision Pro (AVP) in the Appendix: Rumor Mill’s 5,000 Nits Apple Vision Pro, reporters/authors constantly make erroneous comparisons of “display-out nits” with one device and to the nits-to-the-eye of other devices. Also, as stated last time, the companies appear to want this confusion by avoiding specifying the nits to the eye as they benefit from reporters and others using display device values.

I could not find an official Brilliant Labs value anywhere, but it seems to have told reporters that “the display is 3,000 nits,” which may not be a lie, but it is misleading. Most articles will dutifully give the “display number” but fail to say that they are “display device nits” and not what the user will see and leave it to the readers to make the mistake, while other reporters will make the error themselves.

Digitrends:

The display on Frame is monocular, meaning the text and graphics are displayed over the right eye only. It’s fairly bright (3,000 nits), though, so readability should be good even outdoors in sunlit areas.

Wearable:

As with the Brilliant Labs Monocle – the clip-on, open-source device that came before Frame – information is displayed in just one eye, with overlays being pumped out at around 3,000 nits brightness.

Android Central in androidcentral’s These AI glasses are being backed by the Pokemon Go CEO, who was at least making it clear that it was the display device numbers, but I still think most readers wouldn’t know what to do with this number. They added the tidbit that the panels were made by Sony, and they discussed pulse with modulation (also known as duty cycle). Interestingly, they talk about a short on-time duty cycle causing problems for people sensitive to flicker. In contrast, VR game fans favor a very short on-time duty cycle, what Brad Lynch of SadlyItsBradly refers to as low-persistence) to reduce blurring.

androidcentral’s These AI glasses are being backed by the Pokemon Go CEO

A 0.23-inch Sony MicroOLED display can be found inside one of the lenses, emitting 3,000 nits of brightness. Brilliant Labs tells me it doesn’t use PWM dimming on the display, either, meaning PWM-sensitive folks should have no trouble using it.

Below is a summary of Sony OLED Microdisplays aimed at the AR and VR market. On it, the 0.23 type device is listed with a max lumence of 3,000 nits. However, from the earlier analysis, we know that at most 25% of the light can get through Brilliant Labs Frame birdbath optics or at most 750 nits (likely less due to other optical losses). This number assumes that the device is driven at full brightness and that Brilliant Labs is not buying derated devices at a lower price.

I can’t blame Brilliant Labs because almost every company does the same in terms of hiding the ball on to-the-eye brightness. Only companies with comparatively high nits-to-the-eye values (such as Lumus) publish this spec.

Sony Specifications related to the Apple Vision Pro

The Sony specifications list a 3.5K by 4 K device. The industry common understanding is that Apple designed a custom backplane for the AVP but then used Sony’s OLED process. Notice the spec of 1,000 cd/m2 (candelas per meter squared = nits) at a 20% duty ratio. While favorable for VR gamers wanting less motion blur, the low on-duty cycle time is also a lifetime issue. The display device probably can’t handle the heat from being driven for a high percentage of the time.

It would be reasonable to assume that Apple is similarly restricted to about a 20% on-duty cycle. As I reported last time in the Apple Vision Pro Discussion Video by Karl Guttag and Jason McDowall, I have measured the on-duty cycle of the AVP to be about 18.4% or close to Sony’s 20% for their own device.

The 5,000 nits cited by MIT Tech Review are the raw displays before the optics, whereas the nits for the MQ2 were those going to the eye. The AVP’s (and all other) pancake optics transmit about 11% (or less) of the light from an OLED in the center. With Pancake optics, there is the polarization of the OLED (>50% loss), a transmissive pass, and a reflective pass through a 50/50 mirror, which starts with at most 12.5% (50% cubed) before considering all the other losses from the optics. Then, there is the on-time-duty cycle of the AVP, which I have measured to be about 18.4%. VR devices want the on-time duty cycle to be low to reduce motion blur with the rapid motion of the head and 3-D game. The MQ3 only has a 10.3% on-time duty cycle (shorter duty cycles are easier with LED-illuminated LCDs). So, while the AVP display devices likely can emit about 5,000 nits, the nits reaching the eye are approximately 5,000 nits x 11% x 18.4% = 100 nits.

View Into the Frame Glasses

I don’t want to say that Brilliant Labs is doing anything wrong or that other companies don’t often do the same. Companies often take pictures and videos of new products using non-functional prototypes because the working versions aren’t ready when shooting or because they look better on camera. Still, I want to point out something I noticed with the pictures of the CEO, Bobak Tavangar (right), that was published in many of the articles in the Frames glasses. I didn’t see the curved mirror and the 50/50 beam splitter.

In a high-resolution version of the picture, I could see the split in the optics (below left) but not the darkened rectangle of the 50/50 mirror. So far, I have found only one picture of someone wearing the Frame glasses from Bobak Tavangar’s post on X. It is of a person wearing what appears to be a functional Frame in a clear prototype body (below right). In the dotted line box, you can see the dark rectangle from the 50/50 mirror and a glint from the bottom curved mirror.

I don’t think Brilliant Labs is trying to hide anything, as I can find several pictures that appear to be functional frames, such as the picture from another Tavangar post on X showing trays full of Frame devices being produced (right) or the Forbes picture (earlier in the Optical section).

What was I hoping to show?

I’m trying to show what the Frame looks like when worn to get an idea of the social impact of wearing the glasses. I was looking for a video of someone wearing them with the Frame turned on, but unfortunately, none have surfaced. From the design analysis above, I know they will project a small but bright image view with a mirror image of the display off of the 50/50 mirror, but I have not found an image showing the working device from the outside looking in.

Exploded View of the Frame Glasses

The figure below is taken from Brilliant Lab’s online manual for the Frame glasses (I edited it to reduce space and inverted the image to make it easier to view). By AR glasses standards, the Frame design is about as simple as possible. The choice of two nose bridge inserts is not shown in the figure below.

There is only one size of glasses, which Brilliant Labs described in their AMA as being between a “medium and large” type frame. They say that the temples are flexible to accommodate many head widths. Because the Frames are monocular, IPD is not the problem it would be with a biocular headset.

AddOptics is making custom prescription lenses for the Frames glasses

Brilliant Labs is partnering with AddOptics to make prescription lenses that can be ‘Precision Bonded’ to Frames using a unique optical lens casting process. For more on AddOptics, see CES 2023 (Part 3) – AddOptics Custom Optics and my short follow-up in Mixed Reality at CES & AR/VR/MR 2024 (Part 2 Mostly Optics).

Bonding to the Frames will make for a cleaner and more compact solution than the more common insert solution, but it will likely be permanent and thus a problem for people whose prescriptions change. In their YouTube AMA, Brilliant Labs said they are working with AddOptics to increase the range of prescription values and support for astigmatism.

They didn’t say anything about bifocal or progressive lens support, which is even more complicated (and may require post-mold grinding). As the virtual image is below the centerline of vision, it would typically be where bifocal and progressive lenses would be designed for reading distance (near vision). In contrast, most AR and VR glasses aim to put the virtual image at 2 meters, considered “far vision.”

The Frame’s basic specs

Below, I have collected the basic specs on the Frame glasses and added my estimate for the nits to the eye. Also shown below is their somewhat comical charging adapter (“Mister Charger”). None of these specs are out of the ordinary and are generally at the low end for the display and camera.

  • Monocular 640×400 resolution OLED Microdisplay
  • ~750nits to the eye (based on reports of a 3,000 Sony Micro-OLED display device)
    • (90% on-time duty cycle using an
  • 20-Degree FOV
  • Weight ~40 grams
  • 1280×720 camera
  • Microphone
  • 6 axis IMU
  • Battery 222mAh  (plus 149mAh top-up from charging adapter)
    • With 80mA typical power consumption when operating 0.580 on standby)
  • CPU nRF52840 Cortex M4F (Nordic ARM)
  • Bluetooth 5.3

Everything in AR Today is “AI”

Brilliant Labs is marketing the frames as “AI Glasses.” The “AI” comes from Brilliant Lab’s Noa ChatGPT client application running on a smartphone. Brillant Labs says the hardware is “open source” and can be used by other companies’ applications.

I’m assuming the “AI” primarily runs on the Noa cell phone application, which then connects to the cloud for the heavy-lifting AI. According to their video by Brillant Labs, while on the Monocle, the CPU only controls the display and peripherals, they plan to move some processing onto the Frame’s more capable CPU. Like other “AI” wearables, I expect simple questions will get immediate responses while complex questions will wait on the cloud.

Conclusions

To be fair, designing glasses and wearable AR products for the mass market is difficult. I didn’t intend to pick on Brilliant Lab’s Frames; instead, I am using it as an example.

With a monocular, 20-degree FOV below the center of the person’s view, the Frames are a “data snacking” type AR device. It is going to be competing with products like the Human AI projector (which is a joke — see: Humane AI – Pico Laser Projection – $230M AI Twist on an Old Scam), the Rabbit R1, Meta’s (display-less) Ray Ban Wayfarer, other “AI” audio glasses, and many AR-AI glasses similar to the Frame that are in development.

This blog normally concentrates on display and optics, and on this score, the Frame’s optics are a “minimal effort” to support low cost and weight. As such, they have a lot of problems, including:

  • Small 20-degree FOV that is set below the eyes and not centered (unless you are lucky with the right IPD)
  • Due to the way the beam 50/50 splitter cuts through the optics, it will have a visible seam. I don’t think this will be pleasant to look through when the display is off (but I have not tried them yet). You could argue that you only put them on “when you need them,” but that negates most use cases.
  • The support for vision correction appears to lock the glasses to a single (current) prescription.
  • Regardless of flexibility, the single-size frame will make the glasses unwearable for many people.
  • The brightness to the eye of probably less than 750 nits is not bright enough for general outdoor use in daylight. It might be marginal if used combined with clip-on sunglasses or if they are used in the shade.

As a consumer, I hate the charger adapter concept. Why they couldn’t just put a USB-C connector on the glasses is beyond me and a friction point for every user. Users typically have dozens of USB-C power cables today, but your device is dead if you forget or lose the adaptor. Since these are supposed to be prescription glasses, the idea of needing to take them off to charge them is also problematic.

While I can see the future use model for AI prescription glasses, I think a display, even one with a small FOV, will add significant value. I think Brillant Labs’s Frames are for early adopters who will accept many faults and difficulties. At least they are reasonably priced at $349, by today’s standards, and don’t require a subscription for basic services without too many complex AI queries requiring the cloud.

❌