Apple Vision Pro Discussion Video by Karl Guttag and Jason McDowall
Introduction
As discussed in Mixed Reality at CES and the AR/VR/MR 2024 Video (Part 1 – Headset Companies), Jason McDowall of The AR Show recorded over four hours of video discussing the 50 companies I met at CES and AR/VR/MR. The last thing we discussed for about 50 minutes was the Apple Vision Pro (AVP).
The AVP video amounts to a recap of the many articles I have written on the AVP. Where appropriate, I will give links to my more detailed coverage in prior articles and updates rather than rehash that information in this article.
It should be noted that Jason and I recorded the video on March 25th, 2024. Since then, there have been many articles from tech magazines saying the AVP sales are lagging, often citing Bloomberg’s Mark Gurman’s “Demand for demos is down” and Analyst Ming Quo reporting, “Apple has cut its 2024 Vision Pro shipments to 400–450k units (vs. market consensus of 700–800k units or more).” While many reviewers cite the price of the AVP, I have contended that price was not the problem as it was in line with a new high-tech device (adjusted for inflation, it is about the same price as the first Apple II). My criticism focuses on the utility and human factors. In high-tech, the cost is usually a fixable problem with time and effort, and people will pay more if something is of great utility.
I said the Apple Vision Pro would have utility problems before it was announced. See my 2023 AWE Presentation “Optical Versus Passthrough Mixed Reality“) and my articles on the AVP. I’m not about bashing a product or concept; when I find faults, I point them out and show my homework, so to speak, on this blog and in my presentations.
Before the main article, I want to repeat the announcement that I plan to go to DisplayWeek in May and AWE in June. I have also included a short section on YouTube personality/influence Marques Browlee’s Waveform Podast and Hugo Barra’s (former Head of Oculus at Meta) blog article discussing my controversial (but correct) assessment that the Apple Vision Pro’s optics are slightly out of focus/blurry.
DisplayWeek and AWE
I will be at SID DisplayWeek in May and AWE in June. If you want to meet with me at either event, please email meet@kgontech.com. I usually spend most of my time on the exhibition floor where I can see the technology.
AWE has moved to Long Beach, CA, south of LA, from its prior venue in Santa Clara, and it is about one month later than last year. Last year at AWE, I presented Optical Versus Passthrough Mixed Reality, available on YouTube. This presentation was in anticipation of the Apple Vision Pro.
At AWE, I will be on the PANEL: Current State and Future Direction of AR Glasses on Wednesday, June 19th, from 11:30 AM to 12:25 PM with the following panelists:
- Jason McDowall – The AR Show (Moderator)
- Jeri Ellsworth – Tilt Five
- Adi Robertson – The Verge
- Edward Tang – Avegant
- Karl M Guttag – KGOnTech
There is an AWE speaker discount code – SPKR24D , which provides a 20% discount, and it can be combined with Early Bird pricing (which ends May 9th, 2024). You can register for AWE here.
“Controversy” of the AVP Being a Little Blurry Discussed on Marques Brownlee’s Podcast and Hugo Barra’s Blog
As discussed in Apple Vision Pro – Influencing the Influencers & “Information Density,” which included citing this blog on Linus Tips, this blog is read by other influencers, media, analysts, and key people at AR/VR/MR tech companies.
Marques Brownlee (MKBHD), another major YouTube personality, Waveform Podcast/WVFRM YouTube channel, discussed (link to the YouTube discussion) my March 1st article on Apple Vision Pro’s Optics Blurrier & Lower Contrast than Meta Quest 3. Marques discussed Hugo Barra’s (former Head of Oculus at Meta) blog’s March 11, 2024 “Hot Take” article (about 1/3rd of the way down) on my blog article.
According to MKBHD and Hugo Barra, my comments about Vision Pro are controversial, but they agree that it would make sense based on my evidence and their experience. My discussion with Jason was recorded before the Waveform Podcast came out. I’m happy to defend and debate this issue.
Outline of the Video and Additional Information
The Video The times in blue on the left of each subsection give the link to the YouTube video section discussing that subject.
00:16 Ergonomics and Human Factors
I wrote about the issues with the AVP’s human factors design in Apple Vision Pro (Part 2) – Hardware Issues Mechanical Ergonomics. In a later article in CES Part 2, I compared the AVP to the new Sony XR headset in the Sony XR (and others compared to Apple Vision Pro) section.
08:23 Lynx and Hypervision
I wrote the article comparing the new Sony XR headset to the AVP mentioned the Lynx R1, first shown in 2021, in this comparison. But I didn’t realize how much they were alike until I saw a post somewhere (I couldn’t find it again) by Lynx’s CEO, Stan Larroque saying how much they were alike. It could be a matter of form following function, but how much they are alike from just about any angle is rather striking.
While on the subject of Lynx and Apple. Lynx used optic by Limbak for the Lynx R1. As I broke in December 2022 Limbak Bought by “Large US Company” (which soon was revealed as Apple) and discussed in more detail in a 2022 Video with Brad Lynch, I don’t like the R1’s Limbak “catadioptric” (combined mirror and refractive) optics. While the R1 optics are relatively thin, like pancake optics, they cause a significant loss of resolution due to their severe distortion, and worse, they have an optical discontinuity in the center of the image unless the eye is perfectly aligned.
In May 2023, Lynx and Hypervision announced that they were working together. In Apple Vision Pro (Part 4)—Hypervision Pancake Optics Analysis, Hypervision detailed the optics of the Apple Vision Pro. That article also discusses the Hypervision pancake optics it was showing at AR/VR/MR 2023. Hypervision demonstrated single pancake optics with a 140-degree FOV (the AVP is about 90 degrees) and blended dual pancake optics with a 240-degree FOV (see below right).
10:59 Big Screen Beyond Compared to AVP Comfort Issues
When I was at the LA SID One Day conference, I stopped by Big Screen Beyond to try out their headset. I wore Big Screen’s headset for over 2 hours and didn’t have any of the discomfort issues I had with the AVP. With the AVP, my eyes start bothering me after about 1/2 hours and are pretty sore by 1 hour. There are likely two major factors: one is that the AVP is applying pressure to the forehead, and the other is that something is not working right optically with the AVP.
Big Screen Beyond has a silicon gel-like custom interface that is 3-D printed based on a smartphone face scan. Like the AVP, they have magnetic prescription inserts. While the Big Screen Beyond was much more comfortable, the face interface has a large contact area with the face. While not that uncomfortable, I would like something that breathed more. When you remove the headset, you can feel the preparation evaporating from where the interface was contacting your face. I can’t imagine anyone wearing makeup being happy (the same with the with the AVP or any headset that presses against the face).
On a side note, I was impressed by Big Screen Beyond’s statement that it is cash flow positive. It is a sign that they are not wildly spending money on frills and that they understand the market they are serving. They are focused on serving dedicated VR gamers who want to connect the headset to a powerful computer.
Related to the Big Screen Beyond interface, a tip I picked up on Reddit is that you can use a silicon face pad made for the Meta Quest 2 or 3 on the AVP’s face interface (see above right). The silicon face pad gives some grip to the face interface and reduces the pressure required to hold the AVP steady. The pad adds about 1mm, but it so happens that I had recently swapped my original AVP face interface for one that is 5mm shorter. Now, I barely need to tighten the headband. A downside to the silicon pad, like the Big Screen Beyond, is that it more or less forms a seal with your face, and you can feel the perspiration evaporating when you remove it.
13:16 Some Basic AVP Information
In the video, I provide some random information about the AVP. I wanted to go into detail here about the often misquoted brightness of the AVP.
I started by saying that I have read or watched many people state that the AVP is much brighter than the Meta Quest 3 (MQ3) or Meta Quest Pro (MQP). They are giving ridiculously high brightness/nits values for the AVP. As I reported in my March 7th, 2024, comments in the article Apple Vision Pro’s Optics Blurrier & Lower Contrast than Meta Quest 3, the AVP outputs to the eye about 100 nits and is only about 5-10% brighter than the MQ3 and ~20% less than the MQP.
I will explain how this came about in the Appendix at the end. And to this day, if you do a Google search (captured below), it will prominently state that the AVP has a “50-fold improvement over the Meta’s Quest 2, which hits just 100 nits,” citing MIT Technology Review.
Nits are tricky to measure in a headset without the right equipment, and even then, they vary considerably from the center (usually the highest to the periphery).
The 5,000 nits cited by MIT Tech Review are the raw displays before the optics, whereas the nits for the MQ2 were those going to the eye. The AVP’s (and all other) pancake optics transmit about 11% (or less) of the light from an OLED in the center. With Pancake optics, there is the polarization of the OLED (>50% loss), a transmissive pass, and a reflective pass through a 50/50 mirror, which starts with at most 12.5% (50% cubed) before considering all the other losses from the optics. Then, there is the on-time-duty cycle of the AVP, which I have measured to be about 18.4%. VR devices want the on-time duty cycle to be low to reduce motion blur with the rapid motion of the head and 3-D game. The MQ3 only has a 10.3% on-time duty cycle (shorter duty cycles are easier with LED-illuminated LCDs). So, while the AVP display devices likely can emit about 5,000 nits, the nits reaching the eye are approximately 5,000 nits x 11% x 18.4% = 100 nits.
18:59 Computer Monitor Replacement is Rediculous
I wrote a three-part series on why I think monitor replacement by the Apple Vision Pro is ridiculous. Please see Apple Vision Pro (Part 5A) – Why Monitor Replacement is Ridiculous, Part 5B, and Part 5C. There are multiple fundamental problems that neither Apple nor anyone else is close to solving. The slide on the right summarizes some of the big issues.
Nyquist Sampling – Resampling Causes Blurring & Artifacts
I tried to explain the problem in two ways, one based on the frequency domain and the other on the spatial (pixel) domain.
19:29 Frequency Domain Discussion
Anyone familiar with signal processing may remember that a square wave has infinite odd harmonics. Images can be treated like 2-dimensional signals. A series of equally spaced, equal-width horizontal lines looks like a square wave in the vertical dimension. Thus, to represent them perfectly with a 3-D transform requires infinite resolution. Since the resolution of the AVP (or any VR headset) is limited, there will be artifacts such as blurring, wiggling, and scintillation.
As I pointed out in (Part 5A), computers tend to “cheat” and distort text and graphics to fit on the pixel grid and thus sidestep the Nyquist sampling problem that any VR headset must face when trying to make a 2-D image appear still in 3-D space. Those who know signal processing know that the Nyquist rate is 2x the highest frequency component. However, as noted above, horizontal lines have infinite frequency. Hence, some degradation is inevitable, but then we only have to beat the resolution limit of the eye, which, in effect, acts as a low-pass filter. Unfortunately, the AVP’s display is about 2-3x too low linearly (4-9x in two dimensions) in resolution for the artifacts not to be seen by a person with good vision.
22:15 Spatial Domain Discussion
To avoid relying on signal processing theory, in (Part 5A), I gave the example of how a single display pixel can be translated into 3-D space (right). The problem is that a pixel the size of a physical pixel in the headset will always cover parts of four physical pixels. Worse yet, with the slightest movement of a person’s head, how much of each pixel and even which pixels will be constantly changing, causing temporal artifacts such as wiggling and scintillation. The only way to reduce the temporal artifacts is to soften (low pass filter) the image in the resampling process.
23:19 Optics Distortion
In addition to the issues with representing a 2-D image in 3-D space, the AVP’s optics are highly distorting, as discussed in Apple Vision Pro’s (AVP) Image Quality Issues—First Impressions. The optical distortions can be “digitally corrected” but face the same resample issues discussed above.
25:51 Close-Up Center Crop and Foveated Boundary
The figures shown in this part of the video come from Apple Vision Pro’s (AVP) Image Quality Issues – First Impressions, and I will refer you to that article rather than repeat it here.
28:52 AVP’s Pancake Optics and Comparison to MQ3 and Birdbath
Much of this part of the video is covered in more detail in Apple Vision Pro’s (AVP) Image Quality Issues—First Impressions.
Using Eye Tracking for Optics Has Wider Implications
A key point made in the video is that the AVP’s optics are much more “aggressive” than Meta’s, and as a result, they appear to require dynamic eye tracking to work well. I referred to the AVP optics as being “unstable.” The AVP is constantly pre-correcting for distortion and color based on eye tracking. While the use of eye tracking for Foveated Rendering and control input is much discussed by Apple and others, using eye tracking to correct the optics has much more significant implications, which may be why the AVP has to be “locked” onto a person’s face.
Eye tracking for foveated rendering does not have to be nearly as precise as it is for correction, but using it for optical correction does. This leads me to speculate that the AVP requires the facial interfaces to lock the headset to the face, which is horrible regarding human factors, to support pre-correcting the optics. This follows my rule, “when smart people do something that appears dumb, it is because the alternative was worse.”
Comparison to (Nreal/Xreal) Birdbath
One part not discussed in the video or that article but shown in the associated figure (below) is the similarity of Pancake Optics are similar to Birdbath Optics. Nreal (now Xreal) Birdbath optics are discussed in my Nreal teardown series in Nreal Birdbath Overview.
Both pancake and birdbath optics start by polarizing the image from an OLED microdisplay. They use quarter waveplates to “switch” the polarization, causing it to bounce off a polarizer and then pass through it. They both use a 50/50 coated semi-mirror. They both use a combination of refractive (lens) and reflective (mirror) optics. In the case of the birdbath, the polarizer acts as a beam splitter to the OLED display so it does not block the view out, whereas with pancake optics, everything is inline.
31:34 AVP Color Uniformity Problem
The color uniformity and the fact that the color shift moves around with eye movement were discussed in Apple Vision Pro’s Optics Blurrier & Lower Contrast than Meta Quest 3.
32:11 Comparing Resolution vs a Monitor
In Apple Vision Pro’s Optics Blurrier & Lower Contrast than Meta Quest 3, I compared the resolution of the AVP (below left) to various computer monitors (below right) and the Meta Quest 3.
Below is a close-up crop of the center of the same image shown on the AVP, a 28″ monitor, and the Meta Quest 3. See the article for an in-depth explanation.
33:03 Vision OS 1.1 Change in MacBook mirror processing
I received and saw some comments about my Apple Vision Pro’s Optics Blurrier & Lower Contrast than Meta Quest 3 that Vision OS 1.1 MacBook mirroring was sharper. I had just run a side-by-side comparison of displaying an image from a file on the AVP versus displaying the same image via mirroring a MacBook in Apple Vision Pro Displays the Same Image Differently Depending on the Application. So, I downloaded Vision OS 1.1 to the AVP and reran the same test, and I found a clear difference in the rendering of the MacBook mirroring (but not the display from the AVP file). However, it was not that the MacBook mirror image was shaper per se, but it was less bold. Even in the thumbnails below (click on them to see the full-size images). In the thumbnails below, note how the text looks less bold on the right side of the left image (OS 1.2) versus the right side of the right image.
Below are crops from the two images above, with the OS 1.1 image on the top and OS 1.0 on the bottom. The MacBook mirroring comes from the right sides of both images. Note how much bold the text and lines are in the OS 1.1 crop.
35:57 AVP Passthrough Cameras in the Wrong Location
38:43 AVP’s Optics are Soft/Blurry
As stated in Apple Vision Pro’s Optics Blurrier & Lower Contrast than Meta Quest 3, the AVP optics are a little soft. According to Marquees Brownlee (see above) and others, my statement has caused controversy. I have heard others question my methods, but I have yet to see any evidence to the contrary.
I have provided my photographic evidence (right) and have seen it with my eyes by swapping headsets back and forth with high-resolution content. For comparison, the same image was displayed on the Meta Quest 3, and the MQ3 was clearly sharper. The “blur” on the AVP is similar to what one would see with a Gaussian blur with a radius of about 0.5 to 1 pixel.
Please don’t confuse “pixel resolution” with optical sharpness. The AVP has more pixels per degree, but the optics are a bit out of focus and, thus, a little blurry/soft. One theory is that it is being done to reduce the screen door effect (seeing the individual pixels) and make the images on the AVP look smoother.
The slight blurring of the AVP may reduce the screen door effect as the gap between pixels is thinner on the OLED displays than on the MQ3’s LCDs. But jaggies and scintillation are still very visible on the AVP.
41:41 Closing Discussion: “Did Apple Move the Needle?”
The video wraps up with Jason asking the open-ended question, “Did Apple Move the Needle?” I discuss whether it will replace a cell phone, home monitor(s), laptop on the road, or home TV. I think you can guess that I am more than skeptical that the AVP now or in the future will change things for more than a very small fraction of the people who use cell phones, laptops, and TVs. As I say about some conference demos, “Not everything that would make a great theme park experience is something you will ever want in your home to use regularly.”
Appendix: Rumor Mill’s 5,000 Nits Apple Vision Pro
When I searched the Internet to see if anyone had independently reported on the brightness of the AVP, I got the Google search answer in big, bold letters: “5,000 Nits” (right). Then, I went to the source of this answer, and it was none other than the MIT Technology Review. I then thought they must be quoting the display’s brightness, not the headset’s, but it reports that it is a “50-fold improvement over Meta Quest 2,” which is ridiculous.
I see this all the time when companies quote a spec for the display device, and it gets reported as the headset’s brightness/nits to the eye. The companies are a big part of the problem because most headset makers won’t give a number for the eye’s brightness in their specs. I should note that with almost all headset optics, the peak nits in the center will be much higher than those in the periphery. Through the years, one thing I have found that all companies exaggerate in their marketing is the brightness, either in lumens for projectors or nits for headsets.
An LCOS or DLP display engine can output over a million nits into a waveguide, but that number is so big (almost never given) that it is not confused with the nits to the eye. Nits are a function of light output (measured in Lumens) and the ability to collimate the light (a function of the size of the light source and illumination optics).
The “5,000 nits” source was a tweet by Ross Young of DSCC. Part of the Tweet/X thread is copied on the right. A few respondents understood this could not be the nits to the eye, and a few responders understood that it could not be to the eye. Responder BattleZxeVR even got the part about the duty cycle being a factor, but that didn’t stop many other later responders from getting it wrong.
Citing some other publications that didn’t seem to understand the difference between nits-in versus nits-out:
Quoting from The Daejeon Chronicles (June 2023): Apple Vision Pro Screens: 5,000 Nits of Wholesome HDR Goodness (with my bold emphasis):
Dagogo Altraide of ColdFusion has this to say about the device’s brightness capability:
“The screens have 5,000 nits of peak brightness, and that’s a lot. The Meta Quest 2, for example, maxes out at about 100 nits of brightness and Sony’s PS VR, about 265 nits. So, 5,000 nits is crazy. According to display analyst Ross Young, this 5,000 nits of peak brightness isn’t going to blind users, but rather provide superior contrast, brighter colors and better highlights than any of the other displays out there today.”
Quoting from Mac Rumors (May 2023): Apple’s AR/VR Headset Display Specs: 5000+ Nits Brightness for HDR, 1.41-Inch Diagonal Display and More:
With ~5000 nits brightness or more, the AR/VR headset from Apple would support HDR or high dynamic range content, which is not typical for current VR headsets on the market. The Meta Quest 2, for example, maxes out around 100 nits of brightness and it does not offer HDR, and the HoloLens 2 offers 500 nits brightness. Sony’s PSVR 2 headset has around 265 nits of brightness, and it does have an advertised HDR feature when connected to an HDR display.
The flatpanelshd (June 2023): Apple Vision Pro: Micro-OLEDs with 3800×3000 pixels & 90/96Hz – a paradigm shift did understand that the 5,000 nist was the display device and not to the eye:
DSCC has previously said that the micro-OLED displays deliver over 5000 nits of brightness but a good portion of that is typically lost due to the lenses and the display driving method.
As I wrote in Apple Vision Pro (Part 1) – What Apple Got Right Compared to The Meta Quest Pro, Snazzy Labs had an excellent explanation of the issues with the applications shown by Apple at the AVP announcement (it is a fun and informative video). But in another otherwise excellent video, What Reviewers Aren’t Telling You About Apple Vision Pro, I have to give him credit for recognizing that the MIT Tech Review had confabulated the display’s brightness with the headset’s brightness. But then hazarded a guess that it would be “after the optics, I bet it’s around 1,000 nits.” His guess was “just a bit outside” by about 10x. I do not want to pick on Snazzy Labs, as I love the videos I have seen from them, but I want to point out how much even technically knowledgeable people without a background in optics underestimate the light losses in headset optics.