Today, rather than shooting through AR/MR headsets, I pointed my camera skyward toward the Solar Eclipse. They were forecasting a lot of cloud cover in the Dallas area, and it looked bad less than two hours before the eclipse was to start. But then, as the eclipse neared, the clouds mostly parted so I could get some good pictures between the times when clouds moved by.
But the sun was still behind a cloud as it was about time for the total eclipse. Fortunately, a few moments into the totality, the clouds moved out of the way.
All the images were cropped without any scaling. To see them in full size, you can click on any image.
During the eclipse, I took pictures between the times when the clouds were blocking the view. At one point, not long after totality, there was a fairly dark low cloud and some light wispy clouds. They blocked enough sunlight that I could take pictures without the sun filter, yet I could still see the eclipse. The clouds had an interesting effect. It looked like a shot of a crescent moon through the clouds.
Photography Info
For the camera people in the audience, the pictures were taken with a Canon R5 (45MP) camera with an RF 100-500mm lens at 500mm with a 1.4x teleconverter (netting 700mm). That is about half what it would take to get the full sun and corona to fill the frame, so all the pictures below are cropped (except for the last picture with the cloud cover). Most of the pictures were shot on a tripod with a “geared head” (the same one I use to line up the camera to shoot through AR and MR headset), so once I had the sun lined up in the camera’s LCD display, I only had to turn a couple of knobs occasionally. Except during totality and a few shots when the clouds were blocking, I used a sun filter (Silver-black polymer). All shots were taken with exposure bracketing (3 shots at different exposures).
Update 1/28/2024 – Based on some feedback from Nimo Planet, I have corrected the description of their computer pod.
Introduction
The “theme” for this article is companies I met with at CES with optical see-through Augmented and Mixed Reality using OLED microdisplays.
I’m off to SPIE AR/VR/MR 2024 in San Francisco as I release this article. So, this write-up will be a bit rushed and likely have more than the usual typos. Then, right after I get back from the AR/VR/MR show, I should be picking up my Apple Vision Pro for testing.
Xreal
Xreal (formerly Nreal) says they shipped 350K units in 2023, more than all other AR/MR companies combined. They had a large booth on the CES floor, which was very busy. They had multiple public and private demo stations.
Inherent in the birdbath optical architecture Xreal still uses, they will block about 70% of the real-world light, acting like moderately dark sunglasses. About 10% of the display’s light makes it to the eye, which is much more efficient than waveguides, which are much thinner and more transparent. Xreal claims their newer designs support up to 500 nits, meaning the Sony Micro-OLEDs must output about 5,000 nits.
With investment, volume, and experience, Xreal has improved its optics and image quality. It can’t improve much over the inherent limitations of a birdbath, particularly in terms of transparency. Xreal recently added an LCD dimming shutter to selectively block out more or all of the real world fully with their new Xreal Air 2 Pro and their latest Air 2 Ultra, for which I was given a demo at CES.
The earlier Xreal/Nreal headsets were little more than 1920×1080 monitors you wore with a USB-C connection for power and video. Each generation has added more “smarts” to the glasses. The Air 2 Ultra includes dual 3-D IR camera sensors for spatial recognition. Xreal and (to be discussed later) Nimo, among others, have already picked up on Apple’s “Spatial Computing,” referring to their products as affordable ways to get into spatial computing.
Most of the newer headsets will support either via a cell phone or Xreal’s “Beam” compute module, which can act to mirror or cast one more virtual display from a computer, cell phone, or tablet. While virtually there may be more monitors, they are still represented on a 1920×1080 display device. I believe (I forgot to ask) that Xreal is using internal sensors to detect head movement to virtualize the monitors with head movement.
Xreals Air 2 Ultra demo showcased the new spatial sensors’ ability to recognize hand and finger gestures. Additionally, the sensors could read “bar-coded” dials and slides made from cardboard.
BMW AR Ride Concept (Using Xreal Glasses)
In addition to seeing Xreal devices on their own, I was invited by BMW to take a ride trying out their Augmented Reality HUD on the streets around the convention center. A video produced by BMW gives a slightly different and abbreviated trip. I should emphasize that this is just an R&D demonstration, not a product that BMW plans to introduce. Also, BMW made clear that they would be working with other makes of headsets but that Xreal was the most readily available.
To augment using the Xreal glasses, BWM mounted a head tracking camera under the rearview mirror. This allows the BMW to lock the image generated to the physical car. Specifically, it allowed them to (selectively) block/occlude parts of the virtual image hidden behind the front A-pillar of the car. Not shown in the pictures from BMW below (click on the picture to see them bigger) is that you could see the images would start in the front window but be hidden by the A-pillar and then continue in the side window.
BWM’s R&D is looking at driver and passenger AR glasses use. They discussed that they would have different content for the driver, which would have to be simplified and more limited than what they could show the passenger. There are many technical and government/legal issues (all 50 states in the U.S. have different laws regarding HUD displays) with supporting headsets on drivers. From a purely technical perspective, a hear-worn AR HUD has many advantages and some disadvantages versus a fixed HUD on the windshield or dash combiner (too much to get into in this quick article).
Ocutrx (for Low-Vision and other applications)
Ocutrx’s Oculenz is also using “birdbath” optics with the OcuLenz. The OcuLens was originally designed to support people with “low vision,” especially people with Macular Degeneration and eye problems that block parts of a person’s vision. People with Macular Degeneration lose their vision’s high-resolution, high-contrast, and color-sensitive parts. They must rely on other parts of the retina, commonly called peripheral vision (although it may include more than just what is technically considered peripheral vision).
A low-vision headset must have a wide FOV to reach the outer parts of the retina. They must magnify, increase color saturation, and improve contrast over what a person with normal vision would want to see. Note that while these people may be legally blind, they still can see, particularly with their peripheral vision. This is why a headset that still allows them to use their peripheral vision is important.
About 20 million people in the US alone have what is considered “low-vision,” and about 1 million more people each develop low-vision as the population ages. It is the biggest identifiable market I know of today for augmented reality headset headsets. But a catch needs to be fixed for this market to be served. By the very nature of the people involved, having low vision and often being elderly, they need a lot of professional help while at the same time being often on a fixed or limited income. Unfortunately, rarely will private or government (Medicare/Medicaid) insurance will rarely cover either the headset cost or the professional support required. There have been bills before Congress to change this, but so far, nothing has happened of which I am aware. Without a way to pay for the headsets, the volumes are low, which makes the headsets more expensive than they need to be.
In the past, I have reported on Evergaze’s seeBoost, which existed in this market while developing their second-generation product for the economic reasons (lack of insurance coverage) above. I have also discussed NuEyes with Bradley Lynch in a video after AWE 2022. The economic realities of the low-vision market cause companies like NuEye and Ocutrx to look for other business opportunities for the headsets. It is really a frustrating situation knowing that technology could help so many people. I hope to cover this topic in more detail in the future.
Nimo Planet (Nimo)
Nimo Planet (Nimo) makes a small computer that acts as a spatial mouse pointer for AR headsets with a USB-C port for power and video input. It replaces the need for a cell phone and can send mirror/casting video information from other devices to the headset. Still, Nimo Core is a fully standalone computer with Nimo OS, which simultaneously supports Android, Web, and Unity Apps. No other host computer is needed.
According to Nimo, every other multi-screen solution in the market is developed in web platforms or UnityApp, which limits them to running only Web Views. Nimo OS created a new Stereo Rendering and Multi-Window architecture in AOSP to run multiple Android, Unity, and Web Apps simultaneously.
The Sightful is similar to the Nimo Planet type of device in some ways. With the Sightful, the computer is built inside the keyboard and touchpad, making it a full-fledged computer. Alternatively, Sightful can be viewed as a laptop computer where the display uses AR glasses rather than a flat panel.
Like Nimo and Xreal’s Beam and many other new Mixed Reality devices, Sightful supports multiple windows. I don’t know if they have sensors for 3-D sensing, so I suspect they use internal sensors to detect head movement.
Sightful’s basic display specs resemble other birdbath AR glasses designs from companies like Xreal and Rokid. I have not had a chance, however, to compare them seriously.
LetinAR
I have been writing about LetinAR since 2018. LetinAR started with a “Pin Mirror” type of pupil replication. They have now moved on to a series of what I will call “horizontal slat pupil replicators.” They also use total internal reflections (TIR) and a curved mirror to move the focus of the image form an OLED microdisplay before going to the various pupil-expanding slats.
While LetinAR’s slat design improves image quality over its earlier pin mirrors, it is still imperfect. When looking through the lenses (without a virtual image), the view is a bit “disturbed” and seems to have diffraction line effects. Similarly, you can perceive gaps or double images depending on your eye location and movement. LetinAR is working on continuing to improve this technology. While their image quality is not as good as the birdbath designs, they offer much better transparency.
LetinAR seems to be making progress with multiple customers, including Jor Jin, who was demonstrating in the LentinAR booth, Sharp, which had a big demonstration in their booth (while they didn’t say whose optics were in the demo, it was obviously LentinARs – see pictures below), and the headset discussed above by Nimo.
Conclusions
Sorry, there is no time for major conclusions today. I’m off to the AR/VR/MR Conference and Exhibition.
I will note that regardless of the success of the AVP, Apple has already succeeded in changing the language of Augmented and Mixed reality. In addition to almost everyone in AR and Mixed reality talking “AI,” many companies now use “Spatial Computing” to refer to their products in their marketing.
As I wrote last time, I met with nearly 40 companies at CES, of which 31 I can talk about. This time, I will go into more detail and share some photos. I picked the companies for this article because they seemed to link together. The Sony XR headset and how it fit on the user’s head was similar to the newer DigiLens Argo headband. DigiLens and the other companies had diffractive waveguides and emphasized lightweight and glass-like form factors.
I would like to caution readers of my saying that “all demos at conferences are magic shows,” something I warn about near the beginning of this blog in Cynics Guide to CES – Glossary of Terms). I generally no longer try to take “through the optics” pictures at CES. It is difficult to get good representative photos in the short time available with all the running around and without all the proper equipment. I made an exception for the TCL color MicroLED glasses as they readily came out better than expected. But at the same time, I was only using test images provided by TCL and not test patterns that I selected. Generally, the toughest test patterns (such as those on my Test Pattern Page) are simple. For example, if you put up a solid white image and see color in the white, you know something is wrong. When you put up colorful pictures with a lot of busy detail (like a colorful parrot in the TCL demo), it is hard to tell what, if anything, is wrong.
Sony XR and DigiLens Headband Mixed Reality (with contrasts to Apple Vision Pro)
Sony XR (and others compared to Apple Vision Pro)
This blog expressed concerns about the Apple Vision Pro’s (AVP) poor mechanical ergonomics (AVP), completely blocking peripheral vision and the terrible placement of the passthrough cameras. My first reaction was that the AVP looked like it was designed by a beginner with too much money and an emphasis on style over functionality. What I consider Apple’s obvious mistakes seem to be addressed in the new Sony XR headset (SonyXR).
The SonyXR shows much better weight distribution, with (likely) the battery and processing moved to the back “bustle” of the headset and a rigid frame to transfer to the weight for balance. It has been well established that with designs such as the Hololens 2 and Meta Quest Pro, this type of design leads to better comfort. This design approach can also move a significant amount of power to the back for better heat management due to having a second surface radiating heat.
The bustle on the back design also avoids the terrible design decision by Apple to have a snag hazard and disconnection nuisance with an external battery and cable.
The SonyXR is shown to have enough eye relief to wear typical prescription glasses. This will be a major advantage in many potential XR/MR headset uses, making it more interchangeable. This is particularly important for use cases that are not all-day or one-time (ex., museum tours, and other special events). Supporting enough eye relief for glasses is more optically difficult and requires larger optics for the same field of view (FOV).
Another major benefit of the larger eye relief is that it allows for peripheral vision. Peripheral vision is considered to start at about 100 degrees or about where a typical VR headset’s FOV stops. While peripheral vision is low in resolution, it is sensitive to motion. It alerts the person to motion so they will turn their head. The saying goes that peripheral vision evolved to keep humans from being eaten by tigers. This translated to the modern world, being hit by moving machinery and running into things that might hurt you.
Another good feature shown in the Sony XR is the flip-up screen. There are so many times when you want to get the screen out of your way quickly. The first MR headset I used that supported this was the Hololens 2.
Another feature of the Hololens 2 is the front-to-back head strap (optional but included). Longtime VR gamer and YouTube personality Brad Lynch of the SadlyItsBradley YouTube channel has tried many VR-type headsets and optional headbands/straps. Brad says that front-to-back straps/pads generally provide the most comfort with extended use. Side-to-side straps, such as on the AVP, generally don’t provide the support where it is needed most. Brad has also said that while a forehead pad, such as on the Meta Quest Pro, helps, headset straps (which are not directly supported on the MQP) are still needed. It is not clear whether the Sony XR headset will have over-the-head straps. Even companies that support/include overhead straps generally don’t show them in the marketing photos and demos as they mess up people’s hair.
The SonyXR cameras are located closer to the user’s eyes. While there are no perfect placements for the two cameras, the further they are from the actual location of the eyes, the more distortion will be caused for making perspective/depth-correct passthrough (for more on this subject, see: Apple Vision Pro Part 6 – Passthrough Mixed Reality (PtMR) Problems).
Lynx also used the headband with a forehead pad, with the back bustle and flip-up screen. Lynx also supports enough eye relief for glasses and good peripheral vision and locates their passthrough cameras near where the eye will be when in use. Unfortunately, I found a lot of problems with the optics Lynx chose for the R1 by the optics design firm Limbak (see also my Lynx R1 discussion with Brad Lynch). Apple has since bought Limbak, and it is likely Lynx will be moving on with other optical designs.
Digilens Argo New Head Band Version at CES 2024
I wrote a lot about Digilens Argo in last year’s coverage of CES and the AR/VR/MR conference in DigiLens, Lumus, Vuzix, Oppo, & Avegant Optical AR (CES & AR/VR/MR 2023 Pt. 8). In the section Skull-Gripping “Glasses” vs. Headband or Open Helmet, I discussed how Digilens has missed an opportunity for both comfort and supporting the wearing of glasses. Digilens said they took my comments to heart and developed a variation with the rigid headband and flip-up display shown in their suite at CES 2024. Digilens said that this version let them expand their market (and no, I didn’t get a penny for my input).
The Argos are light enough that they probably don’t need an over-the-head band for extra support. If the headband were a ground-up design rather than a modular variation, I would have liked to see the battery and processing moved to a back bustle.
While on the subject of Digilens, they also had a couple of nice static displays. Pictured below right are variations in waveguide thickness they support. Generally, image quality and field of view can be improved by supporting more waveguide layers (with three layers supporting individual red, green, and blue waveguides). Digilens also had a static display using polarized light to show different configurations they can support for the entrance, expansion, and exit gratings (below right).
Vuzix
Vuzix has been making wearable heads-up displays for about 26 years and has a wide variety of headsets for different applications. Vuzix has been discussed on this blog many times. Vuzix primarily focuses on lightweight and small form factor glasses and attachments with displays.
Vuzix Ultralite Sport (S) and Forward Projection (Eye Glow) Elimination
New this year at CES was Vuzix’s Ultralite Sports (S) model. In addition to being more “sporty” looking, their waveguides are designed to eliminate forward projection (commonly referred to as “Eye Glow”). Eye glow was famously an issue with most diffractive waveguides, including the Hololens 1 & 2 (see right), Magic Leap 1 & 2, and previous Vuzix waveguide-based glasses.
If the lenses are canted (tilted), the exit gratings, when designed to project to the eye, will then project down at twice the angle at which the waveguides are canted. Thus, with only a small change in the tilt of the waveguides, the projection will be far below the eyesight of others (unless they are on the ground).
Ultra Light Displays with Audio (Vuzix/Xander) & Solos
Last year, Vuzix introduced their lightweight (38 grams) Z100 Ultralite, which uses 640×480 green (only) MicroLED microdisplays. Xander, using the lightweight Vuzix’s Z100, has developed text-to-speech glasses for people with hearing difficulties (Xander was in the AARP booth at CES).
While a green-only display with low resolution by today’s standards is not something you will want to watch movies, there are many uses for having a limited amount of text and graphics in a lightweight and small form factor. For example, I got to try out Solos Audio glasses, which, among other things, use ChatGPT to do on-the-fly language translation. It’s not hard to imagine that a small display could help clarify what is being said about Solos and similar products, including the Amazon Echo Frames and the Ray-Ban Meta Wayfarer.
Mojie (Green) MicroLED with Plastic Waveguide
Like Vuzix Z100, the Mojie (a trademark of Meta-Bounds) uses green-only Jade Bird Display 640×480 microLEDs with waveguide optics. The big difference is that Mojie, along with Oppo Air 2 and Meizu MYVU, all use Meta-Bounds resin plastic waveguides. Unfortunately, I didn’t get to the Mojie booth until near closing time at CES, but they were nice enough to give me a short demo. Overall, regarding weight and size, the Mojie AR glasses are similar to the Vuzix Z100, but I didn’t have the time and demo content to judge the image quality. Generally, resin plastic diffractive waveguides to date have had lower image quality than ones on a glass substrate.
I have no idea what resin plastic Meta-Bounds uses or if they have their own formula. Mitsui Chemicals and Mitsubishi Chemicals, both of Japan, are known to be suppliers of resin plastic substrate material.
EverySight
ELBIT F35 Helmet and Skylens
Everysight (the company, not the front eye display feature on the Apple Vision Pro) has been making lightweight glasses primarily for sports since about 2018. Everysight spun out of the major defense (including the F35 helmet HUD) and commercial products company ELBIT. Recently, ELBIT had their AR glasses HUD approved by the FAA for use in the Boeing 737ng series. EverySight uses an optics technology, which I call “precompensated off-axis.” Everysight (and ELBIT) have an optics engine that projects onto a curved front lens with a partial mirror coating. The precompensation optics of the projector correct for the distortion from hitting a curved mirror off-axis.
The Everysight/Elbit technology is much more optically efficient than waveguide technologies and more transparent than “birdbath technologies” (the best-known birdbath technology today being Xreal). The amount of light from the display versus transparency is a function of the semi-transparent mirror coating. The downside of the Eversight optical system with small-form glasses is that the FOV and Eyebox tend to be smaller. The new Everysight Maverick glasses have a 22-degree FOV and produce over 1,000 nits using a 5,000 nit 640×400 pixel full-color Sony Micro-OLED.
The front lens/mirror elements are inexpensive and interchangeable. But the most technically interesting thing is that Everysight has figured out how to support prescriptions built into the front lens. They use a “push-pull” optics arrangement similar to some waveguide headsets (most notably Hololens 1&2 and Magic Leap). The optical surface on the eye side of the lens corrects for the virtual display of the eye, and the optical surface on the outside surface of the lens is curved to do what is necessary to correct vision correction for the real world.
TCL RayNeo X2 and Ray Neo X2 Lite
I generally no longer try to take “through the optics” pictures at CES. It is very difficult to get good representative photos in the short time available with all the running around and without all the proper equipment. I got some good photos through TCL’s RayNeo X2 and the RayNeo X2 Lite. While the two products sound very close, the image quality with the “Lite” version, which switched to using Applied Materials (AMAT) diffractive waveguides, was dramatically better.
The older RayNeo X2s were available to see on the floor and had problems, particularly with the diffraction gratings capturing stray light and the general color quality. I was given a private showing of the newly announced “Lite” version using the AMAT waveguides, and not only were they lighter, but the image quality was much better. The picture on the right below shows the RayNeo X2 (with an unknown waveguide) on the left that captures the stray overhead light (see streaks at the arrows). The picture via the Lite model (with the AMAT waveguide) does not exhibit these streaks, even though the lighting is similar. Although hard to see in the pictures, the color uniformity with the AMAT waveguide also seems better (although not perfect, as discussed later).
Both RayNeo models use 3-separate Jade Bird Display red, green, and blue MicroLEDs (inorganic) with an X-cube color combiner. X-cubes have long been used in larger LCD and LCOS 3-panel projectors and are formed with four prisms with different dichroic coatings that are glued together. Jade Bird Display has been demoing this type of color combiner since at least AR/VR/MR 2022 (above). Having worked with 3-Panel LCOS projectors in my early days at Syndiant, I know the difficulties in aligning three panels to an X-cube combiner. This alignment is particularly difficult with the size of these MicroLED displays and their small pixels.
I must say that the image quality of the TCL RayNeo X2 Lite exceeded my expectations. Everything seems well aligned in the close-up crop from the same parrot picture (below). Also, there seems to be relatively good color without the wide variation from pixel-to-pixel brightness I have seen in past MicroLED displays. While this is quite an achievement for a MicroLED system, the RayNeo X2 light only has a modest 640×480 resolution display with a 30-degree diagonal FOV. These specs result in about 26 pixels per degree or about half the angular resolution of many other headsets. The picture below was taken with a Canon R5 with a 16mm lens, which, as it turns out, has a resolving power close to good human vision.
Per my warning in the introduction, all demos are magic shows. I don’t know how representative this prototype will be of units in production, and perhaps most importantly, I did not try my test patterns but used the images provided by TCL.
Below is another picture of the parrot taken against a darker background. Looking at the wooden limb under the parrot, you will see it is somewhat reddish on the left and greenish on the right. This might indicate color shifting due to the waveguide, as is common with diffractive waveguides. Once again, taking quick pictures at shows (all these were handheld) and without controlling the source content, it is hard to know. This is why I would like to acquire units for extended evaluations.
The next two pictures, taken against a dark background and a dimly lit room, show what I think should be a white text block on the top. But the text seems to change from a reddish tint on the left to a blueish tint on the right. Once again, this suggests some color shifting across the diffractive waveguide.
Below is the same projected image taken with identical camera settings but with different background lighting.
Below is the same projected flower image with the same camera settings and different lighting.
Another thing I noticed with the Lite/AMAT waveguides is significant front projection/eye glow. I suspect this will be addressed in the future, as has been demonstrated by Digilens, Displelix, and Vuzix, as discussed earlier.
Conclusions
The Sony XR headset seems to showcase many of the beginner mistakes made by Apple with the AVP. In the case of the Digilens Argo last year, they seemed to be caught between being a full-featured headset and the glasses form factor. The new Argo headband seems like a good industrial form factor that allows people to wear normal glasses and flip the display out of the way when desired.
Vuzix, with its newer Ultralite Z100 and Sports model, seems to be emphasizing lightweight functionality. Vuzix and the other waveguide AR glasses have not given a clear path as to how they will support people who need prescription glasses. The most obvious approach they will do some form of “push-pull” with a lens before and after the waveguides. Luxexcel had a way to 3-D print prescription push-pull lenses, but Meta bought them. Add Optics (formed by former Luxexcel employees) has another approach with 3-D printed molds. Everysight tries to address prescription lenses with a somewhat different push-pull approach that their optical design necessitates.
While not perfect, the TCL color MicroLED, at least in the newer “Lite” version, was much better than I expected. At the same time, one has to recognize the resolution, FOV, and color uniformity are still not up to some other technologies. In other words, to appreciate it, one has to recognize the technical difficulty. I also want to note that Vuzix has said that they are also planning on color MicroLED glasses with three microdisplays, but it is not clear whether they will use an X-cube or a waveguide combiner approach.
The moderate success of smart audio glasses may be pointing the way for these ultra-light glasses form factor designs for a consumer AR product. One can readily see where adding some basic text and graphics would be of further benefit. We will know if this category has become successful if Apple enters this market 😁.
I want my readers to know about my first article on Display Daily as a “Senior Analyst” and the article I wrote for the March/April SID Information Display. I will be speaking and attending the upcoming AWE 2023 Conference from May 31st through June 2nd. I also recently recorded another AR Show Podcast with Jason McDowell, which should be published in a few weeks.
I also wanted you to know I have a lot of travel planned for May, so there may not be much, if anything, published on this blog in May. But I have several articles in the works for this month and should have more to discuss in June.
Display Daily Article On Apple – New “Senior Analyst”
Display Daily, a division of Jon Peddie research, has just put out an article by me discussing long-rumored Apple Mixed Reality headset. In some ways, this follows up an article I wrote for Display Daily in 2015 titled VR and AR Head Mounted Displays – Sorry, but there is no Santa Claus.
Display Daily and I are looking at joint wrote and video projects. I will be teaming up with Display Daily as a “Senior Analyst” on these new projects while I continue to publish this blog.
SID Information Display Magazine’s March/April 2023 Article, “The Crucial Role of Optics in AR/MR”
I was asked to contribute an article to SID’s Information Display Magazine’s printed and online March/April 2023 issue.
The article (available for free download) discusses the most common types of optics and displays used in mixed reality today and what I see as the technologies of the future.
Attending and Presenting at AWE 2023
AWE has been the best conference for seeing a wide variety of AR, VR, and MR headsets for many years. While I mostly spend my time on the show floor and in private meetings to see the “good stuff,” I have been invited to give a presentation this year. The Topic of the presentation will be the pros and cons of Optical versus Video Passthrough Mixed Reality. The conference runs from May 31st to June 2nd. I will be presenting at 9:00 AM on June 2nd.
My Long History with Display Daily (20+ Years) and even Longer with Jon Peddie (40+ Years)
I’ve been interacting with Display Daily and its former parent company Insight Media headed by Chris Chinnock, who is still a Display Daily Contributor since I left Texas Instruments to work on LCOS display devices in 1998. Meko, headed by Bob Raikes, took over Display Daily in 2014, then late in 2022, Jon Peddie Research acquired Display Daily.
It turns out that I had known market analyst Jon Peddie since the mid-1980s when he was the chief architect of the TMS34010, the world’s first fully programmable graphics processor, and led the definition of other graphics devices, including the first Video DRAM. Jon suggested we work together on some projects, and I have become a Senior Analyst at Display Daily.