Reading view

There are new articles available, click to refresh the page.

Meta beats suit over tool that lets Facebook users unfollow everything

Meta has defeated a lawsuit—for now—that attempted to invoke Section 230 protections for a third-party tool that would have made it easy for Facebook users to toggle on and off their news feeds as they pleased.

The lawsuit was filed by Ethan Zuckerman, a professor at University of Massachusetts Amherst. He feared that Meta might sue to block his tool, Unfollow Everything 2.0, because Meta threatened to sue to block the original tool when it was released by another developer. In May, Zuckerman told Ars that he was "suing Facebook to make it better" and planned to use Section 230's shield to do it.

Zuckerman's novel legal theory argued that Congress always intended for Section 230 to protect third-party tools designed to empower users to take control over potentially toxic online environments. In his complaint, Zuckerman tried to convince a US district court in California that:

Read full article

Comments

© NurPhoto / Contributor | NurPhoto

Apple upgrades MacBook Pro with M4 chips

Apple on Wednesday wrapped up Macweek (well, Mac half week) by introducing an updated MacBook Pro. Apple’s most premium laptop is catching up to is brethren with the addition of M4 chips. The Pro and Mini are the first two Macs getting the new chip. The Pro will also be the first to sport the […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Apple announces M4 Max chip, debuting on the MacBook Pro

Apple wrapped up a half week of Mac announcements Wednesday by debuting the latest addition to the M-series of chips. A day after announcing the M4 Pro alongside the tiny new Mac mini, the company is showcasing the M4 Max, which is coming to the MacBook Pro line. Like the other members of the M4 […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Just shy of a major deal

Couple of fun things: 

1. Over a quarter of a million people now subscribe to my newsletter

2. I sold my next book: 

These things aren’t unrelated: over the past decade, the newsletter has turned into a wonderful playground for me: a place where I can work out my ideas, share what I love, and show my work while I come up with the next thing…

Collective creativity


One of the diary-like joys of the Friday newsletter is getting to sit down after a week and figure out if the things in my life have been speaking to each other in any particular way.

Usually, the week is a miscellany — if not cacophony — but often a theme appears.

That theme this week is “collective creativity,” brought about by reading about Prince, jazz, and the work of being in a band. It’s a dense one, and good, I think.

Read it here.

The comfort of drawing Batman

In today’s newsletter, I write about spending half of a flight to Honolulu drawing a comic while freeze-framing Tim Burton’s Batman:

Planes are excellent places to work, but they’re also excellent places to zone out and to play or do “comfort work” — what I’m calling the creative work we return to when we don’t know what else to do.

Drawing Batman, it turns out, is a great comfort to me!

A reader commented that they’d love to sit across from me on a plane, and it suddenly occurred to me that I left out a huge inspiration from the newsletter: I was sitting on the plane diagonally from a kid drawing, which is what made me get out my diary in the first place!

Here are a few blind contour drawings I made of the kid:

And what I wrote in my diary underneath:

there’s a little kid across the aisle from me who has the most chaotic little marker box and I love it. just scribbling little drawings w/ what looks like EXPO markers and crayons and all kinds of random stuff…

Since the letter takes a turn into kids and the aliveness in the lines that they draw, I can’t believe I left out this detail. But that’s what’s so great about putting work in front of people — the minute you do, you remember everything you left out.

Read the whole letter here: “The comfort of drawing Batman

Anticipation and recall

I will often map out a Tuesday newsletter in my notebook, forget I made a map, and write it without my notes. Then when I go back flipping through my notebook, I discover everything I left out!

Today’s newsletter is about messing around with anticipation and recall to stretch out pleasant events and minimize unpleasant ones

On the unpleasant side, I left out one of my favorite parts of the section of Katherine Morgan Schafler’s The Perfectionist’s Guide to Losing Control that inspired the letter:

We justify agreeing to get coffee with someone whom we don’t really want to see by saying something like, “It’ll just be half an hour and then I’ll leave.” No. It’ll be the anticipatory anxiety for the week leading up to that half hour, the half hour itself, and then the negative recall of how you felt annoyed and immediately resentful upon sitting down, didn’t want to be there, and couldn’t believe she said that, even though she always says stuff like that, and that’s why you don’t like hanging out with her in the first place….When it comes to agreeing to engage in events we don’t want to engage in, there’s nothing quick about quick catch-up drinks or quick calls or quick meetings.

This adds a layer to the question to ask yourself to avoid accepting invitations you’ll later regret: “Would I do it tomorrow?

The time travel involved in this calculation is already tricky — who knows how I’ll feel about doing something five minutes from now, let alone five months from now? But if you think about the time leading up to the event and the time coming down from it, suddenly such obligations reveal their bloated shape. 

(“The job never kills anybody,” says John Taylor of Duran Duran. “It’s the fucking stuff you do in between.”)

On the pleasant side, I was reminded of how important it is to have something to look forward to, no matter how silly.

All of this, by the way, is a form of playing with your experience of time: by exploiting anticipation and recall, you’re trying to effectively slow down and speed up certain events, and using your memory to shape the story you want to tell about your experience. 

You can read the whole newsletter here

Drawing Eno

Yesterday’s newsletter, “Drawing Eno,” was inspired by seeing Gary Hustwit’s new film Eno and how I’ve been drawing Brian Eno lectures and interviews for over 15 years. Here’s a drawing I made from the generated version of the movie I saw:

You can read the rest of the newsletter here.

Notes on travel

Friday’s newsletter was inspired by our recent trip to New Mexico.

It ended on this note about travel:

I am a big believer that travel doesn’t relieve your problems, it throws them into relief. You see your life in a new light and new shadows. The desert light can be good for this. At its peak, it is harsh and unforgiving, but at dusk and dawn it softens, becomes more mysterious. Every trip has its challenges, but I returned home, as I often do, with a sense of perspective and a clarity about what I want to do next. What more could one ask for? (“Go away so you can come back.”)

What I liked most about New Mexico was being in the forests and the deserts outside of town.

In Benjamin Labatut’s The Maniac, a fictional Richard Feynman says:

Los Alamos was high up on a mesa with tall cliffs carved in dark red earth, lots of trees and shrubs all around. The landscape was breathtaking, the most beautiful place I’d ever seen. Coming from New York, I’d never traveled out to the West before, so I really felt like I was in another world. In Mars or something. It had the strange energy of a sacred space, a haven far away from the civilized world, away from prying eyes, farther than God could see. The perfect spot to do the unimaginable.

Read more in “The Land of Enchantment.”

Four Tet on making music

Four Tet’s Three is one of my favorite albums of the year, so I was delighted to come across an interview with Kieran Hebden on the Tape Notes podcast discussing its making. He rarely gives interviews, so before listening, I really knew nothing about him or how he works. It was a delight to hear about the making of a record I’ve spent so much time with. 

Four Tet’s music is extra special to me because my 11-year-old composer and I both love it — I put “Loved” on my February mixtape and Owen put “Lush” on the mixtape we collaborated on this month. It was wild to me to hear Hebden describe how he works in Ableton, drawing the notes on the piano roll instead of playing them on the keyboard. (Something I see Owen do a ton when he’s composing.)

I really loved Hebden’s attitude towards making music after many decades. He says that if he can stay excited about listening to music and enjoy the making of it while also avoiding the trappings of success and the bog of the industry, that it actually makes the work more successful. Just a wonderful listen. 

When he was asked about his most important piece of equipment, he said his hi-fi system because it’s what helps him listen to music in a level of detail that helps him really explore and hear sounds. (Check out the gigantic ongoing Spotify playlist of what he’s listening to.)

This emphasis on listening came up over and over again in the interview, and I wanted to copy down his advice to other musicians: Listen to more music.

“Listening to a lot of music and really exploring it and doing that level of investigation of really understanding where things have come from.”

He then describes swimming upstream

If you listen to a current record now that samples an old nineties record, and then you check out the old nineties record, find out that sample’s like an old soul record for the drum break or whatever.

And then you go listen to the old soul record and then you find out who the drummer was who played that drum break. And it’s like, oh, it’s Bernard Purdy or whatever.

And then you look on Wikipedia and check out all the other records he made. And then you’re like, oh, he worked with this producer a lot and you check out what that producer did.

To listen to music in that way and explore it and study it, I think is hugely valuable in terms of learning how to be a good arranger, a good producer, a good musician. The more you take in of understanding the sort of like great music that’s out there and the things that came before, it’s so powerful.

Everything’s there, all the information’s there. And then if you take everything you learn from that and then combine it with your own ideas and your own emotions and stuff, then you sort of set up to sort of push things forward. I think that’s much more useful than spending all your time being like, I’m just gonna be learning what every single thing in Ableton does now for the next few months…

You’ve got to love records so much, he says, that you want to make something that can sit on a shelf alongside the records you love.

It’s a lesson that is true for all creative people: Your output depends on your input.

If you want to be a great musician, you need to listen to more great music. If you want to write great books, you need to read more great books. If you want to make great films…

(Steal like an artist.)

Meta Quest Pro (Part 2) – Block Diagrams & Teardown of Headset and Controller

Introduction

Limas Lin (LinkedIn Contact) is a reader of my blog and wanted to share his block diagram and teardown of the Meta Quest Pro. He sent me these diagrams over a month ago, but I have been busy with videos and blog articles about the companies I met with at CES and AR/VR/MR 2023.

I have more to write about both the Meta Quest Pro plus the recent company meetings, but I wanted to get out what I think are excellent diagrams on the Meta Quest Pro. The diagrams show the location and the teardown photos with most if not all the key components identified.

I don’t have anything more to say about these diagrams or a conclusion as I think images speak for themselves. I know it is a major effort to find all the components and present them in a such a clear and concise manner and want to than Limas Lim for sharing. You can click on each diagram for a higher resolution image.

Meta Quest Pro Headset Diagrams

Meta Quest Pro Controller Diagrams

The post Meta Quest Pro (Part 2) – Block Diagrams & Teardown of Headset and Controller first appeared on KGOnTech.

Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6)

[March 4th, 2023 Corrections/Updates – poLight informed me of some corrections, better figures, and new information that I have added to the section on poLight. Cambridge Mechatronics informed me about their voltage and current requirements for pixel-shifting (aka wobulation).]

Introduction

For this next entry in my series on companies I met with at CES or Photonics West’s (PW) AR/VR/MR show in 2023, I will be covering two different approaches to what I call “optics micromovement.” Cambridge Mechatronics (CML) uses Shape Memory Alloys (SMA) wires to move optics and devices (including haptics). poLight uses piezoelectric actuators to bend thin glass over their flexible optical polymer. I met with both companies at CES 2023, and they both provided me with some of their presentation material for use in this article.

I would also like to point out that one alternative to moving lenses for focusing is electrically controlled LC lenses. In prior articles, I discussed implementations of LC lenses by Flexenable (CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs); Meta (Facebook) with some on DeepOptics (Meta (aka Facebook) Cambria Electrically Controllable LC Lens for VAC? and Meta’s Cambria (Part 2): Is It Half Dome 3?); and Magic Leap with some on DeepOptics (Magic Leap 2 (Pt. 2): Possible Answers from Patent Applications); and DeepOptics (CES 2018 (Part 1 – AR Overview).

After discussing the technologies from CML and poLight, it will be got into some of the new uses within AR and VR.

Beyond Camera Focusing and Optical Image Stabilization Uses of Optics Micromovement in AR and VR

Both poLight and CML have cell phone customers using their technology for camera auto-focus and optical image stabilization (OIS). This type of technology will also be used in the various cameras found on AR and VR headsets. poLight’s TLens is known to be used in the Magic Leap 2 reported by Yole Development and Sharp’s CES 2023 VR prototype (reported by SadlItsBradley).

While the potential use of their technology in AR and VR camera optics is obvious, both companies are looking at other ways their technologies could support Augmented and Virtual Reality.

Cambridge Mechatronics (CML) – How it works

Cambridge Mechatronics is an engineering firm that makes custom designs for miniature machines using shaped memory alloy (SMA). Their business is in engineering the machines for their customers. These machines can move optics or objects. The SMA wires contract when heated due to electricity moving through them (below left) and then act on spring structures to cause movement as the wires contract or relax. Using multiple wires in various structures can cause more complex movement. Another characteristic of the SMA wire is that as it heats and contracts, it makes the wire thicker and shorter, causing the resistance to be reduced. CML uses the change in resistance as feedback for closed-loop control (below right).

Show (below right) is a 4-wire actuator that can move horizontally, vertically, or rotate (arrows pointing at the relaxed wires). The SMA wires enable a very thin structure. Below is a still from a CML video showing this type of actuator’s motion.

Below is an 8-wire (2 crossed wires on four sides) mechanism for moving a lens in X, Y, and Z and Pitch and Yaw to control focusing and optical image stabilization (OIS). Below are five still frames from a CML video on how the 8-wire mechanism works.

CML is developing some new SMA technology called “Zero Hold Power.” With this technology, they only need to apply power when moving optics. They suggest this technology would be useful in AR headsets to adjust for temperature variations in optics and support vergence accommodation conflict.

CML’s SMA wire method makes miniature motors and machines that may or may not include optics. With various configurations of wires, springs, levers, ratcheting mechanisms, etc., all kinds of different motions are possible. The issue becomes the mass of the “payload” and how fast the SMA wires can respond.

CML expects that when continuously pixel shifting, they will use take than 3.2V at ~20mA.

poLight – How It Works

poLight’s TLens uses piezoelectric actuators to bend a thin glass membrane over poLight’s special optical clear, index-matched polymer (see below). This bending process changes the lens’s focal point, similar to how the human eye works. The TLens can also be combined with other optics (below right) to support OIS and autofocus.

The GIF animation (right) show how the piezo actuators can bend the top glass membrane to change the lens in the center for autofocus, tilt the lens to shift the image for OIS, and both perform autofocus and OIS.

poLight also proposes supporting “supra” resolution (pixel shifting) for MicroLEDs by tilting flat glass with poLight’s polymer using piezo actuators to shift pixels optically.

One concern is that poLight’s actuators require up to 50 Volts. Generating higher voltages typically comes with some power loss and more components. [Corrected – March 3, 2023] poLight’s companion driver ASIC (PD50) has built-in EMI reduction that minimizes external components (it only requires ext. capacitive load) and power/current consumption is kept very low (TLens® being an optical device, consumes virtually no power, majority of <6mW total power is consumed by our driver ASIC – see table below).

poLight says that the TLens is about 94% transparent. The front aperture diameter of the TLens, while large enough for small sensor (like a smartphone) cameras, seems small at just over 2mm. The tunable wedge concept could have a much wider aperture as it does not need to form a lens. While the poLight method may result in a more compact design, the range of optics would seem to be limited in both the size of the aperture and how much the optics change.

Uses for Optics Micromovement in AR and VR beyond cameras

Going beyond the established camera uses, including autofocus and OIS, outlined below are some of the uses for these devices in AR and VR:

  • Variable focus, including addressing vergence accommodation conflict (VAC)
  • Super-resolution – shifting the display device or the optic to improve the effective resolution
  • Aiming and moving cameras:
    • When doing VR with camera-passthrough, there are human factor advantages to having the cameras positioned and aimed the same as the person’s eyes.
    • For SLAM and tracking cameras, more area could be covered with higher precision if the cameras rotate.
  • I discussed several uses for MicroLED pixel shifting in CES 2023 (Part 2) – Porotech – The Most Advanced MicroLED Technology:
    • Shifting several LEDs to the same location to average their brightness and correct for any dead or weak pixels should greatly improve yields.
    • Shifting spatial color subpixels (red, green, and blue) to the same location for a full-color pixel. This would be a way to reduce the effective size of a pixel and “cheat” the etendue issue caused by a larger spatial color pixel.
    • Improve resolution as the MicroLED emission area is typically much smaller than the pitch between pixels. There might be no overlap when switching and thus give the full resolution advantage. This technique could provide even fewer pixels with fewer connections, but there will be a tradeoff in maximum brightness that can be achieved.

Conclusions

It seems clear that future AR and VR systems will require changing optics at a minimum for autofocusing. There is also the obvious need to support focus-changing optics for VAC. Moving/changing optics will find many other uses in future AR and VR systems.

Between poLight and Cambridge Mechatronic (CML), it seems clear that CML’s technology is much more adaptable to a wider range and types of motion. For example, CML could handle the bigger lenses required for VAC in VR. poLight appears to have an advantage in size for small cameras.

The post Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6) first appeared on KGOnTech.

The post Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6) appeared first on KGOnTech.

CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs

Introduction – Combining 2023’s CES and AR/VR/MR

As I wrote last time, I met with over 30 companies, about 10 of which twice between CES and SPIE’s AR/VR/MR conferences. Also, since I started publishing articles and videos with SadlyItsBradley on CES, I have received information about other companies, corrections, and updates.

FlexEnable is developing technology that could affect AR, VR, and MR. FlexEnable offers an alternative to Meta Materials (not to be confused with Meta/Facebook) electronic dimming technology. Soon after publishing CES 2023 (Part 1) – Meta Materials’ Breakthrough Dimming Technology, I learned about FlexEnable. So to a degree, this article is an update on the Meta Materials article.

Additionally, FlexEnable also has a liquid crystal electronic lens technology. This blog has discussed Meta/Facebook’s interest in electronically switchable lens technology in Imagine Optix Bought By Meta – Half Dome 3’s Varifocal Tech – Meta, Valve, and Apple on Collision Course? and Meta’s Cambria (Part 2): Is It Half Dome 3?.

FlexEnable is also working on Biaxially Curved LCD technology. In addition to direct display uses, the ability to curve a display as needed will find uses in AR and VR. Curved LCDs could be particularly useful in very wide FOV systems. I discussed this briefly (discussed R6’s helmet having a curved LCD briefly in out AR/VR/MR 2022 video with SadlyItsBradley)

FlexEnable – Flexible/Moldable LC for Dimming, Electronic Lenses, Embedded Circuitry/Transistors, and Curved LCD

FlexEnable has many device structures for making interesting optical technologies that combine custom liquid crystal (LC), Tri-acetyl cellulose (TAC) clear sheets, polymer transistors, and electronic circuitry. While Flexenable has labs to produce prototypes, its business model is to design, develop, and adapt its technologies to its customers’ requirements for transfer to a manufacturing facility.

TAC films are often used in polarizers because they have high light transmittance and low birefringence (variable retardation and, thus, change in polarization). Unlike most plastics, TAC can retain its low birefringence when flexed or heated to its glass point (becomes rubbery but not melted) and molded to a biaxial curve. By biaxially curving, they can match the curvature of lenses or other product features.

FlexEnable’s Biaxially Curvable Dimming

Below is the FlexEnable approach to dimming, which is similar to how a traditional glass LC device is made. The difference is that they use TAC films to enclose the LC instead of glass. FlexEnable has formulated a non-polarization-based LC that can either darken or lighten when an electric field is applied (the undriven state can be transparent or dark). For AR, a transparent state, when undriven, would normally be preferred.

To form a biaxially curved dimmer, the TAC material is heated to its glass point (around 150°C) for molding. Below is the cell structure and an example of a curved dimmer in its transparent and dark state.

FlexEnable biaxially shapeable electronic dimming

The Need for Dimming Technology

As discussed in CES 2023 (Part 1) – Meta Materials’ Breakthrough Dimming Technology, there is a massive need in optical AR to support electronically controlled dimming that A) does not require light to be polarized, and B) has a highly transparent state when not dimming. Electronic dimming supports AR headset use in various ambient light conditions, from outside in the day to darker environments. It will make the virtual content easier to see without blasting very bright light to the eyes. Not only will it reduce system power, but it will also be easier on the eyes.

The Magic Leap has demonstrated the usefulness of electronic dimming with and without segmentation (also known as soft edge occlusion or pixelated dimming) with their Magic Leap 2 (and discussed with SadlyItsBradley). Segmented dimming allows the light blocking to be selective and more or less track the visual content and make it look more solid. Because the segmented dimming is out of focus can only do “soft edge occlusion,” where it dims general areas. “Hard-edge occlusion,” which would selectively dim the real work for each pixel in the virtual world, appears impossible with optical AR (but trivial in VR with camera passthrough).

The biggest problem with the ML2 approach is that it used polarization-based dimming that blocks about 65% of the light in its most transparent state (and ~80% after the waveguide). I discussed this problem in Magic Leap 2 (Pt. 3): Soft Edge Occlusion, a Solution for Investors and Not Users. The desire (I would say needed, as discussed in the Meta Materials article here) for light blocking in AR is undeniable, but blocking 80% of the light in the most transparent state is unacceptable in most applications. Magic Leap has been demonstrating that soft edge occlusion improves the virtual image.

Some of the possible dimming ranges

Dimming Range and Speed

Two main factors affect the darkening range and switching speed: the LC formulation and the cell gap thickness. For a given LC formula, the thicker the gap, the more light it will block in both the transmissive and the dark state.

Like with most LC materials, the switching speed increases roughly inversely proportional to the square cell gap thickness. For example, if the cell gap is half as thick, the LC will switch about 4 times faster. FlexEnable is not ready to specify the switching speeds.

The chart on the right shows some currently possible dimming ranges with different LC formulas and cell thicknesses.

Segmented/Pixelated Dimming

Fast switching speeds become particularly important for supporting segmented dimming (ex., Magic Leap 2) because the dimming switching speed needs to be about as fast as the display. Stacking two thin cells in series could give both faster switching with a larger dynamic range as the light blocking would be roughly squared.

FlexEnable supports passive and active (transistor) circuitry to segment/pixelate and control the dimming.

Electronically Controlled Lenses

FlexEnable is also developing what are known as GRIN (Gradient Index) LC lenses. With this type of LC, the electric field changes the LC’s refraction index to create a switchable lens effect. The index-changing effect is polarization specific, so to control unpolarized light, a two-layer sandwich is required (see below left). As evidenced by patent applications, Meta (Facebook) has been studying GRIN and Pancharatnam–Berry Phase (PBP) electronically switchable lenses (for more on the difference between GRIN and PBP switchable lenses, see the Appendix). Meta application 2020/0348528 (Figs. 2 and 12 right) shows using a GRIN-type lens with a Fresnel electrode pattern (what they call a Segmented Lens Profile or SPP). The same application also discusses PBP lenses.

FlexEnable (left) and Meta Patent Application 2020/0348528 Figs. 2 and 12 (right)

The switchable optical power of the GRIN lens can be increased by making the cell gap thicker, but as stated earlier, the speed of LC switching will reduce by roughly the square of the cell gap thickness. So instead, a Fresnel-like approach can be used, as seen diagrammatically in the Meta patent application figure (above right). This results in a thinner and faster switching lens but with Fresnel-type optical issues.

When used in VR (ex., Meta’s Half Dome 3), the light can be polarized, so only one layer is required per switchable lens.

There is a lot of research in the field of electronically switchable optics. DeepOptics is a company that this blog has referenced a few times since I saw them at CES 2018.

Miniature Electromechanical Focusing – Cambridge Mechatronics and poLight

At CES, I met with Cambridge Mechatronics (CML)and poLight, which have miniature electromechanical focusing and optical image stabilization devices used in cell phones and AR cameras. CML uses Shape Memory Alloy wire to move conventional optics for focusing and stabilization. poLight uses piezoelectric actuators to bend a clear deformable membrane over a clear but soft optical material to form a variable lens. They can also tilt a rigid surface against the soft optical material to control optical image stabilization and pixel shifting (often called wobulation) I plan to cover both technologies in more detail in a future article, but I wanted to mention them here as alternatives to LC control variable focus.

Polymer Transistors and Circuitry

FlexEnable has also developed polymer semiconductors that they claim perform better than amorphous silicon transistors (typically used in flat panel displays). Higher performance translates into smaller transistors. These transistors can be used in an active matrix to control higher-resolution devices.

Biaxially Curved LCD

Combining FlexEnable’s technologies together, including curved LCD, circuitry, and polymer semiconductors results in their ability to make biaxially curved LCD prototype displays (right).

Curved displays and Very Wide FOV

Curved displays become advantageous in making very wide FOV displays. At AWE 2022, Red 6 had private demonstrations (discussed briefly in my video with SadlyItsBradley) of a 100-degree FOV with no pupil swim (image distorting as the eye moves) military AR headset incorporating a curved LCD. Pulsar, an optical design consulting company, developed the concept of using a curved display and the optics for the new Red 6 prototype. To be clear, Red 6/Pulsar used a curved glass LCD display, not one from FlexEnable, but it shows that curved displays become advantageous.

Conclusions

In the near term, I find the non-polarized electronic dimming capabilities most interesting for AR. While FlexEnable doesn’t claim to have the light-to-dark range of Meta Materials, they appear to have enough range, particularly on the transparent end, for some AR applications. We must wait to see if the switching speeds are fast enough to support segmented dimming.

To have electronic dimming in a film that can be biaxially curved to add to a design will be seen by many to have design advantages over Meta Material’s rigid lens-like dimming technology. Currently, it seems that, at least on specs, Meta Materials has demonstrated a much wider dynamic range from the transparent to the dark state. I would expect that Flexenable’s LC characteristics will continue to improve.

Electronically changeable lenses are seen as a way to address vergence accommodation conflict (VAC) in VR (such as with Meta’s Half-Dome 3). They would be combined with eye tracking or other methods to move the focus based on where the user is looking. Supporting VAC with AR would be much more complex to prevent the focus changing in the real world a pre-compensation switchable lens would have to cancel out the effect on the real world. This complexity will likely prevent them from being used for VAR in optical AR anytime soon.

Biaxially curved LCDs would seem to offer optical advantages in very wide FOV applications.

Appendix: GRIN vs. Pancharatnam-Berry phase lenses

Simply put, the LC itself acts as a lens with a GRIN lens. The voltage across the LC and the LC’s thickness affects how the lens works. Pancharatnam-Berry phase (PBP) lenses use an LC shutter (uniform) to change the polarization of light that controls the effect of a film with the lens function recorded in it. The lens function film will act or not act based on the polarization of the light. As stated earlier, Meta has been considering both GRIN and PBP lenses (for example, both are shown in Meta application 2020/0348528)

For more on how GRIN lenses work, see Electrically tunable gradient-index lenses via nematic liquid crystals with a method of spatially extended phase distribution.

For more on PBP lenses, see the Augmented reality near-eye display using Pancharatnam-Berry phase lenses and my article, which discusses Meta’s use in the Half-Dome 3.

GRIN lenses don’t require light to be first polarized, but they require a sandwich of two cells. PBP in an AR application would require the real-world light to be polarized, which would lose more than 50% of the light and cause issues with looking at polarized light displays such as LCDs.

The PBP method would likely support more complex lens functions to be recorded in the films. The Meta Half-Dome 3 used a series of PBP lenses with binary-weighted lens functions (see below).

Meta patent application showing the use of multiple PBP lenses (link to article)

The post CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs first appeared on KGOnTech.

The post CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs appeared first on KGOnTech.

❌