Reading view

There are new articles available, click to refresh the page.

Meta beats suit over tool that lets Facebook users unfollow everything

Meta has defeated a lawsuit—for now—that attempted to invoke Section 230 protections for a third-party tool that would have made it easy for Facebook users to toggle on and off their news feeds as they pleased.

The lawsuit was filed by Ethan Zuckerman, a professor at University of Massachusetts Amherst. He feared that Meta might sue to block his tool, Unfollow Everything 2.0, because Meta threatened to sue to block the original tool when it was released by another developer. In May, Zuckerman told Ars that he was "suing Facebook to make it better" and planned to use Section 230's shield to do it.

Zuckerman's novel legal theory argued that Congress always intended for Section 230 to protect third-party tools designed to empower users to take control over potentially toxic online environments. In his complaint, Zuckerman tried to convince a US district court in California that:

Read full article

Comments

© NurPhoto / Contributor | NurPhoto

Meta Quest Pro (Part 2) – Block Diagrams & Teardown of Headset and Controller

Introduction

Limas Lin (LinkedIn Contact) is a reader of my blog and wanted to share his block diagram and teardown of the Meta Quest Pro. He sent me these diagrams over a month ago, but I have been busy with videos and blog articles about the companies I met with at CES and AR/VR/MR 2023.

I have more to write about both the Meta Quest Pro plus the recent company meetings, but I wanted to get out what I think are excellent diagrams on the Meta Quest Pro. The diagrams show the location and the teardown photos with most if not all the key components identified.

I don’t have anything more to say about these diagrams or a conclusion as I think images speak for themselves. I know it is a major effort to find all the components and present them in a such a clear and concise manner and want to than Limas Lim for sharing. You can click on each diagram for a higher resolution image.

Meta Quest Pro Headset Diagrams

Meta Quest Pro Controller Diagrams

The post Meta Quest Pro (Part 2) – Block Diagrams & Teardown of Headset and Controller first appeared on KGOnTech.

Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6)

[March 4th, 2023 Corrections/Updates – poLight informed me of some corrections, better figures, and new information that I have added to the section on poLight. Cambridge Mechatronics informed me about their voltage and current requirements for pixel-shifting (aka wobulation).]

Introduction

For this next entry in my series on companies I met with at CES or Photonics West’s (PW) AR/VR/MR show in 2023, I will be covering two different approaches to what I call “optics micromovement.” Cambridge Mechatronics (CML) uses Shape Memory Alloys (SMA) wires to move optics and devices (including haptics). poLight uses piezoelectric actuators to bend thin glass over their flexible optical polymer. I met with both companies at CES 2023, and they both provided me with some of their presentation material for use in this article.

I would also like to point out that one alternative to moving lenses for focusing is electrically controlled LC lenses. In prior articles, I discussed implementations of LC lenses by Flexenable (CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs); Meta (Facebook) with some on DeepOptics (Meta (aka Facebook) Cambria Electrically Controllable LC Lens for VAC? and Meta’s Cambria (Part 2): Is It Half Dome 3?); and Magic Leap with some on DeepOptics (Magic Leap 2 (Pt. 2): Possible Answers from Patent Applications); and DeepOptics (CES 2018 (Part 1 – AR Overview).

After discussing the technologies from CML and poLight, it will be got into some of the new uses within AR and VR.

Beyond Camera Focusing and Optical Image Stabilization Uses of Optics Micromovement in AR and VR

Both poLight and CML have cell phone customers using their technology for camera auto-focus and optical image stabilization (OIS). This type of technology will also be used in the various cameras found on AR and VR headsets. poLight’s TLens is known to be used in the Magic Leap 2 reported by Yole Development and Sharp’s CES 2023 VR prototype (reported by SadlItsBradley).

While the potential use of their technology in AR and VR camera optics is obvious, both companies are looking at other ways their technologies could support Augmented and Virtual Reality.

Cambridge Mechatronics (CML) – How it works

Cambridge Mechatronics is an engineering firm that makes custom designs for miniature machines using shaped memory alloy (SMA). Their business is in engineering the machines for their customers. These machines can move optics or objects. The SMA wires contract when heated due to electricity moving through them (below left) and then act on spring structures to cause movement as the wires contract or relax. Using multiple wires in various structures can cause more complex movement. Another characteristic of the SMA wire is that as it heats and contracts, it makes the wire thicker and shorter, causing the resistance to be reduced. CML uses the change in resistance as feedback for closed-loop control (below right).

Show (below right) is a 4-wire actuator that can move horizontally, vertically, or rotate (arrows pointing at the relaxed wires). The SMA wires enable a very thin structure. Below is a still from a CML video showing this type of actuator’s motion.

Below is an 8-wire (2 crossed wires on four sides) mechanism for moving a lens in X, Y, and Z and Pitch and Yaw to control focusing and optical image stabilization (OIS). Below are five still frames from a CML video on how the 8-wire mechanism works.

CML is developing some new SMA technology called “Zero Hold Power.” With this technology, they only need to apply power when moving optics. They suggest this technology would be useful in AR headsets to adjust for temperature variations in optics and support vergence accommodation conflict.

CML’s SMA wire method makes miniature motors and machines that may or may not include optics. With various configurations of wires, springs, levers, ratcheting mechanisms, etc., all kinds of different motions are possible. The issue becomes the mass of the “payload” and how fast the SMA wires can respond.

CML expects that when continuously pixel shifting, they will use take than 3.2V at ~20mA.

poLight – How It Works

poLight’s TLens uses piezoelectric actuators to bend a thin glass membrane over poLight’s special optical clear, index-matched polymer (see below). This bending process changes the lens’s focal point, similar to how the human eye works. The TLens can also be combined with other optics (below right) to support OIS and autofocus.

The GIF animation (right) show how the piezo actuators can bend the top glass membrane to change the lens in the center for autofocus, tilt the lens to shift the image for OIS, and both perform autofocus and OIS.

poLight also proposes supporting “supra” resolution (pixel shifting) for MicroLEDs by tilting flat glass with poLight’s polymer using piezo actuators to shift pixels optically.

One concern is that poLight’s actuators require up to 50 Volts. Generating higher voltages typically comes with some power loss and more components. [Corrected – March 3, 2023] poLight’s companion driver ASIC (PD50) has built-in EMI reduction that minimizes external components (it only requires ext. capacitive load) and power/current consumption is kept very low (TLens® being an optical device, consumes virtually no power, majority of <6mW total power is consumed by our driver ASIC – see table below).

poLight says that the TLens is about 94% transparent. The front aperture diameter of the TLens, while large enough for small sensor (like a smartphone) cameras, seems small at just over 2mm. The tunable wedge concept could have a much wider aperture as it does not need to form a lens. While the poLight method may result in a more compact design, the range of optics would seem to be limited in both the size of the aperture and how much the optics change.

Uses for Optics Micromovement in AR and VR beyond cameras

Going beyond the established camera uses, including autofocus and OIS, outlined below are some of the uses for these devices in AR and VR:

  • Variable focus, including addressing vergence accommodation conflict (VAC)
  • Super-resolution – shifting the display device or the optic to improve the effective resolution
  • Aiming and moving cameras:
    • When doing VR with camera-passthrough, there are human factor advantages to having the cameras positioned and aimed the same as the person’s eyes.
    • For SLAM and tracking cameras, more area could be covered with higher precision if the cameras rotate.
  • I discussed several uses for MicroLED pixel shifting in CES 2023 (Part 2) – Porotech – The Most Advanced MicroLED Technology:
    • Shifting several LEDs to the same location to average their brightness and correct for any dead or weak pixels should greatly improve yields.
    • Shifting spatial color subpixels (red, green, and blue) to the same location for a full-color pixel. This would be a way to reduce the effective size of a pixel and “cheat” the etendue issue caused by a larger spatial color pixel.
    • Improve resolution as the MicroLED emission area is typically much smaller than the pitch between pixels. There might be no overlap when switching and thus give the full resolution advantage. This technique could provide even fewer pixels with fewer connections, but there will be a tradeoff in maximum brightness that can be achieved.

Conclusions

It seems clear that future AR and VR systems will require changing optics at a minimum for autofocusing. There is also the obvious need to support focus-changing optics for VAC. Moving/changing optics will find many other uses in future AR and VR systems.

Between poLight and Cambridge Mechatronic (CML), it seems clear that CML’s technology is much more adaptable to a wider range and types of motion. For example, CML could handle the bigger lenses required for VAC in VR. poLight appears to have an advantage in size for small cameras.

The post Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6) first appeared on KGOnTech.

The post Cambridge Mechatronics and poLight Optics Micromovement (CES/PW Pt. 6) appeared first on KGOnTech.

CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs

Introduction – Combining 2023’s CES and AR/VR/MR

As I wrote last time, I met with over 30 companies, about 10 of which twice between CES and SPIE’s AR/VR/MR conferences. Also, since I started publishing articles and videos with SadlyItsBradley on CES, I have received information about other companies, corrections, and updates.

FlexEnable is developing technology that could affect AR, VR, and MR. FlexEnable offers an alternative to Meta Materials (not to be confused with Meta/Facebook) electronic dimming technology. Soon after publishing CES 2023 (Part 1) – Meta Materials’ Breakthrough Dimming Technology, I learned about FlexEnable. So to a degree, this article is an update on the Meta Materials article.

Additionally, FlexEnable also has a liquid crystal electronic lens technology. This blog has discussed Meta/Facebook’s interest in electronically switchable lens technology in Imagine Optix Bought By Meta – Half Dome 3’s Varifocal Tech – Meta, Valve, and Apple on Collision Course? and Meta’s Cambria (Part 2): Is It Half Dome 3?.

FlexEnable is also working on Biaxially Curved LCD technology. In addition to direct display uses, the ability to curve a display as needed will find uses in AR and VR. Curved LCDs could be particularly useful in very wide FOV systems. I discussed this briefly (discussed R6’s helmet having a curved LCD briefly in out AR/VR/MR 2022 video with SadlyItsBradley)

FlexEnable – Flexible/Moldable LC for Dimming, Electronic Lenses, Embedded Circuitry/Transistors, and Curved LCD

FlexEnable has many device structures for making interesting optical technologies that combine custom liquid crystal (LC), Tri-acetyl cellulose (TAC) clear sheets, polymer transistors, and electronic circuitry. While Flexenable has labs to produce prototypes, its business model is to design, develop, and adapt its technologies to its customers’ requirements for transfer to a manufacturing facility.

TAC films are often used in polarizers because they have high light transmittance and low birefringence (variable retardation and, thus, change in polarization). Unlike most plastics, TAC can retain its low birefringence when flexed or heated to its glass point (becomes rubbery but not melted) and molded to a biaxial curve. By biaxially curving, they can match the curvature of lenses or other product features.

FlexEnable’s Biaxially Curvable Dimming

Below is the FlexEnable approach to dimming, which is similar to how a traditional glass LC device is made. The difference is that they use TAC films to enclose the LC instead of glass. FlexEnable has formulated a non-polarization-based LC that can either darken or lighten when an electric field is applied (the undriven state can be transparent or dark). For AR, a transparent state, when undriven, would normally be preferred.

To form a biaxially curved dimmer, the TAC material is heated to its glass point (around 150°C) for molding. Below is the cell structure and an example of a curved dimmer in its transparent and dark state.

FlexEnable biaxially shapeable electronic dimming

The Need for Dimming Technology

As discussed in CES 2023 (Part 1) – Meta Materials’ Breakthrough Dimming Technology, there is a massive need in optical AR to support electronically controlled dimming that A) does not require light to be polarized, and B) has a highly transparent state when not dimming. Electronic dimming supports AR headset use in various ambient light conditions, from outside in the day to darker environments. It will make the virtual content easier to see without blasting very bright light to the eyes. Not only will it reduce system power, but it will also be easier on the eyes.

The Magic Leap has demonstrated the usefulness of electronic dimming with and without segmentation (also known as soft edge occlusion or pixelated dimming) with their Magic Leap 2 (and discussed with SadlyItsBradley). Segmented dimming allows the light blocking to be selective and more or less track the visual content and make it look more solid. Because the segmented dimming is out of focus can only do “soft edge occlusion,” where it dims general areas. “Hard-edge occlusion,” which would selectively dim the real work for each pixel in the virtual world, appears impossible with optical AR (but trivial in VR with camera passthrough).

The biggest problem with the ML2 approach is that it used polarization-based dimming that blocks about 65% of the light in its most transparent state (and ~80% after the waveguide). I discussed this problem in Magic Leap 2 (Pt. 3): Soft Edge Occlusion, a Solution for Investors and Not Users. The desire (I would say needed, as discussed in the Meta Materials article here) for light blocking in AR is undeniable, but blocking 80% of the light in the most transparent state is unacceptable in most applications. Magic Leap has been demonstrating that soft edge occlusion improves the virtual image.

Some of the possible dimming ranges

Dimming Range and Speed

Two main factors affect the darkening range and switching speed: the LC formulation and the cell gap thickness. For a given LC formula, the thicker the gap, the more light it will block in both the transmissive and the dark state.

Like with most LC materials, the switching speed increases roughly inversely proportional to the square cell gap thickness. For example, if the cell gap is half as thick, the LC will switch about 4 times faster. FlexEnable is not ready to specify the switching speeds.

The chart on the right shows some currently possible dimming ranges with different LC formulas and cell thicknesses.

Segmented/Pixelated Dimming

Fast switching speeds become particularly important for supporting segmented dimming (ex., Magic Leap 2) because the dimming switching speed needs to be about as fast as the display. Stacking two thin cells in series could give both faster switching with a larger dynamic range as the light blocking would be roughly squared.

FlexEnable supports passive and active (transistor) circuitry to segment/pixelate and control the dimming.

Electronically Controlled Lenses

FlexEnable is also developing what are known as GRIN (Gradient Index) LC lenses. With this type of LC, the electric field changes the LC’s refraction index to create a switchable lens effect. The index-changing effect is polarization specific, so to control unpolarized light, a two-layer sandwich is required (see below left). As evidenced by patent applications, Meta (Facebook) has been studying GRIN and Pancharatnam–Berry Phase (PBP) electronically switchable lenses (for more on the difference between GRIN and PBP switchable lenses, see the Appendix). Meta application 2020/0348528 (Figs. 2 and 12 right) shows using a GRIN-type lens with a Fresnel electrode pattern (what they call a Segmented Lens Profile or SPP). The same application also discusses PBP lenses.

FlexEnable (left) and Meta Patent Application 2020/0348528 Figs. 2 and 12 (right)

The switchable optical power of the GRIN lens can be increased by making the cell gap thicker, but as stated earlier, the speed of LC switching will reduce by roughly the square of the cell gap thickness. So instead, a Fresnel-like approach can be used, as seen diagrammatically in the Meta patent application figure (above right). This results in a thinner and faster switching lens but with Fresnel-type optical issues.

When used in VR (ex., Meta’s Half Dome 3), the light can be polarized, so only one layer is required per switchable lens.

There is a lot of research in the field of electronically switchable optics. DeepOptics is a company that this blog has referenced a few times since I saw them at CES 2018.

Miniature Electromechanical Focusing – Cambridge Mechatronics and poLight

At CES, I met with Cambridge Mechatronics (CML)and poLight, which have miniature electromechanical focusing and optical image stabilization devices used in cell phones and AR cameras. CML uses Shape Memory Alloy wire to move conventional optics for focusing and stabilization. poLight uses piezoelectric actuators to bend a clear deformable membrane over a clear but soft optical material to form a variable lens. They can also tilt a rigid surface against the soft optical material to control optical image stabilization and pixel shifting (often called wobulation) I plan to cover both technologies in more detail in a future article, but I wanted to mention them here as alternatives to LC control variable focus.

Polymer Transistors and Circuitry

FlexEnable has also developed polymer semiconductors that they claim perform better than amorphous silicon transistors (typically used in flat panel displays). Higher performance translates into smaller transistors. These transistors can be used in an active matrix to control higher-resolution devices.

Biaxially Curved LCD

Combining FlexEnable’s technologies together, including curved LCD, circuitry, and polymer semiconductors results in their ability to make biaxially curved LCD prototype displays (right).

Curved displays and Very Wide FOV

Curved displays become advantageous in making very wide FOV displays. At AWE 2022, Red 6 had private demonstrations (discussed briefly in my video with SadlyItsBradley) of a 100-degree FOV with no pupil swim (image distorting as the eye moves) military AR headset incorporating a curved LCD. Pulsar, an optical design consulting company, developed the concept of using a curved display and the optics for the new Red 6 prototype. To be clear, Red 6/Pulsar used a curved glass LCD display, not one from FlexEnable, but it shows that curved displays become advantageous.

Conclusions

In the near term, I find the non-polarized electronic dimming capabilities most interesting for AR. While FlexEnable doesn’t claim to have the light-to-dark range of Meta Materials, they appear to have enough range, particularly on the transparent end, for some AR applications. We must wait to see if the switching speeds are fast enough to support segmented dimming.

To have electronic dimming in a film that can be biaxially curved to add to a design will be seen by many to have design advantages over Meta Material’s rigid lens-like dimming technology. Currently, it seems that, at least on specs, Meta Materials has demonstrated a much wider dynamic range from the transparent to the dark state. I would expect that Flexenable’s LC characteristics will continue to improve.

Electronically changeable lenses are seen as a way to address vergence accommodation conflict (VAC) in VR (such as with Meta’s Half-Dome 3). They would be combined with eye tracking or other methods to move the focus based on where the user is looking. Supporting VAC with AR would be much more complex to prevent the focus changing in the real world a pre-compensation switchable lens would have to cancel out the effect on the real world. This complexity will likely prevent them from being used for VAR in optical AR anytime soon.

Biaxially curved LCDs would seem to offer optical advantages in very wide FOV applications.

Appendix: GRIN vs. Pancharatnam-Berry phase lenses

Simply put, the LC itself acts as a lens with a GRIN lens. The voltage across the LC and the LC’s thickness affects how the lens works. Pancharatnam-Berry phase (PBP) lenses use an LC shutter (uniform) to change the polarization of light that controls the effect of a film with the lens function recorded in it. The lens function film will act or not act based on the polarization of the light. As stated earlier, Meta has been considering both GRIN and PBP lenses (for example, both are shown in Meta application 2020/0348528)

For more on how GRIN lenses work, see Electrically tunable gradient-index lenses via nematic liquid crystals with a method of spatially extended phase distribution.

For more on PBP lenses, see the Augmented reality near-eye display using Pancharatnam-Berry phase lenses and my article, which discusses Meta’s use in the Half-Dome 3.

GRIN lenses don’t require light to be first polarized, but they require a sandwich of two cells. PBP in an AR application would require the real-world light to be polarized, which would lose more than 50% of the light and cause issues with looking at polarized light displays such as LCDs.

The PBP method would likely support more complex lens functions to be recorded in the films. The Meta Half-Dome 3 used a series of PBP lenses with binary-weighted lens functions (see below).

Meta patent application showing the use of multiple PBP lenses (link to article)

The post CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs first appeared on KGOnTech.

The post CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs appeared first on KGOnTech.

The Turtles of the Metaverse

The Turtles of the Metaverse

The myth has it that the earth is held up by a World Turtle. When asked what holds up the World Turtle, the sage replies: "another turtle".

You know the rest. It's turtles all the way down.

Depending what media you read, you might have heard we're living in an Exponential Age. The pace of change is so fast, and is happening along a curve that's so, well, exponential that it's nearly impossible for the human mind to comprehend.

It's a curve that doesn't just encompass finance or computers, but also climate change and research, genetics and nuclear fusion.

The concept of this age led Packy McCormack to proclaim we might be headed to a Trillion dollar VC future:

The Turtles of the Metaverse

Lately, predictions about the Metaverse seem to be rapidly scaling those Exponential peaks. This week Goldman Sachs called it an $8 TRILLION dollar opportunity.

Maybe it's better to create a bar graph of how VCs and analysts value the Metaverse. It would be very Exponential. And it would probably end with Jensen Huang's prediction that “Omniverse or the Metaverse is going to be a new economy this is larger than our current economy.

(Does that mean our current economy will shrink to zero? Or that it will double? Because isn't the Metaverse also part of "our economy"?)

Regardless - you get the point. The Exponential Age is upon us.

I can feel it.

I can't keep up.

My feed is filled with new AI advances and more realistic virtual worlds, with virtual productions that are almost as good as what Hollywood can produce and robots whose facial expressions look a lot like people.

Oh, and a planet increasingly following an exponential trajectory of its own. And a virus that has taught us all to understand the value of a logarithmic chart.

Maybe you can feel it too? This sense that things are happening so fast, that change is sweeping by us like brush fire, and that we barely have time to recognize it let alone run to keep up.

The Age of Turtles

Turtles might be a by-product of the Exponential Age.

They're everywhere.

How you feel about AI stands on the back of your ideas about intelligence which stand on the back of your ideas about the human mind.

These days even the Turing Test stands on a different turtle than we may have imagined. Alvy Ray Smith argues that Turing set the test out as a commentary on a society that tries to evaluate the mind of a gay mind. The Turing Test was a subversive way of asking: "how can you chemically try to castrate me? You can't even tell if it's a real human behind that curtain."

Not the turtle I thought his test was standing on.

But there are lots of other turtles.

The Metaverse is full of them.

How you feel about avatars stands on the back of your ideas about our capacity to identify outside of our own bodies, which stands on the back of our ideas about the importance of the physical world, which stands on the back of our ideas of humanity's place in that world.

If you want to go down THAT rabbit hole, the Convivial Society is there to guide you through a lot of Hannah Arendt and Marxist-adjacent commentary. And honestly? I can buy all of it on certain days. And on others, I can see it as an author in need of a good anthropologist.

Too much philosophy and not enough doing is one of the turtles we can stand on. But its shell is fragile.

At one point, I wrote that I thought avatar identity and virtuality was a sort of proxy affirmation Gödel's Incompleteness Theorem: no matter how deep we go in trying to find the real 'self', we'll always loop back to where we started.

I am what I am and that includes my avatar. There's no point in finding the final turtle, because it's mathematically impossible to prove that there's one in the first place.

(As a side note, Incompleteness Theorem has deep relevance to the development of the computer - and is where Turing started in the first place. This creates yet another strange loop where the thing that computers tried to solve ended up creating worlds where their solution was made, well, more 'meta' than we imagined).

Decoding the Metaverse

The Exponential Age makes it tough to keep up. Meta, Microsoft, Apple, NVIDIA, Niantic...everyone is piling into the Metaverse.

Sure, maybe you're steeped in this stuff like I am. But most of you aren't. You have day jobs and a dog to walk and you really want to have pizza tonight even though you know you shouldn't.

So how do you decode it? How do you figure out which "metaverse" you want to join? How do you gauge how much fear you should have, or how concerned you should be that we're all about to log-out of reality?

Well...the turtles are here to help. Or more precisely, three of them:

  • What does it mean to be human?
  • How should humans relate to each other?
  • How do you describe humanity's relationship to its tools and technology?

And it goes like this: first, answer those questions for yourself, even if only in a loose way. Second, listen for how people talk about the Metaverse in relation to those three things. Third, compare the two.

Samples of Turtles

What It Means to be Human

Let's start with Meta née Facebook.

To Meta, being human means being connected. In their keynote about their move into the Metaverse it wasn't about finding clean, empty, silent spaces online. It was about connecting. Because to them, that's the human purpose.

Having said that, what they SAY about people and how they act often diverge. Something that goes all the way back to this quote:

The Turtles of the Metaverse

Listen for those signals. Whether it's the belief that being human is about work (Microsoft) or play (Niantic) these companies build entire businesses around a singular view of what it means to be human.

How Should Humans Relate to Each Other?

I had an interesting chat on Twitter about decentralized autonomous organizations (DAOs). And it sort of concluded with this:

The Turtles of the Metaverse

Now, first, I don't really take any great exception to what Bruce had to say. It was sane, cautious, and skeptical. All good - because there's a LOT to be skeptical about when it comes to some of these new crypto-based models.

But my second thought was: "hold on...so, you're OK with corporations as a structure because they achieve the same outcomes?"

I think that how humans relate to each other is a central question for our times. I've personally come to believe that the "corporate" experiment has run its course.

Whatever this system is we live in is fundamentally broken. We need something better - and I'm willing to throw the dice a bit to see if we can't find it.

But that's me. That's my turtle.

I respect Bruce's turtle also. For him, a corporation is an outcome producing entity and a DAO is a speculative crap shoot.

That's his turtle and I have my own.

But how companies and communities decide this will have a radical impact on how the Metaverse evolves. If we're all satisfied with letting corporations decide our shared future - that's a choice.

Hanging on to old ideas about the efficiency of markets or the glorious power of the corporation (a legal PERSON in some countries) is also a choice. When you hear companies talk up the virtues of the Metaverse, listen carefully to the language they choose around how we'll all interact together.

Do any of them predict their own obsolescence? Or do they continue to act as if the weather is perfectly fine under their corporate umbrellas?

I'm kind of hoping we can come up with something new.

How do you describe humanity's relationship to its tools and technology?

This is the final turtle. And I'll keep it simple:

  • Almost everything that comes out of traditional Silicon Valley places technology inside the human circle of empathy (as Jaron Lanier describes it)
  • This means that technology is treated as if it has its own place at the same table as us humans. It is treated as if "IT" will save us, help us, improve our lives, make us more connected or wash our dishes.
  • But technology doesn't do any of those things. It's just technology. It doesn't have wants or needs, it isn't an organism or a living thing.
  • We make things. Some of those things we call technology. The technology we make has our values embedded in it because how else could it be? Technology is not neutral. It's a choice we make. It's a gesture towards a future we want, expressed through our tools.
  • In other words, listen carefully to anyone who either tries to priviledge technology OR distance themselves from it.

My own turtle? Technology is a conscious choice made by imperfect people.

Do your own research.

What's Your Turtle?

Do you see what I did here?

I created another strange loop.

Because here's the thing: the Metaverse will be massive. It will contain questions that stand on assumptions that stand on questions. It will challenge us to rethink how society is organized, the role of the corporation, the place of waking dreams and the dependence we can create on imaginary places.

As you enter this strange new world, listen carefully to the carnival barkers and corporate shills, the crypto enthusiasts and the coders.

If you listen carefully to how they talk - about people, their relationship to each other, and how they treat their 'tools' then you can pretty quickly close a lot of doors that should remain shut.

But when you do, you might find yourself questioning whether your own beliefs about these same topics still hold true.

You might start wondering what it means to be human when our capacity for self-expression is no longer shackled to reality, or how we will organize ourselves when corporations aren't the only game in town and when governance and culture happen "on-chain".

And so the strange loop: as we try to understand the Metaverse, we loop back to a deeper understanding of ourself.

There's a quote I love which was perhaps optimistic for its time. And maybe it's optimistic now also.

When asked about his trip to the moon, Neil Armstrong replied:

"We hope and think that those people shared our belief that this is the beginning of a new era—the beginning of an era when man understands the universe around him, and the beginning of the era when man understands himself."

May you ride most excellent turtles on that journey to understanding.


So...I'd really like to hear from you. If you get this by e-mail, please do reply. I love it when people hit reply.

You can also hit me up on Twitter. I like having chats in the public square when I can.

I also recommend you join the Open Meta Discord (if Discord is your thing).

And if you want something REALLY fun, join me as I explore something that I've been spending a lot of time on. Like, decades. :)

Facebook's Metaverse Is More Real Than You Think

Facebook's Metaverse Is More Real Than You Think

What if the Metaverse isn't all Marvel character skins, Ariana Grande concerts, NFT art galleries or land auctions?

What if, instead, a large chunk of the Metaverse primarily focuses on extensions of 'reality' - a place for your book club to meet inside a re-creation of the novel's setting, your own personal gallery of cute photos of your cat (rendered as 3D holograms), or a place to browse for furniture for your home?

What if one of the largest drivers of traffic to the Metaverse is Facebook? What if people arrive in the world you've created within the Metaverse because of an ad on Facebook or Instagram, or because someone "liked" your build or shared a 3D movie of a concert you held - and it went viral on Instagram?

How will experiences be created when there's an expectation that someone will first engage with some little "nugget" of content on Instagram? How will those experiences be influenced by someone's arrival from a social stream instead of a gaming portal?

These scenarios are possible when we start to think about what drives Facebook. That while VR headsets or AR glasses might make a lot of money, they will never have the reach or value of 2.89 billion user accounts.

Because if we think of how Facebook would benefit from an open Metaverse, we would realize that the main benefits will be related (at least in the next decade) to how they extend the functionality of the Facebook app and Instagram.

As such, we may see a far more 'real' experience of spatial worlds than we generally imagine. They will extend your Facebook Groups, be an add-on to your Facebook page, or be the result of adding a 3D store when you run a Facebook ad.

And while we may still attend Ariana Grande concerts in the Fortnite corner of the Metaverse or hang out at the Bored Apes Yacht Club, Facebook might also manage to take a slice of value out of that time by getting you there in the first place.

How We Get To The Metaverse

Consider this: you're browsing Instagram and see an ad for a pair of shoes. Even with today's technology (whether in Snapchat, IG or elsewhere) you can view these shoes on your physical feet because of the power of your phone's camera and the supporting AI.

Have a look at this demo (click through to see video on Twitter):

Facebook's Metaverse Is More Real Than You Think

Pretty cool! Digital and physical worlds blurring together.

But what happens if (5 years from now) you add a button that says: "visit store in 3D".

  • If you've been 'seeing' these virtual shoes through your AR glasses, the store materializes in the room around you
  • If you're on your phone you can either view the virtual showroom on your phone or bounce it over to your laptop

Or, if you've been shoe shopping in VR, that same store is also accessible. Just hang a left after the dance club while wearing your virtual reality glasses.

Regardless of how you get there, it has been a short hop from some micro piece of content into a 3D experience.

Linking Spatial Experiences

Now, let's say that store has a teleport button. Or maybe you can just walk down the virtual street. And you can visit a sock store owned by another brand.

You've experienced seamless interoperability between an app (Facebook/Instagram) and a 3D space, and the 3D space is connected to others. This interoperability works across devices: from phone to glasses, from VR to computer screen. From one virtual space (or world) to another.

You've entered the Metaverse. And Instagram was a natural entry point.

Oh...and Facebook made bank on that initial ad. Or maybe they even helped the shoe company set-up their store.

Facebook doesn't need to own the shoe store in the Metaverse. Sure - maybe they rented some server space. But they might also just link across to a shoe store in Decentraland or Epicland.

They keep making money the way they always have.

But there's more.

Your Avatar Is You

I made the following speculation (click to see the discussion):

Facebook's Metaverse Is More Real Than You Think

Ready Player Me lets you create an avatar. The avatar is 'interoperable'. Meaning, the avatar you create can enter a bunch of different 'worlds'.

No need to create a new one for every virtual space you enter. You reduce the friction which currently exists between being interested and participating: the set-up time has been reduced.

It's an ambitious and much-needed functionality as we head towards interconnected 3D games and virtual worlds.

Afterall, why do I need to keep signing up everytime I enter a new place? And why do I need to create a new avatar for every experience?

My speculation above, however, was in recognition that Facebook has one massive strategic value: it already has 2.89 billion user accounts.

Imagine if it could automatically generate an avatar for everyone of those 2.89 billion accounts? Maybe it scrapes their profile photos or something to create an avatar that looks like them right out of the box. No work needed - you just wake up one day and have one.

By eliminating the friction in creating an avatar identity, Facebook would suddenly have a massive avatar population ready to roam the Metaverse.

(By the way, Snapchat is on a similar trajectory. And so is Apple, although with less utility).

And Facebook would want these avatars to be interoperable. Because if Facebook can promise its users that their avatars can travel across the Metaverse then it is giving them an incentive.

Why would your aunt create an avatar using any other system? Facebook already gave her one.

And what about the virtual worlds themselves - all of those spaces which make up the Metaverse?

No brand is going to turn down access to an audience of 2.89 billion. And neither will the little shop keepers, the museums, the companies selling socks or your local council.

That's a lot of users. All of them with avatars. All of them ready to travel.

But...Anonymous!

Now - will those avatars link back to the real names of the people behind them? Will your avatar have your credit card attached to it? Will you have a pop-up friends list so you can invite them along for your book club or whatever? Will your interests on Facebook help to recommend cool places for you to visit in the Metaverse?

Yes. And most people will be fine with that.

Let me repeat that: Yes. And most people will be fine with that.

Most people don't care about pseudoanonymity now. And most people won't. They aren't jumping into another world to play a fantasy game.

They just want to try on shoes.

Or maybe they want to hang out with their book club. Because suddenly every group on Facebook has a little icon that says "meet in 3D". And you don't need to set up an avatar, it's cheap (or free) to rent "space", and you really want to know that you're chatting with your Aunt Sally and not randomCatLover21237.

[By the way - I am not saying any of this because I necessarily support it. But I'm trying to recognize that pseudoanonymity is more of a concern within tech circles than out of them].

Facebook Doesn't Want to Build a Dystopia

(It just wants to make money helping you to get there).

Mark Zuckerberg announced that Facebook will become a Metaverse company within 5 years:

And my hope, if we do this well, I think over the next five years or so, in this next chapter of our company, I think we will effectively transition from people seeing us as primarily being a social media company to being a metaverse company.

Somehow, a consensus seemed to emerge that this somehow meant that Facebook wants to own the Metaverse. And a lot of people seem to equate this solely with Oculus or with this idea that Facebook has plans for some ad-infested dystopia.

But these ideas don't make either technical or strategic sense:

First, no one can OWN the Metaverse. Mark himself makes this point:

"The metaverse is a vision that spans many companies — the whole industry. You can think about it as the successor to the mobile internet. And it’s certainly not something that any one company is going to build, but I think a big part of our next chapter is going to hopefully be contributing to building that, in partnership with a lot of other companies and creators and developers."

Second, it isn't even technically possible for someone to own the full stack of the Metaverse.

We don't even need to argue over your personal definition of the word "Metaverse".

There are too many players, too many blockchain worlds, too many games, too many Epics and Niantics running around with their own Metaverse plans. All of these 3D spaces will link up because the technology already exists to make that happen.

The only limitation is a lack of agreement on standards.

(And yes, with today's tech, the connections would be flaky, we'll need 5G and edge computing etc etc. I didn't say today's Metaverse would be any good - just that the inventions are already mostly there, we're just searching for consensus on how to hook it all together).

Third, Mark said that Facebook will shift from being a social media company to a metaverse one, but he didn't say he's abandoning his business model. It doesn't make strategic sense for Facebook to throw out its business model, its reliance on its main apps, it's focus on advertising and monetizing attention.

In other words, Facebook is best served by extending what has worked for it so well before.

And finally, Mark himself said that this wasn't about virtual reality:

"But the metaverse isn’t just virtual reality. It’s going to be accessible across all of our different computing platforms; VR and AR, but also PC, and also mobile devices and game consoles. Speaking of which, a lot of people also think about the metaverse as primarily something that’s about gaming. And I think entertainment is clearly going to be a big part of it, but I don’t think that this is just gaming. I think that this is a persistent, synchronous environment where we can be together, which I think is probably going to resemble some kind of a hybrid between the social platforms that we see today, but an environment where you’re embodied in it."

But What About VR?

But doesn't this ignore Oculus and the work Facebook is doing on AR glasses?

Not really. But it's important to see those as components of, rather than key drivers of the Facebook Metaverse strategy.

I personally think that their investment in optical devices will end up being seen as a defensive move against an emerging generation of new 'wearables'.

Just like no one will own the Metaverse, no one will own the AR or VR markets. We might each end up owning (those of us with the means to do so) 3-4 pairs of glasses for different situations and experiences.

Facebook might end up being a market leader in VR. They may even win the battle with Apple, Niantic and a dozen other companies for AR.

But these will still be niche markets for at least a decade.

In the meantime, the Metaverse will be built, will grow, and will attract upwards of a billion or more users. And so at least for the next decade, the true strategic value for Facebook will be driven by its apps, and how they link to these new worlds.

Data In and Data Out

We think of Facebook as a walled garden. But it's not.

Facebook is everywhere. 17% of websites have a Facebook pixel. This pixel lets Facebook track you...even when you aren't on Facebook. 8.4 million websites send data back to Facebook.

And then there's social sign-in. 90% of people who use social login on sites other than Facebook use...Facebook.

In other words, a large part of Facebook's business model has always been collating data from outside its "main apps".

Why would the Metaverse be any different?

Facebook doesn't own those 8.4 million websites.

With the Metaverse, the ways that content will flow across digital domains will shift.

Scan Everything

I got to thinking about this after a lengthy chat with Keith Jordan, one of the smartest technologists around (in large part because his insights are grounded in, well, reality).

He outlined a scenario where Facebook could use photogrammetry to capture  memories of an event and allow you to relive the experience in 3D.

Take a scan of the wedding cake, the bride on the steps of the church, and your drunk uncle passed out in the corner and bingo - you can now relive the special day in the Metaverse (or post the scans on your Facebook page).

Add scanning to Instagram and you suddenly have a new generation of photos. And in order to properly "see" a recreation of your wedding day - a virtual environment seems pretty ideal.

It's just another example of how Facebook will attempt to capture value as information and 3D content moves across domains.

And if Facebook can take a huge chunk of the gateways into the Metaverse, the avatars we use to travel when we get there, and the apps we post our Metaverse memories to...then don't they get a bigger win than selling a bunch more VR headsets?

Or maybe you think Facebook is planning to change its business model? If so, I'd suggest this deep dive by Napkin Math on "Is Facebook Fixable?". I concur with most of what it says. But leave it to you to make your own decisions.

A Metaverse Strategy

None of this is possible for Facebook without an open Metaverse. At the simplest level, the above scenarios aren't possible unless:

  • There is a universal standard for linking to a 3D space. You need to be able to click that button in the shoe ad and have it take you to a shoe store
  • There's a universal standard for allowing avatars to move seamlessly into a 3D world. If you need to sign-up for a new account everytime you click that "see in 3D" button, no one will come.

Now, I don't know what Facebook's real plans are. But the above speculation serves a few purposes:

  • It helps to visualize that the emergence of the Metaverse will be driven, in part, by the interests of extremely large and well-financed players.
  • It shows how a company like Facebook can be supportive of an open Metaverse because it has the most to gain: it's one of the few companies in the world with a billion people who are 'avatar ready'
  • It reminds us that the Metaverse may have huge sections which are more "real" than we might imagine. It will have book clubs or city council offices, scans of our pets or 3D photos of our wedding, endless stores with shoes or furniture.
  • It demonstrates that while we might want pseudoanonymity or distributed blockchain identity systems - this will come up against the awesome power of Facebook and its 2.89 billion accounts. Most of whom don't care. And most of whom, frankly, just want to meet up with their best friend or uncle, and see them by their name.
  • It helps us to understand that many of the entrypoints to the Metaverse might use "legacy" systems. As soon as we create a universal standard for virtual world URLs, Facebook can start to monetize the links. A shoe ad on Instagram is as likely a way to get into the Metaverse as any other source - and even a concert needs Facebook's insane reach if it wants more than 100 people to show up.  
  • And finally, it reminds us that the Metaverse is about experiences. No one cares whether the Metaverse starts or ends at their glasses. They care that they were able to hang out with their friends, or buy shoes, or see Ariana Grande - whether she appears on the floor of their living room, via a VR headset, or on a computer screen. You were synchronous with others. You were there. And the device you use to see her won't really matter to you the user.

Now, this is just one hypothesis for Facebook's Metaverse future. But even if it's the correct hypothesis there's still a wild card.

Because Facebook isn't the only player.

As much as Facebook and the other big players will be throwing billions at the strategic problem of ensuring their relevane in the Metaverse, they're still up against a powerful force: the people with a passion to building something better, who want to create something that leaves some of these old paradigms behind.

In other words, Facebook may be big and powerful, but the mighty have fallen before.

And they will fall again. Just don't expect them to go down without a fight.

Facebook, the Metaverse and Building Bridges

Facebook, the Metaverse and Building Bridges

Facebook is all in on the metaverse.

Their intention is to pivot from being a media to metaverse company:

“And my hope, if we do this well, I think over the next five years or so, in this next chapter of our company, I think we will effectively transition from people seeing us as primarily being a social media company to being a metaverse company,” Zuckerberg said in an interview with the Verge.

Today, Andrew Bosworth (Boz) who leads Facebook Reality Labs, announced the formation of a metaverse product group that will pull in heavyweights from Instagram (Vishal Shah, head of product) and Facebook Gaming (Vivek Sharma will lead the Horizon teams, and Oculus OG while Jason Rubin will lead the Content team).

Continents on the Metaverse

Facebook joins Epic Games (makers of Fortnite), Niantic (makers of Pokémon Go, which is creating a metaverse at physical world-scale), Roblox and Apple (who will never call it the metaverse) in trying to build the next evolution in computing.

I recently wrote that the metaverse is a sort of cultural proxy term for a major shift in computing:

  • From one that is (mostly) flat and two-dimensional to one which is spatial
  • From a time when reality itself is separate from the digital, to a time when real-time digital content is mapped directly onto our own world (truly a "meta" layer on physical reality)
  • From a time when 3D environments are mostly for games, to one when we consume the majority of our entertainment and spend increasing amounts of our social and work time in 3D digital spaces.

The evidence of this shift can be found in everything from user hours 'in-worlds' to how much money is being spent, from virtual concerts that pull in millions of concurrent attendees to the increased capacity for our phones to take 3D snapshots or to scan the room with LiDAR in order to enhance augmented reality.

These provable trendlines, however, don't make a "metaverse". The Metaverse implies travel. It implies that we can move easily between 3D experiences as easily as we click a link in a browser.

(Although, there are a lot of bottlenecks with this too, as you experience everytime you hit a sign-up form, paywall or need to enter your credit card number YET AGAIN).

Imagine needing a browsers for social media, and then another for shopping, and another to read the news.

That's the experience of the metaverse today: you need one client for Roblox, you need an app to play Pokemon Go, you need a different app for Fortnite and you even need special equipment to jump into VR.

And just like the web, you also need different 'identities' (sign-ups) and wallets to fully participate in these spaces.

The continents are getting bigger but there's no way to move between them and we can't easily take our identity, possessions and money with us.

As Facebook describes it: "But to achieve our full vision of the Metaverse, we also need to build the connective tissue between these spaces -- so you can remove the limitations of physics and move between them with the same ease as moving from one room in your home to the next."

Oh: I bolded the word "our". We'll come back to that.

The Cultural Layer of the Metaverse and the Tech Stack

Right now, culturally, the metaverse is a big amorphous blob against which a lot of ideas are being attached.

Most people have never heard of it. Most people don't care whether Hololens will ever become a consumer device and they aren't sitting at home wondering about the FOV on a pair of Nreal glasses. They only care whether it sounds cool and what they'll be able to do once they get there - even if "there" is some generalized future.

Javier interpreted my observation of this fact as a disservice:

Facebook, the Metaverse and Building Bridges

And Javier is right: because beneath all of this top level noise is the hard work of creating the actual technology that will make it work, and the even tougher job of creating the standards that will help make all of this seamless interconnectivity possible.

We can talk for days in generalities about 'the metaverse'.

But that talk MATTERS, because culture eventually translates into requests for development, for projects.

"Give me one of those NFT things," is what they're yelling right now in some headquarters somewhere. And next up (believe me, I've been asked): "get me onto the Metaverse!"

Meanwhile, programmers and developers sit in a room and actually make stuff.

And what they develop will have embedded values. Code is not agnostic. Code makes a statement about what we believe, whether it's through insisting that you use your real name and birth gender or that transactions be distributed (or centralized).

Code becomes culture and culture becomes code. Usually the former, as anonymized email, lack of micropayments and other design choices in how the Internet was built makes clear.

Which brings us back to that "our" word that Boz used: "But to achieve our full vision of the Metaverse, we also need to build the connective tissue between these spaces."

Now, maybe when he said "OUR" and "WE" he meant all of us: but unless we're all reporting in to this new product team that he mentioned in the next paragraph, I think he's really talking about Facebook.

And so let's be concrete: we'd better be concerned about who's making the decisions about how all of these worlds will interconnect; how anonymity, pseudoanonymity and identity will be handled (signed in with your Facebook account into VR lately?); who will have access to our bank records once we arrive;  whether users will value sex or shopping, creativity or socializing; and how the whole damn thing will be paid for in the first place.

You Build the World

We're at an inflection point. The buzz will die down. Or, more accurately, it will come in waves. Everyone will talk about the metaverse for a few weeks and then people will get back to work.

But Facebook's announcement is another proof point that the largest players around are making huge and very public bets on the 'metaverse'.

But we all have agency here. Facebook doesn't have an ordained right to determine what the metaverse should be, or how the connective tissue should be designed.

Thankfully, I actually have some faith in the Reality Labs team and their willingness to publish, share, and participate in open source initiatives. (Their bosses on the other hand are a different story).

The storylines are being drawn: some of them highly tangible and specific (passthrough VR!) and some of them vague and aspirational.

At the one end of the spectrum are the big players like Facebook. And they'll either become....well, Facebook, but making even more money from the ever-greater amounts of time we spend with them.

Or, they might become AOL. Trying to build a giant closed garden and then discovering it all blown away by the tsunami of a more open metaverse.

Because at the other end of the spectrum is a grass-roots led version of the metaverse. One in which NFTs aren't just over-inflated fan clubs, but a proof of concept that value can be decentralized and ripped from the control of centralized corporations and governments.

In this more open metaverse, maybe the values of creativity, beautiful code and co-creation can help to get these new worlds right.

In truth, both will exist: huge continents and smaller islands, open seas and walled kingdoms.

But perhaps the interplay between the two will allow for a convergence of culture and code which values the user first, and the very human experience of entering whole new worlds.

❌