Normal view

There are new articles available, click to refresh the page.
Yesterday — 8 November 2024New on MIT Technology Review

The Download: AI vs quantum, and the future of reproductive rights in the US

8 November 2024 at 14:10

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Why AI could eat quantum computing’s lunch

Tech companies have been funneling billions of dollars into quantum computers for years. The hope is that they’ll be a game changer for fields as diverse as finance, drug discovery, and logistics.

But while the field struggles with the realities of tricky quantum hardware, another challenger is making headway in some of these most promising use cases. AI is now being applied to fundamental physics, chemistry, and materials science in a way that suggests quantum computing’s purported home turf might not be so safe after all. Read the full story.

—Edd Gent

What’s next for reproductive rights in the US

This week, it wasn’t just the future president of the US that was on the ballot. Ten states also voted on abortion rights.

Two years ago, the US Supreme Court overturned Roe v. Wade, a legal decision that protected the right to abortion. Since then, abortion bans have been enacted in multiple states, and millions of people in the US have lost access to local clinics.

Now, some states are voting to extend and protect access to abortion. Missouri, a state that has long restricted access, even voted to overturn its ban. But it’s not all good news for proponents of reproductive rights. Read the full story.

—Jessica Hamzelou

This story is from The Checkup, our weekly newsletter giving you the inside track on all things biotech. Sign up to receive it in your inbox every Thursday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Black Americans received racist texts threatening them with slavery 
Some of the messages claim to be from Trump supporters or the Trump administration. (WP $)
+ What Trump’s last tenure as president can teach us about what’s coming. (New Yorker $)
+ The January 6 rioters are hoping for early pardons and release. (Wired $)

2 China is shoring up its economy to the tune of $1.4 trillion
It’s bracing itself for increased trade tensions with a Trump-governed US. (FT $)
+ The country’s chip industry has a plan too. (Reuters)
+ We’re witnessing the return of Trumponomics. (Economist $)
+ Here’s how the tech markets have reacted to his reelection. (Insider $)

3 How crypto came out on top
Trump is all in, even if he previously dismissed it as a scam. (Bloomberg $)
+ Enthusiasts are hoping for less regulation and more favorable legislation. (Time $)

4 A weight-loss drug contributed to the death of a nurse in the UK

Susan McGowan took two doses of Mounjaro in the weeks before her death. (BBC)
+ It’s the first known death to be officially linked to the drug in the UK. (The Guardian)

5 An academic’s lawsuit against Meta has been dismissed
Ethan Zuckerman wanted protection against the firm for building an unfollowing tool. (NYT $)

6 How the Republicans won online
The right-wing influencer ecosystem is extremely powerful and effective. (The Atlantic $)
+ The left doesn’t really have an equivalent network. (Vox)
+ X users are considering leaving the platform in protest (again.) (Slate $)

7 What does the future of America’s public health look like?
Noted conspiracy theorist and anti-vaxxer RFK Jr could be in charge soon. (NY Mag $)
+ Letting Kennedy “go wild on health” is not a great sign. (Forbes $)
+ His war on fluoride in drinking water is already underway. (Politico)

8 An AI-created portrait of Alan Turing has sold for $1 million
Just… why? (The Guardian)
+ Why artists are becoming less scared of AI. (MIT Technology Review)

9 How to harness energy from space
A relay system of transmitters could help to ping it back to Earth. (IEEE Spectrum)
+ The quest to figure out farming on Mars. (MIT Technology Review)

10 AI-generated videos are not interesting
That’s according to the arbiters of what is and isn’t interesting over at Reddit. (404 Media)
+ What’s next for generative video. (MIT Technology Review)

Quote of the day

“That’s petty, right? How much does one piece of fruit per day cost?”

—A former Intel employee reacts to the news the embattled company is planning to restore its free coffee privileges for its staff—but not free fruit, Insider reports.

The big story

Recapturing early internet whimsy with HTML

December 2023

Websites weren’t always slick digital experiences. 

There was a time when surfing the web involved opening tabs that played music against your will and sifting through walls of text on a colored background. In the 2000s, before Squarespace and social media, websites were manifestations of individuality—built from scratch using HTML, by users who had some knowledge of code. 

Scattered across the web are communities of programmers working to revive this seemingly outdated approach. And the movement is anything but a superficial appeal to retro aesthetics—it’s about celebrating the human touch in digital experiences. Read the full story

—Tiffany Ng

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Sandwiches through the ages is a pretty great subject for a book.
+ Art Garfunkel and Paul Simon are getting the band back together! (kind of)
+ Instant mashed potatoes have a bad reputation. But it doesn’t have to be this way.
+ Here’s what an actual robot apocalypse would look like (thanks Will!) 🤖

Before yesterdayNew on MIT Technology Review

What’s next for reproductive rights in the US

7 November 2024 at 19:00

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

Earlier this week, Americans cast their votes in a seminal presidential election. But it wasn’t just the future president of the US that was on the ballot. Ten states also voted on abortion rights.

Two years ago, the US Supreme Court overturned Roe v. Wade, a legal decision that protected the right to abortion. Since then, abortion bans have been enacted in multiple states, and millions of people in the US have lost access to local clinics.

Now, some states are voting to extend and protect access to abortion. This week, seven states voted in support of such measures. And voters in Missouri, a state that has long restricted access, have voted to overturn its ban.

It’s not all good news for proponents of reproductive rights—some states voted against abortion access. And questions remain over the impact of a second term under former president Donald Trump, who is set to return to the post in January.

Roe v. Wade, the legal decision that enshrined a constitutional right to abortion in the US in 1973, guaranteed the right to an abortion up to the point of fetal viability, which is generally considered to be around 24 weeks of pregnancy. It was overturned by the US Supreme Court in the summer of 2022.

Within 100 days of the decision, 13 states had enacted total bans on abortion from the moment of conception. Clinics in these states could no longer offer abortions. Other states also restricted abortion access. In that 100-day period, 66 of the 79 clinics across 15 states stopped offering abortion services, and 26 closed completely, according to research by the Guttmacher Institute.

The political backlash to the decision was intense. This week, abortion was on the ballot in 10 states: Arizona, Colorado, Florida, Maryland, Missouri, Montana, Nebraska, Nevada, New York, and South Dakota. And seven of them voted in support of abortion access.

The impact of these votes will vary by state. Abortion was already legal in Maryland, for example. But the new measures should make it more difficult for lawmakers to restrict reproductive rights in the future. In Arizona, abortions after 15 weeks had been banned since 2022. There, voters approved an amendment to the state constitution that will guarantee access to abortion until fetal viability.

Missouri was the first state to enact an abortion ban once Roe v. Wade was overturned. The state’s current Right to Life of the Unborn Child Act prohibits doctors from performing abortions unless there is a medical emergency. It has no exceptions for rape or incest. This week, the state voted to overturn that ban and protect access to abortion up to fetal viability. 

Not all states voted in support of reproductive rights. Amendments to expand access failed to garner enough support in Nebraska, South Dakota, and Florida. In Florida, for example, where abortions after six weeks of pregnancy are banned, an amendment to protect access until fetal viability got 57% of the vote, falling just short of the 60% the state required for it to pass.

It’s hard to predict how reproductive rights will fare over the course of a second Trump term. Trump himself has been inconsistent on the issue. During his first term, he installed members of the Supreme Court who helped overturn Roe v. Wade. During his most recent campaign he said that decisions on reproductive rights should be left to individual states.

Trump, himself a Florida resident, has refused to comment on how he voted in the state’s recent ballot question on abortion rights. When asked, he said that the reporter who posed the question “should just stop talking about that,” according to the Associated Press.

State decisions can affect reproductive rights beyond abortion access. Just look at Alabama. In February, the Alabama Supreme Court ruled that frozen embryos can be considered children under state law. Embryos are routinely cryopreserved in the course of in vitro fertilization treatment, and the ruling was considered likely to significantly restrict access to IVF in the state. (In March, the state passed another law protecting clinics from legal repercussions should they damage or destroy embryos during IVF procedures, but the status of embryos remains unchanged.)

The fertility treatment became a hot topic during this year’s campaign. In October, Trump bizarrely referred to himself as “the father of IVF.” That title is usually reserved for Robert Edwards, the British researcher who won the 2010 Nobel prize in physiology or medicine for developing the technology in the 1970s.

Whatever is in store for reproductive rights in the US in the coming months and years, all we’ve seen so far suggests that it’s likely to be a bumpy ride.


Now read the rest of The Checkup

Read more from MIT Technology Review’s archive

My colleague Rhiannon Williams reported on the immediate aftermath of the decision that reversed Roe v. Wade when it was announced a couple of years ago. 

The Alabama Supreme Court ruling on embryos could also affect the development of technologies designed to serve as “artificial wombs,” as Antonio Regalado explained at the time.

Other technologies are set to change the way we have babies. Some, which could lead to the creation of children with four parents or none at all, stand to transform our understanding of parenthood.  

We’ve also reported on attempts to create embryo-like structures using stem cells. These structures look like embryos but are created without eggs or sperm. There’s a “wild race” afoot to make these more like the real thing. But both scientific and ethical questions remain over how far we can—and—should go.

My colleagues have been exploring what the US election outcome might mean for climate policies. Senior climate editor James Temple writes that Trump’s victory is “a stunning setback for climate change.” And senior reporter Casey Crownhart explains how efforts including a trio of laws implemented by the Biden administration, which massively increased climate funding, could be undone.

From around the web

Donald Trump has said he’ll let Robert F. Kennedy Jr. “go wild on health.” Here’s where the former environmental lawyer and independent candidate—who has no medical or public health degrees—stands on vaccines, fluoride, and the Affordable Care Act. (New York Times)

Bird flu has been detected in pigs on a farm in Oregon. It’s a worrying development that virologists were dreading. (The Conversation)

And, in case you need it, here’s some lighter reading:

Scientists are sequencing the DNA of tiny marine plankton for the first time. (Come for the story of the scientific expedition; stay for the beautiful images of jellies and sea sapphires.) (The Guardian)

Dolphins are known to communicate with whistles and clicks. But scientists were surprised to find a “highly vocal” solitary dolphin in the Baltic Sea. They think the animal is engaging in “dolphin self-talk.” (Bioacoustics)

How much do you know about baby animals? Test your knowledge in this quiz. (National Geographic)

Why AI could eat quantum computing’s lunch

By: Edd Gent
7 November 2024 at 15:00

Tech companies have been funneling billions of dollars into quantum computers for years. The hope is that they’ll be a game changer for fields as diverse as finance, drug discovery, and logistics.

Those expectations have been especially high in physics and chemistry, where the weird effects of quantum mechanics come into play. In theory, this is where quantum computers could have a huge advantage over conventional machines.

But while the field struggles with the realities of tricky quantum hardware, another challenger is making headway in some of these most promising use cases. AI is now being applied to fundamental physics, chemistry, and materials science in a way that suggests quantum computing’s purported home turf might not be so safe after all.

The scale and complexity of quantum systems that can be simulated using AI is advancing rapidly, says Giuseppe Carleo, a professor of computational physics at the Swiss Federal Institute of Technology (EPFL). Last month, he coauthored a paper published in Science showing that neural-network-based approaches are rapidly becoming the leading technique for modeling materials with strong quantum properties. Meta also recently unveiled an AI model trained on a massive new data set of materials that has jumped to the top of a leaderboard for machine-learning approaches to material discovery.

Given the pace of recent advances, a growing number of researchers are now asking whether AI could solve a substantial chunk of the most interesting problems in chemistry and materials science before large-scale quantum computers become a reality. 

“The existence of these new contenders in machine learning is a serious hit to the potential applications of quantum computers,” says Carleo “In my opinion, these companies will find out sooner or later that their investments are not justified.”

Exponential problems

The promise of quantum computers lies in their potential to carry out certain calculations much faster than conventional computers. Realizing this promise will require much larger quantum processors than we have today. The biggest devices have just crossed the thousand-qubit mark, but achieving an undeniable advantage over classical computers will likely require tens of thousands, if not millions. Once that hardware is available, though, a handful of quantum algorithms, like the encryption-cracking Shor’s algorithm, have the potential to solve problems exponentially faster than classical algorithms can. 

But for many quantum algorithms with more obvious commercial applications, like searching databases, solving optimization problems, or powering AI, the speed advantage is more modest. And last year, a paper coauthored by Microsoft’s head of quantum computing, Matthias Troyer, showed that these theoretical advantages disappear if you account for the fact that quantum hardware operates orders of magnitude slower than modern computer chips. The difficulty of getting large amounts of classical data in and out of a quantum computer is also a major barrier. 

So Troyer and his colleagues concluded that quantum computers should instead focus on problems in chemistry and materials science that require simulation of systems where quantum effects dominate. A computer that operates along the same quantum principles as these systems should, in theory, have a natural advantage here. In fact, this has been a driving idea behind quantum computing ever since the renowned physicist Richard Feynman first proposed the idea.

The rules of quantum mechanics govern many things with huge practical and commercial value, like proteins, drugs, and materials. Their properties are determined by the interactions of their constituent particles, in particular their electrons—and simulating these interactions in a computer should make it possible to predict what kinds of characteristics a molecule will exhibit. This could prove invaluable for discovering things like new medicines or more efficient battery chemistries, for example. 

But the intuition-defying rules of quantum mechanics—in particular, the phenomenon of entanglement, which allows the quantum states of distant particles to become intrinsically linked—can make these interactions incredibly complex. Precisely tracking them requires complicated math that gets exponentially tougher the more particles are involved. That can make simulating large quantum systems intractable on classical machines.

This is where quantum computers could shine. Because they also operate on quantum principles, they are able to represent quantum states much more efficiently than is possible on classical machines. They could also take advantage of quantum effects to speed up their calculations.

But not all quantum systems are the same. Their complexity is determined by the extent to which their particles interact, or correlate, with each other. In systems where these interactions are strong, tracking all these relationships can quickly explode the number of calculations required to model the system. But in most that are of practical interest to chemists and materials scientists, correlation is weak, says Carleo. That means their particles don’t affect each other’s behavior significantly, which makes the systems far simpler to model.

The upshot, says Carleo, is that quantum computers are unlikely to provide any advantage for most problems in chemistry and materials science. Classical tools that can accurately model weakly correlated systems already exist, the most prominent being density functional theory (DFT). The insight behind DFT is that all you need to understand a system’s key properties is its electron density, a measure of how its electrons are distributed in space. This makes for much simpler computation but can still provide accurate results for weakly correlated systems.

Simulating large systems using these approaches requires considerable computing power. But in recent years there’s been an explosion of research using DFT to generate data on chemicals, biomolecules, and materials—data that can be used to train neural networks. These AI models learn patterns in the data that allow them to predict what properties a particular chemical structure is likely to have, but they are orders of magnitude cheaper to run than conventional DFT calculations. 

This has dramatically expanded the size of systems that can be modeled—to as many as 100,000 atoms at a time—and how long simulations can run, says Alexandre Tkatchenko, a physics professor at the University of Luxembourg. “It’s wonderful. You can really do most of chemistry,” he says.

Olexandr Isayev, a chemistry professor at Carnegie Mellon University, says these techniques are already being widely applied by companies in chemistry and life sciences. And for researchers, previously out of reach problems such as optimizing chemical reactions, developing new battery materials, and understanding protein binding are finally becoming tractable.

As with most AI applications, the biggest bottleneck is data, says Isayev. Meta’s recently released materials data set was made up of DFT calculations on 118 million molecules. A model trained on this data achieved state-of-the-art performance, but creating the training material took vast computing resources, well beyond what’s accessible to most research teams. That means fulfilling the full promise of this approach will require massive investment.

Modeling a weakly correlated system using DFT is not an exponentially scaling problem, though. This suggests that with more data and computing resources, AI-based classical approaches could simulate even the largest of these systems, says Tkatchenko. Given that quantum computers powerful enough to compete are likely still decades away, he adds, AI’s current trajectory suggests it could reach important milestones, such as precisely simulating how drugs bind to a protein, much sooner.

Strong correlations

When it comes to simulating strongly correlated quantum systems—ones whose particles interact a lot—methods like DFT quickly run out of steam. While more exotic, these systems include materials with potentially transformative capabilities, like high-temperature superconductivity or ultra-precise sensing. But even here, AI is making significant strides.

In 2017, EPFL’s Carleo and Microsoft’s Troyer published a seminal paper in Science showing that neural networks could model strongly correlated quantum systems. The approach doesn’t learn from data in the classical sense. Instead, Carleo says, it is similar to DeepMind’s AlphaZero model, which mastered the games of Go, chess, and shogi using nothing more than the rules of each game and the ability to play itself.

In this case, the rules of the game are provided by Schrödinger’s equation, which can precisely describe a system’s quantum state, or wave function. The model plays against itself by arranging particles in a certain configuration and then measuring the system’s energy level. The goal is to reach the lowest energy configuration (known as the ground state), which determines the system’s properties. The model repeats this process until energy levels stop falling, indicating that the ground state—or something close to it—has been reached.

The power of these models is their ability to compress information, says Carleo. “The wave function is a very complicated mathematical object,” he says. “What has been shown by several papers now is that [the neural network] is able to capture the complexity of this object in a way that can be handled by a classical machine.”

Since the 2017 paper, the approach has been extended to a wide range of strongly correlated systems, says Carleo, and results have been impressive. The Science paper he published with colleagues last month put leading classical simulation techniques to the test on a variety of tricky quantum simulation problems, with the goal of creating a benchmark to judge advances in both classical and quantum approaches.

Carleo says that neural-network-based techniques are now the best approach for simulating many of the most complex quantum systems they tested. “Machine learning is really taking the lead in many of these problems,” he says.

These techniques are catching the eye of some big players in the tech industry. In August, researchers at DeepMind showed in a paper in Science that they could accurately model excited states in quantum systems, which could one day help predict the behavior of things like solar cells, sensors, and lasers. Scientists at Microsoft Research have also developed an open-source software suite to help more researchers use neural networks for simulation.

One of the main advantages of the approach is that it piggybacks on massive investments in AI software and hardware, says Filippo Vicentini, a professor of AI and condensed-matter physics at École Polytechnique in France, who was also a coauthor on the Science benchmarking paper: “Being able to leverage these kinds of technological advancements gives us a huge edge.”

There is a caveat: Because the ground states are effectively found through trial and error rather than explicit calculations, they are only approximations. But this is also why the approach could make progress on what has looked like an intractable problem, says Juan Carrasquilla, a researcher at ETH Zurich, and another coauthor on the Science benchmarking paper.

If you want to precisely track all the interactions in a strongly correlated system, the number of calculations you need to do rises exponentially with the system’s size. But if you’re happy with an answer that is just good enough, there’s plenty of scope for taking shortcuts. 

“Perhaps there’s no hope to capture it exactly,” says Carrasquilla. “But there’s hope to capture enough information that we capture all the aspects that physicists care about. And if we do that, it’s basically indistinguishable from a true solution.”

And while strongly correlated systems are generally too hard to simulate classically, there are notable instances where this isn’t the case. That includes some systems that are relevant for modeling high-temperature superconductors, according to a 2023 paper in Nature Communications.

“Because of the exponential complexity, you can always find problems for which you can’t find a shortcut,” says Frank Noe, research manager at Microsoft Research, who has led much of the company’s work in this area. “But I think the number of systems for which you can’t find a good shortcut will just become much smaller.”

No magic bullets

However, Stefanie Czischek, an assistant professor of physics at the University of Ottawa, says it can be hard to predict what problems neural networks can feasibly solve. For some complex systems they do incredibly well, but then on other seemingly simple ones, computational costs balloon unexpectedly. “We don’t really know their limitations,” she says. “No one really knows yet what are the conditions that make it hard to represent systems using these neural networks.”

Meanwhile, there have also been significant advances in other classical quantum simulation techniques, says Antoine Georges, director of the Center for Computational Quantum Physics at the Flatiron Institute in New York, who also contributed to the recent Science benchmarking paper. “They are all successful in their own right, and they are also very complementary,” he says. “So I don’t think these machine-learning methods are just going to completely put all the other methods out of business.”

Quantum computers will also have their niche, says Martin Roetteler, senior director of quantum solutions at IonQ, which is developing quantum computers built from trapped ions. While he agrees that classical approaches will likely be sufficient for simulating weakly correlated systems, he’s confident that some large, strongly correlated systems will be beyond their reach. “The exponential is going to bite you,” he says. “There are cases with strongly correlated systems that we cannot treat classically. I’m strongly convinced that that’s the case.”

In contrast, he says, a future fault-tolerant quantum computer with many more qubits than today’s devices will be able to simulate such systems. This could help find new catalysts or improve understanding of metabolic processes in the body—an area of interest to the pharmaceutical industry.

Neural networks are likely to increase the scope of problems that can be solved, says Jay Gambetta, who leads IBM’s quantum computing efforts, but he’s unconvinced they’ll solve the hardest challenges businesses are interested in.

“That’s why many different companies that essentially have chemistry as their requirement are still investigating quantum—because they know exactly where these approximation methods break down,” he says.

Gambetta also rejects the idea that the technologies are rivals. He says the future of computing is likely to involve a hybrid of the two approaches, with quantum and classical subroutines working together to solve problems. “I don’t think they’re in competition. I think they actually add to each other,” he says.

But Scott Aaronson, who directs the Quantum Information Center at the University of Texas, says machine-learning approaches are directly competing against quantum computers in areas like quantum chemistry and condensed-matter physics. He predicts that a combination of machine learning and quantum simulations will outperform purely classical approaches in many cases, but that won’t become clear until larger, more reliable quantum computers are available.

“From the very beginning, I’ve treated quantum computing as first and foremost a scientific quest, with any industrial applications as icing on the cake,” he says. “So if quantum simulation turns out to beat classical machine learning only rarely, I won’t be quite as crestfallen as some of my colleagues.”

One area where quantum computers look likely to have a clear advantage is in simulating how complex quantum systems evolve over time, says EPFL’s Carleo. This could provide invaluable insights for scientists in fields like statistical mechanics and high-energy physics, but it seems unlikely to lead to practical uses in the near term. “These are more niche applications that, in my opinion, do not justify the massive investments and the massive hype,” Carleo adds.

Nonetheless, the experts MIT Technology Review spoke to said a lack of commercial applications is not a reason to stop pursuing quantum computing, which could lead to fundamental scientific breakthroughs in the long run.

“Science is like a set of nested boxes—you solve one problem and you find five other problems,” says Vicentini. “The complexity of the things we study will increase over time, so we will always need more powerful tools.”

The Download: what Trump’s victory means for the climate

7 November 2024 at 14:10

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Trump’s win is a tragic loss for climate progress

—James Temple

Donald Trump’s decisive victory is a stunning setback for the fight against climate change.

The Republican president-elect’s return to the White House means the US is going to squander precious momentum, unraveling hard-won policy progress that was just beginning to pay off, all for the second time in less than a decade. 

It comes at a moment when the world can’t afford to waste time, with nations far off track from any emissions trajectories that would keep our ecosystems stable and our communities safe. 

Trump could push the globe into even more dangerous terrain, by defanging President Joe Biden’s signature climate laws, exacerbating the dangers of heat waves, floods, wildfires, droughts, and famine and increase deaths and disease from air pollution. And this time round, I fear it will be far worse. Read the full story.

The US is about to make a sharp turn on climate policy

The past four years have seen the US take climate action seriously, working with the international community and pumping money into solutions. Now, we’re facing a period where things are going to be very different. This is what the next four years will mean for the climate fight. Read the full story.

—Casey Crownhart

This story is from The Spark, a newsletter we send out every Wednesday. If you want to stay up-to-date with all the latest goings-on in climate and energy, sign up.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Tech leaders are lining up to congratulate Donald Trump 
In a bid to placate the famously volatile President-elect. (FT $)
+ Many are seeking to rebuild bridges that have fractured since his last tenure. (CNBC)
+ Particularly Jeff Bezos, who has had a fractious relationship with Trump. (NY Mag $)
+ Expect less regulation, more trade upheaval, and a whole lot more Elon Musk. (WP $)

2 Election deniers have gone mysteriously silent
It’s almost as if their claims of fraud were baseless in the first place. (NYT $)
+ It looks like influencer marketing campaigns really did change minds. (Wired $)

3 How Elon Musk is likely to slash US government spending
He has a long history of strategic cost-cutting in his own businesses. (WSJ $) 
+ His other ventures are on course for favorable government treatment. (Reuters)
+ It’s easy to forget that Musk claims to have voted Democrat in 2020 and 2016. (WP $)

4 Google could be spared being broken up
Trump has expressed skepticism about the antitrust proposal. (Reuters)
+ It’s far from the only reverse-ferret we’re likely to see. (Economist $)

5 How progressive groups are planning for a future under Trump
Alliances are meeting today to form networks of resources. (Fast Company $)

6 Australia wants to ban under-16s from accessing social media
But it’s not clear how it could be enforced. (The Guardian)
+ The proposed law could come into power as soon as next year. (BBC)
+ Roblox has made sweeping changes to its child safety policies. (Bloomberg $)
+ Child online safety laws will actually hurt kids, critics say. (MIT Technology Review)

7 It looks like OpenAI just paid $10 million for a url
Why ChatGPT when you could just chat.com? (The Verge)
+ How ChatGPT search paves the way for AI agents. (MIT Technology Review)

8 Women in the US are exploring swearing off men altogether
Social media interest in a Korean movement advocating for a man-free life is soaring. (WP $)

9 Gen Z can’t get enough of manifesting
TikTok is teaching them how to will their way to a better life. (Insider $)

10 Tattoo artists are divided over whether they should use AI 
AI-assisted designs have been accused of lacking soul. (WSJ $)

Quote of the day

“Don’t worry, I won’t judge — much. Maybe just an eye roll here and there.”

—Lily, a sarcastic AI teenage avatar and star of language learning app Duolingo, greets analysts tuning into the company’s earning call, Insider reports.

The big story

The great commercial takeover of low-Earth orbit

April 2024

NASA designed the International Space Station to fly for 20 years. It has lasted six years longer than that, though it is showing its age, and NASA is currently studying how to safely destroy the space laboratory by around 2030. 

The ISS never really became what some had hoped: a launching point for an expanding human presence in the solar system. But it did enable fundamental research on materials and medicine, and it helped us start to understand how space affects the human body. 

To build on that work, NASA has partnered with private companies to develop new, commercial space stations for research, manufacturing, and tourism. If they are successful, these companies will bring about a new era of space exploration: private rockets flying to private destinations. They’re already planning to do it around the moon. One day, Mars could follow. Read the full story.

—David W. Brown

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Who doesn’t love a smeared makeup look?
+ Time to snuggle up: it’s officially Nora Ephron season. 🍁🧣
+ Walking backwards—don’t knock it ‘til you’ve tried it. It’s surprisingly good for you.
+ Feeling stressed? Here’s how to calm your mind in times of trouble.

Trump’s win is a tragic loss for climate progress

7 November 2024 at 00:46

Donald Trump’s decisive victory is a stunning setback for the fight against climate change.

The Republican president-elect’s return to the White House means the US is going to squander precious momentum, unraveling hard-won policy progress that was just beginning to pay off, all for the second time in less than a decade. 

It comes at a moment when the world can’t afford to waste time, with nations far off track from any emissions trajectories that would keep our ecosystems stable and our communities safe. Under the policies in place today, the planet is already set to warm by more than 3 °C over preindustrial levels in the coming decades.

Trump could push the globe into even more dangerous terrain, by defanging President Joe Biden’s signature climate laws. In fact, a second Trump administration could boost greenhouse-gas emissions by 4 billion tons through 2030 alone, according to an earlier analysis by Carbon Brief, a well-regarded climate news and data site. That will exacerbate the dangers of heat waves, floods, wildfires, droughts, and famine and increase deaths and disease from air pollution, inflicting some $900 million in climate damages around the world, Carbon Brief found.

I started as the climate editor at MIT Technology Review just as Trump came into office the last time. Much of the early job entailed covering his systematic unraveling of the modest climate policy and progress that President Barack Obama had managed to achieve. I fear it will be far worse this time, as Trump ambles into office feeling empowered and aggrieved, and ready to test the rule of law and crack down on dissent. 

This time his administration will be staffed all the more by loyalists and idealogues, who have already made plans to force civil servants with expertise and experience from federal agencies including the Environmental Protection Agency. He’ll be backed by a Supreme Court that he moved well to the right, and which has already undercut landmark environmental doctrines and weakened federal regulatory agencies. 

This time the setbacks will sting more, too, because the US did finally manage to pass real, substantive climate policy, through the slimmest of congressional margins. The Inflation Reduction Act and Bipartisan Infrastructure Law allocated massive amounts of government funding to accelerating the shift to low-emissions industries and rebuilding the US manufacturing base around a clean-energy economy. 

Trump has made clear he will strive to repeal as many of these provisions as he can, tempered perhaps only by Republicans who recognize that these laws are producing revenue and jobs in their districts. Meanwhile, throughout the prolonged presidential campaign, Trump or his surrogates pledged to boost oil and gas production, eliminate federal support for electric vehicles, end pollution rules for power plants, and remove the US from the Paris climate agreement yet again. Each of those goals stands in direct opposition to the deep, rapid emissions cuts now necessary to prevent the planet from tipping past higher and higher temperature thresholds.

Project 2025, considered a blueprint for the early days of a second Trump administration despite his insistence to the contrary, calls for dismantling or downsizing federal institutions including the the National Oceanic and Atmospheric Administration and the Federal Emergency Management Agency. That could cripple the nation’s ability to forecast, track, or respond to storms, floods, and fires like those that have devastated communities in recent months.

Observers I’ve spoken to fear that the Trump administration will also return the Department of Energy, which under Biden had evolved its mission toward developing low-emissions technologies, to the primary task of helping companies dig up more fossil fuels.

The US election could create global ripples as well, and very soon. US negotiators will meet with their counterparts at the annual UN climate conference that kicks off next week. With Trump set to move back into the White House in January, they will have little credibility or leverage to nudge other nations to step up their commitments to reducing emissions. 

But those are just some of the direct ways that a second Trump administration will enfeeble the nation’s ability to drive down emissions and counter the growing dangers of climate change. He also has considerable power to stall the economy and sow international chaos amid escalating conflicts in Europe and the Middle East. 

Trump’s eagerness to enact tariffs, slash government spending, and deport major portions of the workforce may stunt growth, drive up inflation, and chill investment. All that would make it far more difficult for companies to raise the capital and purchase the components needed to build anything in the US, whether that means wind turbines, solar farms, and seawalls or buildings, bridges, and data centers. 

view from behind Trump on stage election night 2024 with press and crowd
President-elect Donald Trump speaks at an election night event in West Palm Beach, Florida.
WIN MCNAMEE/GETTY IMAGES

His clumsy handling of the economy and international affairs may also help China extend its dominance in producing and selling the components that are crucial to the energy transition, including batteries, EVs, and solar panels, to customers around the globe.

If one job of a commentator is to find some perspective in difficult moments, I admit I’m mostly failing in this one.

The best I can do is to say that there will be some meaningful lines of defense. For now, at least, state leaders and legislatures can continue to pass and implement stronger climate rules. Other nations could step up their efforts to cut emissions and assert themselves as global leaders on climate. 

Private industry will likely continue to invest in and build businesses in climate tech and clean energy, since solar, wind, batteries, and EVs have proved themselves as competitive industries. And technological progress can occur no matter who is sitting in the round room on Pennsylvania Avenue, since researchers continue striving to develop cleaner, cheaper ways of producing our energy, food, and goods.

By any measure, the job of addressing climate change is now much harder. Nothing, however, has changed about the stakes. 

Our world doesn’t end if we surpass 2 °C, 2.5 °C, or even 3 °C, but it will steadily become a more dangerous and erratic place. Every tenth of a degree remains worth fighting for—whether two, four, or a dozen years from now—because every bit of warming that nations pull together to prevent eases future suffering somewhere.

So as the shock wears off and the despair begins to lift, the core task before us remains the same: to push for progress, whenever, wherever, and however we can. 

The US is about to make a sharp turn on climate policy

6 November 2024 at 17:29

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Voters have elected Donald Trump to a second term in the White House.

In the days leading up to the election, I kept thinking about what four years means for climate change right now. We’re at a critical moment that requires decisive action to rapidly slash greenhouse-gas emissions from power plants, transportation, industry, and the rest of the economy if we’re going to achieve our climate goals.

The past four years have seen the US take climate action seriously, working with the international community and pumping money into solutions. Now, we’re facing a period where things are going to be very different. A Trump presidency will have impacts far beyond climate, but for the sake of this newsletter, we’ll stay focused on what four years means in the climate fight as we start to make sense of this next chapter. 

Joe Biden arguably did more to combat climate change than any other American president. One of his first actions in office was rejoining the Paris climate accord—Trump pulled out of the international agreement to fight climate change during his first term in office. Biden then quickly set a new national goal to cut US carbon emissions in half, relative to their peak, by 2030.

The Environmental Protection Agency rolled out rules for power plants to slash pollution that harms both human health and the climate. The agency also announced new regulations for vehicle emissions to push the country toward EVs.

And the cornerstone of the Biden years has been unprecedented climate investment. A trio of laws—the Bipartisan Infrastructure Law, the CHIPS and Science Act, and the Inflation Reduction Act—pumped hundreds of billions of dollars into infrastructure and research, much of it on climate.

Now, this ship is about to make a quick turn. Donald Trump has regularly dismissed the threat of climate change and promised throughout the campaign to counter some of Biden’s key moves.

We can expect to see a dramatic shift in how the US talks about climate on the international stage. Trump has vowed to once again withdraw from the Paris agreement. Things are going to be weird at the annual global climate talks that kick off next week.

We can also expect to see efforts to undo some of Biden’s key climate actions, most centrally the Inflation Reduction Act, as my colleague James Temple covered earlier this year.

What, exactly, Trump can do will depend on whether Republicans take control of both houses of Congress. A clean sweep would open up more lanes for targeting legislation passed under Biden. (As of sending this email, Republicans have secured enough seats to control the Senate, but the House is uncertain and could be for days or even weeks.)

I don’t think the rug will be entirely pulled out from under the IRA—portions of the investment from the law are beginning to pay off, and the majority of the money has gone to Republican districts. But there will certainly be challenges to pieces, especially the EV tax credits, which Trump has been laser-focused on during the campaign.

This all adds up to a very different course on climate than what many had hoped we might see for the rest of this decade.

A Trump presidency could add 4 billion metric tons of carbon dioxide emissions to the atmosphere by 2030 over what was expected from a second Biden term, according to an analysis published in April by the website Carbon Brief (this was before Biden dropped out of the race). That projection sees emissions under Trump dropping by 28% below the peak by the end of the decade—nowhere near the 50% target set by Biden at the beginning of his term.

The US, which is currently the world’s second-largest greenhouse-gas emitter and has added more climate pollution to the atmosphere than any other nation, is now very unlikely to hit Biden’s 2030 goal. That’s basically the final nail in the coffin for efforts to limit global warming to 1.5 °C (2.7 °F) over preindustrial levels.

In the days, weeks, and years ahead we’ll be covering what this change will mean for efforts to combat climate change and to protect the most vulnerable from the dangerous world we’re marching toward—indeed, already living in. Stay tuned for more from us.


Now read the rest of The Spark

Related reading

Trump wants to unravel Biden’s landmark climate law. Read our coverage from earlier this year to see what’s most at risk

It’s been two years since the Inflation Reduction Act was passed, ushering in hundreds of billions of dollars in climate investment. Read more about the key provisions in this newsletter from August

silhouette of a cow with letters C,T,G,A floating inside in brilliant orange light
MIT TECHNOLOGY REVIEW | GETTY

Another thing

Jennifer Doudna, one of the inventors of the gene-editing tool CRISPR, says the tech could be a major tool to help address climate change and deal with the growing risks of our changing world. 

The hope is that CRISPR’s ability to chop out specific pieces of DNA will make it faster and easier to produce climate-resilient crops and livestock, while avoiding the pitfalls of previous attempts to tweak the genomes of plants and animals. Read the full story from my colleague James Temple.

Keeping up with climate  

Startup Redoxblox is building a technology that’s not exactly a thermal battery, but it’s not not a thermal battery either. The company raised just over $30 million to build its systems, which store energy in both heat and chemical bonds. (Heatmap)

It’s been a weird fall in the US Northeast—a rare drought has brought a string of wildfires, and New York City is seeing calls to conserve water. (New York Times)

It’s been bumpy skies this week for electric-plane startups. Beta Technologies raised over $300 million in funding, while Lilium may be filing for insolvency soon. (Canary Media)

→ The runway for futuristic electric planes is still a long one. (MIT Technology Review)

Meta’s plan to build a nuclear-powered AI data center has been derailed by a rare species of bee living on land earmarked for the project. (Financial Times)

The atmospheric concentration of methane—a powerful greenhouse gas—has been mysteriously climbing since 2007, and that growth nearly doubled in 2020. Now scientists may have finally figured out the culprits: microbes in wetlands that are getting warmer and wetter. (Washington Post)

Greenhouse-gas emissions from the European Union fell by 8% in 2023. The drop is thanks to efforts to shut down coal-fired power plants and generate more electricity from renewables like solar and wind. (The Guardian)

Four electric school buses could help officials figure out how to charge future bus fleets. A project in Brooklyn will aim to use onsite renewables and smart charging to control the costs and grid stress of EV charging depots. (Canary Media)

Delivering the next-generation barcode

The world’s first barcode, designed in 1948, took more than 25 years to make it out of the lab and onto a retail package. Since then, the barcode has done much more than make grocery checkouts faster—it has remade our understanding of how physical objects can be identified and tracked, creating a new pace and set of expectations for the speed and reliability of modern commerce.

Nearly eighty years later, a new iteration of that technology, which encodes data in two dimensions, is poised to take the stage. Today’s 2D barcode is not only out of the lab but “open to a world of possibility,” says Carrie Wilkie, senior vice president of standards and technology at GS1 US.

2D barcodes encode substantially more information than their 1D counterparts. This enables them to link physical objects to a wide array of digital resources. For consumers, 2D barcodes can provide a wealth of product information, from food allergens, expiration dates, and safety recalls to detailed medication use instructions, coupons, and product offers. For businesses, 2D barcodes can enhance operational efficiencies, create traceability at the lot or item level, and drive new forms of customer engagement.

An array of 2D barcode types supports the information needs of a variety of industries. The GS1 DataMatrix, for example, is used on medication or medical devices, encoding expiration dates, batch and lot numbers, and FDA National Drug Codes. The QR Code is familiar to consumers who have used one to open a website from their phone. Adding a GS1 Digital Link URI to a QR Code enables it to serve two purposes: as both a traditional barcode for supply chain operations, enabling tracking throughout the supply chain and price lookup at checkout, and also as a consumer-facing link to digital information, like expiry dates and serial numbers.

Regardless of type, however, all 2D barcodes require a business ecosystem backed by data. To capture new value from advanced barcodes, organizations must supply and manage clean, accurate, and interoperable data around their products and materials. For 2D barcodes to deliver on their potential, businesses will need to collaborate with partners, suppliers, and customers and commit to common data standards across the value chain.

Driving the demand for 2D barcodes

Shifting to 2D barcodes—and enabling the data ecosystems behind them—will require investment by business. Consumer engagement, compliance, and sustainability are among the many factors driving this transition.

Real-time consumer engagement: Today’s customers want to feel connected to the brands they interact with and purchase from. Information is a key element of that engagement and empowerment. “When I think about customer satisfaction,” says Leslie Hand, group vice president for IDC Retail Insights, “I’m thinking about how I can provide more information that allows them to make better decisions about their own lives and the things they buy.”

2D barcodes can help by connecting consumers to online content in real time. “If, by using a 2D barcode, you have the capability to connect to a consumer in a specific region, or a specific store, and you have the ability to provide information to that consumer about the specific product in their hand, that can be a really powerful consumer engagement tool,” says Dan Hardy, director of customer operations for HanesBrands, Inc. “2D barcodes can bring brand and product connectivity directly to an individual consumer, and create an interaction that supports your brand message at an individual consumer/product level.”

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The Download: ice-melting robots, and genetically modified trees

6 November 2024 at 14:10

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Life-seeking, ice-melting robots could punch through Europa’s icy shell

At long last, NASA’s Europa Clipper mission is on its way. It launched on October 14 and is now en route to its target: Jupiter’s ice-covered moon Europa, whose frozen shell almost certainly conceals a warm saltwater ocean. When the spacecraft gets there, it will conduct dozens of close flybys in order to determine what that ocean is like and, crucially, where it might be hospitable to life.

Europa Clipper is still years away from its destination—it is not slated to reach the Jupiter system until 2030. But that hasn’t stopped engineers and scientists from working on what would come next if the results are promising: a mission capable of finding evidence of life itself. Read the full story.

— Robin George Andrews

GMOs could reboot chestnut trees

Living as long as a thousand years, the American chestnut tree once dominated parts of the Eastern forest canopy, with many Native American nations relying on them for food. But by 1950, the tree had largely succumbed to a fungal blight probably introduced by Japanese chestnuts.

As recently as last year, it seemed the 35-year effort to revive the American chestnut might grind to a halt. Now, American Castanea, a new biotech startup, has created more than 2,500 transgenic chestnut seedlings— likely the first genetically modified trees to be considered for federal regulatory approval as a tool for ecological restoration. Read the full story.

—Anya Kamenetz

This piece is from the latest print issue of MIT Technology Review, which is all about the weird and wonderful world of food. If you don’t already, subscribe to receive future copies once they land.

MIT Technology Review Narrated: Why Congo’s most famous national park is betting big on crypto

In an attempt to protect its forests and famous wildlife, Virunga has become the first national park to run a Bitcoin mine. But some are wondering what crypto has to do with conservation.

This is our latest story to be turned into a MIT Technology Review Narrated podcast. In partnership with News Over Audio, we’ll be making a selection of our stories available, each one read by a professional voice actor. You’ll be able to listen to them on the go or download them to listen to offline.

We’re publishing a new story each week on Spotify and Apple Podcasts, including some taken from our most recent print magazine. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Donald Trump has won the US Presidential election 
He’s the first president with a criminal conviction and two impeachments under his belt. (WP $)
+ The crypto industry is rejoicing at the news as bitcoin leapt to a record high. (NYT $)
+ In fact, a blockchain entrepreneur won the Ohio Senate race. (CNBC)
+ What comes next is anyone’s guess. (The Atlantic $)

2 Trump’s victory is music to Elon Musk’s ears
He’s been promised a new role as head of a new Department of Government Efficiency. (FT $)
+ Musk is being sued over his $1 million giveaways during the election campaign. (Reuters)
+ The billionaire used X as his own personal megaphone to stir up dissent. (The Atlantic $)

3 Abortion rights are now under further threat 
Particularly pills sent by mail. (Vox)
+ Trump’s approach to discussing abortion has been decidedly mixed. (Bloomberg $)

4 Trump could be TikTok’s last hope for survival in the US
Now he’s stopped threatening to ban it, that is. (The Information $)

5 Perplexity is approaching a $9 billion valuation
Thanks to the company’s fourth round of funding this year. (WSJ $)+ Microsoft has reportedly expressed interest in acquiring the AI search startup. (The Information $)

6 The iPhone could be Apple’s last major cash cow
It’s acknowledged that its other devices may never reach the same heady heights. (FT $)
+ Nvidia has overtaken Apple as the world’s largest company. (Bloomberg $)

7 The Mozilla Foundation is getting rid of its advocacy division
The team prioritized fighting for a free and open web. (TechCrunch)

8 China plans to slam a spacecraft into an asteroid
Following in the footsteps of America’s successful 2022 mission. (Economist $)
+ Watch the moment NASA’s DART spacecraft crashed into an asteroid. (MIT Technology Review)

9 The Vatican’s anime mascot has been co opted into AI porn
That didn’t take long. (404 Media)

10 Gigantic XXL TVs are the gift of the season
It’s cheaper than ever to fit your home out with a jumbotron screen. (CNN)

Quote of the day

“This is what happens when you mess with the crypto army.”

—Crypto twin Cameron Winklevoss celebrates the victory of blockchain entrepreneur Bernie Moreno, new Senator-elect for Ohio, in a post on X.

The big story

How covid conspiracies led to an alarming resurgence in AIDS denialism

August 2024

Several million people were listening in February when Joe Rogan falsely declared that “party drugs” were an “important factor in AIDS.” His guest on The Joe Rogan Experience, the former evolutionary biology professor turned contrarian podcaster Bret Weinstein, agreed with him.

Speaking to the biggest podcast audience in the world, the two men were promoting dangerous and false ideas—ideas that were in fact debunked and thoroughly disproved decades ago.

These comments and others like them add up to a small but unmistakable resurgence in AIDS denialism—a false collection of theories arguing either that HIV doesn’t cause AIDS or that there’s no such thing as HIV at all.

These claims had largely fallen out of favor until the coronavirus arrived. But, following the pandemic, a renewed suspicion of public health figures and agencies is giving new life to ideas that had long ago been pushed to the margins. Read the full story.

—Anna Merlan

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Full Moon Matinee is an amazing crime drama resource on YouTube: complete with some excellent acting courtesy of its host.
+ This is your sign to pick a name and cheer on random strangers during a marathon. I guarantee you’ll make their day!
+ There’s no wrong way to bake a sweet potato, but some ways are better than others.
+ Are you a screen creeper? I know I am.

The Download: inside animals’ minds, and how to make AI agents useful

5 November 2024 at 14:10

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

What do jumping spiders find sexy? How DIY tech is offering insights into the animal mind.

Studying the minds of other animals comes with a challenge that human psychologists don’t usually face: Your subjects can’t tell you what they’re thinking. 

To get answers from animals, scientists need to come up with creative experiments to learn why they behave the way they do. Sometimes this requires designing and building experimental equipment from scratch. 

These contraptions can range from ingeniously simple to incredibly complex, but all of them are tailored to help answer questions about the lives and minds of specific species. Do honeybees need a good night’s sleep? What do jumping spiders find sexy? Do falcons like puzzles? For queries like these, off-the-shelf gear simply won’t do. Check out these contraptions custom-built by scientists to help them understand the lives and minds of the animals they study

—Betsy Mason

This piece is from the latest print issue of MIT Technology Review, which is all about the weird and wonderful world of food. If you don’t already, subscribe to receive future copies once they land.

How ChatGPT search paves the way for AI agents

It’s been a busy few weeks for OpenAI. Alongside updates to its new Realtime API platform, which will allow developers to build apps and voice assistants more quickly, it recently launched ChatGPT search, which allows users to search the internet using the chatbot.

Both developments pave the way for the next big thing in AI: agents. These AI assistants can complete complex chains of tasks, such as booking flights. OpenAI’s strategy is to both build agents itself and allow developers to use its software to build their own agents, and voice will play an important role in what agents will look and feel like.

Melissa Heikkilä, our senior AI reporter, sat down with Olivier Godement, OpenAI’s head of product for its platform, and Romain Huet, head of developer experience, last week to hear more about the two big hurdles that need to be overcome before agents can become a reality. Read the full story.

This story is from The Algorithm, our weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 America is heading to the polls
Here’s how Harris and Trump will attempt to lead the US to tech supremacy. (The Information $)
+ The ‘Stop the Steal’ election denial movement is preparing to contest the vote. (WP $)
+ The muddy final polls suggest it’s still all to play for. (Vox)

2 Abortion rights are on the 2024 ballot
A lack of access to basic health care has led to the deaths of at least four women. (NY Mag $)
+ Nine states will decide whether to guarantee their residents abortion access. (Fortune)
+ If Trump wins he could ban abortion nationwide, even without Congress. (Politico)

3 Inside New York’s election day wargames
Tech, business and policy leaders gathered to thrash out potential risks. (WSJ $)+ Violence runs throughout all aspects of this election cycle. (FT $)

4 Elon Musk’s false and misleading X election posts have billions of views
In fact, they’ve been viewed twice as much as all X’s political ads this year. (CNN)
+ Musk’s decision to hitch himself to Trump may end up backfiring, though. (FT $)

5 Meta will permit the US military to use its AI models
It’s an interesting update to its previous policy, which explicitly banned its use for military purposes. (NYT $)
+ Facebook has kept a low profile during the election cycle. (The Atlantic $)
+ Inside the messy ethics of making war with machines. (MIT Technology Review)

6 The hidden danger of pirated software
It’s not just viruses you should be worried about. (404 Media)

7 Apple is weighing up expanding into smart glasses
Where Meta leads, Apple may follow. (Bloomberg $)
+ The coolest thing about smart glasses is not the AR. It’s the AI. (MIT Technology Review)

8 India’s lithium plans may have been a bit too ambitious
Reports of a major lithium reserve appear to have been massively overblown.(Rest of World)
+ Some countries are ending support for EVs. Is it too soon? (MIT Technology Review)

9 Your air fryer could be surveilling you
Household appliances are now mostly smart, and stuffed with trackers. (The Guardian)

10 How to stay sane during election week
Focus on what you can control, and try to let go of what you can’t. (WP $)
+ Here’s how election gurus are planning to cope in the days ahead. (The Atlantic $)
+ How to log off. (MIT Technology Review)

Quote of the day

“We’re in kind of the ‘throw spaghetti at the wall’ moment of politics and AI, where this intersection allows people to try new things for propaganda.”

—Rachel Tobac, chief executive of ethical hacking company SocialProof Security, tells the Washington Post why a deepfake video of Martin Luther King endorsing Donald Trump is being shared online in the closing hours of the presidential race.

The big story

The hunter-gatherer groups at the heart of a microbiome gold rush

December 2023

Over the last couple of decades, scientists have come to realize just how important the microbes that crawl all over us are to our health. But some believe our microbiomes are in crisis—casualties of an increasingly sanitized way of life. Disturbances in the collections of microbes we host have been associated with a whole host of diseases, ranging from arthritis to Alzheimer’s.

Some might not be completely gone, though. Scientists believe many might still be hiding inside the intestines of people who don’t live in the polluted, processed environment that most of the rest of us share.

They’ve been studying the feces of people like the Yanomami, an Indigenous group in the Amazon, who appear to still have some of the microbes that other people have lost. But they’re having to navigate an ethical minefield in order to do so. Read the full story.

—Jessica Hamzelou

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Move over Moo Deng—Haggis the baby pygmy hippo is the latest internet star!
+ To celebrate the life of the late, great Quincy Jones, check out this sensational interview in which he spills the beans on everything from the Beatles’ musical shortcomings to who shot Kennedy. Thank you for the music, Quincy.
+ The color of the season? Sage green, apparently.
+ Dinosaurs are everywhere, you just need to look for them.

OpenAI brings a new web search tool to ChatGPT

ChatGPT can now search the web for up-to-date answers to a user’s queries, OpenAI announced today. 

Until now, ChatGPT was mostly restricted to generating answers from its training data, which is current up to October 2023 for GPT-4o, and had limited web search capabilities. Searches about generalized topics will still draw on this information from the model itself, but now ChatGPT will automatically search the web in response to queries about recent information such as sports, stocks, or news of the day, and can deliver rich multi-media results. Users can also manually trigger a web search, but for the most part, the chatbot will make its own decision about when an answer would benefit from information taken from the web, says Adam Fry, OpenAI’s product lead for search.

“Our goal is to make ChatGPT the smartest assistant, and now we’re really enhancing its capabilities in terms of what it has access to from the web,” Fry tells MIT Technology Review. The feature is available today for the chatbot’s paying users. 

ChatGPT triggers a web search when the user asks about local restaurants in this example

While ChatGPT search, as it is known, is initially available to paying customers, OpenAI intends to make it available for free later, even when people are logged out. The company also plans to combine search with its voice features and Canvas, its interactive platform for coding and writing, although these capabilities will not be available in today’s initial launch.

The company unveiled a standalone prototype of web search in July. Those capabilities are now built directly into the chatbot. OpenAI says it has “brought the best of the SearchGPT experience into ChatGPT.” 

OpenAI is the latest tech company to debut an AI-powered search assistant, challenging similar tools from competitors such as Google, Microsoft, and startup Perplexity. Meta, too, is reportedly developing its own AI search engine. As with Perplexity’s interface, users of ChatGPT search can interact with the chatbot in natural language, and it will offer an AI-generated answer with sources and links to further reading. In contrast, Google’s AI Overviews offer a short AI-generated summary at the top of the website, as well as a traditional list of indexed links. 

These new tools could eventually challenge Google’s 90% market share in online search. AI search is a very important way to draw more users, says Chirag Shah, a professor at the University of Washington, who specializes in online search. But he says it is unlikely to chip away at Google’s search dominance. Microsoft’s high-profile attempt with Bing barely made a dent in the market, Shah says. 

Instead, OpenAI is trying to create a new market for more powerful and interactive AI agents, which can take complex actions in the real world, Shah says. 

The new search function in ChatGPT is a step toward these agents. 

It can also deliver highly contextualized responses that take advantage of chat histories, allowing users to go deeper in a search. Currently, ChatGPT search is able to recall conversation histories and continue the conversation with questions on the same topic. 

ChatGPT itself can also remember things about users that it can use later —sometimes it does this automatically, or you can ask it to remember something. Those “long-term” memories affect how it responds to chats. Search doesn’t have this yet—a new web search starts from scratch— but it should get this capability in the “next couple of quarters,” says Fry. When it does, OpenAI says it will allow it to deliver far more personalized results based on what it knows.

“Those might be persistent memories, like ‘I’m a vegetarian,’ or it might be contextual, like ‘I’m going to New York in the next few days,’” says Fry. “If you say ‘I’m going to New York in four days,’ it can remember that fact and the nuance of that point,” he adds. 

To help develop ChatGPT’s web search, OpenAI says it leveraged its partnerships with news organizations such as Reuters, the Atlantic, Le Monde, the Financial Times, Axel Springer, Condé Nast, and Time. However, its results include information not only from these publishers, but any other source online that does not actively block its search crawler.   

It’s a positive development that ChatGPT will now be able to retrieve information from these reputable online sources and generate answers based on them, says Suzan Verberne, a professor of natural-language processing at Leiden University, who has studied information retrieval. It also allows users to ask follow-up questions.

But despite the enhanced ability to search the web and cross-check sources, the tool is not immune from the persistent tendency of AI language models to make things up or get it wrong. When MIT Technology Review tested the new search function and asked it for vacation destination ideas, ChatGPT suggested “luxury European destinations” such as Japan, Dubai, the Caribbean islands, Bali, the Seychelles, and Thailand. It offered as a source an article from the Times, a British newspaper, which listed these locations as well as those in Europe as luxury holiday options.

“Especially when you ask about untrue facts or events that never happened, the engine might still try to formulate a plausible response that is not necessarily correct,” says Verberne. There is also a risk that misinformation might seep into ChatGPT’s answers from the internet if the company has not filtered its sources well enough, she adds. 

Another risk is that the current push to access the web through AI search will disrupt the internet’s digital economy, argues Benjamin Brooks, a fellow at Harvard University’s Berkman Klein Center, who previously led public policy for Stability AI, in an op-ed published by MIT Technology Review today.

“By shielding the web behind an all-knowing chatbot, AI search could deprive creators of the visits and ‘eyeballs’ they need to survive,” Brooks writes.

Chasing AI’s value in life sciences

Inspired by an unprecedented opportunity, the life sciences sector has gone all in on AI. For example, in 2023, Pfizer introduced an internal generative AI platform expected to deliver $750 million to $1 billion in value. And Moderna partnered with OpenAI in April 2024, scaling its AI efforts to deploy ChatGPT Enterprise, embedding the tool’s capabilities across business functions from legal to research.

In drug development, German pharmaceutical company Merck KGaA has partnered with several AI companies for drug discovery and development. And Exscientia, a pioneer in using AI in drug discovery, is taking more steps toward integrating generative AI drug design with robotic lab automation in collaboration with Amazon Web Services (AWS).

Given rising competition, higher customer expectations, and growing regulatory challenges, these investments are crucial. But to maximize their value, leaders must carefully consider how to balance the key factors of scope, scale, speed, and human-AI collaboration.

The early promise of connecting data

The common refrain from data leaders across all industries—but specifically from those within data-rich life sciences organizations—is “I have vast amounts of data all over my organization, but the people who need it can’t find it.” says Dan Sheeran, general manager of health care and life sciences for AWS. And in a complex healthcare ecosystem, data can come from multiple sources including hospitals, pharmacies, insurers, and patients.

“Addressing this challenge,” says Sheeran, “means applying metadata to all existing data and then creating tools to find it, mimicking the ease of a search engine. Until generative AI came along, though, creating that metadata was extremely time consuming.”

ZS’s global head of the digital and technology practice, Mahmood Majeed notes that his teams regularly work on connected data programs, because “connecting data to enable connected decisions across the enterprise gives you the ability to create differentiated experiences.”

Majeed points to Sanofi’s well-publicized example of connecting data with its analytics app, plai, which streamlines research and automates time-consuming data tasks. With this investment, Sanofi reports reducing research processes from weeks to hours and the potential to improve target identification in therapeutic areas like immunology, oncology, or neurology by 20% to 30%.

Achieving the payoff of personalization

Connected data also allows companies to focus on personalized last-mile experiences. This involves tailoring interactions with healthcare providers and understanding patients’ individual motivations, needs, and behaviors.

Early efforts around personalization have relied on “next best action” or “next best engagement” models to do this. These traditional machine learning (ML) models suggest the most appropriate information for field teams to share with healthcare providers, based on predetermined guidelines.

When compared with generative AI models, more traditional machine learning models can be inflexible, unable to adapt to individual provider needs, and they often struggle to connect with other data sources that could provide meaningful context. Therefore, the insights can be helpful but limited.  

Sheeran notes that companies have a real opportunity to improve their ability to gain access to connected data for better decision-making processes, “Because the technology is generative, it can create context based on signals. How does this healthcare provider like to receive information? What insights can we draw about the questions they’re asking? Can their professional history or past prescribing behavior help us provide a more contextualized answer? This is exactly what generative AI is great for.”

Beyond this, pharmaceutical companies spend millions of dollars annually to customize marketing materials. They must ensure the content is translated, tailored to the audience and consistent with regulations for each location they offer products and services. A process that usually takes weeks to develop individual assets has become a perfect use case for generative copy and imagery. With generative AI, the process is reduced to from weeks to minutes and creates competitive advantage with lower costs per asset, Sheeran says.

Accelerating drug discovery with AI, one step at a time

Perhaps the greatest hope for AI in life sciences is its ability to generate insights and intellectual property using biology-specific foundation models. Sheeran says, “our customers have seen the potential for very, very large models to greatly accelerate certain discrete steps in the drug discovery and development processes.” He continues, “Now we have a much broader range of models available, and an even larger set of models coming that tackle other discrete steps.”

By Sheeran’s count, there are approximately six major categories of biology-specific models, each containing five to 25 models under development or already available from universities and commercial organizations.

The intellectual property generated by biology-specific models is a significant consideration, supported by services such as Amazon Bedrock, which ensures customers retain control over their data, with transparency and safeguards to prevent unauthorized retention and misuse.

Finding differentiation in life sciences with scope, scale, and speed

Organizations can differentiate with scope, scale, and speed, while determining how AI can best augment human ingenuity and judgment. “Technology has become so easy to access. It’s omnipresent. What that means is that it’s no longer a differentiator on its own,” says Majeed. He suggests that life sciences leaders consider:

Scope: Have we zeroed in on the right problem? By clearly articulating the problem relative to the few critical things that could drive advantage, organizations can identify technology and business collaborators and set standards for measuring success and driving tangible results.

Scale: What happens when we implement a technology solution on a large scale? The highest-priority AI solutions should be the ones with the most potential for results.Scale determines whether an AI initiative will have a broader, more widespread impact on a business, which provides the window for a greater return on investment, says Majeed.

By thinking through the implications of scale from the beginning, organizations can be clear on the magnitude of change they expect and how bold they need to be to achieve it. The boldest commitment to scale is when companies go all in on AI, as Sanofi is doing, setting goals to transform the entire value chain and setting the tone from the very top.

Speed: Are we set up to quickly learn and correct course? Organizations that can rapidly learn from their data and AI experiments, adjust based on those learnings, and continuously iterate are the ones that will see the most success. Majeed emphasizes, “Don’t underestimate this component; it’s where most of the work happens. A good partner will set you up for quick wins, keeping your teams learning and maintaining momentum.”

Sheeran adds, “ZS has become a trusted partner for AWS because our customers trust that they have the right domain expertise. A company like ZS has the ability to focus on the right uses of AI because they’re in the field and on the ground with medical professionals giving them the ability to constantly stay ahead of the curve by exploring the best ways to improve their current workflows.”

Human-AI collaboration at the heart

Despite the allure of generative AI, the human element is the ultimate determinant of how it’s used. In certain cases, traditional technologies outperform it, with less risk, so understanding what it’s good for is key. By cultivating broad technology and AI fluency throughout the organization, leaders can teach their people to find the most powerful combinations of human-AI collaboration for technology solutions that work. After all, as Majeed says, “it’s all about people—whether it’s customers, patients, or our own employees’ and users’ experiences.”

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The Download: US house-building barriers, and a fusion energy facility tour

31 October 2024 at 13:10

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Housing is an election issue. But the US sucks at it.

Ahead of abortion access, ahead of immigration, and way ahead of climate change, US voters under 30 are most concerned about one issue: housing affordability. And it’s not just young voters who say soaring rents and eye-watering home sale prices are among their top worries. For the first time in recent memory, the cost of housing could be a major factor in the presidential election.  

It’s not hard to see why. From the beginning of the pandemic to early 2024, US home prices rose by 47%. In large swaths of the country, buying a home is no longer a possibility even for those with middle-class incomes. 

Permitting delays and strict zoning rules create huge obstacles to building more and faster—as do other widely recognized issues, like the political power of NIMBY activists across the country and an ongoing shortage of skilled workers. But there is also another, less talked-about problem: We’re not very efficient at building, and we seem somehow to be getting worse. Read the full story.

—David Rotman

Inside a fusion energy facility

—Casey Crownhart

On an overcast day in early October, I picked up a rental car and drove to Devens, Massachusetts, to visit a hole in the ground.

Commonwealth Fusion Systems has raised over $2 billion in funding since it spun out of MIT in 2018, all in service of building the first commercial fusion reactor. The plan is to have it operating by 2026.

I visited the company’s site recently to check in on progress. Things are starting to come together and, looking around the site, I found it becoming easier to imagine a future that could actually include fusion energy. But there’s still a lot of work left to do. Read the full story.

This story is from The Spark, our weekly climate and energy newsletter. Sign up to receive it in your inbox every Wednesday.

MIT Technology Review Narrated: How gamification took over the world

Instead of liberating us from drudgery and maximizing our potential, gamification has turned out to be just another tool for coercion, distraction, and control. Why did we fall for it?

This is our latest story to be turned into a MIT Technology Review Narrated podcast. In partnership with News Over Audio, we’ll be making a selection of our stories available, each one read by a professional voice actor. You’ll be able to listen to them on the go or download them to listen to offline.

We’re publishing a new story each week on Spotify and Apple Podcasts, including some taken from our most recent print magazine.

Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Bird flu has been found in a pig in the US for the first time 
The USDA says it’s not cause for panic. But it’s certainly cause for concern. (Reuters)
Why virologists are getting increasingly nervous about bird flu. (MIT Technology Review)
 
2 Elon Musk has turned X into a political weapon 
This is what $44 billion bought him: the ability to flood the zone with falsehoods during an election. (The Atlantic $)
X’s crowdsourced fact-checking program is falling woefully short. (WP $)
And it’s not just X. YouTube is full of election conspiracy content too. (NYT $)
+ Spare a thought for the election officials who have to navigate this mess. (NPR)
 
3 Europe’s big tech hawks are nervously eyeing the US election
Biden was an ally in their efforts to crack down. Either of his potential successors look like a less sure bet. (Wired $)
Attendees regularly fail to disclose their links to big tech at EU events. (The Guardian)
 
4 The AI boom is being powered by concrete
It’s a major ingredient for data centers and the power plants being built to serve them—and a climate disaster. (IEEE Spectrum)
How electricity could help tackle a surprising climate villain. (MIT Technology Review)
 
5 What makes human brains so special? 🧠
Much of the answer is still a mystery—but researchers are uncovering more and more promising leads. (Nature)
+ Tech that measures our brainwaves is 100 years old. How will we be using it 100 years from now? (MIT Technology Review)
 
6 Boston Dynamics’ humanoid robot is getting much more capable
If its latest video, in which it autonomously picks up and moves car parts, is anything to go by. (TechCrunch)
A skeptic’s guide to humanoid-robot videos. (MIT Technology Review)
 
7 Alexa desperately needs a revamp
The voice assistant was launched 10 years ago, and it’s been disappointing us ever since. (The Verge
 
8 We’re sick of algorithms recommending us stuff
Lots of people are keen to turn back to guidance from other humans. (New Yorker $)
If you’re one of them, I have bad news: AI is going to make the problem much worse. (Fortune $)
 
9 Russia fined Google $20,000,000,000,000,000,000,000,000,000,000,000
That’s more money than exists on Earth but sure, don’t let that stop you. (The Register)
 
10 What is going on with Mark Zuckerberg recently 
He’s using clothes to rebrand himself and… it’s kinda working?! (Slate)

Quote of the day

“It’s what happens when you let a bunch of grifters take over.”


—A Trumpworld source explains to Wired why Donald Trump’s ground campaign in Michigan is so chaotic. 

 The big story

A day in the life of a Chinese robotaxi driver

worldcoin orb
WORLDCOIN

July 2022

When Liu Yang started his current job, he found it hard to go back to driving his own car: “I instinctively went for the passenger seat. Or when I was driving, I would expect the car to brake by itself,” says the 33-year-old Beijing native, who joined the Chinese tech giant Baidu in January 2021 as a robotaxi driver.

Liu is one of the hundreds of safety operators employed by Baidu, “driving” five days a week in Shougang Park. But despite having only worked for the company for 19 months, he already has to think about his next career move, as his job will likely be eliminated within a few years. Read the full story.

—Zeyi Yang

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Happy Halloween! Check out some of the best spine-chilling classic novels
+ If scary movies are more your jam, I’ve still got you covered.
+ These photo montages of music fans outside concerts are incredible. 
+ Love that this guy went from being terrified of rollercoasters to designing them.
+ You’ll probably never sort your life out. And that’s OK.

Inside a fusion energy facility

31 October 2024 at 11:00

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

On an overcast day in early October, I picked up a rental car and drove to Devens, Massachusetts, to visit a hole in the ground.

Commonwealth Fusion Systems has raised over $2 billion in funding since it spun out of MIT in 2018, all in service of building the first commercial fusion reactor. The company has ambitions to build power plants, but currently the goal is to finish putting together its first demonstration system, the SPARC reactor. The plan is to have it operating by 2026.

I visited the company’s site recently to check in on progress. Things are starting to come together around the hole in the floor where SPARC will eventually be installed. Looking around the site, I found it becoming easier to imagine a future that could actually include fusion energy. But there’s still a lot of work left to do. 

Fusion power has been a dream for decades. The idea is simple: Slam atoms together and use the energy that’s released to power the world. The systems would require small amounts of abundant fuel and wouldn’t produce dangerous waste. The problem is, executing this vision has been much slower than many had hoped.

Commonwealth is one of the leaders in commercial fusion. My colleague James Temple wrote a feature story, published in early 2022, about the company’s attempts to bring the technology to reality. At the time, the Devens location was still a muddy construction site, with the steel and concrete just starting to go into the ground.

Things are much more polished now—when I visited earlier this month, I pulled into one of the designated visitor parking spots and checked in at a reception desk in a bustling office building before beginning my tour. There were two main things to see: the working magnet factory and the cluster of buildings that will house and support the SPARC reactor.

We started in the magnet factory. SPARC is a tokamak, a device relying on powerful magnets to contain the plasma where fusion reactions take place. There will be three different types of magnets in SPARC, all arranged to keep the plasma in position and moving around in the right way.

The company is making its own magnets powered with tape made from a high-temperature superconductor, which generates a magnetic field when an electric current runs through it. SPARC will contain thousands of miles’ worth of this tape in its magnets. In the factory, specialized equipment winds up the tape and tucks it into metal cases, which are then stacked together and welded into protective shells.  

After our quick loop around the magnet factory, I donned a helmet, neon vest, and safety glasses and got a short safety talk that included a stern warning to not stare directly at any welding. Then we walked across a patio and down a gravel driveway to the main complex of buildings that will house the SPARC reactor.

Except for some remaining plywood stairs and dust, the complex appeared to be nearly completed. There’s a huge wall of glass on the front of the building—a feature intended to show that the company is open with the community about the goings-on inside, as my tour guide, chief marketing officer Joe Paluska, put it.  

Four main buildings surround the central tokamak hall. These house support equipment needed to cool down the magnets, heat up the plasma, and measure conditions in the reactor. Most of these big, industrial systems that support SPARC are close to being ready to turn on or are actively being installed, explained Alex Creely, director of tokamak operations, in a call after my tour.

When it was finally time to see the tokamak hall that will house SPARC, we had to take a winding route to get there. A maze of concrete walls funneled us to the entrance, and I lost track of my left and right turns. Called the labyrinth, this is a safety feature, designed to keep stray neutrons from escaping the hall once the reactor is operating. (Neutrons are a form of radiation, and enough exposure can be dangerous to humans.) 

Finally, we stepped into a cavernous space. From our elevated vantage point on a metal walkway, we peered down into a room with gleaming white floors and equipment scattered around the perimeter. At the center was a hole, covered with a tarp and surrounded by bright-yellow railings. That empty slot is where the star of the show, SPARC, will eventually be installed.

tokamak hall at Commonwealth Fusion Systems
The tokamak hall at Commonwealth Fusion Systems will house the company’s SPARC reactor.
COMMONWEALTH FUSION SYSTEMS

While there’s still very little tokamak in the tokamak hall right now, Commonwealth has an ambitious timeline planned: The goal is to have SPARC running and the first plasma in the reactor by 2026. The company plans to demonstrate that it can produce more energy in the reactor than is needed to power it (a milestone known as Q>1 in the fusion world) by 2027.

When we published our 2022 story on Commonwealth, the plan was to flip on the reactor and reach the Q>1 milestone by 2025, so the timeline has slipped. It’s not uncommon for big projects in virtually every industry to take longer than expected. But there’s an especially long and fraught history of promises and missed milestones in fusion. 

Commonwealth has certainly made progress over the past few years, and it’s getting easier to imagine the company actually turning on a reactor and meeting the milestones the field has been working toward for decades. But there’s still a tokamak-shaped hole in suburban Massachusetts waiting to be filled. 


Now read the rest of The Spark

Related reading

Read our 2022 feature on Commonwealth Fusion Systems and its path to commercializing fusion energy here

In late 2022, a reactor at a national lab in the US generated more energy than was put in, a first for the industry. Here’s what meeting that milestone actually means for clean energy

There’s still a lot of research to be done in fusion—here’s what’s coming next

Another company called Helion says its first fusion power plant is five years away. Experts are skeptical, to say the least.

AI e-waste
PHOTO ILLUSTRATION BY SARAH ROGERS/MITTR | PHOTOS GETTY

Another thing

Generative AI will add to our growing e-waste problem. A new study estimates that AI could add up to 5 million tons of e-waste by 2030. 

It’s a small fraction of the total, but there’s still good reason to think carefully about how we handle discarded servers and high-performance computing equipment, according to experts. Read more in my latest story

Keeping up with climate  

New York City will buy 10,000 induction stoves from a startup called Copper. The stoves will be installed in public housing in the city. (Heatmap)

Demand is growing for electric cabs in India, but experts say there’s not nearly enough supply to meet it. (Rest of World)

Pivot Bio aims to tweak the DNA of bacteria so they can help deliver nutrients to plants. The company is trying to break into an industry dominated by massive agriculture and chemical companies. (New York Times)

→ Check out our profile of Pivot Bio, which was one of our 15 Climate Tech Companies to Watch this year. (MIT Technology Review)

At least 62 people are dead and many more are missing in dangerous flooding across Spain. (Washington Post

A massive offshore wind lease sale this week offered up eight patches of ocean off the coast of Maine in the US. Four sold, opening the door for up to 6.8 gigawatts of additional offshore wind power. (Canary Media)

Climate change contributed to the deaths of 38,000 people across Europe in the summer of 2022, according to a new study. (The Guardian)

→ The legacy of Europe’s heat waves will be more air-conditioning, and that could be its own problem. (MIT Technology Review)

There are nearly 9,000 public fast-charging sites in the US, and a surprising wave of installations in the Midwest and Southeast. (Bloomberg)

Some proposed legislation aims to ban factory farming, but determining what that category includes is way more complicated than you might think. (Ambrook Research)

The surprising barrier that keeps us from building the housing we need

31 October 2024 at 10:00

Ahead of abortion access, ahead of immigration, and way ahead of climate change, US voters under 30 are most concerned about one issue: housing affordability. And it’s not just young voters who are identifying soaring rents and eye-watering home sale prices as among their top worries. For the first time in recent memory, the cost of housing could be a major factor in the presidential election.  

It’s not hard to see why. From the beginning of the pandemic to early 2024, US home prices rose by 47%. In large swaths of the country, buying a home is no longer a possibility even for those with middle-class incomes. For many, that marks the end of an American dream built around owning a house. Over the same time, rents have gone up 26%.

Vice President Kamala Harris has offered an ambitious plan to build more: “Right now, a serious housing shortage is part of what is driving up cost,” she said last month in Las Vegas. “So we will cut the red tape and work with the private sector to build 3 million new homes.” Included in her proposals is a $40 billion innovation fund to support housing construction.

Former president Donald Trump, meanwhile, has also called for cutting regulations but mostly emphasizes a far different way to tackle the housing crunch: mass deportation of the immigrants he says are flooding the country, and whose need for housing he claims is responsible for the huge jump in prices. (While a few studies show some local impact on the cost of housing from immigration in general, the effect is relatively small, and there is no plausible economic scenario in which the number of immigrants over the last few years accounts for the magnitude of the increase in home prices and rents across much of the country.)

The opposing views offered by Trump and Harris have implications not only for how we try to lower home prices but for how we view the importance of building. Moreover, this attention on the housing crisis also reveals a broader issue with the construction industry at large: This sector has been tech-averse for decades, and it has become less productive over the past 50 years.

The reason for the current rise in the cost of housing is clear to most economists: a lack of supply. Simply put, we don’t build enough houses and apartments, and we haven’t for years. Depending on how you count it, the US has a shortage of around 1.2 million to more than 5.5 million single-family houses.

Permitting delays and strict zoning rules create huge obstacles to building more and faster—as do other widely recognized issues, like the political power of NIMBY activists across the country and an ongoing shortage of skilled workers. But there is also another, less talked-about problem that’s plaguing the industry: We’re not very efficient at building, and we seem somehow to be getting worse.

Together these forces have made it more expensive to build houses, leading to increases in prices. Albert Saiz, a professor of urban economics and real estate at MIT, calculates that construction costs account for more than two-thirds of the price of a new house in much of the country, including the Southwest and West, where much of the building is happening. Even in places like California and New England, where land is extremely expensive, construction accounts for 40% to 60% of value of a new home, according to Saiz.

Part of the problem, Saiz says, is that “if you go to any construction site, you’ll see the same methods used 30 years ago.”

The productivity woes are evident across the construction industry, not just in the housing sector. From clean-energy advocates dreaming of renewables and an expanded power grid to tech companies racing to add data centers, everyone seems to agree: We need to build more and do it quickly. The practical reality, though, is that it costs more, and takes more time, to construct anything.

For decades, companies across the industry have largely ignored ways they could improve the efficiency of their operations. They have shunned data science and the kinds of automation that have transformed the other sectors of the economy. According to an estimation by the McKinsey Global Institute, construction, one of the largest parts of the global economy, is the least digitized major sector worldwide—and it isn’t even close.

The reality is that even if we ease the endless permitting delays and begin cutting red tape, we will still be faced with a distressing fact: The construction industry is not very efficient when it comes to building stuff.

The awful truth

Productivity is our best measure of long-term progress in an industry, at least according to economists. Technically, it’s a measure of how much a worker can produce; as companies adopt more efficient practices and new technologies, productivity grows and businesses can make stuff (in this case, homes and buildings) faster and more cheaply. Yet something shocking has happened in the construction industry: Productivity seems to have stalled and even gone into reverse over the last few decades.

In a recent paper called “The Strange and Awful Path of Productivity in the US Construction Sector,” two leading economists at the University of Chicago showed that productivity growth in US construction came to a halt beginning around 1970. Productivity is notoriously difficult to quantify, but the Chicago researchers calculated it in one of the key parts of the construction business: housing. They found that the number of houses or total square footage (houses are getting bigger) built per employee each year was flat or even falling over the last 50 years. And the researchers believe the lack of productivity growth holds true for all different types of construction.

Chad Syverson, one of the authors, admits he is still trying to pinpoint the reason—“It’s probably a few things.” While he says it’s difficult to quantify the specific impact of various factors on productivity, including the effects of regulatory red tape and political fights that often delay construction, “part of the industry’s problem is its own operational inefficiency,” he says. “There’s no doubt about it.” In other words, the industry just isn’t very innovative.

The lack of productivity in construction over the last half-century, at a time when all other sectors grew dramatically, is “really amazing,” he says—and not in a good way.

US manufacturing, in contrast, continued growing at around 2% to 3% annually over the same period. Auto workers, as a result, now produce far more cars than they once did, leading to cheaper vehicles if you adjust for inflation (and, by most measures, safer and better ones).

Productivity in construction is not just a US problem, according to the McKinsey Global Institute, which has tracked the issue for nearly a decade. Not all countries are faring as badly as the US, but worldwide construction productivity has been flat over the last few decades, says Jan Mischke, who heads the McKinsey work.

Beyond adding to the costs and threatening the financial viability of many planned projects, Mischke says, the lack of productivity is “reflected in all the mess, time and cost overruns, concerns about quality, rework, and all the things that everyone who has ever built anything will have seen.” 

The nature of construction work can make it difficult to improve longstanding processes and introduce new technologies, he says: “Most other sectors become better over time by doing the same thing twice or three times or 3 million times. They learn and improve. All that is essentially missing in construction, where every single project starts from scratch and reinvents the wheel.”

Mischke also sees another reason for the industry’s lack of productivity: the “misaligned incentives” of the various players, who often make more money the longer a project takes.

Though the challenges are endemic to the business, Mischke adds that builders can take steps to overcome them by moving to digital technologies, implementing more standardized processes, and improving the efficiency of their business practices.

“Most other sectors become better over time by doing the same thing twice or three times or 3 million times. All that is essentially missing in construction.”

It’s an urgent problem to solve as many countries race to build housing, expand clean-energy capabilities, and update infrastructure like roads and airports. In their latest report, the McKinsey researchers warn of the dangers if productivity doesn’t improve: “The net-zero transition may be delayed, growth ambitions may be deferred, and countries may struggle to meet the infrastructure and housing needs for their populations.”

But the report also says there’s a flip side to the lack of progress in much of the industry: Individual companies that begin to improve their efficiency could gain a huge competitive advantage.

Building on the data

When Jit Kee Chin joined Suffolk Construction as its chief data officer in 2017, the title was unique in the industry. But Chin, armed with a PhD in experimental physics from MIT and a 10-year stint at McKinsey, brought to the large Boston-based firm the kind of technical and management expertise often missing from construction companies. And she recognized that large construction projects—including the high-rise apartment buildings and sprawling data centers that Suffolk often builds—generate vast amounts of useful data.

At the time, much of the data was siloed; information on the progress of a project was in one place, scheduling in another, and safety data and reports in yet another. “The systems didn’t talk to each other, and it was very difficult to cross-correlate,” says Chin. Getting all the data together so it could be understood and utilized across the business was an early task.

“Almost all construction companies are talking about how to better use their data now,” says Chin, who is currently Suffolk’s CTO, and since her hiring, “a couple others have even appointed chief data officers.” But despite such encouraging signs, she sees the effort to improve productivity in the industry as still very much a work in progress.  

One ongoing and obvious target: the numerous documents that are constantly being revised as they move along from architect to engineers to subcontractors. It’s the lifeblood of any construction project, and Chin says the process “is by no means seamless.” Architects and subcontractors sometimes use different software; meanwhile, the legally binding documents spelling out details of a project are still circulated as printouts. A more frictionless flow of information among the multitude of players is critical to better coordinate the complex building process.

Ultimately, though, building is a physical activity. And while automation has largely been absent from building trades, robots are finally cheap enough to be attractive to builders, especially companies facing a shortage of workers. “The cost of off-the-shelf robotic components has come down to a point where it is feasible to think of simple robots automating a very repetitive task,” says Chin. And advances in robotic image recognition, lidar, AI, and dexterity, she says, mean robots are starting to be able to safely navigate construction sites.

One step in construction where digital designs meet the physical world is the process of laying out blueprints for walls and other structures on the floor of a building. It’s an exacting, time-consuming manual practice, prone to errors.

The Dusty Robotics field printer marks the layout for walls and other structures.
DUSTY ROBOTICS

And startups like Dusty Robotics are betting it’s an almost perfect application for a Roomba-like robot. Tessa Lau, its CEO, recalls that when she researched the industry before founding the company in 2018, she was struck by seeing “people on their hands and knees snapping chalk lines.”

Based in Silicon Valley, the company builds a box-shaped machine that scoots about a site on sturdy wheels to mark the layout. Though the company often markets it as a field printer to allay any fears about automation, it’s an AI-powered robot with advanced sensors that plan and guide its travels.

Not only does the robot automate a critical job, but because that task is so central in the construction process, it also helps open a digital window into the overall workflow of a project.

A history lesson

Whatever the outcome of the upcoming election, don’t hold your breath waiting for home prices to fall; even if we do build more (or somehow decrease demand), it will probably take years for the supply to catch up. But the political spotlight on housing affordability could be a rare opportunity to focus on the broad problem of construction productivity.  

While some critics have argued that Harris’s plan is too vague and lacks the ambition required to solve the housing crisis, her message that we need to build more and faster is the right one. “It takes too long and it costs too much to build. Whether it’s a new housing development, a new factory, or a new bridge, projects take too long to go from concept to reality,” Harris said in a speech in late September. Then she asked: “You know long it took to build [the Empire State Building]?”

Harris stresses cutting red tape to unleash a building boom. That’s critical, but it’s only part of the long-term answer. The construction of the famous New York City skyscraper took just over a year in 1931—a feat that provides valuable clues to how the industry itself can finally increase its productivity.

The explanation for why it was built so quickly has less to do with new technologies—in fact, the engineers mostly opted for processes and materials that were familiar and well-tested at the time—and more to do with how the project leaders managed every aspect of the design and construction process for speed and efficiency. The activity of the thousands of workers was carefully scheduled and tracked, and the workflow was highly choreographed to minimize delays. Even the look of the 1,250-foot building was largely a result of choosing the fastest and simplest way to build.

To a construction executive like Suffolk’s Chin, who estimates it would take at least four years to construct such a building today, the lessons of the Empire State Building resonate, especially the operational discipline and the urgency to finish the structure as quickly as possible. “It’s a stark difference when you think about how much time it took and how much time it would take to build that building now,” she says.

If we want an affordable future, the construction business needs to recapture that sense of urgency and efficiency. To do so, the industry will need to change the way it operates and alter its incentive structures; it will need to incorporate the right mix of automation and find financial models that will transform outdated business practices. The good news is that advances in data science, automation, and AI are offering companies new opportunities to do just that.

The hope, then, is that capitalism will do capitalism. Innovative firms will (hopefully) build more cheaply and faster, boost their profits, and become more competitive. Such companies will prosper, and others will begin to mimic the early adopters, investing in the new technologies and business models. In other words, the reality of seeing some builders profit by using data and automation will finally help drag the construction industry into the modern digital age.

The Download: coping in a time of arrhythmia, and DNA data storage

30 October 2024 at 13:10

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The arrhythmia of our current age  

Arrhythmia means the heart beats, but not in proper time—a critical rhythm of life suddenly going rogue and unpredictable. It’s frightening to experience, but what if it’s also a good metaphor for our current times? That a pulse once seemingly so steady is now less sure. Perhaps this wobbliness might be extrapolated into a broader sense of life in the 2020s. 

Maybe you feel it, too—that the world seems to have skipped more than a beat or two as demagogues rant and democracy shudders, hurricanes rage, and glaciers dissolve. We can’t stop watching tiny screens where influencers pitch products we don’t need alongside news about senseless wars that destroy, murder, and maim tens-of-thousands. 

All the resulting anxiety has been hard on our hearts—literally and metaphorically. Read the full story

—David Ewing Duncan

An easier-to-use technique for storing data in DNA is inspired by our cells

The news: It turns out that you don’t need to be a scientist to encode data in DNA. Researchers have been working on DNA-based data storage for decades, but a new template-based method inspired by our cells’ chemical processes is easy enough for even nonscientists to practice. 

Some background: So far, the process of storing data in DNA has been expensive, time consuming, and error prone. It also required skilled expertise to carry out. 

The details: The new method is more efficient and easy enough that anyone can do it. They enlisted 60 students—studying all sorts of topics, not just science—to test it out, and the trial was a success. It could pave the way for an unusual but ultra-stable way to store information. Read the full story. 

—Jenna Ahart

Read next: We’re making more data than ever. What can—and should—we save for future generations? And will they be able to understand it? Read our feature all about the race to save our online lives from a digital dark age.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Facebook is auto-generating militia group pages
Rather than shutting extremist content down, it’s actually lending a helping hand. (Wired $)
X is shoving political content into people’s feeds, whether they want it or not. (WSJ $)
Some users say they’re being paid thousands of dollars by X to promote misinformation. (BBC)

2 OpenAI is working on its first in-house chip with Broadcom and TSMC
It’s abandoned ambitious plans to manufacture its own chips. Instead, it’s focusing on the design stage of the process. (Reuters $)
Chip designer Arm could become one of the biggest beneficiaries of the AI boom. (FT $)

3 Elon Musk has build a compound for his children and their mothers
It is an… unconventional set-up to say the least. (NYT $) 
Musk fans are losing a lot of money to crypto scams. (Gizmodo)

4 A quarter of new code at Google is now AI-generated 
That fascinating fact emerged from CEO Sundar Pichai himself on the company’s latest earnings call. (The Verge
Github Copilot will switch from only using OpenAI’s models to a multi-model approach. (Ars Technica)
How AI assistants are already changing the way code gets made. (MIT Technology Review)

5 This app can operate your smartphone for you 
If you live in China anyway—but companies everywhere are working on the same capabilities. (South China Morning Post)
LinkedIn has launched an AI agent that purports to do a whole range of recruitment tasks. (TechCrunch

6 Universal is building an AI music generator 
But it’s a long way off from demoing it just yet. (The Verge)
Rival AI music startups face a big barrier: licensing copyrighted music is very expensive. (MIT Technology Review)

7 Kids are getting around school smartphone bans with smartwatches
But it seems it’s anxious parents that are really driving adoption. (Wired $)

8 Reddit just turned a profit for the first time
It has almost 100 million daily users now. (FT $) 

9 AI is coming to the world of dance 💃
You still need human bodies—but AI is helping with choreography and set designs. (The Guardian)

10 A PhD student found a lost city in Mexico by accident
Luke Auld-Thomas stumbled across a vast ancient Maya city while studying online Lidar survey data. (BBC)

Quote of the day

Compared to what AI boosters were predicting after ChatGPT was released, this is a glacial pace of adoption.”


—Arvind Narayanan, a computer science professor at Princeton University, digs into a study which found that only 0.5-3.5% of work hours involve generative AI in a post on X.

 The big story

How Worldcoin recruited its first half a million test users

worldcoin orb
WORLDCOIN

April 2022

In December 2021, residents of the village of Gunungguruh, Indonesia, were curious when technology company Worldcoin turned up at a local school. It was pitched as a “new, collectively owned global currency that will be distributed fairly to as many people as possible,” in exchange for an iris scan and other personal data.

Gunungguruh was not alone in receiving a visit from Worldcoin. MIT Technology Review has interviewed over 35 individuals in six countries who either worked for or on behalf of Worldcoin, had been scanned, or were unsuccessfully recruited to participate.

Our investigation reveals wide gaps between Worldcoin’s public messaging, which focused on protecting privacy, and what users experienced. We found that the company’s representatives used deceptive marketing practices, and failed to obtain meaningful informed consent. Read the full investigation

—Eileen Guo and Adi Renaldi

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Can you guess these movies from their French name?

+ Why leopard print is an eternally solid style choice. ($)

+ Sitting all day screws our bodies up, but these stretches can help.

+ You can pretty much pinpoint the exact hour you hit peak happiness on vacation. ($)

The arrhythmia of our current age

Thumpa-thumpa, thumpa-thumpa, bump, 

thumpa, skip, 

thumpa-thump, pause …

My heart wasn’t supposed to be beating like this. Way too fast, with bumps, pauses, and skips. On my smart watch, my pulse was topping out at 210 beats per minute and jumping every which way as my chest tightened. Was I having a heart attack? 

The day was July 4, 2022, and I was on a 12-mile bike ride on Martha’s Vineyard. I had just pedaled past Inkwell Beach, where swimmers sunbathed under colorful umbrellas, and into a hot, damp headwind blowing off the sea. That’s when I first sensed a tugging in my chest. My legs went wobbly. My head started to spin. I pulled over, checked my watch, and discovered that I was experiencing atrial fibrillation—a fancy name for a type of arrhythmia. The heart beats, but not in the proper time. Atria are the upper chambers of the heart; fibrillation means an attack of “uncoordinated electrical activity.”   

I recount this story less to describe a frightening moment for me personally than to consider the idea of arrhythmia—a critical rhythm of life suddenly going rogue and unpredictable, triggered by … what? That July afternoon was steamy and over 90 °F, but how many times had I biked in heat far worse? I had recently recovered from a not-so-bad bout of covid—my second. Plus, at age 64, I wasn’t a kid anymore, even if I didn’t always act accordingly.  

Whatever the proximal cause, what was really gripping me on July 4, 2022, was the idea of arrhythmia as metaphor. That a pulse once seemingly so steady was now less sure, and how this wobbliness might be extrapolated into a broader sense of life in the 2020s. I know it’s quite a leap from one man’s abnormal ticker to the current state of an entire species and era, but that’s where my mind went as I was taken to the emergency department at Martha’s Vineyard Hospital. 

Maybe you feel it, too—that the world seems to have skipped more than a beat or two as demagogues rant and democracy shudders, hurricanes rage, glaciers dissolve, and sunsets turn a deeper orange as fires spew acrid smoke into the sky, and into our lungs. We can’t stop watching tiny screens where influencers pitch products we don’t need alongside news about senseless wars that destroy, murder, and maim tens-of-thousands. Poverty remains intractable for billions. So does loneliness and a rising crisis in mental health even as we fret over whether AI is going to save us or turn us into pets; and on and on.

For most of my life, I’ve leaned into optimism, confident that things will work out in the end. But as a nurse admitted me and attached ECG leads to my chest, I felt a wave of doubt about the future. Lying on a gurney, I watched my pulse jump up and down on a monitor, erratically and still way too fast, as another nurse poked a needle into my hand to deliver an IV bag of saline that would hydrate my blood vessels. Soon after, a young, earnest doctor came in to examine me, and I heard the word uttered for the first time. 

“You are having an arrhythmia,” he said.

Even with my heart beating rat-a-tat-tat, I couldn’t help myself. Intrigued by the word, which I had heard before but had never really heard, I pulled out the phone that is always at my side and looked it up.

ar·rhyth·mi·a
Noun: “a condition in which the heart beats with an irregular or abnormal  rhythm.” Greek a-, “without,” and rhuthmos, “rhythm.”

I lay back and closed my eyes and let this Greek origin of the word roll around in my mind as I repeated it several times—rhuthmos, rhuthmos, rhuthmos.

Rhythm, rhythm, rhythm …

I tapped my finger to follow the beat of my heart, but of course I couldn’t, because my heart wasn’t beating in the steady and predictable manner that my finger could easily have followed before July 4, 2022. After all, my heart was built to tap out in a rhythm, a rhuthmos—not an arhuthmos

Later I discovered that the Greek rhuthmos, ῥυθμός, like the English rhythm, refers not only to heartbeats but to any steady motion, symmetry, or movement. For the ancient Greeks this word was closely tied to music and dance; to the physics of vibration and polarity; to a state of balance and harmony. The concept of rhuthmos was incorporated into Greek classical sculptures using a strict formula of proportions called the Kanon, an example being the Doryphoros (Spear Bearer) originally by the fifth century sculptor Polykleitos. Standing today in the Acropolis Museum in Athens this statue appears to be moving in an easy fluidity, a rhuthmos that’s somehow drawn out of the milky-colored stone. 

The Greeks also thought of rhuthmos as harmony and balance in emotions, with Greek playwrights penning tragedies where the rhuthmos of life, nature, and the gods goes awry. “In this rhythm, I am caught,” cries Prometheus in Aeschylus’s Prometheus Bound, where rhuthmos becomes a steady, unrelenting punishment inflicted by Zeus when Prometheus introduces fire to humans, providing them with a tool previously reserved for the gods. Each day Prometheus, who is chained to a rock, has his liver eaten out by an eagle, only to have the liver grow back each night, a cycle repeated day after day in a steady beat for an eternity of penance, pain, and vexation.

In modern times, cardiologists have used rhuthmos to refer to the physical beating of the muscle in our chests that mixes oxygen and blood and pumps it through 60,000 miles of veins, arteries, and capillaries to fingertips, toe tips, frontal cortex, kidneys, eyes, everywhere. In 2006, the journal Rhythmos launched as a quarterly medical publication that focuses on cardiac electrophysiology. This subspecialty of cardiology involves the electrical signals animating the heart with pulses that keep it beating steadily—or, for me in the summer of 2022, not. 

The question remained: Why?

As far as I know, I wasn’t being punished by Zeus, although I couldn’t entirely rule out the possibility that I had annoyed some god or goddess and was catching hell for it. Possibly covid was the culprit—that microscopic bundle of RNA with the power of a god to mess with us mortals—but who knows? As science learns more about this pernicious bug, evidence suggests that it can play havoc with the nervous system and tissue that usually make sure the heart stays in rhuthmos

A-fib also can be instigated by even moderate imbibing of alcohol, by aging, and sometimes by a gene called KCNQ1. Mutations in this gene “appear to increase the flow of potassium ions through the channel formed with the KCNQ1 protein,” according to MedlinePlus, part of the National Library of Medicine. “The enhanced ion transport can disrupt the heart’s normal rhythm, resulting in atrial fibrillation.” Was a miscreant  mutation playing a role in my arrhythmia?

Angst and fear can influence A-fib too. I had plenty of both during the pandemic, along with most of humanity. Lest we forget—and we’re trying really, really hard to forget—covid anxiety continued to rage in the summer of 2022, even after vaccines had arrived and most of the world had reopened. 

Back then, the damage done to fragile brains forced to shelter in place for months and months was still fresh. Cable news and social media continued to amplify the terror of seeing so many people dead or facing permanent impairment. Politics also seemed out of control, with demagogues—another Greek word—running amok. Shootings, invasions, hatred, and fury seemed to lurk everywhere. This is one reason I stopped following the news for days at a time—something I had never done, as a journalist and news junkie. I felt that my fragile heart couldn’t bear so much visceral tragedy, so much arhuthmos.

We each have our personal stories from those dark days. For me, covid came early in 2020 and led to a spring and summer with a pervasive brain fog, trouble breathing, and eventually a depression of the sort that I had never experienced before. At the same time, I had friends who ended up in the ICU, and I knew people whose parents and other relatives had passed. My mother was dying of dementia, and my father had been in and out of the ICU a half-dozen times with myasthenia gravis, an autoimmune disease that can be fatal. This family dissolution had started before covid hit, but the pandemic made the implosion of my nuclear family seem worse and undoubtedly contributed to the failure of my heart’s pulse to stay true. 


Likewise, the wider arhuthmos some of us are feeling now began long before the novel coronavirus shut down ordinary life in March 2020. Statistics tell us that anxiety, stress, depression, and general mental unhealthiness have been steadily ticking up for years. This seems to suggest that something bigger has been going on for some time—a collective angst that seems to point to the darker side of modern life itself. 

Don’t get me wrong. Modern life has provided us with spectacular benefits—Manhattan, Boeing 787 Dreamliners, IMAX films, cappuccinos, and switches and dials on our walls that instantly illuminate or heat a room. Unlike our ancestors, most of us no longer need to fret about when we will eat next or whether we’ll find a safe place to sleep, or worry that a saber-toothed tiger will eat us. Nor do we need to experience an A-fib attack without help from an eager and highly trained young doctor, an emergency department, and an IV to pump hydration into our veins. 

But there have been trade-offs. New anxieties and threats have emerged to make us feel uneasy and arrhythmic. These start with an uneven access to things like emergency departments, eager young doctors, shelter, and food—which can add to anxiety not only for those without them but also for anyone who finds this situation unacceptable. Even being on the edge of need can make the heart gambol about.

Consider, too, the basic design features of modern life, which tend toward straight lines—verticals and horizontals. This comes from an instinct we have to tidy up and organize things, and from the fact that verticals and horizontals in architecture are stable and functional. 

All this straightness, however, doesn’t always sit well with brains that evolved to see patterns and shapes in the natural world, which isn’t horizontal and vertical. Our ancestors looked out over vistas of trees and savannas and mountains that were not made from straight lines. Crooked lines, a bending tree, the fuzzy contour of a grassy vista, a horizon that bobs and weaves—these feel right to our primordial brains. We are comforted by the curve of a robin’s breast and the puffs and streaks and billows of clouds high in the sky, the soft earth under our feet when we walk.

Not to overly romanticize nature, which can be violent, unforgiving, and deadly. Devastating storms and those predators with sharp teeth were a major reason why our forebears lived in trees and caves and built stout huts surrounded by walls. Homo sapiens also evolved something crucial to our survival—optimism that they would survive and prevail. This has been a powerful tool—one of the reasons we are able to forge ahead, forget the horrors of pandemics and plagues, build better huts, and learn to make cappuccinos on demand. 

As one of the great optimists of our day, Kevin Kelly, has said: “Over the long term, the future is decided by optimists.” 

But is everything really okay in this future that our ancestors built for us? Is the optimism that’s hardwired into us and so important for survival and the rise of civilization one reason for the general anxiety we’re feeling in a future that has in some crucial ways turned out less ideal than those who constructed it had hoped? 

At the very least, modern life seems to be downplaying elements that are as critical to our feelings of safety as sturdy walls, standing armies, and clean ECGs—and truly more crucial to our feelings of happiness and prosperity than owning two cars or showing off the latest swimwear on Miami Beach. These fundamentals include love and companionship, which statistics tell us are in short supply. Today millions have achieved the once optimistic dream of living like minor pharaohs and kings in suburban tract homes and McMansions, yet inadvertently many find themselves separated from the companionship and community that are basic human cravings. 

Modern science and technology can be dazzling and good and useful. But they’ve also been used to design things that hurt us broadly while spectacularly benefiting just a few of us. We have let the titans of social media hijack our genetic cravings to be with others, our need for someone to love and to love us, so that we will stay glued to our devices, even in the ED when we think we might be having a heart attack. Processed foods are designed to play on our body’s craving for sweets and animal fat, something that evolution bestowed so we would choose food that is nutritious and safe to eat (mmm, tastes good) and not dangerous (ugh, sour milk). But now their easy abundance overwhelms our bodies and makes many of us sick. 

We invented money so that acquiring things and selling what we make in order to live better would be faster and easier. In the process, we also invented a whole new category of anxiety—about money. We worry about having too little of it and sometimes too much; we fear that someone will steal it or trick us into spending it on things we don’t need. Some of us feel guilty about not spending enough of it on feeding the hungry or repairing our climate. Money also distorts elections, which require huge amounts of it. You may have gotten a text message just now, asking for some to support a candidate you don’t even like. 

The irony is that we know how to fix at least some of what makes us on edge. For instance, we know we shouldn’t drive gas-guzzling SUVs and that we should stop looking at endless perfect kitchens, too-perfect influencers, and 20-second rants on TikTok. We can feel helpless even as new ideas and innovations proliferate. This may explain one of the great contradictions of this age of arrhythmia—one demonstrated in a 2023 UNESCO global survey about climate change that questioned 3,000 young people from 80 different countries, aged 16 to 24. Not surprisingly, 57% were “eco-anxious.” But an astonishing 67% were “eco-optimistic,” meaning many were both anxious and hopeful. 

Me too. 

All this anxiety and optimism have been hard on our hearts—literally and metaphorically. Too much worry can cause this fragile muscle to break down, to lose its rhythm. So can too much of modern life. Cardiovascular disease remains the No. 1 killer of adults, in the US and most of the world, with someone in America dying of it every 33 seconds, according to the Centers for Disease Control and Prevention. The incidence of A-fib has tripled in the past 50 years (possibly because we’re diagnosing it more); it afflicted almost 50 million people globally in 2016.


For me, after that initial attack on Martha’s Vineyard, the A-fib episodes kept coming. I charted them on my watch, the blips and pauses in my pulse, the moments when my heart raced at over 200 beats per minute, causing my chest to tighten and my throat to feel raw. Sometimes I tasted blood, or thought I did. I kept bicycling through the summer and fall of 2022, gingerly watching my heart rate to see if I could keep the beats from taking a sudden leap from normal to out of control. 

When an arrhythmic episode happened, I struggled to catch my breath as I  pulled over to the roadside to wait for the misfirings to pass. Sometimes my mind grew groggy, and I got confused. It became difficult during these cardio-disharmonious moments to maintain my cool with other people. I became less able to process the small setbacks that we all face every day—things I previously had been able to let roll off my back. 

Early in 2023 I had my heart checked by a cardiologist. He conducted an echocardiogram and had me jog on a treadmill hooked up to monitors. “There has been no damage to your heart,” he declared after getting the results, pointing to a black-and-white video of my heart muscle contracting and constricting, drawing in blood and pumping it back out again. I felt relieved, although he also said that the A-fib was likely to persist, so he prescribed a blood thinner called Eliquis as a precaution to prevent stroke. Apparently, during unnatural pauses in one’s heartbeat blood can clot and send tiny, scab-like fragments into the brain, potentially clogging up critical capillaries and other blood vessels. “You don’t want that to happen,” said the cardiologist.

Toward the end of my heart exam, the doctor mentioned a possible fix for my arrhythmia. I was skeptical, although what he proposed turned out to be one of the great pluses of being alive right now—a solution that was unavailable to my ancestors or even to my grandparents. “It’s called a heart ablation,” he said. The procedure, a simple operation, redirects errant electric signals in the heart muscle to restore a normal pattern of beating. Doctors will run a tube into your heart, find the abnormal tissue throwing off the rhythm, and zap it with either extreme heat, cold, or (the newest option) electrical pulses. There are an estimated 240,000 such procedures a year in the United States. 

“Can you really do that?” I asked.

“We can,” said the doctor. “It doesn’t always work the first time. Sometimes you need a second or third procedure, but the success rate is high.”

A few weeks later, I arrived at Beth Israel Hospital in Boston at 11 a.m. on a Tuesday. My first cardiologist was unavailable to do the procedure, so after being prepped in the pre-op area I was greeted by Andre d’Avila, a specialist in electrocardiology, who explained again how the procedure worked. He said  that he and an electrophysiology fellow would be inserting long, snakelike catheters through the femoral veins in my groin that contain wires tipped with a tiny ultrasound camera and a cauterizer that would be used to selectively and carefully burn the surfaces of my atrial muscles. The idea was to create patterns of scar tissue to block and redirect the errant electrical signals and restore a steady rhuthmos to my heart. The whole thing would take about two or three hours, and I would likely be going home that afternoon.

Moments later, an orderly came and wheeled me through busy hallways to an OR where Dr. d’Avila introduced the technicians and nurses on his OR team. Monitors pinged and machines whirred as moments later an anesthesiologist placed a mask over my mouth and nose, and I slipped into unconsciousness. 

The ablation was a success. Since I woke up, my heart has kept a steady beat, restoring my internal rhuthmos, even if the procedure sadly did not repair the myriad worrisome externalities—the demagogues, carbon footprints, and the rest. Still, the undeniably miraculous singeing of my atrial muscles left me with a realization that if human ingenuity can fix my heart and restore its rhythm, shouldn’t we be able to figure out how to fix other sources of arhuthmos in our lives? 

We already have solutions to some of what ails us. We know how to replace fossil fuels with renewables, make cities less sharp-edged, and create smart gizmos and apps that calm our minds rather than agitating them. 

For my own small fix, I thank Dr. d’Avila and his team, and the inventors of the ablation procedure. I also thank Prometheus, whose hubris in bringing fire to mortals literally saved me by providing the hot-tipped catalyst to repair my ailing heart. Perhaps this can give us hope that the human species will bring the larger rhythms of life into a better, if not perfect, beat. Call me optimistic, but also anxious, about our prospects even as I can now place my finger on my wrist and feel once again the steady rhuthmos of my heart.

An easier-to-use technique for storing data in DNA is inspired by our cells 

30 October 2024 at 11:00

It turns out that you don’t need to be a scientist to encode data in DNA. Researchers have been working on DNA-based data storage for decades, but a new template-based method inspired by our cells’ chemical processes is easy enough for even nonscientists to practice. The technique could pave the way for an unusual but ultra-stable way to store information. 

The idea of storing data in DNA was first proposed in the 1950s by the physicist Richard Feynman. Genetic material has exceptional storage density and durability; a single gram of DNA can store a trillion gigabytes of data and retain the information for thousands of years. Decades later, a team led by George Church at Harvard University put the idea into practice, encoding a 53,400-word book.

This early approach relied on DNA synthesis—stringing genetic sequences together piece by piece, like beads on a thread, using the four nucleotide building blocks A, T, C, and G to encode information. The process was expensive, time consuming, and error prone, creating only one bit (or an eighth of a byte) with each nucleotide added to a strand. Crucially, the process required skilled expertise to carry out.

The new method, published in Nature last week, is more efficient, storing 350 bits at a time by encoding strands in parallel. Rather than hand-threading each DNA strand, the team assembles strands from pre-built DNA bricks about 20 nucleotides long, encoding information by altering some and not others along the way. Peking University’s Long Qian and team got the idea for such templates from the way cells share the same basic set of genes but behave differently in response to chemical changes in DNA strands. “Every cell in our bodies has the same genome sequence, but genetic programming comes from modifications to DNA. If life can do this, we can do this,” she says. 

Qian and her colleagues encoded data through methylation, a chemical reaction that switches genes on and off by attaching a methyl compound—a small methane-related molecule. Once the bricks are locked into their assigned spots on the strand, researchers select which bricks to methylate, with the presence or absence of the modification standing in for binary values of 0 or 1. The information can then be deciphered using nanopore sequencers to detect whether a brick has been methylated. In theory, the new method is simple enough to be carried out without detailed knowledge of how to manipulate DNA.

The storage capacity of each DNA strand caps off at roughly 70 bits. For larger files, researchers splintered data into multiple strands identified by unique barcodes encoded in the bricks. The strands were then read simultaneously and sequenced according to their barcodes. With this technique, researchers encoded the image of a tiger rubbing from the Han dynasty, troubleshooting the encoding process until the image came back with no errors. The same process worked for more complex images, like a photorealistic print of a panda. 

To gauge the real-world applicability of their approach, the team enlisted 60 students from diverse academic backgrounds—not just scientists—to encode any writing of their choice. The volunteers transcribed their writing into binary code through a web server. Then, with a kit sent by the team, they pipetted an enzyme into a 96-well plate of the DNA bricks, marking which would be methylated. The team then ran the samples through a sequencer to make the DNA strand. Once the computer received the sequence, researchers ran a decoding algorithm and sent the restored message back to a web server for students to retrieve with a password. The writing came back with a 1.4% error rate in letters, and the errors were eventually corrected through language-learning models. 

Once it’s more thoroughly developed, Qian sees the technology becoming useful as long-term storage for archival information that isn’t accessed every day, like medical records, financial reports, or scientific data.  

The success nonscientists achieved using the technique in coding trials suggests that the DNA storage could eventually become a practical technology. “Everyone is storing data every day, and so to compete with traditional data storage technologies, DNA methods need to be usable by the everyday person,” says Jeff Nivala, co-director of University of Washington’s Molecular Information Systems Lab. “This is still an early demonstration of going toward nonexperts, but I think it’s pretty unique that they’re able to do that.”

DNA storage still has many strides left to make before it can compete with traditional data storage. The new system is more expensive than either traditional data storage techniques or previous DNA-synthesis methods, Nivala says, though the encoding process could become more efficient with automation on a larger scale. With future development, template-based DNA storage might become a more secure method of tackling ever-climbing data demands. 

Cultivating the next generation of AI innovators in a global tech hub

A few years ago, I had to make one of the biggest decisions of my life: continue as a professor at the University of Melbourne or move to another part of the world to help build a brand new university focused entirely on artificial intelligence.

With the rapid development we have seen in AI over the past few years, I came to the realization that educating the next generation of AI innovators in an inclusive way and sharing the benefits of technology across the globe is more important than maintaining the status quo. I therefore packed my bags for the Mohammed bin Zayed University of Artificial Intelligence (MBZUAI) in Abu Dhabi.

The world in all its complexity

Today, the rewards of AI are mostly enjoyed by a few countries in what the Oxford Internet Institute dubs the “Compute North.” These countries, such as the US, the U.K., France, Canada, and China, have dominated research and development, and built state of the art AI infrastructure capable of training foundational models. This should come as no surprise, as these countries are home to many of the world’s top universities and large tech corporations.

But this concentration of innovation comes at a cost for the billions of people who live outside these dominant countries and have different cultural backgrounds.

Large language models (LLMs) are illustrative of this disparity. Researchers have shown that many of the most popular multilingual LLMs perform poorly with languages other than English, Chinese, and a handful of other (mostly) European languages. Yet, there are approximately 6,000 languages spoken today, many of them in communities in Africa, Asia, and South America. Arabic alone is spoken by almost 400 million people and Hindi has 575 million speakers around the world.

For example, LLaMA 2 performs up to 50% better in English compared to Arabic, when measured using the LM-Evaluation-Harness framework. Meanwhile, Jais, an LLM co-developed by MBZUAI, exceeds LLaMA 2 in Arabic and is comparable to Meta’s model in English (see table below).

The chart shows that the only way to develop AI applications that work for everyone is by creating new institutions outside the Compute North that consistently and conscientiously invest in building tools designed for the thousands of language communities across the world.

Environments of innovation

One way to design new institutions is to study history and understand how today’s centers of gravity in AI research emerged decades ago. Before Silicon Valley earned its reputation as the center of global technological innovation, it was called Santa Clara Valley and was known for its prune farms. However, the main catalyst was Stanford University, which had built a reputation as one of the best places in the world to study electrical engineering. Over the years, through a combination of government-led investment through grants and focused research, the university birthed countless inventions that advanced computing and created a culture of entrepreneurship. The results speak for themselves: Stanford alumni have founded companies such as Alphabet, NVIDIA, Netflix, and PayPal, to name a few.

Today, like MBZUAI’s predecessor in Santa Clara Valley, we have an opportunity to build a new technology hub centered around a university.

And that’s why I chose to join MBZUAI, the world’s first research university focused entirely on AI. From MBZUAI’s position at the geographical crossroads of East and West, our goal is to attract the brightest minds from around the world and equip them with the tools they need to push the boundaries of AI research and development.

A community for inclusive AI

MBZUAI’s student body comes from more than 50 different countries around the globe. It has attracted top researchers such as Monojit Choudhury from Microsoft, Elizabeth Churchill from Google, Ted Briscoe from the University of Cambridge, Sami Haddadin from the Technical University of Munich, and Yoshihiko Nakamura from the University of Tokyo, just to name a few.

These scientists may be from different places but they’ve found a common purpose at MBZUAI with our interdisciplinary nature, relentless focus on making AI a force for global progress, and emphasis on collaboration across disciplines such as robotics, NLP, machine learning, and computer vision.

In addition to traditional AI disciplines, MBZUAI has built departments in sibling areas that can both contribute to and benefit from AI, including human computer interaction, statistics and data science, and computational biology.

Abu Dhabi’s commitment to MBZUAI is part of a broader vision for AI that extends beyond academia. MBZUAI’s scientists have collaborated with G42, an Abu Dhabi-based tech company, on Jais, an Arabic-centric LLM that is the highest-performing open-weight Arabic LLM; and also NANDA, an advanced Hindi LLM. MBZUAI’s Institute of Foundational Models has created LLM360, an initiative designed to level the playing field of large model research and development by publishing fully open source models and datasets that are competitive with closed source or open weights models available from tech companies in North America or China.

MBZUAI is also developing language models that specialize in Turkic languages, which have traditionally been underrepresented in NLP, yet are spoken by millions of people.

Another recent project has brought together native speakers of 26 languages from 28 different countries to compile a benchmark dataset that evaluates the performance of vision language models and their ability to understand cultural nuances in images.

These kinds of efforts to expand the capabilities of AI to broader communities are necessary if we want to maintain the world’s cultural diversity and provide everyone with AI tools that are useful to them. At MBZUAI, we have created a unique mix of students and faculty to drive globally-inclusive AI innovation for the future. By building a broad community of scientists, entrepreneurs, and thinkers, the university is increasingly establishing itself as a driving force in AI innovation that extends far beyond Abu Dhabi, with the goal of developing technologies that are inclusive for the world’s diverse languages and culture.

This content was produced by the Mohamed bin Zayed University of Artificial Intelligence. It was not written by MIT Technology Review’s editorial staff.

The Download: mysterious exosomes, and AI’s e-waste issue

29 October 2024 at 13:35

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Exosomes are touted as a trendy cure-all. We don’t know if they work.

There’s a trendy new cure-all in town—you might have seen ads pop up on social media or read rave reviews in beauty magazines. 

Exosomes are being touted as a miraculous treatment for hair loss, aging skin, acne, eczema, pain conditions, long covid, and even neurological diseases like Parkinson’s and Alzheimer’s. That’s, of course, if you can afford the price tag—which can stretch to thousands of dollars.

But there’s a big problem with these big promises: We don’t fully understand how exosomes work—or what they even really are. Read our story

—Jessica Hamzelou

AI will add to the e-waste problem. Here’s what we can do about it.

The news: Generative AI could add up to 5 million metric tons of e-waste in total by 2030, according to a new study. That’s a relatively small fraction of the current global total of over 60 million metric tons of e-waste each year. However, it’s still a significant part of a growing problem.

Under the hood: The primary contributor is high-performance computing hardware that’s used in data centers and server farms. That equipment is full of valuable metals and hazardous materials, and it’s being replaced at a rapid rate as AI companies race to adopt the most cutting-edge hardware to power their models.

What can be done: Expanding hardware’s lifespan is one of the most significant ways to cut down on e-waste. Refurbishing and reusing components can also play a significant role, as can designing hardware in ways that makes it easier to recycle and upgrade. Read the full story.

—Casey Crownhart

Militaries are great testing grounds for AI tech, says Palmer Luckey

War is a catalyst for technological change, and the last couple of years have been marred by high-profile conflicts around the world. Geopolitical tensions are still rising now. 

Silicon Valley players are poised to benefit. One of them is Palmer Luckey, the founder of the virtual-reality headset company Oculus, which he sold to Facebook for $2 billion. After Luckey’s highly public ousting from Meta, he founded Anduril, which focuses on drones, cruise missiles, and other AI-enhanced technologies for the US Department of Defense. The company is now valued at $14 billion. We interviewed Luckey about his new project: headsets for the military.

But the use of AI for the military is a controversial topic, with a long and bitter history that stretches from Project Maven to killer robots. Read the full story.

—Melissa Heikkilä

This story is from The Algorithm, our weekly newsletter all about the latest in AI. Sign up to receive it in your inbox every Monday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Strava is leaking the location of foreign leaders
Their bodyguards’ runs are revealing more than they ought to. (Le Monde)
+ It’s shockingly easy to buy sensitive data about US military personnel. (MIT Technology Review)

2 A man who used AI to make child sexual abuse images has been jailed
His 18-year sentence is the first of its kind in the UK. (FT $)

3 Here’s what Trump plans to do if he wins a second term
The 900-page Project 2025 document provides plenty of hints. (The Verge)
It would be hard for him to roll back the Green New Deal—but not impossible. (Axios)
+ Russia, China and Iran are interfering in the election. (NYT $)
But cybercriminals may pose an even greater threat. (Wired $)

4 Apple Intelligence is here 
But it seems it’s still kinda dumb. (WP $)
Meta is reportedly building its own AI search engine. (The Information $)
+ The trouble is, AI chatbots make stuff up. And it’s not a fully fixable problem. (MIT Technology Review)

5 Medium is drowning in AI slop
Almost half of the posts on there now are probably AI-generated. (Wired $)

6 What steampunk can teach tech today
We’re too keen on removing friction—people still like fiddling with dials and gears. (New Yorker $)
+ Prosthetics designers are coming up with new ways to augment our bodies. (MIT Technology Review

7 This is what wargaming looks like now
Militaries around the world use software called Command PE built by a tiny British game publisher. (WSJ $)

8 Tiktok’s founder has become China’s richest man 
Zhang Yiming’s wealth has almost doubled in the last year, to $49 billion. (BBC)
How China takes extreme measures to keep teens off TikTok. (MIT Technology Review)

9 How complex life started to flourish 🦠
You can thank eukaryotes, a type of cell that emerged about 3 billion years ago. (Quanta $)

10 Oregon Trail is being turned into an action-comedy movie
With musical numbers. Yes, seriously. (Hollywood Reporter)

Quote of the day

“I thought it would conquer the world.”

Tim Walz, the Democratic nominee for vice president, spoke for us all (well, for me anyway), when he waxed lyrical about the 1999 Sega Dreamcast video game console on a Twitch stream last weekend, the Washington Post reports.

 The big story

Meet the radio-obsessed civilian shaping Ukraine’s drone defense

EMRE ÇAYLAK

September 2024

Drones have come to define the brutal conflict in Ukraine that has now dragged on for more than two and a half years. And most rely on radio communications—a technology that Serhii “Flash” Beskrestnov has obsessed over since childhood.

While Flash is now a civilian, the former officer has still taken it upon himself to inform his country’s defense in all matters related to radio. He studies Russian transmissions and tries to learn about the problems facing troops.

In this race for survival—as each side constantly tries to best the other, only to start all over again when the other inevitably catches up—Ukrainian soldiers need to develop creative solutions, and fast. As Ukraine’s wartime radio guru, Flash may just be one of their best hopes for doing that. Read the full story.

—Charlie Metcalfe

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Timothée Chalamet turned up at his own look-alike contest in New York last weekend. Spoiler alert: he didn’t win. 

+ Learn these basic rules to make veg-based meals delicious.

+ There’s something very special about ancient trees.

+ Do you tend to please everyone but yourself? Here’s how to stop. (NYT $)

❌
❌