Reading view

There are new articles available, click to refresh the page.

Trump’s win is a tragic loss for climate progress

Donald Trump’s decisive victory is a stunning setback for the fight against climate change.

The Republican president-elect’s return to the White House means the US is going to squander precious momentum, unraveling hard-won policy progress that was just beginning to pay off, all for the second time in less than a decade. 

It comes at a moment when the world can’t afford to waste time, with nations far off track from any emissions trajectories that would keep our ecosystems stable and our communities safe. Under the policies in place today, the planet is already set to warm by more than 3 °C over preindustrial levels in the coming decades.

Trump could push the globe into even more dangerous terrain, by defanging President Joe Biden’s signature climate laws. In fact, a second Trump administration could boost greenhouse-gas emissions by 4 billion tons through 2030 alone, according to an earlier analysis by Carbon Brief, a well-regarded climate news and data site. That will exacerbate the dangers of heat waves, floods, wildfires, droughts, and famine and increase deaths and disease from air pollution, inflicting some $900 million in climate damages around the world, Carbon Brief found.

I started as the climate editor at MIT Technology Review just as Trump came into office the last time. Much of the early job entailed covering his systematic unraveling of the modest climate policy and progress that President Barack Obama had managed to achieve. I fear it will be far worse this time, as Trump ambles into office feeling empowered and aggrieved, and ready to test the rule of law and crack down on dissent. 

This time his administration will be staffed all the more by loyalists and idealogues, who have already made plans to force civil servants with expertise and experience from federal agencies including the Environmental Protection Agency. He’ll be backed by a Supreme Court that he moved well to the right, and which has already undercut landmark environmental doctrines and weakened federal regulatory agencies. 

This time the setbacks will sting more, too, because the US did finally manage to pass real, substantive climate policy, through the slimmest of congressional margins. The Inflation Reduction Act and Bipartisan Infrastructure Law allocated massive amounts of government funding to accelerating the shift to low-emissions industries and rebuilding the US manufacturing base around a clean-energy economy. 

Trump has made clear he will strive to repeal as many of these provisions as he can, tempered perhaps only by Republicans who recognize that these laws are producing revenue and jobs in their districts. Meanwhile, throughout the prolonged presidential campaign, Trump or his surrogates pledged to boost oil and gas production, eliminate federal support for electric vehicles, end pollution rules for power plants, and remove the US from the Paris climate agreement yet again. Each of those goals stands in direct opposition to the deep, rapid emissions cuts now necessary to prevent the planet from tipping past higher and higher temperature thresholds.

Project 2025, considered a blueprint for the early days of a second Trump administration despite his insistence to the contrary, calls for dismantling or downsizing federal institutions including the the National Oceanic and Atmospheric Administration and the Federal Emergency Management Agency. That could cripple the nation’s ability to forecast, track, or respond to storms, floods, and fires like those that have devastated communities in recent months.

Observers I’ve spoken to fear that the Trump administration will also return the Department of Energy, which under Biden had evolved its mission toward developing low-emissions technologies, to the primary task of helping companies dig up more fossil fuels.

The US election could create global ripples as well, and very soon. US negotiators will meet with their counterparts at the annual UN climate conference that kicks off next week. With Trump set to move back into the White House in January, they will have little credibility or leverage to nudge other nations to step up their commitments to reducing emissions. 

But those are just some of the direct ways that a second Trump administration will enfeeble the nation’s ability to drive down emissions and counter the growing dangers of climate change. He also has considerable power to stall the economy and sow international chaos amid escalating conflicts in Europe and the Middle East. 

Trump’s eagerness to enact tariffs, slash government spending, and deport major portions of the workforce may stunt growth, drive up inflation, and chill investment. All that would make it far more difficult for companies to raise the capital and purchase the components needed to build anything in the US, whether that means wind turbines, solar farms, and seawalls or buildings, bridges, and data centers. 

view from behind Trump on stage election night 2024 with press and crowd
President-elect Donald Trump speaks at an election night event in West Palm Beach, Florida.
WIN MCNAMEE/GETTY IMAGES

His clumsy handling of the economy and international affairs may also help China extend its dominance in producing and selling the components that are crucial to the energy transition, including batteries, EVs, and solar panels, to customers around the globe.

If one job of a commentator is to find some perspective in difficult moments, I admit I’m mostly failing in this one.

The best I can do is to say that there will be some meaningful lines of defense. For now, at least, state leaders and legislatures can continue to pass and implement stronger climate rules. Other nations could step up their efforts to cut emissions and assert themselves as global leaders on climate. 

Private industry will likely continue to invest in and build businesses in climate tech and clean energy, since solar, wind, batteries, and EVs have proved themselves as competitive industries. And technological progress can occur no matter who is sitting in the round room on Pennsylvania Avenue, since researchers continue striving to develop cleaner, cheaper ways of producing our energy, food, and goods.

By any measure, the job of addressing climate change is now much harder. Nothing, however, has changed about the stakes. 

Our world doesn’t end if we surpass 2 °C, 2.5 °C, or even 3 °C, but it will steadily become a more dangerous and erratic place. Every tenth of a degree remains worth fighting for—whether two, four, or a dozen years from now—because every bit of warming that nations pull together to prevent eases future suffering somewhere.

So as the shock wears off and the despair begins to lift, the core task before us remains the same: to push for progress, whenever, wherever, and however we can. 

The US is about to make a sharp turn on climate policy

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Voters have elected Donald Trump to a second term in the White House.

In the days leading up to the election, I kept thinking about what four years means for climate change right now. We’re at a critical moment that requires decisive action to rapidly slash greenhouse-gas emissions from power plants, transportation, industry, and the rest of the economy if we’re going to achieve our climate goals.

The past four years have seen the US take climate action seriously, working with the international community and pumping money into solutions. Now, we’re facing a period where things are going to be very different. A Trump presidency will have impacts far beyond climate, but for the sake of this newsletter, we’ll stay focused on what four years means in the climate fight as we start to make sense of this next chapter. 

Joe Biden arguably did more to combat climate change than any other American president. One of his first actions in office was rejoining the Paris climate accord—Trump pulled out of the international agreement to fight climate change during his first term in office. Biden then quickly set a new national goal to cut US carbon emissions in half, relative to their peak, by 2030.

The Environmental Protection Agency rolled out rules for power plants to slash pollution that harms both human health and the climate. The agency also announced new regulations for vehicle emissions to push the country toward EVs.

And the cornerstone of the Biden years has been unprecedented climate investment. A trio of laws—the Bipartisan Infrastructure Law, the CHIPS and Science Act, and the Inflation Reduction Act—pumped hundreds of billions of dollars into infrastructure and research, much of it on climate.

Now, this ship is about to make a quick turn. Donald Trump has regularly dismissed the threat of climate change and promised throughout the campaign to counter some of Biden’s key moves.

We can expect to see a dramatic shift in how the US talks about climate on the international stage. Trump has vowed to once again withdraw from the Paris agreement. Things are going to be weird at the annual global climate talks that kick off next week.

We can also expect to see efforts to undo some of Biden’s key climate actions, most centrally the Inflation Reduction Act, as my colleague James Temple covered earlier this year.

What, exactly, Trump can do will depend on whether Republicans take control of both houses of Congress. A clean sweep would open up more lanes for targeting legislation passed under Biden. (As of sending this email, Republicans have secured enough seats to control the Senate, but the House is uncertain and could be for days or even weeks.)

I don’t think the rug will be entirely pulled out from under the IRA—portions of the investment from the law are beginning to pay off, and the majority of the money has gone to Republican districts. But there will certainly be challenges to pieces, especially the EV tax credits, which Trump has been laser-focused on during the campaign.

This all adds up to a very different course on climate than what many had hoped we might see for the rest of this decade.

A Trump presidency could add 4 billion metric tons of carbon dioxide emissions to the atmosphere by 2030 over what was expected from a second Biden term, according to an analysis published in April by the website Carbon Brief (this was before Biden dropped out of the race). That projection sees emissions under Trump dropping by 28% below the peak by the end of the decade—nowhere near the 50% target set by Biden at the beginning of his term.

The US, which is currently the world’s second-largest greenhouse-gas emitter and has added more climate pollution to the atmosphere than any other nation, is now very unlikely to hit Biden’s 2030 goal. That’s basically the final nail in the coffin for efforts to limit global warming to 1.5 °C (2.7 °F) over preindustrial levels.

In the days, weeks, and years ahead we’ll be covering what this change will mean for efforts to combat climate change and to protect the most vulnerable from the dangerous world we’re marching toward—indeed, already living in. Stay tuned for more from us.


Now read the rest of The Spark

Related reading

Trump wants to unravel Biden’s landmark climate law. Read our coverage from earlier this year to see what’s most at risk

It’s been two years since the Inflation Reduction Act was passed, ushering in hundreds of billions of dollars in climate investment. Read more about the key provisions in this newsletter from August

silhouette of a cow with letters C,T,G,A floating inside in brilliant orange light
MIT TECHNOLOGY REVIEW | GETTY

Another thing

Jennifer Doudna, one of the inventors of the gene-editing tool CRISPR, says the tech could be a major tool to help address climate change and deal with the growing risks of our changing world. 

The hope is that CRISPR’s ability to chop out specific pieces of DNA will make it faster and easier to produce climate-resilient crops and livestock, while avoiding the pitfalls of previous attempts to tweak the genomes of plants and animals. Read the full story from my colleague James Temple.

Keeping up with climate  

Startup Redoxblox is building a technology that’s not exactly a thermal battery, but it’s not not a thermal battery either. The company raised just over $30 million to build its systems, which store energy in both heat and chemical bonds. (Heatmap)

It’s been a weird fall in the US Northeast—a rare drought has brought a string of wildfires, and New York City is seeing calls to conserve water. (New York Times)

It’s been bumpy skies this week for electric-plane startups. Beta Technologies raised over $300 million in funding, while Lilium may be filing for insolvency soon. (Canary Media)

→ The runway for futuristic electric planes is still a long one. (MIT Technology Review)

Meta’s plan to build a nuclear-powered AI data center has been derailed by a rare species of bee living on land earmarked for the project. (Financial Times)

The atmospheric concentration of methane—a powerful greenhouse gas—has been mysteriously climbing since 2007, and that growth nearly doubled in 2020. Now scientists may have finally figured out the culprits: microbes in wetlands that are getting warmer and wetter. (Washington Post)

Greenhouse-gas emissions from the European Union fell by 8% in 2023. The drop is thanks to efforts to shut down coal-fired power plants and generate more electricity from renewables like solar and wind. (The Guardian)

Four electric school buses could help officials figure out how to charge future bus fleets. A project in Brooklyn will aim to use onsite renewables and smart charging to control the costs and grid stress of EV charging depots. (Canary Media)

Azerbaijan Plans Caspian-Black Sea Energy Corridor



Azerbaijan next week will garner much of the attention of the climate tech world, and not just because it will host COP29, the United Nation’s giant annual climate change conference. The country is promoting a grand, multi-nation plan to generate renewable electricity in the Caucasus region and send it thousands of kilometers west, under the Black Sea, and into energy–hungry Europe.

The transcontinental connection would start with wind, solar, and hydropower generated in Azerbaijan and Georgia, and off-shore wind power generated in the Caspian Sea. Long-distance lines would carry up to 1.5 gigawatts of clean electricity to Anaklia, Georgia, at the east end of the Black Sea. An undersea cable would move the electricity across the Black Sea and deliver it to Constanta, Romania, where it could be distributed further into Europe.

The scheme’s proponents say this Caspian-Black Sea energy corridor will help decrease global carbon emissions, provide dependable power to Europe, modernize developing economies at Europe’s periphery, and stabilize a region shaken by war. Organizers hope to build the undersea cable within the next six years at an estimated cost of €3.5 billion (US $3.8 billion).

To accomplish this, the governments of the involved countries must quickly circumvent a series of technical, financial, and political obstacles. “It’s a huge project,” says Zviad Gachechiladze, a director at Georgian State Electrosystem, the agency that operates the country’s electrical grid, and one of the architects of the Caucasus green-energy corridor. “To put it in operation [by 2030]—that’s quite ambitious, even optimistic,” he says.

Black Sea Cable to Link Caucasus and Europe

The technical lynchpin of the plan falls on the successful construction of a high voltage direct current (HVDC) submarine cable in the Black Sea. It’s a formidable task, considering that it would stretch across nearly 1,200 kilometers of water, most of which is over 2 km deep, and, since Russia’s invasion of Ukraine, littered with floating mines. By contrast, the longest existing submarine power cable—the North Sea Link—carries 1.4 GW across 720 km between England and Norway, at depths of up to 700 meters.

As ambitious as Azerbaijan’s plans sound, longer undersea connections have been proposed. The Australia-Asia PowerLink project aims to produce 6 GW at a vast solar farm in Northern Australia and send about a third of it to Singapore via a 4,300-km undersea cable. The Morocco-U.K. Power Project would send 3.6 GW over 3,800 km from Morocco to England. A similar attempt by Desertec to send electricity from North Africa to Europe ultimately failed.

Building such cables involves laying and stitching together lengths of heavy submarine power cables from specialized ships—the expertise for which lies with just two companies in the world. In an assessment of the Black Sea project’s feasibility, the Milan-based consulting and engineering firm CESI determined that the undersea cable could indeed be built, and estimated that it could carry up to 1.5 GW—enough to supply over 2 million European households.

But to fill that pipe, countries in the Caucasus region would have to generate much more green electricity. For Georgia, that will mostly come from hydropower, which already generates over 80 percent of the nation’s electricity. “We are a hydro country. We have a lot of untapped hydro potential,” says Gachechiladze.

Azerbaijan and Georgia Plan Green Energy Corridor

Generating hydropower can also generate opposition, because of the way dams alter rivers and landscapes. “There were some cases when investors were not able to construct power plants because of opposition of locals or green parties” in Georgia, says Salome Janelidze, a board member at the Energy Training Center, a Georgian government agency that promotes and educates around the country’s energy sector.

“It was definitely a problem and it has not been totally solved,” says Janelidze. But “to me it seems it is doable,” she says. “You can procure and construct if you work closely with the local population and see them as allies rather than adversaries.”

For Azerbaijan, most of the electricity would be generated by wind and solar farms funded by foreign investment. Masdar, the renewable-energy developer of the United Arab Emirates government, has been investing heavily in wind power in the country. In June, the company broke ground on a trio of wind and solar projects with 1 GW capacity. It intends to develop up to 9 GW more in Azerbaijan by 2030. ACWA Power, a Saudi power-generation company, plans to complete a 240-MW solar plant in the Absheron and Khizi districts of Azerbaijan next year and has struck a deal with the Azerbaijani Ministry of Energy to install up to 2.5 GW of offshore and onshore wind.

CESI is currently running a second study to gauge the practicality of the full breadth of the proposed energy corridor—from the Caspian Sea to Europe—with a transmission capacity of 4 to 6 GW. But that beefier interconnection will likely remain out of reach in the near term. “By 2030, we can’t claim our region will provide 4 GW or 6 GW,” says Gachechiladze. “1.3 is realistic.”

COP29: Azerbaijan’s Renewable Energy Push

Signs of political support have surfaced. In September, Azerbaijan, Georgia, Romania, and Hungary created a joint venture, based in Romania, to shepherd the project. Those four countries in 2022 inked a memorandum of understanding with the European Union to develop the energy corridor.

The involved countries are in the process of applying for the cable to be selected as an EU “project of mutual interest,” making it an infrastructure priority for connecting the union with its neighbors. If selected, “the project could qualify for 50 percent grant financing,” says Gachechiladze. “It’s a huge budget. It will improve drastically the financial condition of the project.” The commissioner responsible for EU enlargement policy projected that the union would pay an estimated €2.3 billion ($2.5 billion) toward building the cable.

Whether next week’s COP29, held in Baku, Azerbaijan, will help move the plan forward remains to be seen. In preparation for the conference, advocates of the energy corridor have been taking international journalists on tours of the country’s energy infrastructure.

Looming over the project are the security issues threaten to thwart it. Shipping routes in the Black Sea have become less dependable and safe since Russia’s invasion of Ukraine. To the south, tensions between Armenia and Azerbaijan remain after the recent war and ethnic violence.

In order to improve relations, many advocates of the energy corridor would like to include Armenia. “The cable project is in the interests of Georgia, it’s in the interests of Armenia, it’s in the interests of Azerbaijan,” says Agha Bayramov, an energy geopolitics researcher at the University of Groningen, in the Netherlands. “It might increase the chance of them living peacefully together. Maybe they’ll say, ‘We’re responsible for European energy. Let’s put our egos aside.’”

New Carrier Fluid Makes Hydrogen Way Easier to Transport



Imagine pulling up to a refueling station and filling your vehicle’s tank with liquid hydrogen, as safe and convenient to handle as gasoline or diesel, without the need for high-pressure tanks or cryogenic storage. This vision of a sustainable future could become a reality if a Calgary, Canada–based company, Ayrton Energy, can scale up its innovative method of hydrogen storage and distribution. Ayrton’s technology could make hydrogen a viable, one-to-one replacement for fossil fuels in existing infrastructure like pipelines, fuel tankers, rail cars, and trucks.

The company’s approach is to use liquid organic hydrogen carriers (LOHCs) to make it easier to transport and store hydrogen. The method chemically bonds hydrogen to carrier molecules, which absorb hydrogen molecules and make them more stable—kind of like hydrogenating cooking oil to produce margarine.

Black gloved hands pour a clear liquid from a beaker into a vial. A researcher pours a sample of Ayrton’s LOHC fluid into a vial.Ayrton Energy

The approach would allow liquid hydrogen to be transported and stored in ambient conditions, rather than in the high-pressure, cryogenic tanks (to hold it at temperatures below -252 ºC) currently required for keeping hydrogen in liquid form. It would also be a big improvement on gaseous hydrogen, which is highly volatile and difficult to keep contained.

Founded in 2021, Ayrton is one of several companies across the globe developing LOHCs, including Japan’s Chiyoda and Mitsubishi, Germany’s Covalion, and China’s Hynertech. But toxicity, energy density, and input energy issues have limited LOHCs as contenders for making liquid hydrogen feasible. Ayrton says its formulation eliminates these trade-offs.

Safe, Efficient Hydrogen Fuel for Vehicles

Conventional LOHC technologies used by most of the aforementioned companies rely on substances such as toluene, which forms methylcyclohexane when hydrogenated. These carriers pose safety risks due to their flammability and volatility. Hydrogenious LOHC Technologies in Erlanger, Germany and other hydrogen fuel companies have shifted toward dibenzyltoluene, a more stable carrier that holds more hydrogen per unit volume than methylcyclohexane, though it requires higher temperatures (and thus more energy) to bind and release the hydrogen. Dibenzyltoluene hydrogenation occurs at between 3 and 10 megapascals (30 and 100 bar) and 200–300 ºC, compared with 10 MPa (100 bar), and just under 200 ºC for methylcyclohexane.

Ayrton’s proprietary oil-based hydrogen carrier not only captures and releases hydrogen with less input energy than is required for other LOHCs, but also stores more hydrogen than methylcyclohexane can—55 kilograms per cubic meter compared with methylcyclohexane’s 50 kg/m³. Dibenzyltoluene holds more hydrogen per unit volume (up to 65 kg/m³), but Ayrton’s approach to infusing the carrier with hydrogen atoms promises to cost less. Hydrogenation or dehydrogenation with Ayrton’s carrier fluid occurs at 0.1 megapascal (1 bar) and about 100 ºC, says founder and CEO Natasha Kostenuk. And as with the other LOHCs, after hydrogenation it can be transported and stored at ambient temperatures and pressures.

Judges described [Ayrton's approach] as a critical technology for the deployment of hydrogen at large scale.” —Katie Richardson, National Renewable Energy Lab

Ayrton’s LOHC fluid is as safe to handle as margarine, but it’s still a chemical, says Kostenuk. “I wouldn’t drink it. If you did, you wouldn’t feel very good. But it’s not lethal,” she says.

Kostenuk and fellow Ayrton cofounder Brandy Kinkead (who serves as the company’s chief technical officer) were originally trying to bring hydrogen generators to market to fill gaps in the electrical grid. “We were looking for fuel cells and hydrogen storage. Fuel cells were easy to find, but we couldn’t find a hydrogen storage method or medium that would be safe and easy to transport to fuel our vision of what we were trying to do with hydrogen generators,” Kostenuk says. During the search, they came across LOHC technology but weren’t satisfied with the trade-offs demanded by existing liquid hydrogen carriers. “We had the idea that we could do it better,” she says. The duo pivoted, adjusting their focus from hydrogen generators to hydrogen storage solutions.

“Everybody gets excited about hydrogen production and hydrogen end use, but they forget that you have to store and manage the hydrogen,” Kostenuk says. Incompatibility with current storage and distribution has been a barrier to adoption, she says. “We’re really excited about being able to reuse existing infrastructure that’s in place all over the world.” Ayrton’s hydrogenated liquid has fuel-cell-grade (99.999 percent) hydrogen purity, so there’s no advantage in using pure liquid hydrogen with its need for subzero temperatures, according to the company.

The main challenge the company faces is the set of issues that come along with any technology scaling up from pilot-stage production to commercial manufacturing, says Kostenuk. “A crucial part of that is aligning ourselves with the right manufacturing partners along the way,” she notes.

Asked about how Ayrton is dealing with some other challenges common to LOHCs, Kostenuk says Ayrton has managed to sidestep them. “We stayed away from materials that are expensive and hard to procure, which will help us avoid any supply chain issues,” she says. By performing the reactions at such low temperatures, Ayrton can get its carrier fluid to withstand 1,000 hydrogenation-dehydrogenation cycles before it no longer holds enough hydrogen to be useful. Conventional LOHCs are limited to a couple of hundred cycles before the high temperatures required for bonding and releasing the hydrogen breaks down the fluid and diminishes its storage capacity, Kostenuk says.

Breakthrough in Hydrogen Storage Technology

In acknowledgement of what Ayrton’s nontoxic, oil-based carrier fluid could mean for the energy and transportation sectors, the U.S. National Renewable Energy Lab (NREL) at its annual Industry Growth Forum in May named Ayrton an “outstanding early-stage venture.” A selection committee of more than 180 climate tech and cleantech investors and industry experts chose Ayrton from a pool of more than 200 initial applicants, says Katie Richardson, group manager of NREL’s Innovation and Entrepreneurship Center, which organized the forum. The committee based its decision on the company’s innovation, market positioning, business model, team, next steps for funding, technology, capital use, and quality of pitch presentation. “Judges described Ayrton’s approach as a critical technology for the deployment of hydrogen at large scale,” Richardson says.

As a next step toward enabling hydrogen to push gasoline and diesel aside, “we’re talking with hydrogen producers who are right now offering their customers cryogenic and compressed hydrogen,” says Kostenuk. “If they offered LOHC, it would enable them to deliver across longer distances, in larger volumes, in a multimodal way.” The company is also talking to some industrial site owners who could use the hydrogenated LOHC for buffer storage to hold onto some of the energy they’re getting from clean, intermittent sources like solar and wind. Another natural fit, she says, is energy service providers that are looking for a reliable method of seasonal storage beyond what batteries can offer. The goal is to eventually scale up enough to become the go-to alternative (or perhaps the standard) fuel for cars, trucks, trains, and ships.

Inside a fusion energy facility

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

On an overcast day in early October, I picked up a rental car and drove to Devens, Massachusetts, to visit a hole in the ground.

Commonwealth Fusion Systems has raised over $2 billion in funding since it spun out of MIT in 2018, all in service of building the first commercial fusion reactor. The company has ambitions to build power plants, but currently the goal is to finish putting together its first demonstration system, the SPARC reactor. The plan is to have it operating by 2026.

I visited the company’s site recently to check in on progress. Things are starting to come together around the hole in the floor where SPARC will eventually be installed. Looking around the site, I found it becoming easier to imagine a future that could actually include fusion energy. But there’s still a lot of work left to do. 

Fusion power has been a dream for decades. The idea is simple: Slam atoms together and use the energy that’s released to power the world. The systems would require small amounts of abundant fuel and wouldn’t produce dangerous waste. The problem is, executing this vision has been much slower than many had hoped.

Commonwealth is one of the leaders in commercial fusion. My colleague James Temple wrote a feature story, published in early 2022, about the company’s attempts to bring the technology to reality. At the time, the Devens location was still a muddy construction site, with the steel and concrete just starting to go into the ground.

Things are much more polished now—when I visited earlier this month, I pulled into one of the designated visitor parking spots and checked in at a reception desk in a bustling office building before beginning my tour. There were two main things to see: the working magnet factory and the cluster of buildings that will house and support the SPARC reactor.

We started in the magnet factory. SPARC is a tokamak, a device relying on powerful magnets to contain the plasma where fusion reactions take place. There will be three different types of magnets in SPARC, all arranged to keep the plasma in position and moving around in the right way.

The company is making its own magnets powered with tape made from a high-temperature superconductor, which generates a magnetic field when an electric current runs through it. SPARC will contain thousands of miles’ worth of this tape in its magnets. In the factory, specialized equipment winds up the tape and tucks it into metal cases, which are then stacked together and welded into protective shells.  

After our quick loop around the magnet factory, I donned a helmet, neon vest, and safety glasses and got a short safety talk that included a stern warning to not stare directly at any welding. Then we walked across a patio and down a gravel driveway to the main complex of buildings that will house the SPARC reactor.

Except for some remaining plywood stairs and dust, the complex appeared to be nearly completed. There’s a huge wall of glass on the front of the building—a feature intended to show that the company is open with the community about the goings-on inside, as my tour guide, chief marketing officer Joe Paluska, put it.  

Four main buildings surround the central tokamak hall. These house support equipment needed to cool down the magnets, heat up the plasma, and measure conditions in the reactor. Most of these big, industrial systems that support SPARC are close to being ready to turn on or are actively being installed, explained Alex Creely, director of tokamak operations, in a call after my tour.

When it was finally time to see the tokamak hall that will house SPARC, we had to take a winding route to get there. A maze of concrete walls funneled us to the entrance, and I lost track of my left and right turns. Called the labyrinth, this is a safety feature, designed to keep stray neutrons from escaping the hall once the reactor is operating. (Neutrons are a form of radiation, and enough exposure can be dangerous to humans.) 

Finally, we stepped into a cavernous space. From our elevated vantage point on a metal walkway, we peered down into a room with gleaming white floors and equipment scattered around the perimeter. At the center was a hole, covered with a tarp and surrounded by bright-yellow railings. That empty slot is where the star of the show, SPARC, will eventually be installed.

tokamak hall at Commonwealth Fusion Systems
The tokamak hall at Commonwealth Fusion Systems will house the company’s SPARC reactor.
COMMONWEALTH FUSION SYSTEMS

While there’s still very little tokamak in the tokamak hall right now, Commonwealth has an ambitious timeline planned: The goal is to have SPARC running and the first plasma in the reactor by 2026. The company plans to demonstrate that it can produce more energy in the reactor than is needed to power it (a milestone known as Q>1 in the fusion world) by 2027.

When we published our 2022 story on Commonwealth, the plan was to flip on the reactor and reach the Q>1 milestone by 2025, so the timeline has slipped. It’s not uncommon for big projects in virtually every industry to take longer than expected. But there’s an especially long and fraught history of promises and missed milestones in fusion. 

Commonwealth has certainly made progress over the past few years, and it’s getting easier to imagine the company actually turning on a reactor and meeting the milestones the field has been working toward for decades. But there’s still a tokamak-shaped hole in suburban Massachusetts waiting to be filled. 


Now read the rest of The Spark

Related reading

Read our 2022 feature on Commonwealth Fusion Systems and its path to commercializing fusion energy here

In late 2022, a reactor at a national lab in the US generated more energy than was put in, a first for the industry. Here’s what meeting that milestone actually means for clean energy

There’s still a lot of research to be done in fusion—here’s what’s coming next

Another company called Helion says its first fusion power plant is five years away. Experts are skeptical, to say the least.

AI e-waste
PHOTO ILLUSTRATION BY SARAH ROGERS/MITTR | PHOTOS GETTY

Another thing

Generative AI will add to our growing e-waste problem. A new study estimates that AI could add up to 5 million tons of e-waste by 2030. 

It’s a small fraction of the total, but there’s still good reason to think carefully about how we handle discarded servers and high-performance computing equipment, according to experts. Read more in my latest story

Keeping up with climate  

New York City will buy 10,000 induction stoves from a startup called Copper. The stoves will be installed in public housing in the city. (Heatmap)

Demand is growing for electric cabs in India, but experts say there’s not nearly enough supply to meet it. (Rest of World)

Pivot Bio aims to tweak the DNA of bacteria so they can help deliver nutrients to plants. The company is trying to break into an industry dominated by massive agriculture and chemical companies. (New York Times)

→ Check out our profile of Pivot Bio, which was one of our 15 Climate Tech Companies to Watch this year. (MIT Technology Review)

At least 62 people are dead and many more are missing in dangerous flooding across Spain. (Washington Post

A massive offshore wind lease sale this week offered up eight patches of ocean off the coast of Maine in the US. Four sold, opening the door for up to 6.8 gigawatts of additional offshore wind power. (Canary Media)

Climate change contributed to the deaths of 38,000 people across Europe in the summer of 2022, according to a new study. (The Guardian)

→ The legacy of Europe’s heat waves will be more air-conditioning, and that could be its own problem. (MIT Technology Review)

There are nearly 9,000 public fast-charging sites in the US, and a surprising wave of installations in the Midwest and Southeast. (Bloomberg)

Some proposed legislation aims to ban factory farming, but determining what that category includes is way more complicated than you might think. (Ambrook Research)

Nuclear Fusion’s New Idea: An Off-the-Shelf Stellarator



For a machine that’s designed to replicate a star, the world’s newest stellarator is a surprisingly humble-looking apparatus. The kitchen-table-size contraption sits atop stacks of bricks in a cinder-block room at the Princeton Plasma Physics Laboratory (PPPL) in Princeton, N.J., its parts hand-labeled in marker.

The PPPL team invented this nuclear-fusion reactor, completed last year, using mainly off-the-shelf components. Its core is a glass vacuum chamber surrounded by a 3D-printed nylon shell that anchors 9,920 meticulously placed permanent rare-earth magnets. Sixteen copper-coil electromagnets resembling giant slices of pineapple wrap around the shell crosswise.

This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

The arrangement of magnets forms the defining feature of a stellarator: an entirely external magnetic field that directs charged particles along a spiral path to confine a superheated plasma. Within this enigmatic fourth state of matter, atoms that have been stripped of their electrons collide, their nuclei fusing and releasing energy in the same process that powers the sun and other stars. Researchers hope to capture this energy and use it to produce clean, zero-carbon electricity.

PPPL’s new reactor is the first stellarator built at this government lab in 50 years. It’s also the world’s first stellarator to employ permanent magnets, rather than just electromagnets, to coax plasma into an optimal three-dimensional shape. Costing only US $640,000 and built in less than a year, the device stands in contrast to prominent stellarators like Germany’s Wendelstein 7-X, a massive, tentacled machine that took $1.1 billion and more than 20 years to construct.

A tabletop machine with many wires coming from it in a research lab Sixteen copper-coil electromagnets resembling giant slices of pineapple wrap around the stellarator’s shell. Jayme Thornton

PPPL researchers say their simpler machine demonstrates a way to build stellarators far more cheaply and quickly, allowing researchers to easily test new concepts for future fusion power plants. The team’s use of permanent magnets may not be the ticket to producing commercial-scale energy, but PPPL’s accelerated design-build-test strategy could crank out new insights on plasma behavior that could push the field forward more rapidly.

Indeed, the team’s work has already spurred the formation of two stellarator startups that are testing their own PPPL-inspired designs, which their founders hope will lead to breakthroughs in the quest for fusion energy.

Are Stellarators the Future of Nuclear Fusion?

The pursuit of energy production through nuclear fusion is considered by many to be the holy grail of clean energy. And it’s become increasingly important as a rapidly warming climate and soaring electricity demand have made the need for stable, carbon-free power ever more acute. Fusion offers the prospect of a nearly limitless source of energy with no greenhouse gas emissions. And unlike conventional nuclear fission, fusion comes with no risk of meltdowns or weaponization, and no long-lived nuclear waste.

Fusion reactions have powered the sun since it formed an estimated 4.6 billion years ago, but they have never served to produce usable energy on Earth, despite decades of effort. The problem isn’t whether fusion can work. Physics laboratories and even a few individuals have successfully fused the nuclei of hydrogen, liberating energy. But to produce more power than is consumed in the process, simply fusing atoms isn’t enough.

A mosaic of square-shaped magnets inside a curved structure Fueled by free pizza, grad students meticulously placed 9,920 permanent rare-earth magnets inside the stellarator’s 3D-printed nylon shell. Jayme Thornton

The past few years have brought eye-opening advances from government-funded fusion programs such as PPPL and the Joint European Torus, as well as private companies. Enabled by gains in high-speed computing, artificial intelligence, and materials science, nuclear physicists and engineers are toppling longstanding technical hurdles. And stellarators, a once-overlooked approach, are back in the spotlight.

“Stellarators are one of the most active research areas now, with new papers coming out just about every week,” says Scott Hsu, the U.S. Department of Energy’s lead fusion coordinator. “We’re seeing new optimized designs that we weren’t capable of coming up with even 10 years ago. The other half of the story that’s just as exciting is that new superconductor technology and advanced manufacturing capabilities are making it more possible to actually realize these exquisite designs.”

Why Is Plasma Containment Important in Fusion Energy?

For atomic nuclei to fuse, the nuclei must overcome their natural electrostatic repulsion. Extremely high temperatures—in the millions of degrees—will get the particles moving fast enough to collide and fuse. Deuterium and tritium, isotopes of hydrogen with, respectively, one and two neutrons in their nuclei, are the preferred fuels for fusion because their nuclei can overcome the repulsive forces more easily than those of heavier atoms.

Heating these isotopes to the required temperatures strips electrons from the atomic nuclei, forming a plasma: a maelstrom of positively charged nuclei and negatively charged electrons. The trick is keeping that searingly hot plasma contained so that some of the nuclei fuse.

Currently, there are two main approaches to containing plasma. Inertial confinement uses high-energy lasers or ion beams to rapidly compress and heat a small fuel pellet. Magnetic confinement uses powerful magnetic fields to guide the charged particles along magnetic-field lines, preventing these particles from drifting outward.

Many magnetic-confinement designs—including the $24.5 billion ITER reactor under construction since 2010 in the hills of southern France—use an internal current flowing through the plasma to help to shape the magnetic field. But this current can create instabilities, and even small instabilities in the plasma can cause it to escape confinement, leading to energy losses and potential damage to the hardware.

Stellarators like PPPL’s are a type of magnetic confinement, with a twist.

How the Stellarator Was Born

Located at the end of Stellarator Road and a roughly 5-kilometer drive from Princeton University’s leafy campus, PPPL is one of 17 U.S. Department of Energy labs, and it employs about 800 scientists, engineers, and other workers. Hanging in PPPL’s lobby is a black-and-white photo of the lab’s founder, physicist Lyman Spitzer, smiling as he shows off the fanciful-looking apparatus he invented and dubbed a stellarator, or “star generator.”

According to the lab’s lore, Spitzer came up with the idea while riding a ski lift at Aspen Mountain in 1951. Enrico Fermi had observed that a simple toroidal, or doughnut-shaped, magnetic-confinement system wouldn’t be sufficient to contain plasma for nuclear fusion because the charged particles would drift outward and escape confinement.

“This technology is designed to be a stepping stone toward a fusion power plant.”

Spitzer determined that a figure-eight design with external magnets could create helical magnetic-field lines that would spiral around the plasma and more efficiently control and contain the energetic particles. That configuration, Spitzer reasoned, would be efficient enough that it wouldn’t require large currents running through the plasma, thus reducing the risk of instabilities and allowing for steady-state operation.

“In many ways, Spitzer’s brilliant idea was the perfect answer” to the problems of plasma confinement, says Steven Cowley, PPPL’s director since 2018. “The stellarator offered something that other approaches to fusion energy couldn’t: a stable plasma field that can sustain itself without any internal current.”

Spitzer’s stellarator quickly captured the imagination of midcentury nuclear physicists and engineers. But the invention was ahead of its time.

Tokamaks vs. Stellarators

The stellarator’s lack of toroidal symmetry made it challenging to build. The external magnetic coils needed to be precisely engineered into complex, three-dimensional shapes to generate the twisted magnetic fields required for stable plasma confinement. In the 1950s, researchers lacked the high-performance computers needed to design optimal three-dimensional magnetic fields and the engineering capability to build machines with the requisite precision.

Meanwhile, physicists in the Soviet Union were testing a new configuration for magnetically confined nuclear fusion: a doughnut-shaped device called a tokamak—a Russian acronym that stands for “toroidal chamber with magnetic coils.” Tokamaks bend an externally applied magnetic field into a helical field inside by sending a current through the plasma. They seemed to be able to produce plasmas that were hotter and denser than those produced by stellarators. And compared with the outrageously complex geometry of stellarators, the symmetry of the tokamaks’ toroidal shape made them much easier to build.

Black and white photo of a man standing in front of a table-top-sized machine Lyman Spitzer in the early 1950s built the first stellarator, using a figure-eight design and external magnets. PPPL

Following the lead of other nations’ fusion programs, the DOE shifted most of its fusion resources to tokamak research. PPPL converted Spitzer’s Model C stellarator into a tokamak in 1969.

Since then, tokamaks have dominated fusion-energy research. But by the late 1980s, the limitations of the approach were becoming more apparent. In particular, the currents that run through a tokamak’s plasma to stabilize and heat it are themselves a source of instabilities as the currents get stronger.

To force the restive plasma into submission, the geometrically simple tokamaks need additional features that increase their complexity and cost. Advanced tokamaks—there are about 60 currently operating—have systems for heating and controlling the plasma and massive arrays of magnets to create the confining magnetic fields. They also have cryogenics to cool the magnets to superconducting temperatures a few meters away from a 150 million °C plasma.

Tokamaks thus far have produced energy only in short pulses. “After 70 years, nobody really has even a good concept for how to make a steady-state tokamak,” notes Michael Zarnstorff, a staff research physicist at PPPL. “The longest pulse so far is just a few minutes. When we talk to electric utilities, that’s not actually what they want to buy.”

Computational Power Revives the Stellarator

With tokamaks gobbling up most of the world’s public fusion-energy funds, stellarator research lay mostly dormant until the 1980s. Then, some theorists started to put increasingly powerful computers to work to help them optimize the placement of magnetic coils to more precisely shape the magnetic fields.

The effort got a boost in 1981, when then-PPPL physicist Allen Boozer invented a coordinate system—known in the physics community as Boozer coordinates—that helps scientists understand how different configurations of magnets affect magnetic fields and plasma confinement. They can then design better devices to maintain stable plasma conditions for fusion. Boozer coordinates can also reveal hidden symmetries in the three-dimensional magnetic-field structure, which aren’t easily visible in other coordinate systems. These symmetries can significantly improve plasma confinement, reduce energy losses, and make the fusion process more efficient.

“We’re seeing new optimized designs we weren’t capable of coming up with 10 years ago.”

“The accelerating computational power finally allowed researchers to challenge the so-called fatal flaw of stellarators: the lack of toroidal symmetry,” says Boozer, who is now a professor of applied physics at Columbia University.

The new insights gave rise to stellarator designs that were far more complex than anything Spitzer could have imagined [see sidebar, “Trailblazing Stellarators”]. Japan’s Large Helical Device came online in 1998 after eight years of construction. The University of Wisconsin’s Helically Symmetric Experiment, whose magnetic-field coils featured an innovative quasi-helical symmetry, took nine years to build and began operation in 1999. And Germany’s Wendelstein 7-X—the largest and most advanced stellarator ever built—produced its first plasma in 2015, after more than 20 years of design and construction.

Experiment Failure Leads to New Stellarator Design

In the late 1990s, PPPL physicists and engineers began designing their own version, called the National Compact Stellarator Experiment (NCSX). Envisioned as the world’s most advanced stellarator, it employed a new magnetic-confinement concept called quasi-axisymmetry—a compromise that mimics the symmetry of a tokamak while retaining the stability and confinement benefits of a stellarator by using only externally generated magnetic fields.

“We tapped into every supercomputer we could find,” says Zarnstorff, who led the NCSX design team, “performing simulations of hundreds of thousands of plasma configurations to optimize the physics properties.”

Three Ways to Send Atoms on a Fantastical Helical Ride


An illustration of a 3 different types of stellerators.


But the design was, like Spitzer’s original invention, ahead of its time. Engineers struggled to meet the precise tolerances, which allowed for a maximum variation from assigned dimensions of only 1.5 millimeters across the entire device. In 2008, with the project tens of millions of dollars over budget and years behind schedule, NCSX was canceled. “That was a very sad day around here,” says Zarnstorff. “We got to build all the pieces, but we never got to put it together.”

Now, a segment of the NCSX vacuum vessel—a contorted hunk made from the superalloy Inconel—towers over a lonely corner of the C-Site Stellarator Building on PPPL’s campus. But if its presence is a reminder of failure, it is equally a reminder of the lessons learned from the $70 million project.

For Zarnstorff, the most important insights came from the engineering postmortem. Engineers concluded that, even if they had managed to successfully build and operate NCSX, it was doomed by the lack of a viable way to take the machine apart for repairs or reconfigure the magnets and other components.

With the experience gained from NCSX and PPPL physicists’ ongoing collaborations with the costly, delay-plagued Wendelstein 7-X program, the path forward became clearer. “Whatever we built next, we knew we needed to make it less expensively and more reliably,” says Zarnstorff. “And we knew we needed to build it in a way that would allow us to take the thing apart.”

A Testbed for Fusion Energy

In 2014, Zarnstorff began thinking about building a first-of-its-kind stellarator that would use permanent magnets, rather than electromagnets, to create its helical field, while retaining electromagnets to shape the toroidal field. (Electromagnets generate a magnetic field when an electric current flows through them and can be turned on or off, whereas permanent magnets produce a constant magnetic field without needing an external power source.)

Even the strongest permanent magnets wouldn’t be capable of confining plasma robustly enough to produce commercial-scale fusion power. But they could be used to create a lower-cost experimental device that would be easier to build and maintain. And that, crucially, would allow researchers to easily adjust and test magnetic fields that could inform the path to a power-producing device.

PPPL dubbed the device Muse. “Muse was envisioned as a testbed for innovative magnetic configurations and improving theoretical models,” says PPPL research physicist Kenneth Hammond, who is now leading the project. “Rather than immediate commercial application, it’s more focused on exploring fundamental aspects of stellarator design and plasma behavior.”

The Muse team designed the reactor with two independent sets of magnets. To coax charged particles into a corkscrew-like trajectory, small permanent neodymium magnets are arranged in pairs and mounted to a dozen 3D-printed panels surrounding the glass vacuum chamber, which was custom-made by glass blowers. Adjacent rows of magnets are oriented in opposite directions, twisting the magnetic-field lines at the outside edges.

Outside the shell, 16 electromagnets composed of circular copper coils generate the toroidal part of the magnetic field. These very coils were mass-produced by PPPL in the 1960s, and they have been a workhorse for rapid prototyping in numerous physics laboratories ever since.

“In terms of its ability to confine particles, Muse is two orders of magnitude better than any stellarator previously built,” says Hammond. “And because it’s the first working stellarator with quasi-axisymmetry, we will be able to test some of the theories we never got to test on NCSX.”

The neodymium magnets are a little bigger than a button magnet that might be used to hold a photo to a refrigerator door. Despite their compactness, they pack a remarkable punch. During my visit to PPPL, I turned a pair of magnets in my hands, alternating their polarities, and found it difficult to push them together and pull them apart.

Graduate students did the meticulous work of placing and securing the magnets. “This is a machine built on pizza, basically,” says Cowley, PPPL’s director. “You can get a lot out of graduate students if you give them pizza. There may have been beer too, but if there was, I don’t want to know about it.”

The Muse project was financed by internal R&D funds and used mostly off-the-shelf components. “Having done it this way, I would never choose to do it any other way,” Zarnstorff says.

Stellarex and Thea Energy Advance Stellarator Concepts

Now that Muse has demonstrated that stellarators can be made quickly, cheaply, and highly accurately, companies founded by current and former PPPL researchers are moving forward with Muse-inspired designs.

Zarnstorff recently cofounded a company called Stellarex. He says he sees stellarators as the best path to fusion energy, but he hasn’t landed on a magnet configuration for future machines. “It may be a combination of permanent and superconducting electromagnets, but we’re not religious about any one particular approach; we’re leaving those options open for now.” The company has secured some DOE research grants and is now focused on raising money from investors.

Thea Energy, a startup led by David Gates, who until recently was the head of stellarator physics at PPPL, is further along with its power-plant concept. Like Muse, Thea focuses on simplified manufacture and maintenance. Unlike Muse, the Thea concept uses planar (flat) electromagnetic coils built of high-temperature superconductors.


Thea Energy


“The idea is to use hundreds of small electromagnets that behave a lot like permanent magnets, with each creating a dipole field that can be switched on and off,” says Gates. “By using so many individually actuated coils, we can get a high degree of control, and we can dynamically adjust and shape the magnetic fields in real time to optimize performance and adapt to different conditions.”

The company has raised more than $23 million and is designing and prototyping its initial project, which it calls Eos, in Kearny, N.J. “At first, it will be focused on producing neutrons and isotopes like tritium,” says Gates. “The technology is designed to be a stepping stone toward a fusion power plant called Helios, with the potential for near-term commercialization.”

Stellarator Startup Leverages Exascale Computing

Of all the private stellarator startups, Type One Energy is the most well funded, having raised $82.5 million from investors that include Bill Gates’s Breakthrough Energy Ventures. Type One’s leaders contributed to the design and construction of both the University of Wisconsin’s Helically Symmetric Experiment and Germany’s Wendelstein 7-X stellarators.

The Type One stellarator design utilizes a highly optimized magnetic-field configuration designed to improve plasma confinement. Optimization can relax the stringent construction tolerances typically required for stellarators, making them easier and more cost-effective to engineer and build.

Type One’s design, like that of Thea Energy’s Eos, makes use of high-temperature superconducting magnets, which provide higher magnetic strength, require less cooling power, and could lower costs and allow for a more compact and efficient reactor. The magnets were designed for a tokamak, but Type One is modifying the coil structure to accommodate the intricate twists and turns of a stellarator.

In a sign that stellarator research may be moving from mainly scientific experiments into the race to field the first commercially viable reactor, Type One recently announced that it will build “the world’s most advanced stellarator” at the Bull Run Fossil Plant in Clinton, Tenn. To construct what it’s calling Infinity One—expected to be operational by early 2029—Type One is teaming up with the Tennessee Valley Authority and the DOE’s Oak Ridge National Laboratory.

“As an engineering testbed, Infinity One will not be producing energy,” says Type One CEO Chris Mowry. “Instead, it will allow us to retire any remaining risks and sign off on key features of the fusion pilot plant we are currently designing. Once the design validations are complete, we will begin the construction of our pilot plant to put fusion electrons on the grid.”

To help optimize the magnetic-field configuration, Mowry and his colleagues are utilizing Summit, one of Oak Ridge’s state-of-the-art exascale supercomputers. Summit is capable of performing more than 200 million times as many operations per second as the supercomputers of the early 1980s, when Wendelstein 7-X was first conceptualized.

AI Boosts Fusion Reactor Efficiency

Advances in computational power are already leading to faster design cycles, greater plasma stability, and better reactor designs. Ten years ago, an analysis of a million different configurations would have taken months; now a researcher can get answers in hours.

And yet, there are an infinite number of ways to make any particular magnetic field. “To find our way to an optimum fusion machine, we may need to consider something like 10 billion configurations,” says PPPL’s Cowley. “If it takes months to make that analysis, even with high-performance computing, that’s still not a route to fusion in a short amount of time.”

In the hope of shortcutting some of those steps, PPPL and other labs are investing in artificial intelligence and using surrogate models that can search and then rapidly home in on promising solutions. “Then, you start running progressively more precise models, which bring you closer and closer to the answer,” Cowley says. “That way we can converge on something in a useful amount of time.”

But the biggest remaining hurdles for stellarators, and magnetic-confinement fusion in general, involve engineering challenges rather than physics challenges, say Cowley and other fusion experts. These include developing materials that can withstand extreme conditions, managing heat and power efficiently, advancing magnet technology, and integrating all these components into a functional and scalable reactor.

Over the past half decade, the vibe at PPPL has grown increasingly optimistic, as new buildings go up and new researchers arrive on Stellarator Road to become part of what may be the grandest scientific challenge of the 21st century: enabling a world powered by safe, plentiful, carbon-free energy.

PPPL recently broke ground on a new $110 million office and laboratory building that will house theoretical and computational scientists and support the work in artificial intelligence and high-performance computing that is increasingly propelling the quest for fusion. The new facility will also provide space for research supporting PPPL’s expanded mission into microelectronics, quantum sensors and devices, and sustainability sciences.

PPPL researchers’ quest will take a lot of hard work and, probably, a fair bit of luck. Stellarator Road may be only a mile long, but the path to success in fusion energy will certainly stretch considerably farther.

Trailblazing Stellarators


In contrast to Muse’s relatively simple, low-cost approach, these pioneering stellarators are some of the most technically demanding machines ever built, with intricate electromagnetic coil systems and complex geometries that require precise engineering. The projects have provided valuable insights into plasma confinement, magnetic-field optimization, and the potential for steady-state operation, and moved the scientific community closer to achieving practical and sustainable fusion energy.

Large Helical Device (LHD)


An image of series of colored boxes and pipes.


Helically Symmetric Experiment (HSX)


A photo of a series of pipes and wires.


Wendelstein 7-X (W7-X)


A photo of a series of pipes and mechanical elements.

This article appears in the November 2024 print issue as “An Off-the- Shelf Stellarator.”

Why agriculture is a tough climate problem to solve

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

As a climate reporter, I’m all too aware of the greenhouse-gas emissions that come from food production. And yet, I’m not a vegan, and I do enjoy a good cheeseburger (at least on occasion). 

It’s a real problem, from a climate perspective at least, that burgers taste good, and so do chicken sandwiches and cheese and just about anything that has butter in it. It can be hard to persuade people to change their eating habits, especially since food is tied up in our social lives and our cultures. 

We could all stand to make some choices that could reduce the emissions associated with the food on our plates. But the longer I write about agriculture and climate, the more I think we’re also going to need to innovate around people’s love for burgers—and fix our food system not just in the kitchen, but on the farm. 

If we lump in everything it takes to get food grown, processed, and transported to us, agriculture accounts for between 20% and 35% of annual global greenhouse-gas emissions. (The range is huge because estimates can vary in what they include and how they account for things like land use, the impact of which is tricky to measure.) 

So when it came time to put together our list of 15 Climate Tech Companies to Watch, which we released earlier this month, we knew we wanted to represent the massive challenge that is our food system. 

We ended up choosing two companies in agriculture for this year’s list, Pivot Bio and Rumin8. My colleague James Temple and I spoke with leaders from both these businesses at our recent Roundtables online event, and it was fascinating to hear from them about the problems they’re trying to solve and how they’re doing it. 

Pivot Bio is using microbes to help disrupt the fertilizer industry. Today, applying nitrogen-based fertilizers to fields is basically like putting gas into a leaky gas tank, as Pivot cofounder Karsten Temme put it at the event. 

Plants rely on nitrogen to grow, but they fail to take up a lot of the nitrogen in fertilizers applied in the field. Since fertilizer requires a ton of energy to produce and can wind up emitting powerful greenhouse gases if plants don’t use it, that’s a real problem.

Pivot Bio uses microbes to help get nitrogen from the air into plants, and the company’s current generation of products can help farmers cut fertilizer use by 25%. 

Rumin8 has its sights set on cattle, making supplements that help them emit less methane, a powerful greenhouse gas. Cows have a complicated digestive system that involves multiple stomachs and a whole lot of microbes that help them digest food. Those microbes produce methane that the cows then burp up. “It’s really rude of them,” quipped Matt Callahan, Rumin8’s cofounder and counsel, at the event. 

In part because of the powerful warming effects of methane, beef is among the worst foods for the climate. Beef can account for up to 10 times more greenhouse-gas emissions than poultry, for example. 

Rumin8 makes an additive that can go into the food or water supply of dairy and beef cattle that can help reduce the methane they burp up. The chemical basically helps the cows use that gas as energy instead, so it can boost their growth—a big benefit to farmers. The company has seen methane reductions as high as 90%, depending on how the cow is getting the supplement (effects aren’t as strong for beef cattle, which often don’t have as close contact with farmers and may not get as strong a dose of the supplement over time as dairy cattle do). 

My big takeaway from our discussion, and from researching and picking the companies on our list this year, is that there’s a huge range of work being done to cut emissions from agriculture on the product side. That’s crucial, because I’m personally skeptical that a significant chunk of the world is going to quickly and voluntarily give up all the tasty but emissions-intensive foods that they’re used to. 

That’s not to say individual choices can’t make a difference. I love beans and lentils as much as the next girl, and we could all stand to make choices that cut down our individual climate impact. And it doesn’t have to be all or nothing. Anyone can choose to eat a little bit less beef specifically, and fewer meat and animal products in general (which tend to be more emissions-intensive than plant-based options). Another great strategy is to focus on cutting down your food waste, which not only reduces emissions but also saves you money. 

But with appetites and budgets for beef and other emissions-intensive foods continuing to grow worldwide, I think we’re also going to need to see a whole lot of innovation that helps lower the emissions of existing food products that we all know and love, including beef. 

There’s no one magic solution that’s going to solve our climate problem in agriculture. The key is going to be both shifting diets through individual and community action and adopting new, lower-emissions options that companies bring to the table. 


Now read the rest of The Spark

Related reading

If you missed our Rountables event “Producing Climate-Friendly Food,” you can check out the recording here. And for more details on the businesses we mentioned, read our profiles on Pivot Bio and Rumin8 from our 2024 list of 15 Climate Tech Companies to Watch. 

There are also some fascinating climate stories from the new, food-focused issue of our print magazine: 

grid of batteries, part of an electric car driving down the road, a flame and an inset of PyroThin aerogels
STEPHANIE ARNETT/MIT TECHNOLOGY REVIEW | ASPEN AEROGEL (PYROTHIN,) AUDI (EV)

Another thing

As more EVs hit the roads, there’s a growing concern about battery fires, which are a relatively rare but dangerous occurrence. 

Aspen Aerogels is making super-light materials that can help suppress battery fires, and the company just got a huge boost from the US Department of Energy. Read more about the $670.6 million loan and the details of the technology in my latest story

Keeping up with climate  

Hurricane Milton disrupted the supply of fresh drinking water, so a Florida hospital deployed a machine to harvest it out of the air. (Wired

There may be a huge supply of lithium in an underground brine reservoir in Arkansas. Using this source of the crucial battery metal will require companies to scale up new ways of extracting it. (New York Times)

There’s been a flurry of new deals between Big Tech and the nuclear industry, but Amazon is going one step further with its latest announcement. The company is supporting development of a new project rather than just agreeing to step in once electricity is ready. (Heatmap)
→ Here’s why Microsoft is getting involved in a plan to revive a nuclear reactor at Three Mile Island. (MIT Technology Review)

Japan’s most popular rice is in danger because of rising temperatures. Koshihikari rice has a low tolerance for heat, and scientists are racing to breed new varieties that can handle a changing climate. (New York Times)

There are some pretty straightforward solutions that could slash methane emissions from landfills, including requiring more sites to install gas-capture systems. Landfills are the third-largest source of the powerful greenhouse gas. (Canary Media)

Heat pump sales have slowed in the US and stalled in Europe. The technology is struggling in part because of high interest rates, increasing costs, and misinformation about the appliances. (Washington Post)
→ Here’s everything you need to know about how heat pumps work. (MIT Technology Review)

A Note from the Editor

What are we going to eat? It is the eternal question. We humans have been asking ourselves this for as long as we have been human. The question itself can be tedious, exciting, urgent, or desperate, depending on who is asking and where. There are many parts of the world where there is no answer. 

Famine is a critical issue in Gaza, Sudan, Syria, Myanmar, and Mali, among other places. As I write this, there are people going hungry tonight in western North Carolina because of the unprecedented flooding brought on by the aftermath of Hurricane Helene. And even when hunger isn’t an acute issue, it can remain  a persistently chronic one. Some 2.3 billion people around the world suffer from food insecurity, according to the World Health Organization. In the United States alone, the USDA has found that more than 47 million people live in food-insecure households. 

This issue is all about food, and more to the point, how we can use technology—high tech and low tech—to feed more people. 

Jonathan W. Rosen explores how some in Africa are tackling hunger by reviving nearly forgotten indigenous crops. These crops are often more resilient to climate change and better suited for the region than some of the more traditional ones embraced by agribusiness. Developing and promoting them could help combat food insecurity across the continent. But as is the case with many such initiatives, a lot hinges on sufficient investment and attention. 

At the high-tech end of the spectrum, Claire L. Evans looks into the startups seeking to create food literally out of thin air. In work based in part on decades-old NASA research, a new generation of researchers is developing carbon-hungry bacteria that will munch on greenhouse gases and grow into edible foodstuff. Yum? 

David W. Brown takes us to Mars—or a small simulacrum of it. If we are ever to spend any time on Mars, we’re going to need to grow our own food there. But there’s a problem. Well, there are a lot of problems! The soil is poisonous, for starters. And we don’t actually have any of it here to experiment with. But if the effort to make that soil arable pays off, it could not only help us bring life to Mars—it could also help support life here on Earth, converting deserts and poisoned wastelands into farmland.

As a reminder that technology is not always the answer, Douglas Main’s cover story takes on the issue of herbicide-resistant weeds. In the past few decades, more and more plants have evolved to develop this type of resistance. Even glyphosate—the chemical in Monsanto’s Roundup, which was initially marketed as being impervious to resistance—has been outpaced by some superweeds in the last 20 years. And the problem is just, well, growing. Nicola Twilley’s research on artificial refrigeration also reveals how technological advances can sometimes harm our food supply even as they help advance it. In our Q&A with her, she explains how the refrigerator has made food safer and more convenient—but at a huge cost in environmental damage (and flavor).

You won’t find only stories on food in this issue. Anna Merlan describes how the new face of AIDS denialism grew out of the choose-your-own-science school of covid vaccine trutherism—and how that movement basically threatens all of public health. Betsy Mason covers fascinating experiments in animal behavior—did you know that sleepy bees are less productive? And from Paolo Bacigalupi we have a new short story I have not stopped thinking about since I first read it. I hope you love it too. 

Super-light materials that help suppress EV battery fires just got a big boost

A company making fire-suppressing battery materials just got a $670.6 million loan commitment from the US Department of Energy.

Aspen Aerogels makes insulating materials that can be layered inside an EV’s battery to prevent or slow heat and fires from spreading within the pack. The company is building a new factory in Georgia to produce its materials, and the DOE’s Loan Programs Office will provide the massive loan to help it finish building the plant. 

As more EVs hit the roads, concern is growing about the relatively rare but dangerous problem of battery fires. While gas-powered cars catch fire at higher rates, battery fires can be harder to put out and are at greater risk of reigniting, creating dangerous situations for drivers and first responders. Materials like Aspen Aerogels’ thermal barriers can help improve battery safety.

“I think the goal is to really make sure that they’re helping to achieve critical battery safety goals that we all share,” says Jigar Shah, director of the Loan Programs Office.

Automakers including General Motors, Toyota, and Audi already buy Aspen Aerogels materials to use in their vehicles. If the new factory starts as planned and ramps to full capacity, it could supply material for over two million EVs annually.

When a lithium-ion battery is damaged or short-circuits, it can go into a process called thermal runaway, a feedback loop of heat and chemical reactions that can lead to a fire or explosion. Electric vehicles’ battery packs are made up of many small battery cells wired together—so there’s a risk that a problem in one cell can spread to the rest of the pack.

The thermal barriers the company makes can be tucked between cells, creating an obstacle that can suppress that spread. Depending on how an automaker uses the materials, aerogel insulation can at a minimum slow down the propagation of thermal runaway, giving a driver enough time to get out of the car. Or automakers can use the materials to design batteries that can confine a bad cell or a group of cells, so “instead of having a car-melting fire, you have a more isolated event,” Young says. 

Aerogels are very good at insulating to maintain hot or cold temperatures, since they’re mostly made of microscopic pockets of air. Aspen won research grants from NASA to explore the use of its materials for spacesuits and other applications in the early 2000s, and it has sold materials for equipment in facilities including oil refineries and liquefied-natural-gas terminals in the decades since, says Don Young, the company’s CEO.

The company began using its aerogels in battery materials in 2021. The start was a partnership with General Motors, Young says—the automaker was having issues with Chevy Bolt batteries catching fire at the time. 

While aerogels can help with the severity of battery fires, they can’t entirely prevent thermal runaway events. “Currently, we are not aware of any commercial technology that reliably prevents thermal runaway,” says In Taek Song, a researcher at LG Chem and part of a team that recently published research on safety devices for lithium-ion batteries, via email. Lithium-ion batteries contain flammable materials and can store a lot of energy. 

Automakers and battery manufacturers already put some measures in place to lower the risk of thermal runaway, including battery management systems that can detect and control battery conditions to prevent fires before they occur. Thermal insulation materials—including those made with aerogels—are part of the growing arsenal that can limit the damage if thermal runaway does occur.

One potential drawback to those materials is that they add bulk to a battery, which reduces energy density—the amount of energy that a battery can store in a certain volume or weight. Higher energy density translates to longer range for an EV, a crucial selling point for many drivers. The benefit of aerogels is that they’re super-light, since they’re mostly air—so they don’t limit energy density as much as other materials might. 

Aspen’s thermal barriers are typically between one and four millimeters thick and can be stacked between cells. Depending on the automaker and vehicle in question, the cost to incorporate it in an EV runs between $300 and $1,000, Young says. 

Pencil resting on a Pyrothin gel to show the comparative thickness
A pencil resting on a PyroThin thermal barrier to show its comparative thickness.
COURTESY OF ASPEN AEROGEL

The market is ramping up quickly. When the company began selling its battery materials in 2021, it did roughly $7 million in sales. In 2023 it had reached $110 million, and that’s on track to more than double again in 2024, Young says. 

Aspen Aerogels currently makes materials for EV batteries at its factory in Rhode Island, which also makes materials for other businesses, including the oil and gas industry. “We’re just busting at the seams of that plant,” Young says. The DOE loan will support construction of a new facility in Georgia, which will be entirely dedicated to making material for EV batteries. The plan is to have that facility running by early 2027, Young says. 

“This loan is to really get them at scale for their first commercial facility in Georgia,” Shah says. The company will need to meet certain financial and technical requirements to finalize the funding. 

“This loan is critically important to us, to help us with the completion of that project,” Young says. 

Correction: A caption has been updated to correctly identify the material pictured as a thermal barrier.

The quest to protect farmworkers from extreme heat

On July 21, 2024, temperatures soared in many parts of the world, breaking the record for the hottest day ever recorded on the planet.

The following day—July 22—the record was broken again.

But even as the heat index rises each summer, the people working outdoors to pick fruits, vegetables, and flowers for American tables keep laboring in the sun.

The consequences can be severe, leading to illnesses such as heat exhaustion or heatstroke. Body temperature can rise so high that farmworkers are “essentially … working with fevers,” says Roxana Chicas, an assistant professor at Emory University’s School of Nursing. In one study by Chicas’s research team, most farmworkers tested were chronically dehydrated, even when they drank fluids throughout the day. And many showed signs of developing acute kidney injury after just one workday.

Chicas is part of an Emory research program that has been investigating farmworker health since 2009. Emphasizing collaboration between researchers and community members, the team has spent years working with farmworkers to collect data on kidney function, the risk of heat illness, and the effectiveness of cooling interventions.

The team is now developing an innovative sensor that tracks multiple vital signs with a goal of anticipating that a worker will develop heat illness and issuing an alert.

If widely adopted and consistently used, it could represent a way to make workers safer on farms even without significant heat protections. Right now, with limited rules on such protections, workers are often responsible for their own safety. “The United States is primarily focused on educating workers on drinking water [and] the symptoms of heat-related illness,” says Chicas, who leads a field team that tested the sensor in Florida last summer.

The sensor project, a collaboration between Emory and engineers at the Georgia Institute of Technology, got its start in 2022, when the team was awarded a $2.46 million, four-year grant from the National Institute of Environmental Health Sciences. The sensor is now able to continuously measure skin temperature, heart rate, and physical activity. A soft device meant to be worn on the user’s chest, it was designed with farmworkers’ input; it’s not uncomfortable to wear for several hours in the heat, it won’t fall off because of sweat, and it doesn’t interfere with the physical movement necessary to do agricultural work.

To translate the sensor data into useful warnings, the team is now working on building a model to predict the risk of heat-related injury.

Chicas understands what drives migrant workers to the United States to labor on farms in the hot sun. When she was a child, her own family immigrated to the US to seek work, settling in Georgia. She remembers listening to stories from farmworker family members and friends about how hot it was in the fields—about how they would leave their shifts with headaches.

But because farmworkers are largely from Latin America (63% were born in Mexico) and nearly half are undocumented, “it’s difficult for [them] to speak up about [their] working conditions,” says Chicas. Workers are usually careful not to draw attention that “may jeopardize their livelihoods.”

They’re more likely to do so if they’re backed up by an organization like the Farmworker Association of Florida, which organizes agricultural workers in the state. FWAF has collaborated with the Emory program for more than a decade, recruiting farmworkers to participate in the studies and help guide them. 

There’s “a lot of trust” between those involved in the program, says Ernesto Ruiz, research coordinator at FWAF. Ruiz, who participated in data collection in Florida this past year, says there was a waiting list to take part in the project because there was so much interest—even though participants had to arrive at the break of dawn before a long day of work.

“We need to be able to document empirically, with uncontroversial evidence, the brutal working conditions that farmworking communities face and the toll it takes on their bodies.”

Ernesto Ruiz, research coordinator, Farmworker Association of Florida

Participants had their vital signs screened in support of the sensor research. They also learned about their blood glucose levels, cholesterol, triglycerides, HDL, and LDL. These readings, Ruiz says, “[don’t] serve any purpose from the standpoint of a predictive variable for heat-related injury.” But community members requested the additional health screenings because farmworkers have little to no access to health care. If health issues are found during the study, FWAF will work to connect workers to health-care providers or free or low-cost clinics.

“Community-based participatory research can’t just be extractive, eliciting data and narratives,” Ruiz says. “It has to give something in return.”

Work on technology to measure heat stress in farmworkers could feed back into policy development. “We need to be able to document empirically, with uncontroversial evidence, the brutal working conditions that farmworking communities face and the toll it takes on their bodies,” Ruiz says.

Though the Biden administration has proposed regulations, there are currently no federal standards in place to protect workers from extreme heat. (Only five states have their own heat standards.) Areas interested in adding protections can face headwinds. In Florida, for example, after Miami-Dade County proposed heat protection standards for outdoor workers, the state passed legislation preventing localities from issuing their own heat rules, pointing to the impact such standards could have on employers.

Meanwhile, temperatures continue to rise. With workers “constantly, chronically” exposed to heat in an environment without protective standards, says Chicas, the sensor could offer its own form of protection. 

Kalena Thomhave is a freelance journalist based in Pittsburgh.

Africa fights rising hunger by looking to foods of the past

The first time the rains failed, the farmers of Kanaani were prepared for it. It was April of 2021, and as climate change had made the weather increasingly erratic, families in the eastern Kenyan village had grown used to saving food from previous harvests. But as another wet season passed with barely any rain, and then another, the community of small homesteads, just off the main road linking Nairobi to the coast of the Indian Ocean, found itself in a full-fledged hunger crisis. 

By the end of 2022, Danson Mutua, a longtime Kanaani resident, counted himself lucky that his farm still had pockets of green: Over the years, he’d gradually replaced much of his maize, the staple crop in Kenya and several other parts of Africa, with more drought-resistant crops. He’d planted sorghum, a tall grass capped with tufts of seeds that look like arrowheads, as well as protein-rich legumes like pigeon peas and green gram, which don’t require any chemical fertilizers and are also prized for fixing nitrogen in soils. Many of his neighbors’ fields were completely parched. Cows, with little to eat themselves, had stopped producing milk; some had started dying. While it was still possible to buy grain at the local market, prices had spiked, and few people had the cash to pay for it. 

Mutua, a father of two, began using his bedroom to secure the little he’d managed to harvest. “If I left it out, it would have disappeared,” he told me from his home in May, 14 months after the rains had finally returned and allowed Kanaani’s farmers to begin recovering. “People will do anything to get food when they’re starving.”

The food insecurity facing Mutua and his neighbors is hardly unique. In 2023, according to the United Nations’ Food and Agriculture Organization, or FAO, an estimated 733 million people around the world were “undernourished,” meaning they lacked sufficient food to “maintain a normal, active, and healthy life.” After falling steadily for decades, the prevalence of global hunger is now on the rise—nowhere more so than in sub-Saharan Africa, where conflicts, economic fallout from the covid-19 pandemic, and extreme weather events linked to climate change pushed the share of the population considered undernourished from 18% in 2015 to 23% in 2023. The FAO estimates that 63% of people in the region are “food insecure”—not necessarily undernourished but unable to consistently eat filling, nutritious meals.

In Africa, like anywhere, hunger is driven by many interwoven factors, not all of which are a consequence of farming practices. Increasingly, though, policymakers on the continent are casting a critical eye toward the types of crops in farmers’ plots, especially the globally dominant and climate-vulnerable grains like rice, wheat, and above all, maize. Africa’s indigenous crops are often more nutritious and better suited to the hot and dry conditions that are becoming more prevalent, yet many have been neglected by science, which means they tend to be more vulnerable to diseases and pests and yield well below their theoretical potential. Some refer to them as “orphan crops” because of this. 

Efforts to develop new varieties of many of these crops, by breeding for desired traits, have been in the works for decades—through state-backed institutions, a continent-wide research consortium, and underfunded scientists’ tinkering with hand-pollinated crosses. Now those endeavors have gotten a major boost: In 2023, the US Department of State, in partnership with the African Union, the FAO, and several global agriculture institutions, launched the Vision for Adapted Crops and Soils, or VACS, a new Africa-focused initiative that seeks to accelerate research and development for traditional crops and help revive the region’s long-­depleted soils. VACS, which had received funding pledges worth $200 million as of August, marks an important turning point, its proponents say—not only because it’s pumping an unprecedented flow of money into foods that have long been disregarded but because it’s being driven by the US government, which has often promoted farming policies around the world that have helped entrench maize and other food commodities at the expense of local crop diversity.

It may be too soon to call VACS a true paradigm shift: Maize is likely to remain central to many governments’ farming policies, and the coordinated crop R&D the program seeks to hasten is only getting started. Many of the crops it aims to promote could be difficult to integrate into commercial supply chains and market to growing urban populations, which may be hesitant to start eating like their ancestors. Some worry that crops farmed without synthetic fertilizers and pesticides today will be “improved” in a way that makes farmers more dependent on these chemicals—in turn, raising farm expenses and eroding soil fertility in the long run. Yet for many of the policymakers, scientists, and farmers who’ve been championing crop diversity for decades, this high-level attention is welcome and long overdue.

“One of the things our community has always cried for is how to raise the profile of these crops and get them on the global agenda,” says Tafadzwa Mabhaudhi, a longtime advocate of traditional crops and a professor of climate change, food systems, and health at the London School of Hygiene and Tropical Medicine, who comes from Zimbabwe.

Now the question is whether researchers, governments, and farmers like Mutua can work together in a way that gets these crops onto plates and provides Africans from all walks of life with the energy and nutrition that they need to thrive, whatever climate change throws their way.

A New World addiction

Africa’s love affair with maize, which was first domesticated several thousand years ago in central Mexico, dates to a period known as the Columbian exchange, when the trans-Atlantic flow of plants, animals, metals, diseases, and people—especially enslaved Africans—dramatically reshaped the world economy. The new crop, which arrived in Africa sometime after 1500 along with other New World foods like beans, potatoes, and cassava, was tastier and required less labor than indigenous cereals like millet and sorghum, and under the right conditions it could yield significantly more calories. It quickly spread across the continent, though it didn’t begin to dominate until European powers carved up most of Africa into colonies in the late 19th century. Its uptake was greatest in southern Africa and Kenya, which both had large numbers of white settlers. These predominantly British farmers, tilling land that had often been commandeered from Africans, began adopting new maize varieties that were higher yielding and more suitable for mechanized milling—albeit less nutritious—than both native grains and the types of maize that had been farmed locally since the 16th century. 

“People plant maize, harvest nothing, and still plant maize the next season. It’s difficult to change that mindset.”

Florence Wambugu, CEO, Africa Harvest

Eager to participate in the new market economy, African farmers followed suit; when hybrid maize varieties arrived in the 1960s, promising even higher yields, the binge only accelerated. By 1990, maize accounted for more than half of all calories consumed in Malawi and Zambia and at least 20% of calories eaten in a dozen other African countries. Today, it remains omnipresent—as a flour boiled into a sticky paste; as kernels jumbled with beans, tomatoes, and a little salt; or as fermented dumplings steamed and served inside the husk. Florence Wambugu, CEO of Africa Harvest, a Kenyan organization that helps farmers adopt maize alternatives, says the crop has such cultural significance that many insist on cultivating it even where it often fails. “People plant maize, harvest nothing, and still plant maize the next season,” she says. “It’s difficult to change that mindset.”

Maize and Africa have never been a perfect match. The plant is notoriously picky, requiring nutrient-rich soils and plentiful water at specific moments. Many of Africa’s soils are naturally deficient in key elements like nitrogen and phosphorus. Over time, the fertilizers needed to support hybrid varieties, often subsidized by governments, depleted soils even further. Large portions of Africa’s inhabited areas are also dry or semi-arid, and 80% of farms south of the Sahara are occupied by smallholders, who work plots of 10 hectares or less. On these farms, irrigation can be spatially impractical and often does not make economic sense. 

It would be a stretch to blame Africa’s maize addiction for its most devastating hunger crises. Research by Alex de Waal, an expert in humanitarian disasters at Tufts University, has found that more than three-quarters of global famine deaths between 1870 and 2010 occurred in the context of “conflict or political repression.” That description certainly applies to today’s worst hunger crisis, in Sudan, a country being ripped apart by rival military governments. As of September, according to the UN, more than 8.5 million people in the country were facing “emergency levels of hunger,” and 755,000 were facing conditions deemed “catastrophic.”

overhead of a bowl of stew
Ground egusi seeds, rich in protein and B vitamins, are used in a popular West African soup.
ADAM DETOUR

For most African farmers, though, weather extremes pose a greater risk than conflict. The two-year drought that affected Mutua, for example, has been linked to a narrowing of the cloud belt that straddles the equator, as well as the tendency of land to lose moisture faster in higher temperatures. According to one 2023 study, by a global coalition of meteorologists, these climatic changes made that drought—which contributed to a 22% drop in Kenya’s national maize output and forced a million people from their homes across eastern Africa—100 times more likely. The UN’s Intergovernmental Panel on Climate Change expects yields of maize, wheat, and rice in tropical regions to fall by 5%, on average, for every degree Celsius that the planet heats up. Eastern Africa could be especially hard hit. A rise in global temperatures of 1.5 degrees above preindustrial levels, which scientists believe is likely to occur sometime in the 2030s, is projected to cause maize yields there to drop by roughly one-third from where they stood in 2005.  

Food demand continues to rise: Sub-Saharan Africa’s population, 1.2 billion now, is expected to surpass 2 billion by 2050.

Food demand, at the same time, will continue to rise: Sub-Saharan Africa’s population, 1.2 billion now, is expected to surpass 2 billion by 2050, and roughly half of those new people will be born and come of age in cities. Many will grow up on Westernized diets: Young, middle-class residents of Nairobi today are more likely to meet friends for burgers than to eat local dishes like nyama choma, roasted meat typically washed down with bottles of Tusker lager. KFC, seen by many as a status symbol, has franchises in a dozen Kenyan towns and cities; those looking to splurge can dine on sushi crafted from seafood flown in specially from Tokyo. Most, though, get by on simple foods like ugali, a maize porridge often accompanied by collard greens or kale. Although some urban residents consume maize grown on family farms “upcountry,” most of them buy it; when domestic harvests underperform, imports rise and prices spike, and more people go hungry. 

A solution from science?

The push to revive Africa’s indigenous crops is a matter of nutrition as well. An overreliance on maize and other starches is a big reason that nearly a third of children under five in sub-Saharan Africa are stunted—a condition that can affect cognition and immune system functioning for life. Many traditional foods are nutrient dense and have potential to combat key dietary deficiencies, says Enoch Achigan-Dako, a professor of genetics and plant breeding at the University of Abomey-Calavi in Benin. He cites egusi as a prime example. The melon seed, used in a popular West African soup, is rich in protein and the B vitamins the body needs to convert food into energy; it is already a lifeline in many places where milk is not widely available. Breeding new varieties with shorter growth cycles, he says, could make the plant more viable in drier areas. Achigan-Dako also believes that many orphan crops hold untapped commercial potential that could help farmers combat hunger indirectly. 

Increasingly, institutions are embracing similar views. In 2013, the 55-­member-state African Union launched the African Orphan Crops Consortium, or AOCC—a collaboration with CGIAR, a global coalition of 15 nonprofit food research institutions, the University of California, Davis, and other partners. The AOCC has since trained more than 150 scientists from 28 African countries in plant breeding techniques through 18-month courses held in Nairobi. It’s also worked to sequence the genomes of 101 understudied crops, in part to facilitate the use of genomic selection. This technique involves correlating observed traits, like drought or pest resistance, with plant DNA, which helps breeders make better-­informed crosses and develop new varieties faster. The consortium launched another course last year to train African scientists in the popular gene-editing technique CRISPR, which enables the tweaking of plant DNA directly. While regulatory and licensing hurdles remain, Leena Tripathi, a molecular biologist at CGIAR’s International Institute of Tropical Agriculture (IITA) and a CRISPR course instructor, believes gene-editing tools could eventually play a big role in accelerating breeding efforts for orphan crops. Most exciting, she says, is the promise of mimicking genes for disease resistance that are found in wild plants but not in cultivated varieties available for crossing.   

For many orphan crops, old-­fashioned breeding techniques also hold big promise. Mathews Dida, a professor of plant genetics and breeding at Kenya’s Maseno University and an alumnus of the AOCC’s course in Nairobi, has focused much of his career on the iron-rich grain finger millet. He believes yields could more than double if breeders incorporated a semi-dwarf gene—a technique first used with wheat and rice in the 1960s. That would shorten the plants so that they don’t bend and break when supplied with nitrogen-based fertilizer. Yet money for such projects, which largely comes from foreign grants, is often tight. “The effort we’re able to put in is very erratic,” he says.

VACS, the new US government initiative, was envisioned in part to help plug these sorts of gaps. Its move to champion traditional crops marks a significant pivot. The United States was a key backer of the Green Revolution that helped consolidate the global dominance of rice, wheat, and maize during the 1960s and 1970s. And in recent decades its aid dollars have tended to support programs in Africa that also emphasize the chemical-­intensive farming of maize and other commercial staples. 

Change, though, was afoot: In 2021, with hunger on the rise, the African Union explicitly called for “intentional investments towards increased productivity and production in traditional and indigenous crops.” It found a sympathetic ear in Cary Fowler, a longtime biodiversity advocate who was appointed US special envoy for global food security by President Joe Biden in 2022. The 74-year-old Tennessean was a co-recipient of this year’s World Food Prize, agriculture’s equivalent of the Nobel, for his role in establishing the Svalbard Global Seed Vault, a facility in the Norwegian Arctic that holds copies of more than 1.3 million seed samples from around the world. Fowler has argued for decades that the loss of crop diversity wrought by the global expansion of large-scale farming risks fueling future hunger crises.

VACS, which complements the United States’ existing food security initiative, Feed the Future, began by working with the AOCC and other experts to develop an initial list of underutilized crops that were climate resilient and had the greatest potential to boost nutrition in Africa. It pared that list down to a group of 20 “opportunity crops” and commissioned models that assessed their future productivity under different climate-change scenarios. The models predicted net yield gains for many: Carbon dioxide, including that released by burning fossil fuels, is the key input in plant photosynthesis, and in some cases the “fertilization effect” of higher atmospheric CO2 can more than nullify the harmful impact of hotter temperatures. 

According to Fowler’s deputy, Anna Nelson, VACS will now operate as a “broad coalition,” with funds channeled through four core implementing partners. One of them, CGIAR, is spearheading R&D on an initial seven of those 20 crops—pigeon peas, Bambara groundnuts, taro, sesame, finger millet, okra, and amaranth—through partnerships with a range of research institutions and scientists. (Mabhaudhi, Achigan-Dako, and Tripathi are all involved in some capacity.) The FAO is leading an initiative that seeks to drive improvements in soil fertility, in part through tools that help farmers decide where and what to plant on the basis of soil characteristics. While Africa remains VACS’s central focus, activities have also launched or are being planned in Guatemala, Honduras, and the Pacific Community, a bloc of 22 Pacific island states and territories. The idea, Nelson tells me, is that VACS will continue to evolve as a “movement” that isn’t necessarily tied to US funding—or to the priorities of the next occupant of the White House. “The US is playing a convening and accelerating role,” she says. But the movement, she adds, is “globally owned.”

Making farm-to-table work

In some ways, the VACS concept is a unifying one. There’s long been a big and often rancorous divide between those who believe Africa needs more innovation-­driven Green Revolution–style agriculture and those promoting ecological approaches, who insist that chemically intensive commercial crops aren’t fit for smallholders. In its focus on seed science as well as crop diversity and soil, VACS has something to offer both. Still, the degree to which the movement can change the direction of Africa’s food production remains an open question. VACS’s initial funding—roughly $150 million pledged by the US and $50 million pledged by other governments as of August—is more than has ever been earmarked for traditional crops and soils at a single moment. The AOCC, by comparison, spent $6.5 million on its plant breeding academy over a decade; as of 2023, its alumni had received a total of $175 million, largely from external grants, to finance crop improvement. Yet enabling orphan crops to reach their full potential, says Allen Van Deynze, the AOCC’s scientific director, who also heads the Seed Biotechnology Center at the University of California, Davis, would require an even bigger scale-up: $1 million per year, ideally, for every type of crop being prioritized in every country, or between $500 million and $1 billion per year across the continent.

“If there are shortages of maize, there will be demonstrations. But nobody’s going to demonstrate if there’s not enough millet, sorghum, or sweet potato.”

Florence Wambugu, CEO, Africa Harvest

Despite the African Union’s support, it remains to be seen if VACS will galvanize African governments to chip in more for crop development themselves. In Kenya, the state-run Agricultural & Livestock Research Organization, or KALRO, has R&D programs for crops such as pigeon peas, green gram, sorghum, and teff. Nonetheless, Wambugu and others say the overall government commitment to traditional crops is tepid—in part because they don’t have a big impact on politics. “If there are shortages of maize, there will be demonstrations,” she says. “But nobody’s going to demonstrate if there’s not enough millet, sorghum, or sweet potato.”

Others express concern that some participants in the VACS movement, including global institutions and private companies, could co-opt long-standing efforts by locals to support traditional crops. Sabrina Masinjila, research and advocacy officer at the African Center for Biodiversity, a Johannesburg-based organization that promotes ecological farming practices and is critical of corporate involvement in Africa’s food systems, sees red flags in VACS’s partnerships with several Western companies. Most concerning, she says, is the support of Bayer, the German biotech conglomerate, for the IITA’s work developing climate-­resilient varieties of banana. In 2018 Bayer purchased Monsanto, which had become a global agrochemical giant through the sale of glyphosate, a weed killer the World Health Organization calls “probably carcinogenic,” along with seeds genetically modified to resist it. Monsanto had also long attracted scrutiny for aggressively pursuing claims of seed patent violations against farmers. Masinjila, a Tanzanian, fears that VACS could open the door to multinational companies’ use of African crops’ genetic sequences for their own private interests or to develop varieties that demand application of expensive, environmentally damaging pesticides and fertilizers.

According to Nelson, no VACS-related US funding will go to crop development that results in any private-sector patents. Seeds developed through CGIAR, VACS’s primary crop R&D partner, are considered to be public goods and are generally made available to governments, researchers, and farmers free of charge. Nonetheless, Nelson does not rule out the possibility that some improved varieties might require costlier, non-organic farming methods. “At its core, VACS is about making more options available to farmers,” she says.

While most indigenous-crop advocates I’ve spoken to are excited about VACS’s potential, several cite other likely bottlenecks, including challenges in getting improved varieties to farmers. A 2023 study by Benson Nyongesa, a professor of plant genetics at the University of Eldoret in Kenya, found that 33% of registered varieties of sorghum and 47% of registered varieties of finger millet had not made it into the fields of farmers; instead, he says, they remained “sitting on the shelves of the institutions that developed them.” The problem represents a market failure: Most traditional crops are self- or open-­pollinated, which means farmers can save a portion of their harvest to plant as seeds the following year instead of buying new ones. Seed companies, he and others say, are out to make a profit and are generally not interested in commercializing them.

Farmers can access seeds in other ways, sometimes with the help of grassroots organizations. Wambugu’s Africa Harvest, which receives funding from the Mastercard Foundation, provides a “starter pack” of seeds for drought-­tolerant crops like sorghum, groundnuts, pigeon peas, and green gram. It also helps its beneficiaries navigate another common challenge: finding markets for their produce. Most smallholders consume a portion of the crops they grow, but they also need cash, and commercial demand isn’t always forthcoming. Part of the reason, says Pamela Muyeshi, owner of Amaica, a Nairobi restaurant specializing in traditional Kenyan fare, is that Kenyans often consider indigenous foods to be “primitive.” This is especially true for those in urban areas who face food insecurity and could benefit from the nutrients these foods offer but often feel pressure to appear modern. Lacking economies of scale, many of these foods remain expensive. To the extent they’re catching on, she says, it’s mainly among the affluent.

""
The global research partnership CGIAR is spearheading R&D on several drought-tolerant crops, including green gram.
ADAM DETOUR

Similar “social acceptability” barriers will need to be overcome in South Africa, says Peter Johnston, a climate scientist who specializes in agricultural adaptation at the University of Cape Town. Johnston believes traditional crops have an important role to play in Africa’s climate resilience efforts, but he notes that no single crop is fully immune to the extreme droughts, floods, and heat waves that have become more frequent and more unpredictable. Crop diversification strategies, he says, will work best if paired with “anticipatory action”—pre-agreed and pre-financed responses, like the distribution of food aid or cash, when certain weather-related thresholds are breached.

Mutua, for his part, is a testament that better crop varieties, coupled with a little foresight, can go a long way in the face of crisis. When the drought hit in 2021, his maize didn’t stand a chance. Yields of pigeon peas and cowpeas were well below average. Birds, notorious for feasting on sorghum, were especially ravenous. The savior turned out to be green gram, better known in Kenya by its Swahili name, ndengu. Although native to India, the crop is well suited to eastern Kenya’s sandy soils and semi-arid climate, and varieties bred by KALRO to be larger and faster maturing have helped its yields improve over time. In good years, Mutua sells much of his harvest, but after the first season with barely any rain, he hung onto it; soon, out of necessity, ndengu became the fixture of his family’s diet. On my visit to his farm, he pointed it out with particular reverence: a low-lying plant with slender green pods that radiate like spokes of a bicycle wheel. The crop, Mutua told me, has become so vital to this area that some people consider it their “gold.”

If the movement to revive “forgotten” crops lives up to its promise, other climate-­stressed corners of Africa might soon discover their gold equivalent as well.

Jonathan W. Rosen is a journalist who writes about Africa. Evans Kathimbu assisted his reporting from Kenya.

Observers warn the US must do more to boost demand for carbon removal 

In 2022, the US made a massive bet on the carbon removal industry, committing $3.5 billion to build four major regional hubs in an effort to scale up the nascent sector. But industry observers fear that market demand isn’t building fast enough to support it, even with these substantial federal grants and other subsidies. 

Some are now calling for the Department of Energy to redirect a portion of the money earmarked to build direct-air-capture (DAC) plants toward purchases of greenhouse-gas removal instead. At issue is the lack of natural demand for the product that these plants ultimately generate: carbon dioxide that, in most cases, is immediately buried underground. Businesses and organizations that purchase credits representing that CO2 do so only to meet climate neutrality goals, which are mostly self-imposed. Carbon removal proponents worry that without greater government efforts to prop up ongoing demand, some of the facilities funded through the program may not survive—or even be built.

Breakthrough Energy, the Bill Gates–backed climate and clean energy organization, released a commentary today calling for more government support for demand to ensure that the industry doesn’t stall out in its infancy, MIT Technology Review can report.

“You’re essentially totally dependent on a handful of companies willing to pay a very high dollar amount as you try to drive the technology down the cost curve,” says Jack Andreasen, head of carbon management within the policy advocacy arm of Breakthrough Energy. “My fear is we’ll build a bunch of facilities and they’ll just be mothballed because they can’t sell enough credits.” 

The Regional Direct Air Capture Hubs program was funded through the Bipartisan Infrastructure Law, which President Joe Biden signed in late 2021. To date, only a few of the awardees have been selected, none of the projects have been built, and few of the funds have been dispersed, so any stumbles would still be years in the future. But if any of the DOE-backed projects did ultimately fail, it would likely chill investor interest and spark a political backlash like the Solyndra scandal did in the early 2010s, creating fresh grounds for critics to assail federal support for climate, clean energy, and carbon removal projects. 

“It’s absolutely critical that the DAC Hubs program creates high-quality projects and that the DOE does everything they can to make sure they thrive,” says Giana Amador, executive director of the Carbon Removal Alliance, a nonprofit group that represents the industry. She says the organization has heard from numerous companies that “demand continues to be a challenge for them,” especially for larger-scale projects.

The DOE’s Office of Clean Energy Demonstrations, which oversees the DAC Hubs program, didn’t respond to an inquiry from MIT Technology Review before press time. 

One of the companies that already secured funds through the program, Heirloom, says it is seeing adequate demand for its projects. But in a prepared statement, the company did say that governments will need to step up support in the coming years, noting that according to the UN’s climate panel, the world may need to suck down billions of tons of carbon dioxide a year by 2050 to prevent temperatures from rising past 2 °C over preindustrial levels.

“Achieving that type of scale won’t happen through a voluntary market alone; it will require significant demand-side policy at home and abroad,” the company said.

The hubs

The DOE announced the first set of DAC Hubs grants last summer, revealing that it would provide more than $1 billion to two projects, each with the capacity to suck down a million tons of carbon dioxide per year: Occidental Petroleum’s proposed carbon removal factory in Kleberg County, Texas, and a collaboration between Battelle, Climeworks, and Heirloom to develop facilities in Louisiana. 

As Heatmap previously reported, Heirloom has pre-sold a “substantial” portion of the capacity for the two projects it is now planning in the state to customers including JPMorgan Chase, Klarna, Meta, Microsoft, and Stripe.

Occidental’s first industrial-scale DAC project, the Stratos plant in Ector County, Texas, is expected to come online next year. The company’s 1PointFive subsidiary is developing the project and has announced customers including AT&T, Amazon, Microsoft, and Trafigura.

The company didn’t respond to a question concerning whether it has lined up deals for the separate DAC Hubs–funded project. But Michael Avery, president of 1PointFive, said in a prepared statement: “We’re continuing to see increasing understanding and interest in the importance of highly-durable CDR solutions like direct air capture to address residual emissions across several industries.”

Last month, the DOE’s Office of Clean Energy Demonstrations said it would provide up to $1.6 billion to a variety of additional DAC facilities, as well as the infrastructure that would support them, which might include storage wells and pipelines. 

Notably, the agency significantly reduced the size of the facilities that might qualify for the second tranche of grant funding. Rather than million-ton facilities, the office said, it would likely look for “mid-scale projects” that could remove 2,000 to 25,000 tons of carbon dioxide per year and “large-scale” ones that capture at least 25,000 tons. It also stated that it plans to use some portion of the remaining funds “to support current and future awardees in addressing key barriers or major industry challenges that fall outside the original award scope and budget.” 

Industry observers interpreted that to mean the office was seriously considering the growing calls to provide more demand support for carbon dioxide removal (CDR). That could take the form of direct government procurement of tons of carbon removal that could be applied toward the nation’s goals under the Paris climate agreement or federal subsidies that help defray the cost of corporate purchases.

Andreasen and Amador both said the DOE should allocate up to $500 million from the original $3.5 billion toward such efforts. Repurposing that money may mean building fewer or smaller plants through the DAC Hubs program, but it could increase the odds of success for those that do get developed.

A public good? 

Breakthrough Energy isn’t a disinterested observer. The venture arm of the organization has made multiple investments in the carbon removal industry. For that matter, it’s not unusual for an industry organization, like the Carbon Removal Alliance, to call for governments to bestow tax breaks, subsidies, or other forms of federal assistance on its members.

The US already provides significant support for the industry on top of the DAC Hubs funding, including a subsidy of up to $180 for every ton of carbon dioxide removed by a direct-air-capture plant and then permanently stored underground. 

The DOE’s Office of Fossil Energy and Carbon Management has started a pilot effort to directly purchase carbon removal last year, with $35 million in available funding. In May, it revealed a list of 24 semifinalists for the purchase contracts, including Charm Industrial, Climeworks, Ebb Carbon, Heirloom, and others. The office intends to select up to 10 companies that could receive as much as $3 million for the sale of removed carbon dioxide when those tons are delivered.

Many critics will see industry figures asking for still more handouts as pleas for lavish levels of corporate welfare.

But others consider carbon removal principally a public good, and there’s wide agreement that the sector will need massive and sustained government support to reach anywhere near the scale that would meaningfully address climate change.

That’s because it’s an odd industry, fueled less by customer demand than by climate imperatives. An earlier National Academies report said the world may need to remove and store away around 10 billion tons per year by midcentury. But that doesn’t mean companies are especially eager to cover the high cost of doing it.

“Demand is a challenge for all climate technologies,” Amador says, given the often high premiums. “But it’s particularly acute for carbon removal and direct air capture, because it’s a public good. We’re producing a waste management service that no one currently has to pay for, and that makes commercializing this particularly difficult.” 

The hope and the challenge

The hope is that scaling up the sector will drive down costs, unlocking additional demand among corporations hoping to cancel out their pollution and making it cheaper for governments to make larger and larger purchases. 

The consulting firm BCG estimates that voluntary demand for carbon removal could increase to as much as 750 million tons by 2040, and that supportive government policies could drive an additional 500 million to 2.5 billion tons of “durable” demand by 2050. Among other possibilities, the European Union, Japan, and California may, for instance, incorporate carbon removal into their regulated carbon trading systems in the coming years. 

But there’s no guarantee that carbon removal costs will drop, voluntary market demand will build, or government support will rise as fast as needed to keep the industry growing before that occurs. Nor is it a given that nations or businesses will ever collectively suck up the cost of drawing billions of tons of carbon dioxide out of the air. 

Even if the industry gets costs down to $100 a ton, a standard target that could drive much more demand, removing 10 billion tons a year would add up to a $1 trillion annual expenditure. The obvious question that raises is who should pay for the bulk of that—average taxpayers who would receive the benefits in the form of lower climate risks, or the major polluters that did the most to cause the problem? 

There are bubbling concerns that too many startups are already chasing too little demand and that follow-on investments are tightening amid a broader slowdown in climate-tech-focused venture capital. Several companies in the space have already gone out of business, including Running Tide and Nori.

Total purchases of carbon removal, through direct air capture and other methods, have continued to rise. A handful of companies, like Microsoft, Stripe, Shopify, and Google, have committed to paying the steep current costs of removing tons of CO2, hoping to help to stand up the sector and earn credit for taking action to address climate change. In fact, the deal volume so far in 2024, at more than $1.4 billion, exceeds the total seen in all previous years combined, says Robert Höglund, cofounder of CDR.fyi, which tracks carbon removal purchases.

But in what he called “a concerning trend,” the number of buyers—and especially the number of new buyers—has ticked down in recent quarters. Microsoft’s carbon removal purchases alone made up more than 77% of this year’s total.

The problem is, “you need 10 Microsofts to finance one DAC hub,” says Julio Friedmann, chief scientist at Carbon Direct, which advises companies on carbon removal. 

There’s an added challenge for direct air capture within the voluntary carbon market: It’s one of the most expensive ways for corporations to cancel out emissions. Carbon removal purchases only make up about 3% percent of the voluntary carbon market today, according to a Carbon Direct report last year. And DAC purchases only represent about 18% of that fraction of the market, according to CDR.fyi. 

Traditional carbon offsets for projects that promise to reduce or avoid emissions are still the main competition for any form of carbon removal, making up about 90% of the voluntary market. The problem is that a variety of studies and investigative stories have found that these credits, which can be earned and sold for preserving forests, building renewable-energy facilities, and similar efforts, often overstate the climate benefits. But they’re a lot cheaper than reliable carbon removal options and remain appealing to many companies looking for a way to cancel out their emissions, at least on paper.

Höglund says that corporate climate goal-setting bodies like the Science Based Targets initiative should help push along the business of high-quality carbon removal by requiring participating companies to set interim objectives for purchases that start small and rise over time. 

But he, too, stresses that the major buyers will need to be governments.

“More, and larger, such government purchase initiatives are likely to be needed to keep the permanent CDR sector on the right track,” Höglund said in an email.

Earlier this year, the US Congress approved another $20 million for a second phase of the DOE’s carbon removal purchase program.

The agency is helping to drive demand by buying carbon removal in small, but likely growing amounts, says Noah Deich, a senior advisor in the DOE’s Office of Fossil Energy and Carbon Management, which oversees the pilot program. But he stresses that additional corporations will need to do their part as well, paying for the high costs of carbon removal today, to ensure that more and more parties can afford to buy large amounts of it in the future.

“Unless we start to make a bigger market for CDR purchasers, we won’t achieve the commercial liftoff in the 2030s,” he says.

Everything comes back to climate tech. Here’s what to watch for next.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

We get to celebrate a very special birthday today—The Spark just turned two! 

Over the past couple of years, I’ve been bringing you all the news you need to know in climate tech and digging into some of the most fascinating and thorny topics from energy and transportation to agriculture and policy. 

In light of this milestone, I’ve been looking back at some of the most popular editions of this newsletter, as well as some of my personal favorites—and it’s all got me thinking about where climate tech will go next. So let’s look back together, and I’ll also share what I’m going to be watching out for as we go forward.

It’s prime time for batteries

It will probably be a surprise to absolutely nobody that the past two years have been filled with battery news. (In case you’re new and need a quick intro to my feelings on the topic, you can read the love letter to batteries I wrote this year for Valentine’s Day.) 

We’ve covered how abundant materials could help unlock cheaper, better batteries, and how new designs could help boost charging speeds. I’ve dug into the data to share how quickly batteries are taking over the world, and how much faster we’ll need to go to hit our climate goals.

The next few years are going to be make-or-break for a lot of the alternative batteries we’ve covered here, from sodium-ion to iron-air and even solid-state. We could see companies either fold or make it to the next stage of commercialization. I’m watching to see which technologies will win—there are many different options that could break out and succeed. 

A nuclear renaissance 

One topic I’ve been covering closely, especially in the past year, is nuclear energy. We need zero-emissions options that are able to generate electricity 24-7. Nuclear fits that bill. 

Over the past two years, we’ve seen some major ups and downs in the industry. Two new reactors have come online in the US, though they were years late and billions over budget. Germany completed its move away from nuclear energy, opting instead to go all in on intermittent renewables like solar and wind (and keep its coal plants open). 

Looking ahead, though, there are signs that we could see a nuclear energy resurgence. I’ve written about interest in keeping older reactors online for longer and opening up plants that have previously shut down. And companies are aiming to deploy new advanced reactor designs, too. 

I’m watching to see how creative the industry can get with squeezing everything it can out of existing assets. But I’m especially interested to see whether new technologies keep making progress on getting regulatory approval, and whether the new designs can actually get built. 

Material world forever

I’ll never stop talking about materials—from what we need to build all the technologies that are crucial for addressing climate change to how we can more smartly use the waste after those products reach the end of their lifetime. 

Recently, I wrote a feature story (and, of course, a related newsletter bringing you behind the scenes of my reporting) about how one rare earth metal gives us a look at some of the challenges we’ll face with sourcing and recycling materials over the next century and beyond. 

It’s fitting that the very first edition of The Spark was about my trip inside a battery recycling factory. Over the past two years, the world of climate tech has become much more tuned in to topics like mining, recycling, and critical minerals. I’m interested to see how companies continue finding new, creative ways to get what they need to build everything they’re trying to deploy. 

Milestones … and deadlines

Overall, the last couple of years have been some of the most exciting and crucial in the race to address climate change, and it’s only going to ramp up from here. 

Next year marks 10 years since the Paris Agreement, a landmark climate treaty that’s guided most of the world’s ambitions to limit warming to less than 2 °C (3.7 °F) above preindustrial levels. In the US, 2027 will mark five years since the Inflation Reduction Act was passed, ushering in a new era of climate spending for the world’s largest economy. 

The last two years have been a whirlwind of new ideas, research, and technologies, all aimed at limiting the most damaging effects of our changing climate. I’m looking forward to following all the progress of the years to come with you as well. 


Now read the rest of The Spark

Another thing

If you’re reading this, I’m willing to bet that you probably eat food. So you should join us for the latest edition of our subscriber-only Roundtables virtual event series, where I’ll be speaking with my colleague James Temple about creating climate-friendly food. 

Joining us are experts from Pivot Bio and Rumin8, two of our 2024 Climate Tech Companies to Watch. It’s going to be a fascinating discussion—subscribers, register to join us here

And one more 

The growing energy demands of artificial intelligence represent a challenge for the grid. But the technology also offers an opportunity for energy tech, according to the authors of a new op-ed out this week. Check it out for more on why they say that AI and clean energy need each other

Keeping up with climate  

Hurricane Milton reached wind speeds of over 160 miles per hour, making it a Category 5 storm. It’s hitting the gulf coast of Florida in the coming days. See its projected path and the rainfall forecast. (Washington Post
→ Tampa Bay has seen destructive hurricanes, but there hasn’t been a direct hit in decades. The metro area is home to over 3 million people. (Axios)

Other regions are still reeling from Hurricane Helene, which dumped rainfall in western North Carolina in particular. The storm upends ideas of what a climate haven is. (Scientific American)
→ Two studies suggest that climate change significantly boosted rainfall from the storm. (NBC News)

If you have an EV, it’s best to keep it out of flood zones during hurricanes when possible. Batteries submerged in salt water can catch fire, though experts say it’s relatively rare. (New York Times)

The risk of winter blackouts in Great Britain is at the lowest in years, even though the country has shut down its last coal plant. The grid is expected to have plenty of energy, in part because of investment in renewables. (The Guardian)

Voters in Kazakhstan have approved a plan to build the country’s first nuclear power plant. The country has a complicated relationship with nuclear technology, since it was a testing ground for Soviet nuclear weapons. (Power

Revoy wants to bring battery swapping to heavy-duty trucks. The company’s batteries can reduce the amount of diesel fuel a conventional truck needs to drive a route. (Heatmap)
→ I wrote earlier this year about another company building batteries into trailers in an effort to clean up distance trucking. (MIT Technology Review)

The weeds are winning

On a languid, damp July morning, I meet weed scientist Aaron Hager outside the old Agronomy Seed House at the University of Illinois’ South Farm. In the distance are round barns built in the early 1900s, designed to withstand Midwestern windstorms. The sky is a formless white. It’s the day after a storm system hundreds of miles wide rolled through, churning out 80-mile-per-hour gusts and prompting dozens of tornado watches and sirens reminiscent of a Cold War bomb drill.

On about 23 million acres, or roughly two-thirds of the state, farmers grow corn and soybeans, with a smattering of wheat. They generally spray virtually every acre with herbicides, says Hager, who was raised on a farm in Illinois. But these chemicals, which allow one plant species to live unbothered across inconceivably vast spaces, are no longer stopping all the weeds from growing.

Since the 1980s, more and more plants have evolved to become immune to the biochemical mechanisms that herbicides leverage to kill them. This herbicidal resistance threatens to decrease yields—out-of-control weeds can reduce them by 50% or more, and extreme cases can wipe out whole fields. 

At worst, it can even drive farmers out of business. It’s the agricultural equivalent of antibiotic resistance, and it keeps getting worse.

As we drive east from the campus in Champaign-Urbana, the twin cities where I grew up, we spot a soybean field overgrown with dark-green, spiky plants that rise to chest height. 

“So here’s the problem,” Hager says. “That’s all water hemp right there. My guess is it’s been sprayed at least once, if not more than once.”

“With these herbicide-resistant weeds, it’s only going to get worse. It’s going to blow up.”

Water hemp (Amaranthus tuberculatus), which can infest just about any kind of crop field, grows an inch or more a day, and females of the species can easily produce hundreds of thousands of seeds. Native to the Midwest, it has burst forth in much greater abundance over the last few years, because it has become resistant to seven different classes of herbicides. Season-long competition from water hemp can reduce soybean yields by 44% and corn yields by 15%, according to Purdue University Extension.

Most farmers are still making do. Two different groups of herbicides still usually work against water hemp. But cases of resistance to both are cropping up more and more.

“We’re starting to see failures,” says Kevin Bradley, a plant scientist at the University of Missouri who studies weed management. “We could be in a dangerous situation, for sure.”

Elsewhere, the situation is even more grim.

“We really need a fundamental change in weed control, and we need it quick, ’cause the weeds have caught up to us,” says Larry Steckel, a professor of plant sciences at the University of Tennessee. “It’s come to a pretty critical point.” 

On the rise

According to Ian Heap, a weed scientist who runs the International Herbicide-Resistant Weed Database, there have been well over 500 unique cases of the phenomenon in 273 weed species and counting. Weeds have evolved resistance to 168 different herbicides and 21 of the 31 known “modes of action,” which means the specific biochemical target or pathway a chemical is designed to disrupt. Some modes of action are shared by many herbicides.

One of the most wicked weeds in the South, one that plagues Steckel and his colleagues, is a rhubarb-red-stemmed cousin to water hemp known as Palmer amaranth (Amaranthus palmeri). Populations of the weeds have been found that are impervious to nine different classes of herbicides. The plant can grow more than two inches a day to reach eight feet in height and dominate entire fields. Originally from the desert Southwest, it boasts a sturdy root system and can withstand droughts. If rainy weather or your daughter’s wedding prevents you from spraying it for a couple of days, you’ve probably missed your chance to control it chemically.  

Palmer amaranth “will zero your yield out,” Hager says.

Several other weeds, including Italian ryegrass and a tumbleweed called kochia, are inflicting real pain on the farmers in the South and the West, particularly in wheat and sugar beet fields.   

Chemical birth 

Before World War II, farmers generally used cultivators such as plows and harrows to remove weeds and break up the ground. Or they did it by hand—like my mother, who remembers hoeing weeds in cornfields as a kid growing up on an Indiana farm.

That changed with the advent of synthetic pesticides and herbicides, which farmers started using in the 1950s. By the 1970s, some of the first examples of resistance appeared. By the early 1980s, Heap and his colleague Stephen Powles had discovered populations of ryegrass (Lolium rigidum) that were resistant to the most commonly used herbicides, known as ACCase inhibitors, spreading throughout southern Australia. Within a few years, this species had become resistant to yet another class, called ALS-inhibiting herbicides.  

The problem had just begun. It was about to get much worse.

In the mid to late 1990s, the agricultural giant Monsanto—now a part of Bayer Crop Science—began marketing genetically engineered crops including corn and soybeans that were resistant to the commercial weed killer Roundup, the active ingredient of which is called glyphosate. Monsanto portrayed these “Roundup-ready” crops, and the ability to spray whole fields with glyphosate, as a virtual silver bullet for weed control.

Glyphosate quickly became one of the most widely used agricultural chemicals, and it remains so today. It was so successful, in fact, that research and development on other new herbicides withered: No major commercial herbicide appears likely to hit the market anytime soon that could help address herbicide resistance on a grand scale. 

Monsanto claimed it was “highly unlikely” that glyphosate-resistant weeds would become a problem. There were, of course, those who correctly predicted that such a thing was inevitable—among them Jonathan Gressel, a professor emeritus at the Weizmann Institute of Science in Rehovot, Israel, who has been studying herbicides since the 1960s.

Stanley Culpepper, a weed scientist at the University of Georgia, confirmed the first case of Roundup resistance in Palmer amaranth in 2004. Resistance rapidly spread. Both Palmer amaranth and water hemp produce male and female plants, the former of which produce pollen that can blow long distances on the wind to pollinate the latter. This also gives the plant a lot of genetic diversity, which allows it to evolve faster—all the better for herbicide resistance to develop and spread. These super-weeds sowed chaos throughout the state.

“It devastated us,” Culpepper says, recalling the period from 2008 to 2012 as particularly difficult. “We were mowing fields down.”  

Staying alive

Herbicide resistance is a predictable ­outcome of evolution, explains Patrick Tranel, a leader in the field of molecular weed science at the University of Illinois, whose lab is a few miles from the South Farm. 

“When you try to kill something, what does it do? It tries to not be killed,” Tranel says. 

Weeds have developed surprising ways to get around chemical control. One 2009 study published in the Proceedings of the National Academy of Sciences showed that a mutation in the Palmer amaranth genome allowed the plant to make more than 150 copies of the gene that glyphosate targets. That kind of gene amplification had never been reported in plants before, says Franck Dayan, a weed scientist at Colorado State University.

Another bizarre way resistance can arise in that species is via structures called extrachromosomal circular DNA, strands of genetic material including the gene target for glyphosate that exist outside of nuclear chromosomes. This gene can be transferred via wind-blown pollen from plants with this adaptation. 

But scientists are increasingly finding metabolic resistance in weeds, where plants have evolved mechanisms to break down just about any foreign substance—including a range of herbicides. 

Let’s say a given herbicide worked on a population of water hemp one year. If any plants “escape,” or survive, and make seeds, their offspring could possess metabolic resistance to the herbicides used. 

“When you try to kill something, what does it do? It tries to not be killed.”

Patrick Tranel, University of Illinois

There’s evidence of resistance developing to both of the chemical groups that have replaced or been mixed with Roundup to kill this weed: an herbicide called glufosinate and a pair of substances known as 2,4-D and dicamba. These two would normally kill many crops, too, but there are now millions of acres of corn and soy genetically modified to be impervious. So essentially the response has been to throw more chemicals at the problem.

“If it worked last year, if you have metabolic resistance there’s no guarantee it’s going to work this year,” Hager says. 

Many of these herbicides can harm the environment and have the potential to harm human health, says Nathan Donley, the environmental health science director at the Center for Biological Diversity, which is based in Tucson, Arizona. Paraquat, for example, is a neurotoxic chemical banned in more than 60 countries (it’s been linked to conditions like Parkinson’s), Donley says, but it’s being used more and more in the United States. 2,4-D, one of the active ingredients in Agent Orange, is a potential endocrine disruptor, and exposure to it is correlated with increased risk of various cancers. Glyphosate is listed as a probable human carcinogen by an agency within the World Health Organization and has been the subject of tens of thousands of lawsuits worth tens of billions. Atrazine can stick around in groundwater for years and can shrink testicles and reduce sperm count in certain fish, amphibians, reptiles, and mammals.

Replacing glyphosate with herbicides like 2,4-D and dicamba, which are generally more toxic, “is definitely a step in the wrong direction,” Donley says. 

Looking for solutions

It’s not just chemicals. Weeds can become resistant to any type of control method. In a classic example from China, a weed called barnyard grass evolved over centuries to resemble rice and thus evade hand weeding.

Because weeds can evolve relatively quickly, researchers recommend a wide diversity of control tactics. Mixing two herbicides with different modes of action can sometimes work, though that’s not the best for the environment or the farmer’s wallet, Tranel says. Rotating the plants that are grown helps, as does installing winter cover crops and, above all, not using the same herbicide in the same way every year. 

Fundamentally, the solution is to “not focus solely on herbicides for weed management,” says Micheal Owen, a weed scientist and emeritus professor at Iowa State University. And that presents a “major, major issue for the farmer” and the current state of American farms, he adds. 

weeds
BELL HUTLEY

Farms have ballooned in size over the last couple of decades, as a result of rural flight, labor costs, and the advent of chemicals and genetically modified crops that allowed farmers to quickly apply herbicides over massive areas to control weeds. This has led to a kind of sinister simplification in terms of crop diversity, weed control practices, and the like. And the weeds have adjusted. 

On the one hand, it’s understandable that farmers often do the cheapest thing they can to control weeds, to get them through the year. But resistance is a medium- to long-term problem running up against a system of short-term thinking and incentives, says Katie Dentzman, a rural sociologist also at Iowa State University.

Her studies have shown that farmers are generally informed and worried about herbicide resistance but are constrained by a variety of factors that prevent them from really heading it off. The farm is too big to economically control weeds without spraying in a single shot, some farmers say, while others lack the labor, financing, or time. 

Agriculture needs to embrace a diversity of weed control practices, Owen says. But that’s much easier said than done. 

“We’re too narrow-visioned, focusing on herbicides as the solution,” says Steven Fennimore, a weed scientist with the University of California, Davis, based in Salinas, California.

Fennimore specializes in vegetables, for which there are few herbicide options, and there are fewer still for organic growers. So innovation is necessary. He developed a prototype that injects steam into the ground, killing weeds within several inches of the entry point. This has proved around 90% effective, and he’s used it in fields growing lettuce, carrots, and onions. But it is not exactly quick: It takes two or three days to treat a 10-acre block.

Many other nonchemical means of control are gaining traction in vegetables and other high-value crops. Eventually, if the economics and logistics work out, these could catch on in row crops, those planted in rows that can be tilled by machinery. 

A company called Carbon Robotics, for example, produces an AI-driven system called the LaserWeeder that, as the name implies, uses lasers to kill weeds. It is designed to pilot itself up and down crop rows, recognizing unwanted plants and vaporizing them with one of its 30 lasers. LaserWeeders are now active in at least 17 states, according to the company.  

You can also shock weeds by using electricity, and several apparatuses designed to do so are commercially available in the United States and Europe. A typical design involves the use of a height-adjustable copper boom that zaps weeds it touches. The most obvious downside with this method is that the weeds usually have to be taller than the crop. By the time the weeds have grown that high, they’ve probably already caused a decline in yield. 

Weed seed destructors are another promising option. These devices, commonly used in Australia and catching on a bit in places like the Pacific Northwest, grind up and kill the seeds of weeds as wheat is harvested.

An Israeli company called WeedOut hatched a system to irradiate and sterilize the pollen of Palmer amaranth plants and then release it into fields. This way, female plants receive the sterile pollen and fail to produce viable seeds. 

“I’m very excited about this [as] a long-term way to reduce the seed bank and to manage these weeds without having to spray an herbicide,” Owen says. 

WeedOut is currently testing its approach in corn, soybean, and sugar beet fields in the US and working to get EPA approval. It recently secured $8 million in funding to scale up. 

In general, AI-driven rigs and precision spraying are very likely to eventually reduce herbicide use, says Stephen Duke, who studies herbicides at the University of Mississippi: “Eventually I expect we’ll see robotic weeding and AI-driven spray rigs taking over.” But he expects that to take a while on crops like soybeans and corn, since it is economically difficult to invest a lot of money in tending such “low-value” agronomic crops planted across such vast areas.

A handful of startups are pursuing new types of herbicides, based on natural products found in fungi or used by plants to compete with one another. But none of these promise to be ready for market anytime soon.

Field day 

Some of the most successful tools for preventing resistance are not exactly high-tech. That much is clear from the presentations at the Aurora Farm Field Day, organized by Cornell University just north of its campus in Ithaca, New York. 

For example, one of the most important things farmers can do to prevent the spread of weed seeds is to clean out their combines after harvest, especially if they’re buying or using equipment from another state, says Lynn Sosnoskie, an assistant professor and weed scientist at Cornell. 

Combines are believed to have already introduced Palmer amaranth into the state, she says—there are now at least five populations in New York. 

Another classic approach is crop rotation—switching between crops with different life cycles, management practices, and growth patterns is a mainstay of agriculture, and it helps prevent weeds from becoming accustomed to one cropping system. Yet another option is to put in a winter cover crop that helps prevent weeds from getting established. 

“We’re not going to solve weed problems with chemicals alone,” Sosnoskie says. That means we have to start pursuing these kinds of straightforward practices.

It’s an especially important point to hammer home in places like New York state, where the problem isn’t yet top of mind. That’s in part because the state isn’t dominated by monocultures the way the Midwest is, and it has a more diverse patchwork of land use. 

But it’s not immune to the issue. Resistance has arrived and threatens to “blow up,” says Vipan Kumar, also a weed expert at Cornell.

“We have to do everything we can to prevent this,” Kumar says. “My role is to educate people that this is coming, and we have to be ready.”

Douglas Main is a journalist and former senior editor and writer at National Geographic.

Preventing Climate Change: A Team Sport

This sponsored session was presented by MEDC at MIT Technology Review’s 2024 EmTech MIT event.

Michigan is at the forefront of the clean energy transition, setting an example in mobility and automotive innovation. Other states and organizations can learn from Michigan’s approach to public-private partnerships, actionable climate plans, and business-government alignment. Progressive climate policies are not only crucial for sustainability but also for attracting talent in today’s competitive job market.

Read more from MIT Technology Review Insights & MEDC about addressing climate change impacts


About the speaker

Hilary Doe, Chief Growth & Marketing Officer, Michigan Economic Development Corporation

As Chief Growth & Marketing Officer, Hilary Doe leads the state’s efforts to grow Michigan’s population, economy, and reputation as the best place to live, work, raise a family, and start a business. Hilary works alongside the Growing Michigan Together Council on a once-in-a-generation effort to grow Michigan’s population, boost economic growth, and make Michigan the place everyone wants to call home.

Hilary is a dynamic leader in nonprofits, technology, strategy, and public policy. She served as the national director at the Roosevelt Network, where she built and led an organization engaging thousands of young people in civic engagement and social change programming at chapters nationwide, which ultimately earned the organization recognition as a recipient of the MacArthur Award for Creative and Effective Institutions. She also served as Vice President of the Roosevelt Institute, where she oversaw strategy and expanded the Institute’s Four Freedoms Center, with the goal of empowering communities and reducing inequality alongside the greatest economists of our generations. Most recently, she served as President and Chief Strategy Officer at Nationbuilder, working to equip the world’s leaders with software to grow their movements, businesses, and organizations, while spreading democracy.

Hilary is a graduate of the University of Michigan’s Honors College and Ford School of Public Policy, a Detroit resident, and proud Michigander.

Productivity Electrified: Tech That Is Supercharging Business

This sponsored session was presented by Ford Pro at MIT Technology Review’s 2024 EmTech MIT event.

A decarbonized transportation system is a necessary pre-requisite for a sustainable economy. In the transportation industry, the road to electrification and greater technology adoption can also increase business bottom lines and reduce downstream costs to tax payers. Focusing on early adopters such as first responders, local municipalities, and small business owners, we’ll discuss common misconceptions, barriers to adoption, implementation strategies, and how these insights carry over into wide-spread adoption of emerging technology and electric vehicles.


About the speaker

Wanda Young, Global Chief Marketing & Experience Officer, Ford Pro

Wanda Young is a visionary brand marketer and digital transformation expert who thrives at the intersection of brand, digital, technology, and data; paired with a deep understanding of the consumer mindset. She gained her experience working for the largest brands in retail, sports & entertainment, consumer products, and electronics. She is a successful brand marketer and change agent that organizations seek to drive digital and data transformation – a Chief Experience Officer years before the title was invented. In her roles managing multiple notable brands, including Samsung, Disney, ESPN, Walmart, Alltel, and Acxiom, she developed knowledge of the interconnectedness of brand, digital, and data; of the importance of customer experience across all touchpoints; the power of data and localization; and the in-the-trenches accountability to drive outcomes. Now at Ford Pro, the Commercial Division of Ford Motor Company, she is focused on helping grow the newly-launched division and brand which only Ford can offer commercial customers – an integrated lineup of vehicles and services designed to meet the needs of all businesses to keep their productivity on pace to drive growth.

Young enjoyed a series of firsts in her career, including launching ESPN+, developing Walmart’s first social media presence and building 5000 of their local Facebook pages (which are still live today and continue to scale), developing the first weather-triggered ad product with The Weather Company, designing an ad product with Google called Local Inventory Ads, being part of team who took Alltel Wireless private (which later sold to Verizon Wireless), launching the Acxiom.com website on her first Mother’s Day with her daughter on her lap. She serves on the board of or is involved in a number of industry memberships and has been the recipient of many prestigious awards. Young received a Bachelor of Arts in English with a minor in Advertising from the University of Arkansas.

Why artificial intelligence and clean energy need each other

We are in the early stages of a geopolitical competition for the future of artificial intelligence. The winners will dominate the global economy in the 21st century.

But what’s been too often left out of the conversation is that AI’s huge demand for concentrated and consistent amounts of power represents a chance to scale the next generation of clean energy technologies. If we ignore this opportunity, the United States will find itself disadvantaged in the race for the future of both AI and energy production, ceding global economic leadership to China.

To win the race, the US is going to need access to a lot more electric power to serve data centers. AI data centers could add the equivalent of three New York Cities’ worth of load to the grid by 2026, and they could more than double their share of US electricity consumption—to 9%—by the end of the decade. Artificial intelligence will thus contribute to a spike in power demand that the US hasn’t seen in decades; according to one recent estimate, that demand—previously flat—is growing by around 2.5% per year, with data centers driving as much as 66% of the increase.

Energy-hungry advanced AI chips are behind this growth. Three watt-hours of electricity are required for a ChatGPT query, compared with just 0.3 watt-hours for a simple Google search. These computational requirements make AI data centers uniquely power dense, requiring more power per server rack and orders of magnitude more power per square foot than traditional facilities. Sam Altman, CEO of OpenAI, reportedly pitched the White House on the need for AI data centers requiring five gigawatts of capacity—enough to power over 3 million homes. And AI data centers require steady and reliable power 24 hours a day, seven days a week; they are up and running 99.999% of the year.

The demands that these gigawatt-scale users are placing on the electricity grid are already accelerating far faster than we can expand the physical and political structures that support the development of clean electricity. There are over 1,500 gigawatts of capacity waiting to connect to the grid, and the time to build transmission lines to move that power now stretches into a decade. One illustration of the challenges involved in integrating new power sources: The biggest factor delaying Constellation’s recently announced restart of the Three Mile Island nuclear plant isn’t the facility itself but the time required to connect it to the grid.

The reflexive response to the challenge of scaling clean-electricity supply has been to pose a false choice: cede the United States’ advantage in AI or cede our commitment to clean energy. This logic argues that the only way to meet the growing power demands of the computing economy will involve the expansion of legacy energy resources like natural gas and the preservation of coal-fired power plants.

The dire ecological implications of relying on more fossil fuels are clear. But the economic and security implications are just as serious. Further investments in fossil fuels threaten our national competitiveness as other countries leap ahead in the clean technologies that present the next generation of economic opportunity—markets measured in the trillions.

The reality is that the unprecedented scale and density of power needed for AI require a novel set of generation solutions, able to deliver reliable power 24-7 in ever increasing amounts. While advocates for legacy fuels have historically pointed to the variability of renewables, power sources that require massive, distributed, and disruptable fuel supplies like natural gas are also not the answer. In Texas, natural-gas plants accounted for 70% of outages after a severe winter storm in late 2022. As climate change intensifies, weather-related disruptions are only likely to increase.   

Rather than seeing a choice between AI competitiveness and climate, we see AI’s urgent demand for power density as an opportunity to kick-start a slew of new technologies, taking advantage of new buyers and new market structures—positioning the US to not only seize the AI future but create the markets for the energy-dense technologies that will be needed to power it.

Data centers’ incessant demand for computing power is best matched to a set of novel sources of clean, reliable power that are currently undergoing rapid innovation. Those include advanced nuclear fission that can be rapidly deployed at small scale and next-generation geothermal power that can be deployed anywhere, anytime. One day, the arsenal could include nuclear fusion as a source of nearly limitless clean energy. These technologies can produce large amounts of energy in relatively small footprints, matching AI’s demand for concentrated power. They have the potential to provide stable, reliable baseload power matched to AI data centers’ 24-7 operations. While some of these technologies (like fusion) remain in development, others (like advanced fission and geothermal energy) are ready to deploy today.

AI’s power density requirements similarly necessitate a new set of electricity infrastructure enhancements—like advanced conductors for transmission lines that can move up to 10 times as much power through much smaller areas, cooling infrastructure that can address the heat of vast quantities of energy-hungry chips humming alongside one another, and next-generation transformers that enable the efficient use of higher-voltage power. These technologies offer significant economic benefits to AI data centers in the form of increased access to power and reduced latency, and they will enable the rapid expansion of our 20th-century electricity grid to serve 21st-century needs. 

Moreover, the convergence of AI and energy technologies will allow for faster development and scaling of both sectors. Across the clean-energy sector, AI serves as a method of invention, accelerating the pace of research and development for next-generation materials design. It is also a tool for manufacturing, reducing capital intensity and increasing the pace of scaling. Already, AI is helping us overcome barriers in next-generation power technologies. For instance, Princeton researchers are using it to predict and avoid plasma instabilities that have long been obstacles to sustained fusion reactions. In the geothermal and mining context, AI is accelerating the pace and driving down the cost of commercial-grade resource discovery and development. Other firms use AI to predict and optimize performance of power plants in the field, greatly reducing the capital intensity of projects.

Historically, deployment of novel clean energy technologies has had to rely on utilities, which are notoriously slow to adopt innovations and invest in first-of-a-kind commercial projects. Now, however, AI has brought in a new source of capital for power-generation technologies: large tech companies that are willing to pay a premium for 24-7 clean power and are eager to move quickly.

These “new buyers” can build additional clean capacity in their own backyards. Or they can deploy innovative market structures to encourage utilities to work in new ways to scale novel technologies. Already, we are seeing examples, such as the agreement between Google, the geothermal developer Fervo, and the Nevada utility NV Energy to secure clean, reliable power at a premium for use by data centers. The emergence of these price-insensitive but time-sensitive buyers can accelerate the deployment of clean energy technologies.

The geopolitical implications of this nexus between AI and climate are clear: The socioeconomic fruits of innovation will flow to the countries that win both the AI and the climate race. 

The country that is able to scale up access to reliable baseload power will attract AI infrastructure in the long-run—and will benefit from access to the markets that AI will generate. And the country that makes these investments first will be ahead, and that lead will compound over time as technical progress and economic productivity reinforce each other.

Today, the clean-energy scoreboard tilts toward China. The country has commissioned 37 nuclear power plants over the last decade, while the United States has added two. It is outspending the US two to one on nuclear fusion, with crews working essentially around the clock on commercializing the technology. Given that the competition for AI supremacy boils down to scaling power density, building a new fleet of natural-gas plants while our primary competitor builds an arsenal of the most power-dense energy resources available is like bringing a knife to a gunfight.

The United States and the US-based technology companies at the forefront of the AI economy have the responsibility and opportunity to change this by leveraging AI’s power demand to scale the next generation of clean energy technologies. The question is, will they?

Michael Kearney is a general partner at Engine Ventures, a firm that invests in startups commercializing breakthrough science and engineering. Lisa Hansmann is a principal at Engine Ventures and previously served as special assistant to the president in the Biden administration, working on economic policy and implementation.

Why one developer won’t quit fighting to connect the US’s grids

Michael Skelly hasn’t learned to take no for an answer.

For much of the last 15 years, the Houston-based energy entrepreneur has worked to develop long-haul transmission lines to carry wind power across the Great Plains, Midwest, and Southwest, delivering clean electricity to cities like Albuquerque, Chicago, and Memphis. But so far, he has little to show for the effort. 

Skelly has long argued that building such lines and linking together the nation’s grids would accelerate the shift from coal- and natural-gas-fueled power plants to the renewables needed to cut the pollution driving climate change. But his previous business, Clean Line Energy Partners, shut down in 2019, after halting two of its projects and selling off interests in three more.

Skelly contends he was early, not wrong, about the need for such lines, and that the market and policymakers are increasingly coming around to his perspective. Indeed, the US Department of Energy just blessed his latest company’s proposed line with hundreds of millions in grants. 

The North Plains Connector would stretch about 420 miles from southeast Montana to the heart of North Dakota and create the first major connection between the US’s two largest grids, enabling system operators to draw on electricity generated by hydro, solar, wind, and other resources across much of the country. This could help keep regional power systems online during extreme weather events and boost the overall share of electricity generated by those clean sources. 

Skelly says he’s already secured the support of nine utilities around the region for the project, as well as more than 90% of the landowners along the route.

Michael Skelly
Michael Skelly founded Clean Line Energy Partners in 2009.
GRID UNITED

He says that more and more local energy companies have come to recognize that rising electricity demands, the growing threat storms and fires pose to power systems, and the increasing reliance on renewables have hastened the need for more transmission lines to stitch together and reinforce the country’s fraying, fractured grids.

“There’s a real understanding, really, across the country of the need to invest more in the grid,” says Skelly, now chief executive of Grid United, the Houston-based transmission development firm he founded in 2021. “We need more wires in the air.” 

Still, proposals to build long transmission lines frequently stir up controversy in the communities they would cross. It remains to be seen whether this growing understanding will be enough for Skelly’s project to succeed, or to get the US building anywhere near the number of transmission lines it now desperately needs.

Linking grids

Transmission lines are the unappreciated linchpin of the clean-energy transition, arguably as essential as solar panels in cutting emissions and as important as seawalls in keeping people safe.

These long, high, thick wires are often described as the highways of our power systems. They connect the big wind farms, hydroelectric plants, solar facilities, and other power plants to the edges of cities, where substations step down the voltage before delivering electricity into homes and businesses along distribution lines that are more akin to city streets. 

There are three major grid systems in the US: the Western Interconnection, the Eastern Interconnection, and the Texas Interconnected System. Regional grid operators such as the California Independent System Operator, the Midcontinent Independent System Operator, and the New York Independent System Operator oversee smaller local grids that are connected, to a greater or lesser extent, within those larger networks.

Transmission lines that could add significant capacity for sharing electricity back and forth across the nation’s major grid systems are especially valuable for cutting emissions and improving the stability of the power system. That’s because they allow those independent system operators to draw on a far larger pool of electricity sources. So if solar power is fading in one part of the country, they could still access wind or hydropower somewhere else. The ability to balance out fluctuations in renewables across regions and seasons, in turn, reduces the need to rely on the steady output of fossil-fuel plants. 

“There’s typically excess wind or hydro or other resources somewhere,” says James Hewett, manager of the US policy lobbying group at Breakthrough Energy, the Bill Gates–backed organization focusing on clean energy and climate issues. “But today, the limiting constraint is the ability to move resources from the place where they’re excessive to where they’re needed.” 

(Breakthrough Energy Ventures, the investment arm of the firm, doesn’t hold any investments in the North Plains Connector project or Grid United.)

It also means that even if regional wildfires, floods, hurricanes, or heat waves knock out power lines and plants in one area, operators may still be able to tap into adjacent systems to keep the lights on and air-conditioning running. That can be a matter of life and death in the event of such emergencies, as we’ve witnessed in the aftermath of heat waves and hurricanes in recent years.  

Studies have shown that weaving together the nation’s grids can boost the share of electricity that renewables reliably provide, significantly cut power-sector emissions, and lower system costs. A recent study by the Lawrence Berkeley National Lab found that the lines interconnecting the US’s major grids and the regions within them offer the greatest economic value among transmission projects, potentially providing more than $100 million in cost savings per year for every additional gigawatt of added capacity. (The study presupposes that the lines are operated efficiently and to their full capacity, among other simplifying assumptions.)

Experts say that grid interconnections can more than pay for themselves over time because, among other improved efficiencies, they allow grid operators to find cheaper sources of electricity at any given time and enable regions to get by with fewer power plants by relying on the redundancy provided by their neighbors.

But as it stands, the meager links between the Eastern Interconnection and Western Interconnection amount to “tiny little soda straws connecting two Olympic swimming pools,” says Rob Gramlich, president of Grid Strategies, a consultancy in Washington, DC. 

A win-win-win”

Grid United’s North Plains Connector, in contrast, would be a fat pipe.

The $3.2 billion, three-gigawatt project would more than double the amount of electricity that could zip back and forth between those grid systems, and it would tightly interlink a trio of grid operators that oversee regional parts of those larger systems: the Western Electricity Coordinating Council, the Midcontinent Independent System Operator, and the Southwest Power Pool. If the line is developed, each could then more easily tap into the richest, cheapest sources at any given time across a huge expanse of the nation, be it hydropower generated in the Northwest, wind turbines cranking across the Midwest, or solar power produced anywhere.

The North Plains Connector transmission line would stretch from from southeast Montana to the heart of North Dakota, connecting the nation's two biggest grids.
The North Plains Connector transmission line would stretch from from southeast Montana to the heart of North Dakota, connecting the nation’s two biggest grids.
COURTESY: ALLETE

This would ensure that utilities could get greater economic value out of those energy plants, which are expensive to build but relatively cheap to operate, and it would improve the reliability of the system during extreme weather, Skelly says.

“If you’ve got a heat dome in the Northwest, you can send power west,” he says. “If you have a winter storm in the Midwest, you can send power to the east.”

Grid United is developing the project as a joint venture with Allete, an energy company in Duluth, Minnesota, that operates several utilities in the region. 

The Department of Energy granted $700 million to a larger regional effort, known as the North Plains Connector Interregional Innovation project, which encompasses two smaller proposals in addition to Grid United’s. The grants will be issued through a more than $10 billion program established under the Bipartisan Infrastructure Law, enacted by President Joe Biden in 2021. 

That funding will likely be distributed to regional utilities and other parties as partial matching grants, designed to incentivize investments in the project among those likely to benefit from it. That design may also help address a chicken-and-egg problem that plagues independent transmission developers like Grid United, Breakthrough’s Hewett says. 

Regional utilities can pass along the costs of projects to their electricity customers. Companies like Grid United, however, generally can’t sign up the power producers that will pay to use their lines until they’ve got project approval, but they also often can’t secure traditional financing until they’ve lined up customers.

The DOE funding could ease that issue by providing an assurance of capital that would help get the project through the lengthy permitting process, Hewett says. 

“The states are benefiting, local utilities are benefiting, and the developer will benefit,” he says. “It’s a win-win-win.”

Transmission hurdles

Over the years, developers have floated various proposals to more tightly interlink the nation’s major grid systems. But it’s proved notoriously difficult to build any new transmission lines in the US—a problem that has only worsened in recent years. 

The nation is developing only 20% of the transmission capacity per year in the 2020s that it did in the early 2010s. On average, interstate transmission lines take eight to 10 years to develop “if they succeed at all,” according to a report from the Niskanen Center.

The biggest challenge in adding connections between grids, says Gramlich of Grid Strategies, is that there’s no clear processes for authorizing lines that cross multiple jurisdictions and no dedicated regional or federal agencies overseeing such proposals. The fact that numerous areas may benefit from such lines also sparks interregional squabbling over how the costs should be allocated. 

In addition, communities often balk at the sight of wires and towers, particularly if the benefits of the lines mostly accrue around the end points, not necessarily in all the areas the wires cross. Any city, county, or state, or even one landowner, can hold up a project for years, if not kill it.

But energy companies themselves share much of the blame as well. Regional energy agencies, grid operators, and utilities have actively fought proposals from independent developers to erect wires passing through their territories. They often simply don’t want to forfeit control of their systems, invite added competition, or deal with the regulatory complexity of such projects. 

The long delays in building new grid capacity have become a growing impediment to building new energy projects.

As of last year, there were 2,600 gigawatts’ worth of proposed energy generation or storage projects waiting in the wings for transmission capacity that would carry their electricity to customers, according to a recent analysis by Lawrence Berkeley National Lab. That’s roughly the electricity output of 2,600 nuclear reactors, or more than double the nation’s entire power system. 

The capacity of projects in the queue has risen almost eightfold from a decade ago, and about 95% of them are solar, wind, or battery proposals.

“Grid interconnection remains a persistent bottleneck,” Joseph Rand, an energy policy researcher at the lab and the lead author of the study, said in a statement.

The legacy of Clean Line Energy

Skelly spent the aughts as the chief development officer of Horizon Wind Energy, a large US wind developer that the Portuguese energy giant EDP snapped up in 2007 for more than $2 billion. Skelly then made a spirited though ill-fated run for Congress in 2008, as the Democratic nominee for the 7th Congressional District of Texas. He ran on a pro-renewables, pro-education campaign but lost by a sizable margin in a district that was solidly Republican.

The following year, he founded Clean Line Energy Partners. The company raised tens of millions of dollars and spent a decade striving to develop five long-range transmission projects that could connect the sorts of wind projects Skelly had worked to build before.

The company did successfully earn some of the permits required for several lines. But it was forced to shut down or offload its projects amid pushback from landowner groups and politicians opposed to renewables, as well as from regional utilities and public utility commissions. 

“He was going to play in other people’s sandboxes and they weren’t exactly keen on having him in there,” says Russell Gold, author of Superpower: One Man’s Quest to Transform American Energy, which recounted Skelly’s and Clean Line Energy’s efforts and failures.

Ultimately, those obstacles dragged out the projects beyond the patience of the company’s investors, who declined to continue throwing more money at them, he says. 

The company was forced to halt the Centennial West line through New Mexico and the Rock Island project across the Midwest. In addition, it sold off its stake in the Grain Belt Express, which would stretch from Kansas to Indiana, to Invenergy; the Oklahoma portion of the Plains and Eastern line to NextEra Energy; and the Western Spirit line through New Mexico, along with an associated wind farm project, to Pattern Development. 

Clean Line Energy itself wound down in 2019.

The Western Spirit transmission line was electrified in late 2021, but the other two projects are still slogging through planning and permitting.

“These things take a long time,” Skelly says. 

For all the challenges the company faced, Gold still credits it with raising awareness about the importance and necessity of long-distance interregional transmission. He says it helped spark conversations that led the Federal Energy Regulatory Commission to eventually enact rules to support regional transmission planning and encouraged other big players to focus more on building transmission lines.

“I do believe that there is a broader social, political, and commercial awareness now that the United States needs to interconnect its grids,” Gold says. 

Lessons learned

Skelly spent a few years as a senior advisor at Lazard, consulting with companies on renewable energy. But he was soon ready to take another shot at developing long-haul transmission lines and started Grid United in 2021.

The new company has proposed four transmission projects in addition to the North Plains Connector—one between Arizona and New Mexico, one between Colorado and Oklahoma, and one each within Texas and Wyoming.

Asked what he thinks the legacy of Clean Line Energy is, Skelly says it’s mixed. But he soon adds that the history of US infrastructure building is replete with projects that didn’t move ahead. The important thing, he says, is to draw the right lessons from those failures.

“When we’re smart about it, we look at the past to see what we can learn,” he says. “We certainly do that today in our business.”

Skelly says one of the biggest takeaways was that it’s important to do the expensive upfront work of meeting with landowners well in advance of applying for permitting, and to use their feedback to guide the line of the route. 

Anne Hedges, director of policy and legislative affairs at the Montana Environmental Information Center, confirms that this is the approach Grid United has taken in the region so far.

“A lot of developers seem to be more focused on drawing a straight line on a map rather than working with communities to figure out the best placement for the transmission system,” she says. “Grid United didn’t do that. They got out on the ground and talked to people and planned a route that wasn’t linear.”

The other change that may make Grid United’s project there more likely to move forward has more to do with what the industry’s learned than what Skelly has.  

Gramlich says regional grid operators and utilities have become more receptive to collaborating with developers on transmission lines—and for self-interested reasons. They’ll need greater capacity, and soon, to stay online and meet the growing energy demands of data centers, manufacturing facilities, electric vehicles, and buildings, and address the risks to power systems from extreme weather events.

Industry observers are also hopeful that an energy permitting reform bill pending in Congress, along with the added federal funding and new rules requiring transmission providers to do more advance planning, will also help accelerate development. The bipartisan bill promises to shorten the approval process for projects that are determined to be in the national interest. It would also require neighboring areas to work together on interregional transmission planning.

Hundreds of environmental groups have sharply criticized the proposal, which would also streamline approvals for certain oil and gas operations.

“This legislation guts bedrock environmental protections, endangers public health, opens up tens of millions of acres of public lands and hundreds of millions of acres of offshore waters to further oil and gas leasing, gives public lands to mining companies, and would defacto rubberstamp gas export projects that harm frontline communities and perpetuate the climate crisis,” argued a letter signed by 350.org, Earthjustice, the Center for Biological Diversity, the Union of Concerned Scientists, and hundreds of other groups.

But a recent analysis by Third Way, a center-left think tank in Washington, DC, found that the emissions benefits from accelerating transmission permitting could significantly outweigh the added climate pollution from the fossil-fuel provisions in the bill. It projects that the bill would, on balance, reduce global emissions by 400 million to 16.6 billion tons of carbon dioxide through 2050. 

“Guardedly optimistic” 

Grid United expects to begin applying for county and state permits in the next few months and for federal permits toward the end of the year. It hopes to begin construction within the next four years and switch the line on in 2032.

Since the applications haven’t been made, it’s not clear what individuals or groups are or will be opposed to it—though, given the history of such projects, some will surely object.

Hedges says the Montana Environmental Information Center is reserving judgment until it sees the actual application. She says the organization will be particularly focused on any potential impact on water and wildlife across the region, “making sure that they’re not harming what are already struggling resources in this area.”

So if Skelly was too early with his last company, the obvious question is: Are the market, regulatory, and societal conditions now ripe for interregional transmission lines?

“We’re gonna find out if they are, right?” he says. “We don’t know yet.”

Skelly adds that he doesn’t think the US is going to build as much transmission as it needs to. But he does believe we’ll start to see more projects moving forward—including, he hopes, the North Plains Connector.

“You just can’t count on anything, and you’ve just got to keep going and push, push, push,” he says. “But we’re making good progress. There’s a lot of utility interest. We have a big grant from the DOE, which will help bring down the cost of the project. So knock on wood, we’re guardedly optimistic.”

Coming soon: Our 2024 list of 15 Climate Tech Companies to Watch

MIT Technology Review set out last year to recognize 15 companies from around the world that demonstrated they have a real shot at meaningfully driving down greenhouse-gas emissions and safeguarding society from the worst impacts of climate change.

We’re excited to announce that we took up the task again this year and will publish our 2024 list of 15 Climate Tech Companies to Watch on October 1. We’ll reveal it first on stage to attendees at our upcoming EmTech MIT event, and then share it online later that day.

The work these companies are doing is needed now more than ever. Global warming appears to be accelerating. The oceans are heating up faster than expected. And some scientists fear the planet is approaching tipping points that could trigger dramatic shifts in Earth’s ecosystems.

Nations must cut the greenhouse-gas pollution fueling that warming, and the heat waves, hurricanes, droughts, and fires it brings, as fast as possible. But we can’t simply halt emissions without plunging the global economy into a deep depression and the world into chaos. 

Any realistic plan to cut billions of tons of emissions over the next few decades requires us to develop and scale up cleaner ways of producing electricity, manufacturing goods, generating heat and cooling, and moving people and stuff around the world. 

To do that, we need competitive companies that can displace heavily polluting industries, or force them to clean up their acts. Those firms need to provide consumers with low-emissions options that, ideally, don’t feel like a sacrifice. And because climate change is underway, we also need technologies and services and infrastructure that can keep communities safe even as the world grows hotter and the weather becomes more erratic and extreme.

As we stated last year, we don’t claim to be oracles or soothsayers. The success of any one business depends on many hard-to-predict variables, including market conditions, political winds, investor sentiment, and consumer preferences. Taking aim at the business model and margins of conglomerates is especially fraught—and some of these firms may well fail.

But we did our best to select companies with solid track records that are tackling critical climate problems and have shown recent progress. 

This year’s list includes companies working to cut stubborn agricultural emissions, mine the metals needed for the energy transition in cleaner ways, and help communities tamp out wildfires before they become infernos. Others are figuring out new ways to produce fuels that can power our vehicles and industries, without adding more carbon dioxide to the atmosphere. 

A few companies from last year’s list also made the cut again because they’ve made notable strides toward their goals in the past 12 months.

We’re proud to publish the full list in the coming weeks. We hope you’ll take a look, ideally learn something new, and perhaps leave feeling encouraged that the world can make the changes needed to ease the risks of climate change and build a more sustainable future.

❌