Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

What Japan’s “megaquake” warning really tells us

MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

On August 8, at 16:42 local time, a magnitude-7.1 earthquake shook southern Japan. The temblor, originating off the shores of mainland island of Kyūshū, was felt by nearly a million people across the region, and initially, the threat of a tsunami emerged. But only a diminutive wave swept ashore, buildings remained upright, and nobody died. The crisis was over as quickly as it began.

But then, something new happened. The Japan Meteorological Agency, a government organization, issued a ‘megaquake advisory’ for the first time. This pair of words may appear disquieting—and to some extent, they are. There is a ticking bomb below Japanese waters, a giant crevasse where one tectonic plate dives below another. Stress has been accumulating across this boundary for quite some time, and inevitably, it will do what it has repeatedly done in the past: part of it will violently rupture, generating a devastating earthquake and a potentially huge tsunami.

The advisory was in part issued because it is possible that the magnitude-7.1 quake is a foreshock – a precursory quake – to a far larger one, a tsunami-making monster that could kill a quarter of a million people.

The good news, for now, is that scientists think it is very unlikely that that magnitude-7.1 quake is a prelude to a cataclysm. Nothing is certain, but “the chances that this actually is a foreshock are really quite low,” says Harold Tobin, the director of the Pacific Northwest Seismic Network.

The advisory, ultimately, isn’t prophetic. Its primary purpose is to let the public know that scientists are aware of what’s going on, that they are cognizant of the worst-case scenario—and that everyone else should be mindful of that grim possibility too. Evacuation routes should be memorized, and emergency supplies should be obtained, just in case.

“Even if the probability is low, the consequences are so high,” says Judith Hubbard, an earthquake scientist at Cornell University. “It makes sense to worry about some of these low probabilities.”

Japan, which sits atop a tectonic jigsaw, is no stranger to large earthquakes. Just this past New Year’s Day, a magnitude-7.6 temblor convulsed the Noto Peninsula, killing 230 people. But special attention is paid to certain quakes even when they cause no direct harm.

The August 8 event took place on the Nankai subduction zone: here, the Philippine Sea plate creeps below Japan, which is attached to the Eurasian plate. This type of plate boundary is the sort capable of producing ‘megaquakes’, those of a magnitude-8.0 and higher. (The numerical difference may seem small, but the scale is logarithmic: a magnitude-8.0 quake unleashes 32 times more energy than a magnitude-7.0 quake.)

Consequently, the Nankai subduction zone (or Nankai Trough) has created several historical tragedies. A magnitude-7.9 quake in 1944 was followed by a magnitude-8.0 quake in 1946; both events were caused by part of the submarine trench jolting. The magnitude-8.6 quake of 1707, however, involved the rupture of the entire Nankai Trough. Thousands died on each occasion.

Predicting disaster

Predicting when and where the next major quake will happen anywhere on Earth is currently impossible. Nankai is no different: as recently noted by Hubbard on her blog Earthquake Insights – co-authored with geoscientist Kyle Bradley – there isn’t a set time between Nankai’s major quakes, which range from days to several centuries.

But as stress is continually accumulating on that plate boundary, it’s certain that, one day, the Nankai Trough will let loose another great quake, one which could push a vast volume of seawater toward a large swath of western and central Japan, making a tsunami 100 feet tall. The darkest scenario suggests that 230,000 could perish, two million buildings would be damaged or destroyed, and the country would be left with a $1.4 trillion bill.

Naturally, a magnitude-7.1 quake on that Trough worries scientists. Aftershocks (a series of smaller magnitude quakes) are a guaranteed feature of potent quakes. But there is a small chance that a large quake will be followed by an even larger quake, retrospectively making the first a foreshock.

“The earthquake changes the stress in the surrounding crust a little bit,” says Hubbard. Using the energy released during the August 8 rupture, and decoding the seismic waves created during the quake, scientists can estimate how much stress gets shifted to surrounding faults.

The worry is that some of the stress released by one quake gets transferred to a big fault that hasn’t ruptured in a very long time but is ready to fold like an explosive house of cards. “You never know which increment of stress is gonna be the one that pushes it over the edge.”

Scientists cannot tell whether a large quake is a foreshock until a larger quake occurs. But the possibility remains that the August 8 temblor is a foreshock to something considerably worse. Statistically, it’s unlikely. But there is additional context to why that megaquake advisory was issued: the specter of 2011’s magnitude-9.1 Tōhoku earthquake and tsunami, which killed 18,000 people, still haunts the Japanese government and the nation’s geoscientists. 

Hubbard explains that, two days before that quake struck off Japan’s eastern seaboard, there was a magnitude-7.2 event in the same area—now known to be a foreshock to the catastrophe. Reportedly, authorities in Japan regretted not highlighting that possibility in advance, which may have meant people on the eastern seaboard would have been more prepared, and more capable, of escaping their fate.

A sign to get prepared

In response, Japan’s government created new protocols for signaling that foreshock possibility. Most magnitude-7.0-or-so quakes would not be followed by a ‘megaquake advisory’. Only those happening in tectonic settings able to trigger truly gigantic quakes will—and that includes the Nankai Trough.

Crucially, this advisory is not a warning that a megaquake is imminent. It means: “be ready for when the big earthquake comes,” says Hubbard. Nobody is mandated to evacuate, but they are asked to know their escape routes. Meanwhile, local news reports that nursing homes and hospitals in the region are tallying emergency supplies while moving immobile patients to higher floors or other locations. The high-speed Shinkansen railway trains are running at a reduced maximum speed, and certain flights are carrying more fuel than usual in case they need to divert.

Earthquake advisories aren’t new. “California has something similar, and has issued advisories before,” says Wendy Bohon, an independent earthquake geologist. In September 2016, for example, a swarm of hundreds of modest quakes caused the U.S. Geological Survey to publicly advise that, for a week, there was a 0.03 to 1% chance of a magnitude-7.0-or-greater quake rocking the Southern San Andreas Fault—an outcome that fortunately didn’t come to pass.

But this megaquake advisory is Japan’s first, and it will have both pros and cons. “There are economic and social consequences to this,” says Bohon. Some confusion about how to respond has been reported, and widespread cancellations of travel to the region will come with a price tag. 

But calm reactions to the advisory seem to be the norm, and (ideally) this advisory will result in an increased understanding of the threat of the Nankai Trough. “It really is about raising awareness,” says Adam Pascale, chief scientist at the Seismology Research Centre in Melbourne, Australia. “It’s got everyone talking. And that’s the point.”

Geoscientists are also increasingly optimistic that the August 8 quake isn’t a harbinger of a seismic pandemonium. “This thing is way off to the extreme margin of the actual Nankai rupture zone,” says Tobin—meaning it may not even count as being in the zone of tectonic concern. 

A blog post co-authored by Shinji Toda, a seismologist at Tōhoku University in Sendai, Japan, also estimates that any stress transferal to the dangerous parts of the Trough is negligible. There is no clear evidence that the plate boundary is acting weirdly. And with each day that goes by, the odds of the August 8 quake being a foreshock drop even further.

Tech defenses

But if a megaquake did suddenly emerge, Japan has a technological shield that may mitigate a decent portion of the disaster. 

Buildings are commonly fitted with dampeners that allow them to withstand dramatic quake-triggered shaking. And like America’s West Coast, the entire archipelago has a sophisticated earthquake early-warning system: seismometers close to the quake’s origins listen to its seismic screams, and software makes a quick estimate of the magnitude and shaking intensity of the rupture, before beaming it to people’s various devices, giving them invaluable seconds to get to cover. Automatic countermeasures also slow trains down, control machinery in factories, hospitals, and office buildings, to minimize damage from the incoming shaking.

A tsunami early-warning system also kicks into gear if activated, beaming evacuation notices to phones, televisions, radios, sirens, and myriad specialized receivers in buildings in the afflicted region—giving people several minutes to flee. A megaquake advisory may be new, but for a population highly knowledgeable about earthquake and tsunami defense, it’s just another layer of protection.

The advisory has had other effects too: it’s caused those in another imperiled part of the world to take notice. The Cascadia Subduction Zone offshore from the US Pacific Northwest is also capable of producing both titanic quakes and prodigious tsunamis. Its last grand performance, in 1700, created a tsunami that not only inundated large sections of the North American coast, but it also swamped parts of Japan, all the way across the ocean.

Japan’s megaquake advisory has got Tobin thinking: “What would we do if our subduction zone starts acting weird?” he says—which includes a magnitude-7.0 quake in the Cascadian depths. “There is not a protocol in place the way there is in Japan.” Tobin speculates that a panel of experts would quickly assemble, and a statement – perhaps one not too dissimilar to Japan’s own advisory – would emerge from the U.S. Geological Survey. Like Japan, “we would have to be very forthright about the uncertainty,” he says.

Whether it’s Japan or the US or anywhere else, such advisories aren’t meant to engender panic. “You don’t want people to live their lives in fear,” says Hubbard. But it’s no bad thing to draw attention to the fact that Earth can sometimes be an unforgiving place to live.

Robin George Andrews is an award-winning science journalist and doctor of volcanoes based in London. He regularly writes about the Earth, space, and planetary sciences, and is the author of two critically acclaimed books: Super Volcanoes (2021) and How To Kill An Asteroid (October 2024).

How to fix a Windows PC affected by the global outage

MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more here.

Windows PCs have crashed in a major IT outage around the world, bringing airlines, major banks, TV broadcasters, health-care providers, and other businesses to a standstill.

Airlines including United, Delta, and American have been forced to ground and delay flights, stranding passengers in airports, while the UK broadcaster Sky News was temporarily pulled off air. Meanwhile, banking customers in Europe, Australia, and India have been unable to access their online accounts. Doctor’s offices and hospitals in the UK have lost access to patient records and appointment scheduling systems. 

The problem stems from a defect in a single content update for Windows machines from the cybersecurity provider CrowdStrike. George Kurtz, CrowdStrike’s CEO, says that the company is actively working with customers affected.

“This is not a security incident or cyberattack,” he said in a statement on X. “The issue has been identified, isolated and a fix has been deployed. We refer customers to the support portal for the latest updates and will continue to provide complete and continuous updates on our website.” CrowdStrike pointed MIT Technology Review to its blog with additional updates for customers.

What caused the issue?

The issue originates from a faulty update from CrowdStrike, which has knocked affected servers and PCs offline and caused some Windows workstations to display the “blue screen of death” when users attempt to boot them. Mac and Linux hosts are not affected.

The update was intended for CrowdStrike’s Falcon software, which is “endpoint detection and response” software designed to protect companies’ computer systems from cyberattacks and malware. But instead of working as expected, the update caused computers running Windows software to crash and fail to reboot. Home PCs running Windows are less likely to have been affected, because CrowdStrike is predominantly used by large organizations. Microsoft did not immediately respond to a request for comment.

“The CrowdStrike software works at the low-level operating system layer. Issues at this level make the OS not bootable,” says Lukasz Olejnik, an independent cybersecurity researcher and consultant, and author of Philosophy of Cybersecurity.

Not all computers running Windows were affected in the same way, he says, pointing out that if a machine’s systems had been turned off at the time CrowdStrike pushed out the update (which has since been withdrawn), it wouldn’t have received it.

For the machines running systems that received the mangled update and were rebooted, an automated update from CloudStrike’s server management infrastructure should suffice, he says.

“But in thousands or millions of cases, this may require manual human intervention,” he adds. “That means a really bad weekend ahead for plenty of IT staff.”

How to manually fix your affected computer

There is a known workaround for Windows computers that requires administrative access to its systems. If you’re affected and have that high level of access, CrowdStrike has recommended the following steps:

1. Boot Windows into safe mode or the Windows Recovery Environment.

2. Navigate to the C:\Windows\System32\drivers\CrowdStrike directory.

3. Locate the file matching “C-00000291*.sys” and delete it.

4. Boot the machine normally.

Sounds simple, right? But while the above fix is fairly easy to administer, it requires someone to enter it physically, meaning IT teams will need to track down remote machines that have been affected, says Andrew Dwyer of the Department of Information Security at Royal Holloway, University of London.

“We’ve been quite lucky that this is an outage and not an exploitation by a criminal gang or another state,” he says. “It also shows how easy it is to inflict quite significant global damage if you get into the right part of the IT supply chain.”

While fixing the problem is going to cause headaches for IT teams for the next week or so, it’s highly unlikely to cause significant long-term damage to the affected systems—which would not have been the case if it had been ransomware rather than a bungled update, he says.

“If this was a piece of ransomware, there could have been significant outages for months,” he adds. “Without endpoint detection software, many organizations would be in a much more vulnerable place. But they’re critical nodes in the system that have a lot of access to the computer systems that we use.”

Google, Amazon and the problem with Big Tech’s climate claims

17 July 2024 at 11:00

MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

Last week, Amazon trumpeted that it had purchased enough clean electricity to cover the energy demands of all the offices, data centers, grocery stores, and warehouses across its global operations, seven years ahead of its sustainability target. 

That news closely followed Google’s acknowledgment that the soaring energy demands of its AI operations helped ratchet up its corporate emissions by 13% last year—and that it had backed away from claims that it was already carbon neutral.

If you were to take the announcements at face value, you’d be forgiven for believing that Google is stumbling while Amazon is speeding ahead in the race to clean up climate pollution. 

But while both companies are coming up short in their own ways, Google’s approach to driving down greenhouse-gas emissions is now arguably more defensible. 

In fact, there’s a growing consensus that how a company gets to net zero is more important than how fast it does so. And a new school of thought is emerging that moves beyond the net-zero model of corporate climate action, arguing that companies should focus on achieving broader climate impacts rather than trying to balance out every ton of carbon dioxide they emit. 

But to understand why, let’s first examine how the two tech giants’ approaches stack up, and where company climate strategies often go wrong.

Perverse incentives

The core problem is that the costs and complexity of net-zero emissions plans, which require companies to cut or cancel out every ton of climate pollution across their supply chains, can create perverse incentives. Corporate sustainability officers often end up pursuing the quickest, cheapest ways of cleaning up a company’s pollution on paper, rather than the most reliable ways of reducing its emissions in the real world. 

That may mean buying inexpensive carbon credits to offset ongoing pollution from their direct operations or that of their suppliers, rather than undertaking the tougher task of slashing those emissions at the source. Those programs can involve paying other parties to plant trees, restore coastal ecosystems, or alter agriculture practices in ways that purport to reduce emissions or pull carbon dioxide out of the air. The snag is, numerous studies and investigative stories have shown that such efforts often overstate the climate benefits, sometimes wildly.  

Net-zero goals can also compel companies to buy what are known as renewable energy credits (RECs), which ostensibly support additional generation of renewable electricity but raise similar concerns that the climate gains are overstated.

The argument for RECs is that companies often can’t purchase a pure stream of clean electricity to power their operations, since grid operators rely on a mix of natural gas, coal, solar, wind, and other sources. But if those businesses provide money or an indication of demand that spurs developers to build new renewables projects and generate more clean electricity than they would have otherwise, the companies can then claim this cancels out ongoing pollution from the electricity they use.

Experts, however, are less and less convinced of the value of RECs at this stage.

The claim that clean-energy projects wouldn’t have been built without that added support is increasingly unconvincing in a world where those facilities can easily compete in the marketplace on their own, Emily Grubert, an associate professor at Notre Dame, previously told me. And if a company’s purchase of such credits doesn’t bring about changes that reduce the emissions in the atmosphere, it can’t balance out the company’s ongoing pollution. 

‘Creative accounting’

For its part, Amazon is relying on both carbon credits and RECs. 

In its sustainability report, the company says that it reached its clean-electricity targets and drove down emissions by improving energy efficiency, buying more carbon-free power, building renewables projects at its facilities, and supporting such projects around the world. It did this in part by “purchasing additional environmental attributes (such as renewable energy credits) to signal our support for renewable energy in the grids where we operate, in line with the expected generation of the projects we have contracted.”

But there’s yet another issue that can arise when a company pays for clean power that it’s not directly consuming, whether through RECs or through power purchase agreements made before a project is built: Merely paying for renewable electricity generation that occurred at some point, somewhere in the world, isn’t the same as procuring the amount of electricity that the company consumed in the specific places and times that it did so. As you may have heard, the sun stops shining and the wind stops blowing, even as Amazon workers and operations keep grinding around the world and around the clock. 

Paying a solar-farm operator some additional money for producing electricity it was already going to generate in the middle of the day doesn’t in any meaningful way reverse the emissions that an Amazon fulfillment center or server farm produces by, say, drawing electricity from a natural-gas power plant two states away in the middle of the night. 

“The reality on the ground is that its data centers are driving up demand for fossil fuels,” argued a report last week from Amazon Employees for Climate Justice, a group of workers that has been pushing the company to take more aggressive action on climate change. 

The organization said that a significant share of Amazon’s RECs aren’t driving development of new projects. It also stressed that those payments and projects often aren’t generating electricity in the same areas and at the same times that Amazon is consuming power.

The employee group estimates that 78% of Amazon’s US energy comes from nonrenewable sources and accuses the company of using “creative accounting” to claim it’s reached its clean-electricity goals.

To its credit, Amazon is investing billions of dollars in renewables, electrifying its fleet of delivery vehicles, and otherwise making real strides in reducing its waste and emissions. In addition, it’s lobbying US legislators to make it easier to permit electric transmission projects, funding more reliable forms of carbon removal, and working to diversify its mix of electricity sources. The company also insists it’s being careful and selective about the types of carbon offsets it supports, investing only in “additional, quantifiable, real, permanent, and socially beneficial” projects.

“Amazon is focused on making the grid cleaner and more reliable for everyone,” the company said in response to an inquiry from MIT Technology Review. “An emissions-first approach is the fastest, most cost-effective and scalable way to leverage corporate clean-energy procurement to help decarbonize global power grids. This includes procuring renewable energy in locations and countries that still rely heavily on fossil fuels to power their grids, and where energy projects can have the biggest impact on carbon reduction.”

The company has adopted what’s known as a “carbon matching” approach (which it lays out further here), stressing that it wants to be sure the emissions reduced through its investments in renewables equal or exceed the emissions it continues to produce. 

But a recent study led by Princeton researchers found that carbon matching had a “minimal impact” on long-term power system emissions, because it rarely helps get projects built or clean energy generated where those things wouldn’t have happened anyway.

“It’s an offsetting scheme at its core,” Wilson Ricks, an author of the study and an energy systems researcher at Princeton, said of the method, without commenting on Amazon specifically. 

(Meta, Salesforce, and General Motors have also embraced this model, the study notes.)

The problem in asserting that a company is effectively running entirely on clean electricity, when it’s not doing so directly and may not be doing so completely, is that it takes off any pressure to finish the job for real. 

Backing off claims of carbon neutrality

Google has made its own questionable climate claims over the years as well, and it faces growing challenges as the energy it uses for artificial intelligence soars. 

But it is striving to address its power consumption in arguably more defensible ways and now appears to be taking some notable course-correcting steps, according to its recent sustainability report

Google says that it’s no longer buying carbon credits that purport to prevent emissions. With this change, it has also backed away from the claim that it had already achieved carbon neutrality across its operations years ago.

“We’re no longer procuring carbon avoidance credits year-over-year to compensate for our annual operational emissions,” the company told MIT Technology Review in a statement. “We’re instead focusing on accelerating an array of carbon solutions and partnerships that will help us work toward our net-zero goal, while simultaneously helping develop broader solutions to mitigate climate change.”

Notably, that includes funding the development of more expensive but possibly more reliable ways of pulling greenhouse gas out of the atmosphere through direct air capture machines or other methods. The company pledged $200 million to Frontier, an effort to pay in advance for one billion tons of carbon dioxide that startups will eventually draw down and store. 

Those commitments may not allow the company to make any assertions about its own emissions today, and some of the early-stage approaches it funds might not work at all. But the hope is that these sorts of investments could help stand up a carbon removal industry, which studies find may be essential for keeping warming in check over the coming decades. 

Clean power around the clock

In addition, for several years now Google has worked to purchase or otherwise support generation of clean power in the areas where it operates and across every hour that it consumes electricity—an increasingly popular approach known as 24/7 carbon-free energy.

The idea is that this will stimulate greater development of what grid operators increasingly need: forms of carbon-free energy that can run at all hours of the day (commonly called “firm generation”), matching up with the actual hour-by-hour energy demands of corporations. That can include geothermal plants, nuclear reactors, hydroelectric plants, and more.

More than 150 organizations and governments have now signed the 24/7 Carbon-Free Energy Compact, a pledge to ensure that clean-electricity purchases match up hourly with their consumption. Those include Google, Microsoft, SAP, and Rivian.

The Princeton study notes that hourly matching is more expensive than other approaches but finds that it drives “significant reductions in system-level CO2 emissions” while “incentivizing advanced clean firm generation and long-duration storage technologies that would not otherwise see market uptake.”

In Google’s case, pursuing 24/7 matching has steered the company to support more renewables projects in the areas where it operates and to invest in more energy storage projects. It has also entered into purchase agreements with power plants that can deliver carbon-free electricity around the clock. These include several deals with Fervo Energy, an enhanced-geothermal startup.

The company says its goal is to achieve net-zero emissions across its supply chains by 2030, with all its electricity use synced up, hour by hour, with clean sources across every grid it operates on.

Energy-hungry AI

Which brings us back to the growing problem of AI energy consumption.

Jonathan Koomey, an independent researcher studying the energy demands of computing, argues that the hue and cry over rising electricity use for AI is overblown. He notes that AI accounts for only a sliver of overall energy consumption from information technology, which produces about 1.4% of global emissions.

But major data center companies like Google, Amazon, and others will need to make significant changes to ensure that they stay ahead of rising AI-driven energy use while keeping on track with their climate goals.

They will have to improve overall energy efficiency, procure more clean energy, and use their clout as major employers to push utilities to increase carbon-free generation in the areas where they operate, he says. But the clear focus must be on directly cutting corporate climate pollution, not mucking around with RECs and offsets.

“Reduce your emissions; that’s it,” Koomey says. “We need actual, real, meaningful emissions reductions, not trading around credits that have, at best, an ambiguous effect.”

Google says it’s already making progress on its AI footprint, while stressing that it’s leveraging artificial intelligence to find ways to drive down climate pollution across sectors. Those include efforts like Tapestry, a project within the company’s X “moonshot factory” to create more efficient and reliable electricity grids, as well as a Google Research collaboration to determine airline flight paths that produce fewer heat-trapping cirrus clouds

“AI holds immense promise to drive climate action,” the company said in its report.

The contribution model

The contrasting approaches of Google and Amazon call to mind an instructive hypothetical that a team of carbon market researchers sketched out in a paper this January. They noted that one company could do the hard, expensive work of directly eliminating nearly every ton of its emissions, while another could simply buy cheap offsets to purportedly address all of its own. In that case the first company would have done more actual good for the climate, but only the latter would be able to say it had reached its net-zero target.

Given these challenges and the perverse incentives driving companies toward cheap offsets, the authors have begun arguing for a different approach, known as the “contribution model.”

Like Koomey and others, they stress that companies should dedicate most of their money and energy to directly cutting their emissions as much as possible. But they assert that companies should adopt a new way of dealing with what’s left over (either because that remaining pollution is occurring outside their direct operations or because there are not yet affordable, emissions-free alternatives).

Instead of trying to cancel out every ongoing ton of emissions, a company might pick a percentage of its revenue or set a defensible carbon price on those tons, and then dedicate all that money toward achieving the maximum climate benefit the money can buy, says Libby Blanchard, a research scholar at the University of Cambridge. (She coauthored the paper on the contribution model with Barbara Haya of the University of California, Berkeley, and Bill Anderegg at the University of Utah.)

That could mean funding well-managed forestry projects that help trap carbon dioxide, protect biodiversity, and improve air and water quality. It could mean supporting research and development on the technologies still needed to slow global warming and efforts to scale them up, as Google seems to be doing. Or it could even mean lobbying for stricter climate laws, since few things can drive change as quickly as public policy. 

But the key difference is that the company won’t be able to claim that those actions canceled out every ton of remaining emissions—only that it took real, responsible steps to “contribute” to addressing the problem of climate change. 

The hope is that this approach frees companies to focus on the quality of the projects it funds, not the quantity of cheap offsets it buys, Blanchard says.

It could “replace this race to the bottom with a race to the top,” she says.

As with any approach put before profit-motivated companies that employ ranks of savvy accountants and attorneys, there will surely be ways to abuse this method in the absence of appropriate safeguards and oversight.

And plenty of companies may refuse to adopt it, since they won’t be able to claim they’ve achieved net-zero emissions, which has become the de facto standard for corporate climate action.

But Blanchard says there’s one obvious incentive for them to move away from that goal.

“There’s way less risk that they’ll be sued or accused of greenwashing,” she says.

❌
❌