Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

For fame or a death wish? Kids’ TikTok challenge injuries stump psychiatrists

By: Beth Mole
6 November 2024 at 23:19

Kids and teens can make some pretty hairbrained choices sometimes. But when a kid's choice is to engage in a TikTok challenge that threatens their life, psychiatrists can struggle to understand if it was just an exasperating poor choice born out of impulsivity and immaturity or something darker—an actual suicide attempt.

In a Viewpoint published today in JAMA Psychiatry, two psychiatrists from the University of Tennessee Health Science Center at Memphis raise the alarm about the dangers and complexities of TikTok challenges. They're an "emerging public health concern" for kids, the psychiatrists write, and they're blurring the lines between unintentional injuries and suicide attempts in children and teens.

The child and adolescent psychiatrists Onomeasike Ataga and Valerie Arnold say that their psychiatry team first saw injuries from TikTok challenges during the COVID-19 pandemic, but the trend has continued since the pandemic eased. Over recent years, they've seen children and teens hospitalized from a variety of challenges, including the "blackout challenge," in which participants attempt to choke themselves until they pass out; the "Benadryl challenge," in which participants ingest a large amount of the allergy medicine to get high and hallucinate; and the "fire challenge," in which participants pour a flammable liquid on their body and light it on fire. In these cases, the psychiatry team is sometimes called in to help assess whether the children and teens had an intent to self-harm. It's often hard to determine—and thus hard to decide on treatment recommendations.

Read full article

Comments

© Getty | Matt Cardy

If Trump dismantles the NOAA, it will affect wildfires and food prices

As the Popo Agie River wends its way down from the glaciers atop Wyoming’s Wind River Mountains toward the city of Lander, it flows into a limestone cave and disappears. The formation, known as the Sinks, spits the river back out at another feature called the Rise a quarter of a mile east, a little more voluminous and a little warmer, with brown and rainbow trout weighing as much as 10 pounds mingling in its now smooth pools. The quarter-mile journey from the Sinks to the Rise takes the river two hours.

Scientists first discovered this quirk of the middle fork of the Popo Agie (pronounced puh-po zuh) in 1983 by pouring red dye into the river upstream and waiting for it to resurface. Geologists attribute the river’s mysterious delay to the water passing through exceedingly small crevasses in the rock that slow its flow.

Like many rivers in the arid West, the Popo Agie is an important aquifer. Ranchers, farmers, businesses, and recreationists rely on detailed data about it—especially day-to-day streamflow measurements. That’s exactly the type of empirical information collected by the National Oceanic and Atmospheric Administration (NOAA).

Read full article

Comments

© https://www.gettyimages.com/detail/photo/whitewater-of-the-middle-popo-agie-river-at-sinks-royalty-free-image/1415169712

ELSA Speak

31 October 2024 at 13:30

An AI-based English learning app, ELSA leverages machine learning and advanced speech recognition technology to improve users’ English pronunciation and fluency. Their suite of products includes Speech Analyzer, which provides detailed feedback on overall speaking proficiency, and ELSA AI, offering personalized conversations with AI to enhance communication skills. The ELSA Speech API allows their technology to integrate with other platforms and applications.

Millions of users have improved their skills using the app, and now there’s ELSA AI—a tool to help people teach and practice natural conversational English using Generative (Voice) AI, available right at their fingertips through a mobile device.

The platform not only offers roleplay scenarios for learners to listen and reply, it is able to provide in-depth and accurate English fluency analysis and feedback to ensure progress. Using voice, not just text-based learning, ELSA AI helps learners master English language fluency, building their confidence.

For these reasons and more, ELSA Speak earned a Cool Tool Award (finalist) for “Best AI Solution” as part of The EdTech Awards 2024 from EdTech Digest. Learn more. 

The post ELSA Speak appeared first on EdTech Digest.

New Carrier Fluid Makes Hydrogen Way Easier to Transport



Imagine pulling up to a refueling station and filling your vehicle’s tank with liquid hydrogen, as safe and convenient to handle as gasoline or diesel, without the need for high-pressure tanks or cryogenic storage. This vision of a sustainable future could become a reality if a Calgary, Canada–based company, Ayrton Energy, can scale up its innovative method of hydrogen storage and distribution. Ayrton’s technology could make hydrogen a viable, one-to-one replacement for fossil fuels in existing infrastructure like pipelines, fuel tankers, rail cars, and trucks.

The company’s approach is to use liquid organic hydrogen carriers (LOHCs) to make it easier to transport and store hydrogen. The method chemically bonds hydrogen to carrier molecules, which absorb hydrogen molecules and make them more stable—kind of like hydrogenating cooking oil to produce margarine.

Black gloved hands pour a clear liquid from a beaker into a vial. A researcher pours a sample of Ayrton’s LOHC fluid into a vial.Ayrton Energy

The approach would allow liquid hydrogen to be transported and stored in ambient conditions, rather than in the high-pressure, cryogenic tanks (to hold it at temperatures below -252 ºC) currently required for keeping hydrogen in liquid form. It would also be a big improvement on gaseous hydrogen, which is highly volatile and difficult to keep contained.

Founded in 2021, Ayrton is one of several companies across the globe developing LOHCs, including Japan’s Chiyoda and Mitsubishi, Germany’s Covalion, and China’s Hynertech. But toxicity, energy density, and input energy issues have limited LOHCs as contenders for making liquid hydrogen feasible. Ayrton says its formulation eliminates these trade-offs.

Safe, Efficient Hydrogen Fuel for Vehicles

Conventional LOHC technologies used by most of the aforementioned companies rely on substances such as toluene, which forms methylcyclohexane when hydrogenated. These carriers pose safety risks due to their flammability and volatility. Hydrogenious LOHC Technologies in Erlanger, Germany and other hydrogen fuel companies have shifted toward dibenzyltoluene, a more stable carrier that holds more hydrogen per unit volume than methylcyclohexane, though it requires higher temperatures (and thus more energy) to bind and release the hydrogen. Dibenzyltoluene hydrogenation occurs at between 3 and 10 megapascals (30 and 100 bar) and 200–300 ºC, compared with 10 MPa (100 bar), and just under 200 ºC for methylcyclohexane.

Ayrton’s proprietary oil-based hydrogen carrier not only captures and releases hydrogen with less input energy than is required for other LOHCs, but also stores more hydrogen than methylcyclohexane can—55 kilograms per cubic meter compared with methylcyclohexane’s 50 kg/m³. Dibenzyltoluene holds more hydrogen per unit volume (up to 65 kg/m³), but Ayrton’s approach to infusing the carrier with hydrogen atoms promises to cost less. Hydrogenation or dehydrogenation with Ayrton’s carrier fluid occurs at 0.1 megapascal (1 bar) and about 100 ºC, says founder and CEO Natasha Kostenuk. And as with the other LOHCs, after hydrogenation it can be transported and stored at ambient temperatures and pressures.

Judges described [Ayrton's approach] as a critical technology for the deployment of hydrogen at large scale.” —Katie Richardson, National Renewable Energy Lab

Ayrton’s LOHC fluid is as safe to handle as margarine, but it’s still a chemical, says Kostenuk. “I wouldn’t drink it. If you did, you wouldn’t feel very good. But it’s not lethal,” she says.

Kostenuk and fellow Ayrton cofounder Brandy Kinkead (who serves as the company’s chief technical officer) were originally trying to bring hydrogen generators to market to fill gaps in the electrical grid. “We were looking for fuel cells and hydrogen storage. Fuel cells were easy to find, but we couldn’t find a hydrogen storage method or medium that would be safe and easy to transport to fuel our vision of what we were trying to do with hydrogen generators,” Kostenuk says. During the search, they came across LOHC technology but weren’t satisfied with the trade-offs demanded by existing liquid hydrogen carriers. “We had the idea that we could do it better,” she says. The duo pivoted, adjusting their focus from hydrogen generators to hydrogen storage solutions.

“Everybody gets excited about hydrogen production and hydrogen end use, but they forget that you have to store and manage the hydrogen,” Kostenuk says. Incompatibility with current storage and distribution has been a barrier to adoption, she says. “We’re really excited about being able to reuse existing infrastructure that’s in place all over the world.” Ayrton’s hydrogenated liquid has fuel-cell-grade (99.999 percent) hydrogen purity, so there’s no advantage in using pure liquid hydrogen with its need for subzero temperatures, according to the company.

The main challenge the company faces is the set of issues that come along with any technology scaling up from pilot-stage production to commercial manufacturing, says Kostenuk. “A crucial part of that is aligning ourselves with the right manufacturing partners along the way,” she notes.

Asked about how Ayrton is dealing with some other challenges common to LOHCs, Kostenuk says Ayrton has managed to sidestep them. “We stayed away from materials that are expensive and hard to procure, which will help us avoid any supply chain issues,” she says. By performing the reactions at such low temperatures, Ayrton can get its carrier fluid to withstand 1,000 hydrogenation-dehydrogenation cycles before it no longer holds enough hydrogen to be useful. Conventional LOHCs are limited to a couple of hundred cycles before the high temperatures required for bonding and releasing the hydrogen breaks down the fluid and diminishes its storage capacity, Kostenuk says.

Breakthrough in Hydrogen Storage Technology

In acknowledgement of what Ayrton’s nontoxic, oil-based carrier fluid could mean for the energy and transportation sectors, the U.S. National Renewable Energy Lab (NREL) at its annual Industry Growth Forum in May named Ayrton an “outstanding early-stage venture.” A selection committee of more than 180 climate tech and cleantech investors and industry experts chose Ayrton from a pool of more than 200 initial applicants, says Katie Richardson, group manager of NREL’s Innovation and Entrepreneurship Center, which organized the forum. The committee based its decision on the company’s innovation, market positioning, business model, team, next steps for funding, technology, capital use, and quality of pitch presentation. “Judges described Ayrton’s approach as a critical technology for the deployment of hydrogen at large scale,” Richardson says.

As a next step toward enabling hydrogen to push gasoline and diesel aside, “we’re talking with hydrogen producers who are right now offering their customers cryogenic and compressed hydrogen,” says Kostenuk. “If they offered LOHC, it would enable them to deliver across longer distances, in larger volumes, in a multimodal way.” The company is also talking to some industrial site owners who could use the hydrogenated LOHC for buffer storage to hold onto some of the energy they’re getting from clean, intermittent sources like solar and wind. Another natural fit, she says, is energy service providers that are looking for a reliable method of seasonal storage beyond what batteries can offer. The goal is to eventually scale up enough to become the go-to alternative (or perhaps the standard) fuel for cars, trucks, trains, and ships.

The AI Boom Rests on Billions of Tonnes of Concrete



Along the country road that leads to ATL4, a giant data center going up east of Atlanta, dozens of parked cars and pickups lean tenuously on the narrow dirt shoulders. The many out-of-state plates are typical of the phalanx of tradespeople who muster for these massive construction jobs. With tech giants, utilities, and governments budgeting upwards of US $1 trillion for capital expansion to join the global battle for AI dominance, data centers are the bunkers, factories, and skunkworks—and concrete and electricity are the fuel and ammunition.

To the casual observer, the data industry can seem incorporeal, its products conjured out of weightless bits. But as I stand beside the busy construction site for DataBank’s ATL4, what impresses me most is the gargantuan amount of material—mostly concrete—that gives shape to the goliath that will house, secure, power, and cool the hardware of AI. Big data is big concrete. And that poses a big problem.

This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

Concrete is not just a major ingredient in data centers and the power plants being built to energize them. As the world’s most widely manufactured material, concrete—and especially the cement within it—is also a major contributor to climate change, accounting for around 6 percent of global greenhouse gas emissions. Data centers use so much concrete that the construction boom is wrecking tech giants’ commitments to eliminate their carbon emissions. Even though Google, Meta, and Microsoft have touted goals to be carbon neutral or negative by 2030, and Amazon by 2040, the industry is now moving in the wrong direction.

Last year, Microsoft’s carbon emissions jumped by over 30 percent, primarily due to the materials in its new data centers. Google’s greenhouse emissions are up by nearly 50 percent over the past five years. As data centers proliferate worldwide, Morgan Stanley projects that data centers will release about 2.5 billion tonnes of CO2 each year by 2030—or about 40 percent of what the United States currently emits from all sources.

But even as innovations in AI and the big-data construction boom are boosting emissions for the tech industry’s hyperscalers, the reinvention of concrete could also play a big part in solving the problem. Over the last decade, there’s been a wave of innovation, some of it profit-driven, some of it from academic labs, aimed at fixing concrete’s carbon problem. Pilot plants are being fielded to capture CO 2 from cement plants and sock it safely away. Other projects are cooking up climate-friendlier recipes for cements. And AI and other computational tools are illuminating ways to drastically cut carbon by using less cement in concrete and less concrete in data centers, power plants, and other structures.

Demand for green concrete is clearly growing. Amazon, Google, Meta, and Microsoft recently joined an initiative led by the Open Compute Project Foundation to accelerate testing and deployment of low-carbon concrete in data centers, for example. Supply is increasing, too—though it’s still minuscule compared to humanity’s enormous appetite for moldable rock. But if the green goals of big tech can jump-start innovation in low-carbon concrete and create a robust market for it as well, the boom in big data could eventually become a boon for the planet.

Hyperscaler Data Centers: So Much Concrete

At the construction site for ATL4, I’m met by Tony Qorri, the company’s big, friendly, straight-talking head of construction. He says that this giant building and four others DataBank has recently built or is planning in the Atlanta area will together add 133,000 square meters (1.44 million square feet) of floor space.

They all follow a universal template that Qorri developed to optimize the construction of the company’s ever-larger centers. At each site, trucks haul in more than a thousand prefabricated concrete pieces: wall panels, columns, and other structural elements. Workers quickly assemble the precision-measured parts. Hundreds of electricians swarm the building to wire it up in just a few days. Speed is crucial when construction delays can mean losing ground in the AI battle.

A large data center under construction . Multiple cherry picker cranes are in the background, and in the foreground are workers preparing for a concrete pour. The ATL4 data center outside Atlanta is one of five being built by DataBank. Together they will add over 130,000 square meters of floor space.DataBank

That battle can be measured in new data centers and floor space. The United States is home to more than 5,000 data centers today, and the Department of Commerce forecasts that number to grow by around 450 a year through 2030. Worldwide, the number of data centers now exceeds 10,000, and analysts project another 26.5 million m2 of floor space over the next five years. Here in metro Atlanta, developers broke ground last year on projects that will triple the region’s data-center capacity. Microsoft, for instance, is planning a 186,000-m2 complex; big enough to house around 100,000 rack-mounted servers, it will consume 324 megawatts of electricity.

The velocity of the data-center boom means that no one is pausing to await greener cement. For now, the industry’s mantra is “Build, baby, build.”

“There’s no good substitute for concrete in these projects,” says Aaron Grubbs, a structural engineer at ATL4. The latest processors going on the racks are bigger, heavier, hotter, and far more power hungry than previous generations. As a result, “you add a lot of columns,” Grubbs says.

1,000 Companies Working on Green Concrete

Concrete may not seem an obvious star in the story of how electricity and electronics have permeated modern life. Other materials—copper and silicon, aluminum and lithium—get higher billing. But concrete provides the literal, indispensable foundation for the world’s electrical workings. It is the solid, stable, durable, fire-resistant stuff that makes power generation and distribution possible. It undergirds nearly all advanced manufacturing and telecommunications. What was true in the rapid build-out of the power industry a century ago remains true today for the data industry: Technological progress begets more growth—and more concrete. Although each generation of processor and memory squeezes more computing onto each chip, and advances in superconducting microcircuitry raise the tantalizing prospect of slashing the data center’s footprint, Qorri doesn’t think his buildings will shrink to the size of a shoebox anytime soon. “I’ve been through that kind of change before, and it seems the need for space just grows with it,” he says.

By weight, concrete is not a particularly carbon-intensive material. Creating a kilogram of steel, for instance, releases about 2.4 times as much CO2 as a kilogram of cement does. But the global construction industry consumes about 35 billion tonnes of concrete a year. That’s about 4 tonnes for every person on the planet and twice as much as all other building materials combined. It’s that massive scale—and the associated cost and sheer number of producers—that creates both a threat to the climate and inertia that resists change.

Aerial view of a cement plant with rail cars extending to the distance on one side. At its Edmonton, Alberta, plant [above], Heidelberg Materials is adding systems to capture carbon dioxide produced by the manufacture of Portland cement.Heidelberg Materials North America

Yet change is afoot. When I visited the innovation center operated by the Swiss materials giant Holcim, in Lyon, France, research executives told me about the database they’ve assembled of nearly 1,000 companies working to decarbonize cement and concrete. None yet has enough traction to measurably reduce global concrete emissions. But the innovators hope that the boom in data centers—and in associated infrastructure such as new nuclear reactors and offshore wind farms, where each turbine foundation can use up to 7,500 cubic meters of concrete—may finally push green cement and concrete beyond labs, startups, and pilot plants.

Why cement production emits so much carbon

Though the terms “cement” and “concrete” are often conflated, they are not the same thing. A popular analogy in the industry is that cement is the egg in the concrete cake. Here’s the basic recipe: Blend cement with larger amounts of sand and other aggregates. Then add water, to trigger a chemical reaction with the cement. Wait a while for the cement to form a matrix that pulls all the components together. Let sit as it cures into a rock-solid mass.

Portland cement, the key binder in most of the world’s concrete, was serendipitously invented in England by William Aspdin, while he was tinkering with earlier mortars that his father, Joseph, had patented in 1824. More than a century of science has revealed the essential chemistry of how cement works in concrete, but new findings are still leading to important innovations, as well as insights into how concrete absorbs atmospheric carbon as it ages.

As in the Aspdins’ day, the process to make Portland cement still begins with limestone, a sedimentary mineral made from crystalline forms of calcium carbonate. Most of the limestone quarried for cement originated hundreds of millions of years ago, when ocean creatures mineralized calcium and carbonate in seawater to make shells, bones, corals, and other hard bits.

Cement producers often build their large plants next to limestone quarries that can supply decades’ worth of stone. The stone is crushed and then heated in stages as it is combined with lesser amounts of other minerals that typically include calcium, silicon, aluminum, and iron. What emerges from the mixing and cooking are small, hard nodules called clinker. A bit more processing, grinding, and mixing turns those pellets into powdered Portland cement, which accounts for about 90 percent of the CO2 emitted by the production of conventional concrete [see infographic, “Roads to Cleaner Concrete”].

A woman wearing a dark blazer and pants stands in front of a blackboard with notes and equations, as well as some machinery. Karen Scrivener, shown in her lab at EPFL, has developed concrete recipes that reduce emissions by 30 to 40 percent.Stefan Wermuth/Bloomberg/Getty Images

Decarbonizing Portland cement is often called heavy industry’s “hard problem” because of two processes fundamental to its manufacture. The first process is combustion: To coax limestone’s chemical transformation into clinker, large heaters and kilns must sustain temperatures around 1,500 °C. Currently that means burning coal, coke, fuel oil, or natural gas, often along with waste plastics and tires. The exhaust from those fires generates 35 to 50 percent of the cement industry’s emissions. Most of the remaining emissions result from gaseous CO 2 liberated by the chemical transformation of the calcium carbonate (CaCO3) into calcium oxide (CaO), a process called calcination. That gas also usually heads straight into the atmosphere.

Concrete production, in contrast, is mainly a business of mixing cement powder with other ingredients and then delivering the slurry speedily to its destination before it sets. Most concrete in the United States is prepared to order at batch plants—souped-up materials depots where the ingredients are combined, dosed out from hoppers into special mixer trucks, and then driven to job sites. Because concrete grows too stiff to work after about 90 minutes, concrete production is highly local. There are more ready-mix batch plants in the United States than there are Burger King restaurants.

Batch plants can offer thousands of potential mixes, customized to fit the demands of different jobs. Concrete in a hundred-story building differs from that in a swimming pool. With flexibility to vary the quality of sand and the size of the stone—and to add a wide variety of chemicals—batch plants have more tricks for lowering carbon emissions than any cement plant does.

Cement plants that capture carbon

China accounts for more than half of the concrete produced and used in the world, but companies there are hard to track. Outside of China, the top three multinational cement producers—Holcim, Heidelberg Materials in Germany, and Cemex in Mexico—have launched pilot programs to snare CO2 emissions before they escape and then bury the waste deep underground. To do that, they’re taking carbon capture and storage (CCS) technology already used in the oil and gas industry and bolting it onto their cement plants.

These pilot programs will need to scale up without eating profits—something that eluded the coal industry when it tried CCS decades ago. Tough questions also remain about where exactly to store billions of tonnes of CO 2 safely, year after year.

The appeal of CCS for cement producers is that they can continue using existing plants while still making progress toward carbon neutrality, which trade associations have committed to reach by 2050. But with well over 3,000 plants around the world, adding CCS to all of them would take enormous investment. Currently less than 1 percent of the global supply is low-emission cement. Accenture, a consultancy, estimates that outfitting the whole industry for carbon capture could cost up to $900 billion.

“The economics of carbon capture is a monster,” says Rick Chalaturnyk, a professor of geotechnical engineering at the University of Alberta, in Edmonton, Canada, who studies carbon capture in the petroleum and power industries. He sees incentives for the early movers on CCS, however. “If Heidelberg, for example, wins the race to the lowest carbon, it will be the first [cement] company able to supply those customers that demand low-carbon products”—customers such as hyperscalers.

Though cement companies seem unlikely to invest their own billions in CCS, generous government subsidies have enticed several to begin pilot projects. Heidelberg has announced plans to start capturing CO2 from its Edmonton operations in late 2026, transforming it into what the company claims would be “the world’s first full-scale net-zero cement plant.” Exhaust gas will run through stations that purify the CO2 and compress it into a liquid, which will then be transported to chemical plants to turn it into products or to depleted oil and gas reservoirs for injection underground, where hopefully it will stay put for an epoch or two.

Chalaturnyk says that the scale of the Edmonton plant, which aims to capture a million tonnes of CO2 a year, is big enough to give CCS technology a reasonable test. Proving the economics is another matter. Half the $1 billion cost for the Edmonton project is being paid by the governments of Canada and Alberta.

ROADS TO CLEANER CONCRETE


As the big-data construction boom boosts the tech industry’s emissions, the reinvention of concrete could play a major role in solving the problem.

• CONCRETE TODAY Most of the greenhouse emissions from concrete come from the production of Portland cement, which requires high heat and releases carbon dioxide (CO2) directly into the air.

• CONCRETE TOMORROW At each stage of cement and concrete production, advances in ingredients, energy supplies, and uses of concrete promise to reduce waste and pollution.

An illustration of the process for cleaner concrete.

The U.S. Department of Energy has similarly offered Heidelberg up to $500 million to help cover the cost of attaching CCS to its Mitchell, Ind., plant and burying up to 2 million tonnes of CO2 per year below the plant. And the European Union has gone even bigger, allocating nearly €1.5 billion ($1.6 billion) from its Innovation Fund to support carbon capture at cement plants in seven of its member nations.

These tests are encouraging, but they are all happening in rich countries, where demand for concrete peaked decades ago. Even in China, concrete production has started to flatten. All the growth in global demand through 2040 is expected to come from less-affluent countries, where populations are still growing and quickly urbanizing. According to projections by the Rhodium Group, cement production in those regions is likely to rise from around 30 percent of the world’s supply today to 50 percent by 2050 and 80 percent before the end of the century.

So will rich-world CCS technology translate to the rest of the world? I asked Juan Esteban Calle Restrepo, the CEO of Cementos Argos, the leading cement producer in Colombia, about that when I sat down with him recently at his office in Medellín. He was frank. “Carbon capture may work for the U.S. or Europe, but countries like ours cannot afford that,” he said.

Better cement through chemistry

As long as cement plants run limestone through fossil-fueled kilns, they will generate excessive amounts of carbon dioxide. But there may be ways to ditch the limestone—and the kilns. Labs and startups have been finding replacements for limestone, such as calcined kaolin clay and fly ash, that don’t release CO 2 when heated. Kaolin clays are abundant around the world and have been used for centuries in Chinese porcelain and more recently in cosmetics and paper. Fly ash—a messy, toxic by-product of coal-fired power plants—is cheap and still widely available, even as coal power dwindles in many regions.

At the Swiss Federal Institute of Technology Lausanne (EPFL), Karen Scrivener and colleagues developed cements that blend calcined kaolin clay and ground limestone with a small portion of clinker. Calcining clay can be done at temperatures low enough that electricity from renewable sources can do the job. Various studies have found that the blend, known as LC3, can reduce overall emissions by 30 to 40 percent compared to those of Portland cement.

LC3 is also cheaper to make than Portland cement and performs as well for nearly all common uses. As a result, calcined clay plants have popped up across Africa, Europe, and Latin America. In Colombia, Cementos Argos is already producing more than 2 million tonnes of the stuff annually. The World Economic Forum’s Centre for Energy and Materials counts LC3 among the best hopes for the decarbonization of concrete. Wide adoption by the cement industry, the centre reckons, “can help prevent up to 500 million tonnes of CO2 emissions by 2030.”

In a win-win for the environment, fly ash can also be used as a building block for low- and even zero-emission concrete, and the high heat of processing neutralizes many of the toxins it contains. Ancient Romans used volcanic ash to make slow-setting but durable concrete: The Pantheon, built nearly two millennia ago with ash-based cement, is still in great shape.

Coal fly ash is a cost-effective ingredient that has reactive properties similar to those of Roman cement and Portland cement. Many concrete plants already add fresh fly ash to their concrete mixes, replacing 15 to 35 percent of the cement. The ash improves the workability of the concrete, and though the resulting concrete is not as strong for the first few months, it grows stronger than regular concrete as it ages, like the Pantheon.

University labs have tested concretes made entirely with fly ash and found that some actually outperform the standard variety. More than 15 years ago, researchers at Montana State University used concrete made with 100 percent fly ash in the floors and walls of a credit union and a transportation research center. But performance depends greatly on the chemical makeup of the ash, which varies from one coal plant to the next, and on following a tricky recipe. The decommissioning of coal-fired plants has also been making fresh fly ash scarcer and more expensive.

Side view of a man in a lab coat as he climbs stairs in an industrial but simple looking pilot cement plant that is about twice his size. At Sublime Systems’ pilot plant in Massachusetts, the company is using electrochemistry instead of heat to produce lime silicate cements that can replace Portland cement.Tony Luong

That has spurred new methods to treat and use fly ash that’s been buried in landfills or dumped into ponds. Such industrial burial grounds hold enough fly ash to make concrete for decades, even after every coal plant shuts down. Utah-based Eco Material Technologies is now producing cements that include both fresh and recovered fly ash as ingredients. The company claims it can replace up to 60 percent of the Portland cement in concrete—and that a new variety, suitable for 3D printing, can substitute entirely for Portland cement.

Hive 3D Builders, a Houston-based startup, has been feeding that low-emissions concrete into robots that are printing houses in several Texas developments. “We are 100 percent Portland cement–free,” says Timothy Lankau, Hive 3D’s CEO. “We want our homes to last 1,000 years.”

Sublime Systems, a startup spun out of MIT by battery scientists, uses electrochemistry rather than heat to make low-carbon cement from rocks that don’t contain carbon. Similar to a battery, Sublime’s process uses a voltage between an electrode and a cathode to create a pH gradient that isolates silicates and reactive calcium, in the form of lime (CaO). The company mixes those ingredients together to make a cement with no fugitive carbon, no kilns or furnaces, and binding power comparable to that of Portland cement. With the help of $87 million from the U.S. Department of Energy, Sublime is building a plant in Holyoke, Mass., that will be powered almost entirely by hydroelectricity. Recently the company was tapped to provide concrete for a major offshore wind farm planned off the coast of Martha’s Vineyard.

Software takes on the hard problem of concrete

It is unlikely that any one innovation will allow the cement industry to hit its target of carbon neutrality before 2050. New technologies take time to mature, scale up, and become cost-competitive. In the meantime, says Philippe Block, a structural engineer at ETH Zurich, smart engineering can reduce carbon emissions through the leaner use of materials.

His research group has developed digital design tools that make clever use of geometry to maximize the strength of concrete structures while minimizing their mass. The team’s designs start with the soaring architectural elements of ancient temples, cathedrals, and mosques—in particular, vaults and arches—which they miniaturize and flatten and then 3D print or mold inside concrete floors and ceilings. The lightweight slabs, suitable for the upper stories of apartment and office buildings, use much less concrete and steel reinforcement and have a CO2 footprint that’s reduced by 80 percent.

There’s hidden magic in such lean design. In multistory buildings, much of the mass of concrete is needed just to hold the weight of the material above it. The carbon savings of Block’s lighter slabs thus compound, because the size, cost, and emissions of a building’s conventional-concrete elements are slashed.

Aerial view of a geometric and vaulted looking fabricated floor under construction outside. Three people with hard hats stand on it. Vaulted, a Swiss startup, uses digital design tools to minimize the concrete in floors and ceilings, cutting their CO2 footprint by 80 percent.Vaulted

In Dübendorf, Switzerland, a wildly shaped experimental building has floors, roofs, and ceilings created by Block’s structural system. Vaulted, a startup spun out of ETH, is engineering and fabricating the lighter floors of a 10-story office building under construction in Zug, Switzerland.

That country has also been a leader in smart ways to recycle and reuse concrete, rather than simply landfilling demolition rubble. This is easier said than done—concrete is tough stuff, riddled with rebar. But there’s an economic incentive: Raw materials such as sand and limestone are becoming scarcer and more costly. Some jurisdictions in Europe now require that new buildings be made from recycled and reused materials. The new addition of the Kunsthaus Zürich museum, a showcase of exquisite Modernist architecture, uses recycled material for all but 2 percent of its concrete.

As new policies goose demand for recycled materials and threaten to restrict future use of Portland cement across Europe, Holcim has begun building recycling plants that can reclaim cement clinker from old concrete. It recently turned the demolition rubble from some 1960s apartment buildings outside Paris into part of a 220-unit housing complex—touted as the first building made from 100 percent recycled concrete. The company says it plans to build concrete recycling centers in every major metro area in Europe and, by 2030, to include 30 percent recycled material in all of its cement.

Further innovations in low-carbon concrete are certain to come, particularly as the powers of machine learning are applied to the problem. Over the past decade, the number of research papers reporting on computational tools to explore the vast space of possible concrete mixes has grown exponentially. Much as AI is being used to accelerate drug discovery, the tools learn from huge databases of proven cement mixes and then apply their inferences to evaluate untested mixes.

Researchers from the University of Illinois and Chicago-based Ozinga, one of the largest private concrete producers in the United States, recently worked with Meta to feed 1,030 known concrete mixes into an AI. The project yielded a novel mix that will be used for sections of a data-center complex in DeKalb, Ill. The AI-derived concrete has a carbon footprint 40 percent lower than the conventional concrete used on the rest of the site. Ryan Cialdella, Ozinga’s vice president of innovation, smiles as he notes the virtuous circle: AI systems that live in data centers can now help cut emissions from the concrete that houses them.

A sustainable foundation for the information age

Cheap, durable, and abundant yet unsustainable, concrete made with Portland cement has been one of modern technology’s Faustian bargains. The built world is on track to double in floor space by 2060, adding 230,000 km 2, or more than half the area of California. Much of that will house the 2 billion more people we are likely to add to our numbers. As global transportation, telecom, energy, and computing networks grow, their new appendages will rest upon concrete. But if concrete doesn’t change, we will perversely be forced to produce even more concrete to protect ourselves from the coming climate chaos, with its rising seas, fires, and extreme weather.

The AI-driven boom in data centers is a strange bargain of its own. In the future, AI may help us live even more prosperously, or it may undermine our freedoms, civilities, employment opportunities, and environment. But solutions to the bad climate bargain that AI’s data centers foist on the planet are at hand, if there’s a will to deploy them. Hyperscalers and governments are among the few organizations with the clout to rapidly change what kinds of cement and concrete the world uses, and how those are made. With a pivot to sustainability, concrete’s unique scale makes it one of the few materials that could do most to protect the world’s natural systems. We can’t live without concrete—but with some ambitious reinvention, we can thrive with it.

This article was updated on 04 November 2024.

TSA silent on CrowdStrike’s claim Delta skipped required security update

29 October 2024 at 20:36

Delta and CrowdStrike have locked legal horns, threatening to drag out the aftermath of the worst IT outage in history for months or possibly years.

Each refuses to be blamed for Delta's substantial losses following a global IT outage caused by CrowdStrike suddenly pushing a flawed security update despite Delta and many other customers turning off auto-updates.

CrowdStrike has since given customers more control over updates and made other commitments to ensure an outage of that scale will never happen again, but Delta isn't satisfied. The airline has accused CrowdStrike of willfully causing losses by knowingly deceiving customers by failing to disclose an unauthorized door into their operating systems that enabled the outage.

Read full article

Comments

© Mario Tama / Staff | Getty Images News

How can you write data to DNA without changing the base sequence?

29 October 2024 at 15:32

Zettabytes—that’s 1021 bytes—of data are currently generated every year. All of those cat videos have to be stored somewhere, and DNA is a great storage medium; it has amazing data density and is stable over millennia.

To date, people have encoded information into DNA the same way nature has, by linking the four nucleotide bases comprising DNA—A, T,  C, and G—into a particular genetic sequence. Making these sequences is time-consuming and expensive, though, and the longer your sequence, the higher chance there is that errors will creep in.

But DNA has an added layer of information encoded on top of the nucleotide sequence, known as epigenetics. These are chemical modifications to the nucleotides, specifically altering a C when it comes before a G. In cells, these modifications function kind of like stage directions; they can tell the cell when to use a particular DNA sequence without altering the “text” of the sequence itself. A new paper in Nature describes using epigenetics to store information in DNA without needing to synthesize new DNA sequences every time.

Read full article

Comments

© Alengo

The Patent Battle That Won’t Quit



Just before this special issue on invention went to press, I got a message from IEEE senior member and patent attorney George Macdonald. Nearly two decades after I first reported on Corliss Orville “Cob” Burandt’s struggle with the U.S. Patent and Trademark Office, the 77-year-old inventor’s patent case was being revived.

From 1981 to 1990, Burandt had received a dozen U.S. patents for improvements to automotive engines, starting with his 1990 patent for variable valve-timing technology (U.S. Patent No. 4,961,406A). But he failed to convince any automakers to license his technology. What’s worse, he claims, some of the world’s major carmakers now use his inventions in their hybrid engines.

Shortly after reading my piece in 2005, Macdonald stepped forward to represent Burandt. By then, the inventor had already lost his patents because he hadn’t paid the US $40,000 in maintenance fees to keep them active.

Macdonald filed a petition to pay the maintenance fees late and another to revive a related child case. The maintenance fee petition was denied in 2006. While the petition to revive was still pending, Macdonald passed the maintenance fee baton to Hunton Andrews Kurth (HAK), which took the case pro bono. HAK attorneys argued that the USPTO should reinstate the 1990 parent patent.

The timing was crucial: If the parent patent was reinstated before 2008, Burandt would have had the opportunity to compel infringing corporations to pay him licensing fees. Unfortunately, for reasons that remain unclear, the patent office tried to paper Burandt’s legal team to death, Macdonald says. HAK could go no further in the maintenance-fee case after the U.S. Supreme Court declined to hear it in 2009.

Then, in 2010, the USPTO belatedly revived Burandt’s child continuation application. A continuation application lets an inventor add claims to their original patent application while maintaining the earlier filing date—1988 in this case.

However, this revival came with its own set of challenges. Macdonald was informed in 2011 that the patent examiner would issue the patent but later discovered that the application was placed into a then-secret program called the Sensitive Application Warning System (SAWS) instead. While touted as a way to quash applications for things like perpetual-motion machines, the SAWS process effectively slowed action on Burandt’s case.

After several more years of motions and rulings, Macdonald met IEEE Member Edward Pennington, who agreed to represent Burandt. Earlier this year, Pennington filed a complaint in the Eastern District of Virginia seeking the issuance of Burandt’s patent on the grounds that it was wrongfully denied.

As of this writing, Burandt still hasn’t seen a dime from his inventions. He subsists on his social security benefits. And while his case raises important questions about fairness, transparency, and the rights of individual inventors, Pennington says his client isn’t interested in becoming a poster boy for poor inventors.

“We’re not out to change policy at the patent office or to give Mr. Burandt a framed copy of the patent to say, ‘Look at me, I’m an inventor,’ ” says Pennington. “This is just to say, ‘Here’s a guy that would like to benefit from his idea.’ It just so happens that he’s pretty much in need. And even the slightest royalty would go a long ways for the guy.”

Kremlin-backed hackers have new Windows and Android malware to foist on Ukrainian foes

28 October 2024 at 18:58

Google researchers said they uncovered a Kremlin-backed operation targeting recruits for the Ukrainian military with information-stealing malware for Windows and Android devices.

The malware, spread primarily through posts on Telegram, came from a persona on that platform known as "Civil Defense." Posts on the ​​@civildefense_com_ua telegram channel and the accompanying civildefense[.]com.ua website claimed to provide potential conscripts with free software for finding user-sourced locations of Ukrainian military recruiters. In fact, the software, available for both Windows and Android, installed infostealers. Google tracks the Kremlin-aligned threat group as UNC5812.

Dual espionage and influence campaign

"The ultimate aim of the campaign is to have victims navigate to the UNC5812-controlled 'Civil Defense' website, which advertises several different software programs for different operating systems," Google researchers wrote. "When installed, these programs result in the download of various commodity malware families."

Read full article

Comments

© Getty Images

Swing Education

28 October 2024 at 13:30

There is a significant shortage of full-time teachers around the country these days. So, to keep classrooms staffed, schools use substitute teachers to fill in for teachers when they are sick or can’t come to school. However, substitute teaching jobs are tough to recruit for, manage, and fill. One complication is that quite often, a teacher absence isn’t discovered until that morning when schools scramble to post an opening. This uncontrollable and unavoidable reality coupled with the dramatic shortage of substitutes who are qualified and ready to go is a significant problem in our nation’s schools.

The process of how to fill short-term or even long-term teacher absences with substitute teachers has not changed in 50+ years. Even the modern communication modes of emails, text messages and robo-phone calls are not any more effective at solving the problem for the angst-driven and uncertain process of finding substitute teachers.

Enter Swing. Swing brings a 21st century solution to a 20th century problem. Swing leverages social communities and technology to not only improve communication but put power back into the hands of substitutes and the schools that need them. Instead of scrambling to find anyone who will answer the phone – or for the sub – being completely in the dark about if there is only one opening today or ten, now both stakeholders have power: schools have power to match the best person for a particular need, and subs have the power to pick the best job for their day or week ahead.

For these reasons and more, Swing Education earned a Cool Tool Award (finalist) for “Best Administrative Solution” as part of The EdTech Awards 2024 from EdTech Digest. Learn more

The post Swing Education appeared first on EdTech Digest.

A how-to for ethical geoengineering research

26 October 2024 at 13:21

Over the Northern Hemisphere's summer, the world's temperatures hovered near 1.5° C above pre-industrial temperatures, and the catastrophic weather events that ensued provided a preview of what might be expected to be the new normal before mid-century. And the warming won't stop there; our current emissions trajectory is such that we will double that temperature increase by the time the century is out and continue beyond its end.

This frightening trajectory and its results have led many people to argue that some form of geoengineering is necessary. If we know the effects of that much warming will be catastrophic, why not try canceling some of it out? Unfortunately, the list of "why nots" includes the fact that we don't know how well some of these techniques work or fully understand their unintended consequences. This means more research is required before we put them into practice.

But how do we do that research if there's the risk of unintended consequences? To help guide the process, the American Geophysical Union (AGU) has just released guidelines for ensuring that geoengineering research is conducted ethically.

Read full article

Comments

© Handout / Getty Images

Autistic Children’s Fascination with Letters Offers Learning Clues

25 October 2024 at 21:22
This shows letters and a brain.A recent study highlights that many autistic children show an intense interest in letters and numbers, which may play a distinct role in their language development. By analyzing clinical records and interviewing parents, researchers found that 37% of autistic children had a strong interest in letters, in contrast to just 3% of non-autistic peers. This interest often emerges around 30 months, aligning with typical development timelines but diverging in how it's applied.

Newborns’ Brains Recognize Complex Sound Patterns

25 October 2024 at 20:16
This shows a baby surrounded by musical notes.New research shows that newborns can detect complex sound patterns that follow non-adjacent, language-like rules, suggesting that the ability to process such sequences is innate. Using near-infrared spectroscopy, researchers observed newborn brain responses to sequences of tones, finding that infants could distinguish between correct and incorrect patterns. The study found that this early ability activates language-related networks, particularly in the left hemisphere, highlighting a foundation for future language skills.

With four more years like 2023, carbon emissions will blow past 1.5° limit

24 October 2024 at 22:23

On Thursday, the United Nations' Environmental Programme (UNEP) released a report on what it terms the "emissions gap"—the difference between where we're heading and where we'd need to be to achieve the goals set out in the Paris Agreement. It makes for some pretty grim reading. Given last year's greenhouse gas emissions, we can afford fewer than four similar years before we would exceed the total emissions compatible with limiting the planet's warming to 1.5° C above pre-industrial conditions. Following existing policies out to the turn of the century would leave us facing over 3° C of warming.

The report ascribes this situation to two distinct emissions gaps: between the goals of the Paris Agreement and what countries have pledged to do and between their pledges and the policies they've actually put in place. There are some reasons to think that rapid progress could be made—the six largest greenhouse gas emitters accounted for nearly two-thirds of the global emissions, so it wouldn't take many policy changes to make a big difference. And the report suggests increased deployment of wind and solar could handle over a quarter of the needed emissions reductions.

But so far, progress has been far too limited to cut into global emissions.

Read full article

Comments

© Mario Tama

A Picture Is Worth 4.6 Terabits



Clark Johnson says he has wanted to be a scientist ever since he was 3. At age 8, he got bored with a telegraph-building kit he received as a gift and repurposed it into a telephone. By age 12, he set his sights on studying physics because he wanted to understand how things worked at the most basic level.

“I thought, mistakenly at the time, that physicists were attuned to the left ear of God,” Johnson says.

Clark Johnson


Employer

Wave Domain

Title

CFO

Member grade

Life Fellow

After graduating at age 19 with a bachelor’s degree in physics in 1950 from the University of Minnesota Twin Cities, he was planning to go to graduate school when he got a call from the head of the physics section at 3M’s R&D laboratory with a job offer. Tempted by the promise of doing things with his own hands, Johnson accepted the role of physicist at the company’s facility in St. Paul, Minn. Thus began his more than seven-decade-long career as an electrical engineer, inventor, and entrepreneur—which continues to this day.

Johnson, an IEEE Life Fellow, is an active member of the IEEE Magnetics Society and served as its 1983–1984 president.

He was on the science committee of the U.S. House of Representatives, and then was recruited by the Advanced Research Projects Agency (ARPA) and assigned to assist in MIT’s Research Program on Communications Policy, where he contributed to the development of HDTV.

He went on to help found Wave Domain in Monson, Mass. Johnson and his Wave Domain collaborators have been granted six patents for their latest invention, a standing-wave storage (SWS) system that houses archival data in a low-energy-use, tamper-proof way using antiquated photography technology.

3M, HDTV, and a career full of color

3M turned out to be fertile ground for Johnson’s creativity.

“You could spend 15 percent of your time working on things you liked,” he says. “The president of the company believed that new ideas sort of sprung out of nothing, and if you poked around, you might come across something that could be useful.”

Johnson’s poking around led him to contribute to developing an audio tape cartridge and Scotchlite, the reflective film seen on roads, signs, and more.

In 1989 he was tapped to be an IEEE Congressional Fellow. He chose to work with Rep. George Brown Jr., a Democrat representing the 42nd district in central California. Brown was a ranking member of the House committee on science, space, and technology, which oversees almost all non-defense and non-health related research.

“It was probably the most exciting year of my entire life,” Johnson says.

While on the science committee, he met Richard Jay Solomon, who was associate director of MIT’s Research Program on Communications Policy, testifying for the committee on video and telecom issues. Solomon’s background is diverse. He studied physics and electrical engineering in the early 1960s at Brooklyn Polytechnic and general science at New York University. Before becoming a research associate at MIT in 1969, he held a variety of positions. He ran a magazine about scientific photography, and he founded a business that provided consulting on urban planning and transportation. He authored four textbooks on transportation planning, three of which were published by the American Society of Civil Engineers. At the magazine, Solomon gained insights into arcane, long-forgotten 19th-century photographic processes that turned out to be useful in future inventions.

a man standing at the end of a brown and orange train car Johnson and Solomon bonded over their shared interest in trains. Johnson’s refurbished Pullman car has traveled some 850,000 miles across the continental U.S.Clark Johnson

Johnson and Solomon clicked over a shared interest in trains. At the time they met, Johnson owned a railway car that was parked in the District of Columbia’s Union Station, and he used it to move throughout North America, traveling some 850,000 miles before selling the car in 2019. Johnson and Solomon shared many trips aboard the refurbished Pullman car.

Now they are collaborators on a new method to store big data in a tamperproof, zero-energy-cost medium.

Conventional storage devices such as solid-state drives and hard disks take energy to maintain, and they might degrade over time, but Johnson says the technique he, Solomon, and collaborators developed requires virtually no energy and can remain intact for centuries under most conditions.

Long before collaborating on their latest project, Johnson and Solomon teamed up on another high-profile endeavor: the development of HDTV. The project arose through their work on the congressional science committee.

In the late 1980s, engineers in Japan were working on developing an analog high-definition television system.

“My boss on the science committee said, ‘We really can’t let the Japanese do this. There’s all this digital technology and digital computers. We’ve got to do this digitally,’” Johnson says.

That spawned a collaborative project funded by NASA and ARPA (the predecessor of modern-day DARPA). After Johnson’s tenure on the science committee ended, he and Solomon joined a team at MIT that participated in the collaboration. As they developed what would become the dominant TV technology, Johnson and Solomon became experts in optics. Working with Polaroid, IBM, and Philips in 1992, the team demonstrated the world’s first digital, progressive-scanned, high-definition camera at the annual National Association of Broadcasters conference.

A serendipitous discovery

Around 2000, Clark and Solomon, along with a new colleague, Eric Rosenthal, began working as independent consultants to NASA and the U.S. Department of Defense. Rosenthal had been a vice president of research and development at Walt Disney Imagineering and general manager of audiovisual systems engineering at ABC television prior to joining forces with Clark and Solomon.

While working on one DARPA-funded project, Solomon stumbled upon a page in a century-old optics textbook that caught his eye. It described a method developed by noted physicist Gabriel Lippmann for producing color photographs. Instead of using film or dyes, Lippmann created photos by using a glass plate coated with a specially formulated silver halide emulsion.

When exposed to a bright, sunlit scene, the full spectrum of light reflected off a mercury-based mirror coating on the back of the glass. It created standing waves inside the emulsion layer of the colors detected. The silver grains in the brightest parts of the standing wave became oxidized, as if remembering the precise colors they saw. (It was in stark contrast to traditional color photographs and television, which store only red, green, and blue parts of the spectrum.) Then, chemical processing turned the oxidized silver halide grains black, leaving the light waves imprinted in the medium in a way that is nearly impossible to tamper with. Lippmann received the 1908 Nobel Prize in Physics for his work.

Lippmann’s photography technique did not garner commercial success, because there was no practical way to duplicate the images or print them. And at the time, the emulsions needed the light to be extremely bright to be properly imprinted in the medium.

Nevertheless, Solomon was impressed with the durability of the resulting image. He explained the process to his colleagues, who recognized the possibility of using the technique to store information for archival purposes. Johnson saw Lippmann’s old photographs at the Museum for Photography, in Lausanne, Switzerland, where he noticed that the colors appeared clear and intense despite being more than a century old.

The silver halide method stuck with Solomon, and in 2013 he and Johnson returned to Lippmann’s emulsion photography technique.

“We got to talking about how we could take all this information we knew about color and use it for something,” Johnson says.

Data in space and on land

While Rosenthal was visiting the International Space Station headquarters in Huntsville, Ala., in 2013, a top scientist said, “‘The data stored on the station gets erased every 24 hours by cosmic rays,’” Rosenthal recalls. “‘And we have to keep rewriting the data over and over and over again.’” Cosmic rays and solar flares can damage electronic components, causing errors or outright erasures on hard disks and other traditional data storage systems.

Rosenthal, Johnson, and Solomon knew that properly processed silver halide photographs would be immune to such hazards, including electromagnetic pulses from nuclear explosions. The team examined Lippmann’s photographic emulsion anew.

Solomon’s son, Brian Solomon, a professional photographer and a specialist in making photographic emulsions, also was concerned about the durability of conventional dye-based color photographs, which tend to start fading after a few decades.

The team came up with an intriguing idea: Given how durable Lippmann’s photographs appeared to be, what if they could use a similar technique—not for making analog images but for storing digital data? Thus began their newest engineering endeavor: changing how archival data—data that doesn’t need to be overwritten but simply preserved and read occasionally—is stored.

black text with red and green wavy lines and black dots in a gray box with another gray box next to it The standing wave storage technique works by shining bright LEDs onto a specially formulated emulsion of silver grains in gelatin. The light reflects off the substrate layer (which could be air), and forms standing waves in the emulsion. Standing waves oxidize the silver grains at their peaks, and a chemical process turns the oxidized silver grains black, imprinting the pattern of colors into the medium. Wave Domain

Conventionally stored data sometimes is protected by making multiple copies or continuously rewriting it, Johnson says. The techniques require energy, though, and can be labor-intensive.

The amount of data that needs to be stored on land is also growing by leaps and bounds. The market for data centers and other artificial intelligence infrastructure is growing at an annual rate of 44 percent, according to Data Bridge Market Research. Commonly used hard drives and solid-state drives consume some power, even when they are not in use. The drives’ standby power consumption varies between 0.05 and 2.5 watts per drive. And data centers contain an enormous number of drives requiring tremendous amounts of electricity to keep running.

Johnson estimates that about 25 percent of the data held in today’s data centers is archival in nature, meaning it will not need to be overwritten.

The ‘write once, read forever’ technology

The technology Johnson, Solomon, and their collaborators have developed promises to overcome the energy requirements and vulnerabilities of traditional data storage for archival applications.

The design builds off of Lippmann’s idea. Instead of taking an analog photograph, the team divided the medium into pixels. With the help of emulsion specialist Yves Gentet, they worked to improve Lippmann’s emulsion chemistry, making it more sensitive and capable of storing multiple wavelengths at each pixel location. The final emulsion is a combination of silver halide and extremely hardened gelatin. Their technique now can store up to four distinct narrow-band, superimposed colors in each pixel.

black text with squares with red, green, blue, yellow and pink in them with another large rectangle below with a spectrum of the rainbow in colors The standing wave storage technique can store up to four colors out of a possible 32 at each pixel location. This adds up to an astounding storage capacity of 4.6 terabits (or roughly 300 movies) in the area of a single photograph. Wave Domain

“The textbooks say that’s impossible,” Solomon says, “but we did it, so the textbooks are wrong.”

For each pixel, they can choose four colors out of a possible 32 to store.

That amounts to more than 40,000 possibilities. Thus, the technique can store more than 40,000 bits (although the format need not be binary) in each 10-square-micrometer pixel, or 4.6 terabits in a 10.16 centimeter by 12.7 cm modified Lippmann plate. That’s more than 300 movies’ worth of data stored in a single picture.

To write on the SWS medium, the plate—coated with a thin layer of the specially formulated emulsion—is exposed to light from an array of powerful color LEDs.

That way, the entire plate is written simultaneously, greatly reducing the writing time per pixel.

The plate then gets developed through a chemical process that blackens the exposed silver grains, memorizing the waves of color it was exposed to.

Finally, a small charged-couplet-device camera array, like those used in cellphones, reads out the information. The readout occurs for the entire plate at once, so the readout rate, like the writing rate, is fast.

“The data that we read is coming off the plate at such a high bandwidth,” Solomon says. “There is no computer on the planet that can absorb it without some buffering.”

The entire memory cell is a sandwich of the LED array, the photosensitive plate, and the CCD. All the elements use off-the-shelf parts.

“We took a long time to figure out how to make this in a very inexpensive, reproducible, quick way,” Johnson says. “The idea is to use readily available parts.” The entire storage medium, along with its read/write infrastructure, is relatively inexpensive and portable.

To test the durability of their storage method, the team sent their collaborators at NASA some 150 samples of their SWS devices to be hung by astronauts outside the International Space Station for nine months in 2019. They then tested the integrity of the stored data after the SWS plates were returned from space, compared with another 150 plates stored in Rosenthal’s lab on the ground.

“There was absolutely zero degradation from nine months of exposure to cosmic rays,” Solomon says. Meanwhile, the plates on Rosenthal’s desk were crawling with bacteria, while the ISS plates were sterile. Silver is a known bactericide, though, so the colors were immune, Solomon says.

Their most recent patent, granted earlier this year, describes a method of storing data that requires no power to maintain when not actively reading or writing data. Team members say the technique is incorruptible: It is immune to moisture, solar flares, cosmic rays, and other kinds of radiation. So, they argue, it can be used both in space and on land as a durable, low-cost archival data solution.

Passing on the torch

The new invention has massive potential applications. In addition to data centers and space applications, Johnson says, scientific enterprises such as the Rubin Observatory being built in Chile, will produce massive amounts of archival data that could benefit from SWS technology.

“It’s all reference data, and it’s an extraordinary amount of data that’s being generated every week that needs to be kept forever,” Johnson says.

Johnson says, however, that he and his team will not be the ones to bring the technology to market: “I’m 94 years old, and my two partners are in their 70s and 80s. We’re not about to start a company.”

He is ready to pass on the torch. The team is seeking a new chief executive to head up Wave Domain, which they hope will continue the development of SWS and bring it to mass adoption.

Johnson says he has learned that people rarely know which new technologies will eventually have the most impact. Perhaps, though few people are aware of it now, storing big data using old photographic technology will become an unexpected success.

The Future of Learning: Empowering the Next Generation to Lead the Digital Age

24 October 2024 at 12:30

A student’s perspective on where the future of learning—with AI—should be headed.  

GUEST COLUMN | by Conrad Ingersoll Dube and William Saulsbery

Education systems, especially K-12, are the foundation of society’s future, meant to equip students with the knowledge and skills that reflect the present, grounded in lessons from the past, to prepare them for tomorrow. Yet, as the world evolves at an unprecedented pace we’re still clinging to outdated teaching methods from decades ago. It’s time to question whether we’re truly preparing the next generation for the challenges and opportunities of the future.

‘It’s time to question whether we’re truly preparing the next generation for the challenges and opportunities of the future.’

Modes of imparting education have improved. Digital, internet and media technologies are frequently employed in classrooms, homework is submitted electronically, classroom discussion and chat groups are formed online. Dissemination has improved, but the content being imparted has remained fairly constant. How can we take this well established, and irreplaceable foundation and evolve it to fully prepare the students of today for the world of tomorrow?

Beyond Fundamental Programming

Students can access courses in AI or in Python programming, but the overwhelming majority of our coursework remains consistent with the curriculum of the past. A large part of education continues to focus on memorization and regurgitation. In days where neuralink technologies are starting to make information available to us from the web at any time, we must focus on evolving education to meet the challenges of modern times. With the fourth industrial revolution upon us, we are entering into an algorithmic economy. For students to succeed in the coming world, they must be taught to think creatively and become experts at problem solving.

For example, take the curriculum around the United States Civil War.  We are commonly teaching students that; there was a Civil War, the North won, Slavery ended, President Lincoln was assassinated. This is an incredible lesson that must be taught, but we are not doing the event, and its participants, justice teaching facts and dates alone. What if instead educators talked through how the war was fought, how it was won. How did General Grant solve terrain, feeding troops, morale, delegation, and how did he grow as a leader throughout the conflict? Then, ask students to tie these learnings to either current personal challenges, or the current geopolitical landscape. Walk them through questions like “how did Lincoln build a coalition to end slavery in the legislature? What was his relationship like with Grant, Sherman, and his other generals?” What is the importance of a great leader to listen to those he has appointed and take their council? The Civil War could be used to give students skills for life and their coming careers. 

Simultaneously, our educators would unleash their creativity to its fullest potential and become excited again about their subject matter. Teachers become teachers because they want to help children learn and flourish. They want to prepare their students for the new world, they want to impart wisdom that was imparted to them, and sometimes wisdom that was not. Release them from “textbook to white board and back again,” quizzing kids on this date and that name. Place 30% of their lesson plans in foundational knowledge of events, and 70% in the hows and whys, and what this can teach their students about solving the challenges facing the world today. 

Free students for Creativity and Problem Solving

The future of work is not person and machine working at odds, or at parallel, but working directly together. We must teach our students to use machines to quickly complete all repetitive tasks, or gathering of common facts and dates. The next generation of careers require humans to act in tandem with machines to form a hybrid society.

Teach students how to leverage artificial intelligence to act as an augmenting agent assisting in tasks. How do we leverage AI to drive better decisions and improve outcomes? What are the prospective threats of AI, and how to protect against them? How do humans introduce ethics into these digital systems? How do humans best combat threats to privacy, safety and human dignity? These are the questions we must grapple with and solve for. Coursework in every dimension should include a hybrid methodology that allows human students to focus on creativity and innovation. 

Who is Leading the Way?

Estonia is emerging as a leader in digital teaching. Their “Tiger Leap” initiative, implemented over 20 years ago to introduce computers to students at an early age, has been a success. For multiple years they have been named as one of the top Programmes for International Student Assessment by the Organisation for Economic Co-operation and Development (OECD). 

The UK is introducing computers and algorithms to children as early as 5 years old. Research shows that learning new languages is easiest when you are young, and computer languages are no exception. Whether you are programming using logic languages like Prolog, or writing object-oriented code in Java, learning the various dialects to converse with our digital colleagues should be as natural as learning new languages at an early age. 

A Digital Assistant for Every Student

AI shouldn’t be the centerpiece of education, but effective leveraging of a digital AI assistant should be a priority. When learning history, we should not be challenged to recall dates and events, these should be furnished by our digital assistant. 

Geography courses should focus not on the names of various straits and gulfs, but on the geographical challenges of these areas and how to best navigate them based on situational challenges and hypotheticals. Global warming, ecological threats, and biomedical solutions would all be engrossing topics for young minds. Again, use foundational curriculum as a basis for real world creativity and innovation. Walk students through the history of the Suez Canal and how it transformed commerce and the way of life for three continents. Then, ask students “what if a climate catastrophe closed the canal for 6-12 months?” What would the global consequences be? How many people would be adversely affected? Who would profit? Get them to think creatively on possible short, medium, and long term solutions. Educators and those running these institutions will become inspired at the possibilities of what scenarios they could create for their students, gaining ownership of the new educational system.

An Incalculable Impact

The United States is behind many in the industrial world in terms of science and math education. We need top governmental focus to catapult us to the front. We need to draw the brightest brains to teaching by arming them with a rock solid foundation honed over decades of practice, topped with a new problem solving focused end game. We must as a society make the choice to acknowledge and reward our teachers at an exponentially higher grade than today. They are molding the minds of our future society; should we not compensate them at top executive levels?

Strategic impact deserves more attention than the tactical quarterly impact that Wall Street seems to be focused on. Education and its overhaul has to become a central focus, as our future depends on it. Think of the cumulative benefit to the United States on the global stage—if every child is taught how to figure things out, creative problem solve, and be an innovator, the benefit to the nation would be incalculable. 

Conrad Ingersoll Dube (son of Chetan Dube, renowned futurist and founder of Amelia and Quant), is currently in high school in New York and his thoughts were the genesis of this piece. 

William Saulsbery is a former teacher and tutor who co-wrote the piece with Conrad. Connect with Will on LinkedIn.

The post The Future of Learning: Empowering the Next Generation to Lead the Digital Age appeared first on EdTech Digest.

CodeGuppy

23 October 2024 at 21:36

CodeGuppy is a free coding platform for schools, coding clubs, and independent learners. Teachers can use codeguppy.com to teach students the JavaScript language by building video games with sprites and sounds. A ton of example projects are included with the platform. With CodeGuppy, students learn coding by building games and fun applications.

With CodeGuppy you’ll learn to code real games and applications directly in your browser. You don’t need to install any software on your local machine. Any Windows, Mac or Chromebook computer is perfect for CodeGuppy.

At CodeGuppy.com they teach JavaScript – the most used and popular programming language nowadays. Their multi-scene code editor is empowering beginners to type their first line of code as well as advanced users to create multi-scene platform games.

To make coding fun and engaging, CodeGuppy provides you with a full library of animated characters, background images, and sounds that you can use in your games and applications.

Learning to code is easy and fun with the right platform. Teachers, parents, and students can use this platform in the classroom, coding club, or at home. The entire curriculum of lessons and projects is tailor-made for students with activities such as interactive graphics and game creation. Using this platform, students love creating programs and sharing them with their friends.

For these reasons and more, CodeGuppy earned a Cool Tool Award (finalist) for “Best Coding, Computer Science, Engineering Solution” as part of The EdTech Awards 2023. Learn more

The post CodeGuppy appeared first on EdTech Digest.

Brain Processes Sentence Structures as Fast as Visual Scenes

23 October 2024 at 21:02
This shows a brain.The human brain can detect the structure of a short sentence in as little as 150 milliseconds, the speed of a blink. Using brain imaging, scientists found that the brain's language comprehension system processes sentences flashed on a screen similarly to how it perceives a visual scene.
❌
❌