Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Why AI could eat quantum computing’s lunch

By: Edd Gent
7 November 2024 at 15:00

Tech companies have been funneling billions of dollars into quantum computers for years. The hope is that they’ll be a game changer for fields as diverse as finance, drug discovery, and logistics.

Those expectations have been especially high in physics and chemistry, where the weird effects of quantum mechanics come into play. In theory, this is where quantum computers could have a huge advantage over conventional machines.

But while the field struggles with the realities of tricky quantum hardware, another challenger is making headway in some of these most promising use cases. AI is now being applied to fundamental physics, chemistry, and materials science in a way that suggests quantum computing’s purported home turf might not be so safe after all.

The scale and complexity of quantum systems that can be simulated using AI is advancing rapidly, says Giuseppe Carleo, a professor of computational physics at the Swiss Federal Institute of Technology (EPFL). Last month, he coauthored a paper published in Science showing that neural-network-based approaches are rapidly becoming the leading technique for modeling materials with strong quantum properties. Meta also recently unveiled an AI model trained on a massive new data set of materials that has jumped to the top of a leaderboard for machine-learning approaches to material discovery.

Given the pace of recent advances, a growing number of researchers are now asking whether AI could solve a substantial chunk of the most interesting problems in chemistry and materials science before large-scale quantum computers become a reality. 

“The existence of these new contenders in machine learning is a serious hit to the potential applications of quantum computers,” says Carleo “In my opinion, these companies will find out sooner or later that their investments are not justified.”

Exponential problems

The promise of quantum computers lies in their potential to carry out certain calculations much faster than conventional computers. Realizing this promise will require much larger quantum processors than we have today. The biggest devices have just crossed the thousand-qubit mark, but achieving an undeniable advantage over classical computers will likely require tens of thousands, if not millions. Once that hardware is available, though, a handful of quantum algorithms, like the encryption-cracking Shor’s algorithm, have the potential to solve problems exponentially faster than classical algorithms can. 

But for many quantum algorithms with more obvious commercial applications, like searching databases, solving optimization problems, or powering AI, the speed advantage is more modest. And last year, a paper coauthored by Microsoft’s head of quantum computing, Matthias Troyer, showed that these theoretical advantages disappear if you account for the fact that quantum hardware operates orders of magnitude slower than modern computer chips. The difficulty of getting large amounts of classical data in and out of a quantum computer is also a major barrier. 

So Troyer and his colleagues concluded that quantum computers should instead focus on problems in chemistry and materials science that require simulation of systems where quantum effects dominate. A computer that operates along the same quantum principles as these systems should, in theory, have a natural advantage here. In fact, this has been a driving idea behind quantum computing ever since the renowned physicist Richard Feynman first proposed the idea.

The rules of quantum mechanics govern many things with huge practical and commercial value, like proteins, drugs, and materials. Their properties are determined by the interactions of their constituent particles, in particular their electrons—and simulating these interactions in a computer should make it possible to predict what kinds of characteristics a molecule will exhibit. This could prove invaluable for discovering things like new medicines or more efficient battery chemistries, for example. 

But the intuition-defying rules of quantum mechanics—in particular, the phenomenon of entanglement, which allows the quantum states of distant particles to become intrinsically linked—can make these interactions incredibly complex. Precisely tracking them requires complicated math that gets exponentially tougher the more particles are involved. That can make simulating large quantum systems intractable on classical machines.

This is where quantum computers could shine. Because they also operate on quantum principles, they are able to represent quantum states much more efficiently than is possible on classical machines. They could also take advantage of quantum effects to speed up their calculations.

But not all quantum systems are the same. Their complexity is determined by the extent to which their particles interact, or correlate, with each other. In systems where these interactions are strong, tracking all these relationships can quickly explode the number of calculations required to model the system. But in most that are of practical interest to chemists and materials scientists, correlation is weak, says Carleo. That means their particles don’t affect each other’s behavior significantly, which makes the systems far simpler to model.

The upshot, says Carleo, is that quantum computers are unlikely to provide any advantage for most problems in chemistry and materials science. Classical tools that can accurately model weakly correlated systems already exist, the most prominent being density functional theory (DFT). The insight behind DFT is that all you need to understand a system’s key properties is its electron density, a measure of how its electrons are distributed in space. This makes for much simpler computation but can still provide accurate results for weakly correlated systems.

Simulating large systems using these approaches requires considerable computing power. But in recent years there’s been an explosion of research using DFT to generate data on chemicals, biomolecules, and materials—data that can be used to train neural networks. These AI models learn patterns in the data that allow them to predict what properties a particular chemical structure is likely to have, but they are orders of magnitude cheaper to run than conventional DFT calculations. 

This has dramatically expanded the size of systems that can be modeled—to as many as 100,000 atoms at a time—and how long simulations can run, says Alexandre Tkatchenko, a physics professor at the University of Luxembourg. “It’s wonderful. You can really do most of chemistry,” he says.

Olexandr Isayev, a chemistry professor at Carnegie Mellon University, says these techniques are already being widely applied by companies in chemistry and life sciences. And for researchers, previously out of reach problems such as optimizing chemical reactions, developing new battery materials, and understanding protein binding are finally becoming tractable.

As with most AI applications, the biggest bottleneck is data, says Isayev. Meta’s recently released materials data set was made up of DFT calculations on 118 million molecules. A model trained on this data achieved state-of-the-art performance, but creating the training material took vast computing resources, well beyond what’s accessible to most research teams. That means fulfilling the full promise of this approach will require massive investment.

Modeling a weakly correlated system using DFT is not an exponentially scaling problem, though. This suggests that with more data and computing resources, AI-based classical approaches could simulate even the largest of these systems, says Tkatchenko. Given that quantum computers powerful enough to compete are likely still decades away, he adds, AI’s current trajectory suggests it could reach important milestones, such as precisely simulating how drugs bind to a protein, much sooner.

Strong correlations

When it comes to simulating strongly correlated quantum systems—ones whose particles interact a lot—methods like DFT quickly run out of steam. While more exotic, these systems include materials with potentially transformative capabilities, like high-temperature superconductivity or ultra-precise sensing. But even here, AI is making significant strides.

In 2017, EPFL’s Carleo and Microsoft’s Troyer published a seminal paper in Science showing that neural networks could model strongly correlated quantum systems. The approach doesn’t learn from data in the classical sense. Instead, Carleo says, it is similar to DeepMind’s AlphaZero model, which mastered the games of Go, chess, and shogi using nothing more than the rules of each game and the ability to play itself.

In this case, the rules of the game are provided by Schrödinger’s equation, which can precisely describe a system’s quantum state, or wave function. The model plays against itself by arranging particles in a certain configuration and then measuring the system’s energy level. The goal is to reach the lowest energy configuration (known as the ground state), which determines the system’s properties. The model repeats this process until energy levels stop falling, indicating that the ground state—or something close to it—has been reached.

The power of these models is their ability to compress information, says Carleo. “The wave function is a very complicated mathematical object,” he says. “What has been shown by several papers now is that [the neural network] is able to capture the complexity of this object in a way that can be handled by a classical machine.”

Since the 2017 paper, the approach has been extended to a wide range of strongly correlated systems, says Carleo, and results have been impressive. The Science paper he published with colleagues last month put leading classical simulation techniques to the test on a variety of tricky quantum simulation problems, with the goal of creating a benchmark to judge advances in both classical and quantum approaches.

Carleo says that neural-network-based techniques are now the best approach for simulating many of the most complex quantum systems they tested. “Machine learning is really taking the lead in many of these problems,” he says.

These techniques are catching the eye of some big players in the tech industry. In August, researchers at DeepMind showed in a paper in Science that they could accurately model excited states in quantum systems, which could one day help predict the behavior of things like solar cells, sensors, and lasers. Scientists at Microsoft Research have also developed an open-source software suite to help more researchers use neural networks for simulation.

One of the main advantages of the approach is that it piggybacks on massive investments in AI software and hardware, says Filippo Vicentini, a professor of AI and condensed-matter physics at École Polytechnique in France, who was also a coauthor on the Science benchmarking paper: “Being able to leverage these kinds of technological advancements gives us a huge edge.”

There is a caveat: Because the ground states are effectively found through trial and error rather than explicit calculations, they are only approximations. But this is also why the approach could make progress on what has looked like an intractable problem, says Juan Carrasquilla, a researcher at ETH Zurich, and another coauthor on the Science benchmarking paper.

If you want to precisely track all the interactions in a strongly correlated system, the number of calculations you need to do rises exponentially with the system’s size. But if you’re happy with an answer that is just good enough, there’s plenty of scope for taking shortcuts. 

“Perhaps there’s no hope to capture it exactly,” says Carrasquilla. “But there’s hope to capture enough information that we capture all the aspects that physicists care about. And if we do that, it’s basically indistinguishable from a true solution.”

And while strongly correlated systems are generally too hard to simulate classically, there are notable instances where this isn’t the case. That includes some systems that are relevant for modeling high-temperature superconductors, according to a 2023 paper in Nature Communications.

“Because of the exponential complexity, you can always find problems for which you can’t find a shortcut,” says Frank Noe, research manager at Microsoft Research, who has led much of the company’s work in this area. “But I think the number of systems for which you can’t find a good shortcut will just become much smaller.”

No magic bullets

However, Stefanie Czischek, an assistant professor of physics at the University of Ottawa, says it can be hard to predict what problems neural networks can feasibly solve. For some complex systems they do incredibly well, but then on other seemingly simple ones, computational costs balloon unexpectedly. “We don’t really know their limitations,” she says. “No one really knows yet what are the conditions that make it hard to represent systems using these neural networks.”

Meanwhile, there have also been significant advances in other classical quantum simulation techniques, says Antoine Georges, director of the Center for Computational Quantum Physics at the Flatiron Institute in New York, who also contributed to the recent Science benchmarking paper. “They are all successful in their own right, and they are also very complementary,” he says. “So I don’t think these machine-learning methods are just going to completely put all the other methods out of business.”

Quantum computers will also have their niche, says Martin Roetteler, senior director of quantum solutions at IonQ, which is developing quantum computers built from trapped ions. While he agrees that classical approaches will likely be sufficient for simulating weakly correlated systems, he’s confident that some large, strongly correlated systems will be beyond their reach. “The exponential is going to bite you,” he says. “There are cases with strongly correlated systems that we cannot treat classically. I’m strongly convinced that that’s the case.”

In contrast, he says, a future fault-tolerant quantum computer with many more qubits than today’s devices will be able to simulate such systems. This could help find new catalysts or improve understanding of metabolic processes in the body—an area of interest to the pharmaceutical industry.

Neural networks are likely to increase the scope of problems that can be solved, says Jay Gambetta, who leads IBM’s quantum computing efforts, but he’s unconvinced they’ll solve the hardest challenges businesses are interested in.

“That’s why many different companies that essentially have chemistry as their requirement are still investigating quantum—because they know exactly where these approximation methods break down,” he says.

Gambetta also rejects the idea that the technologies are rivals. He says the future of computing is likely to involve a hybrid of the two approaches, with quantum and classical subroutines working together to solve problems. “I don’t think they’re in competition. I think they actually add to each other,” he says.

But Scott Aaronson, who directs the Quantum Information Center at the University of Texas, says machine-learning approaches are directly competing against quantum computers in areas like quantum chemistry and condensed-matter physics. He predicts that a combination of machine learning and quantum simulations will outperform purely classical approaches in many cases, but that won’t become clear until larger, more reliable quantum computers are available.

“From the very beginning, I’ve treated quantum computing as first and foremost a scientific quest, with any industrial applications as icing on the cake,” he says. “So if quantum simulation turns out to beat classical machine learning only rarely, I won’t be quite as crestfallen as some of my colleagues.”

One area where quantum computers look likely to have a clear advantage is in simulating how complex quantum systems evolve over time, says EPFL’s Carleo. This could provide invaluable insights for scientists in fields like statistical mechanics and high-energy physics, but it seems unlikely to lead to practical uses in the near term. “These are more niche applications that, in my opinion, do not justify the massive investments and the massive hype,” Carleo adds.

Nonetheless, the experts MIT Technology Review spoke to said a lack of commercial applications is not a reason to stop pursuing quantum computing, which could lead to fundamental scientific breakthroughs in the long run.

“Science is like a set of nested boxes—you solve one problem and you find five other problems,” says Vicentini. “The complexity of the things we study will increase over time, so we will always need more powerful tools.”

Gandhi Inspired a New Kind of Engineering



This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

The teachings of Mahatma Gandhi were arguably India’s greatest contribution to the 20th century. Raghunath Anant Mashelkar has borrowed some of that wisdom to devise a frugal new form of innovation he calls “Gandhian engineering.” Coming from humble beginnings, Mashelkar is driven to ensure that the benefits of science and technology are shared more equally. He sums up his philosophy with the epigram “more from less for more.” This engineer has led India’s preeminent R&D organization, the Council of Scientific and Industrial Research, and he has advised successive governments.

What was the inspiration for Gandhian engineering?

Raghunath Anant Mashelkar: There are two quotes of Gandhi’s that were influential. The first was, “The world has enough for everyone’s need, but not enough for everyone’s greed.” He was saying that when resources are exhaustible, you should get more from less. He also said the benefits of science must reach all, even the poor. If you put them together, it becomes “more from less for more.”

My own life experience inspired me, too. I was born to a very poor family, and my father died when I was six. My mother was illiterate and brought me to Mumbai in search of a job. Two meals a day was a challenge, and I walked barefoot until I was 12 and studied under streetlights. So it also came from my personal experience of suffering because of a lack of resources.

How does Gandhian engineering differ from existing models of innovation?

Mashelkar: Conventional engineering is market or curiosity driven, but Gandhian engineering is application and impact driven. We look at the end user and what we want to achieve for the betterment of humanity.

Most engineering is about getting more from more. Take an iPhone: They keep creating better models and charging higher prices. For the poor it is less from less: Conventional engineering looks at removing features as the only way to reduce costs.

In Gandhian engineering, the idea is not to create affordable [second-rate] products, but to make high technology work for the poor. So we reinvent the product from the ground up. While the standard approach aims for premium price and high margins, Gandhian engineering will always look at affordable price, but high volumes.

A photo of rows of artificial feet.  The Jaipur foot is a light, durable, and affordable prosthetic.Gurinder Osan/AP

What is your favorite example of Gandhian engineering?

Mashelkar: My favorite is the Jaipur foot. Normally, a sophisticated prosthetic foot costs a few thousand dollars, but the Jaipur foot does it for [US] $20. And it’s very good technology; there is a video of a person wearing a Jaipur foot climbing a tree, and you can see the flexibility is like a normal foot. Then he runs one kilometer in 4 minutes, 30 seconds.

What is required for Gandhian engineering to become more widespread?

Mashelkar: In our young people, we see innovation and we see passion, but compassion is the key. We also need more soft funding [grants or zero-interest loans], because venture capital companies often turn out to be “vulture capital” in a way, because they want immediate returns.

We need a shift in the mindset of businesses—they can make money not just from premium products for those at the top of the pyramid, but also products with affordable excellence designed for large numbers of people.

This article appears in the November 2024 print issue as “The Gandhi Inspired Inventor.”

Erika Cruz Keeps Whirlpool’s Machines Spinning



Few devices are as crucial to people’s everyday lives as their household appliances. Electrical engineer Erika Cruz says it’s her mission to make sure they operate smoothly.

Cruz helps design washing machines and dryers for Whirlpool, the multinational appliance manufacturer.

Erika Cruz


Employer:

Whirlpool

Occupation:

Associate electrical engineer

Education:

Bachelor’s degree in electronics engineering, Industrial University of Santander, in Bucaramanga, Colombia

As a member of the electromechanical components team at Whirlpool’s research and engineering center in Benton Harbor, Mich., she oversees the development of timers, lid locks, humidity sensors, and other components.

More engineering goes into the machines than is obvious. Because the appliances are sold around the world, she says, they must comply with different technical and safety standards and environmental conditions. And reliability is key.

“If the washer’s door lock gets stuck and your clothes are inside, your whole day is going to be a mess,” she says.

While appliances can be taken for granted, Cruz loves that her work contributes in its own small way to the quality of life of so many.

“I love knowing that every time I’m working on a new design, the lives of millions of people will be improved by using it,” she says.

From Industrial Design to Electrical Engineering

Cruz grew up in Bucaramanga, Colombia, where her father worked as an electrical engineer, designing control systems for poultry processing plants. Her childhood home was full of electronics, and Cruz says her father taught her about technology. He paid her to organize his resistors, for example, and asked her to create short videos for work presentations about items he was designing. He also took Cruz and her sister along with him to the processing plants.

“We would go and see how the big machines worked,” she says. “It was very impressive because of their complexity and impact. That’s how I got interested in technology.”

In 2010, Cruz enrolled in Colombia’s Industrial University of Santander, in Bucaramanga, to study industrial design. But she quickly became disenchanted with the course’s focus on designing objects like fancy tables and ergonomic chairs.

“I wanted to design huge machines like my father did,” she says.

A teacher suggested that she study mechanical engineering instead. But her father was concerned about discrimination she might face in that career.

“He told me it would be difficult to get a job in the industry because mechanical engineers work with heavy machinery, and they saw women as being fragile,” Cruz says.

Her father thought electrical engineers would be more receptive to women, so she switched fields.

“I am very glad I ended up studying electronics because you can apply it to so many different fields,” Cruz says. She received a bachelor’s degree in electronics engineering in 2019.

The Road to America

While at university, Cruz signed up for a program that allowed Colombian students to work summer jobs in the United States. She held a variety of summer positions in Galveston, Texas, from 2017 to 2019, including cashier, housekeeper, and hostess.

She met her future husband in 2018, an American working at the same amusement park as she did. When she returned the following summer, they started dating, and that September they married. Since she had already received her degree, he was eager for her to move to the states permanently, but she made the difficult decision to return to Colombia.

“With the language barrier and my lack of engineering experience, I knew if I stayed in the United States, I would have to continue working jobs like housekeeping forever,” she says. “So I told my husband he had to wait for me because I was going back home to get some engineering experience.”

“I love knowing that every time I’m working on a new design, the lives of millions of people will be improved by using it.”

Cruz applied for engineering jobs in neighboring Brazil, which had more opportunities than Colombia did. In 2021, she joined Whirlpool as an electrical engineer at its R&D site in Joinville, Brazil. There, she introduced into mass production sensors and actuators provided by new suppliers.

Meanwhile, she applied for a U.S. Green Card, which would allow her to work and live permanently in the country. She received it six months after starting her job. Cruz asked her manager about transferring to one of Whirlpool’s U.S. facilities, not expecting to have any luck. Her manager set up a phone call with the manager of the components team at the company’s Benton Harbor site to discuss the request. Cruz didn’t realize that the call was actually a job interview. She was offered a position there as an electrical engineer and moved to Michigan later that year.

Designing Appliances Is Complex

Designing a new washing machine or dryer is a complex process, Cruz says. First, feedback from customers about desirable features is used to develop a high-level design. Then the product design work is divided among small teams of engineers, each responsible for a given subsystem, including hardware, software, materials, and components.

Part of Cruz’s job is to test components from different suppliers to make sure they meet safety, reliability, and performance requirements. She also writes the documentation that explains to other engineers about the components’ function and design.

Cruz then helps select the groups of components to be used in a particular application—combining, say, three temperature sensors with two humidity sensors in an optimized location to create a system that finds the best time to stop the dryer.

Building a Supportive Environment

Cruz loves her job, but her father’s fears about her entering a male-dominated field weren’t unfounded. Discrimination was worse in Colombia, she says, where she regularly experienced inappropriate comments and behavior from university classmates and teachers.

Even in the United States, she points out, “As a female engineer, you have to actually show you are able to do your job, because occasionally at the beginning of a project men are not convinced.”

In both Brazil and Michigan, Cruz says, she’s been fortunate to often end up on teams with a majority of women, who created a supportive environment. That support was particularly important when she had her first child and struggled to balance work and home life.

“It’s easier to talk to women about these struggles,” she says. “They know how it feels because they have been through it too.”

Update Your Knowledge

Working in the consumer electronics industry is rewarding, Cruz says. She loves going into a store or visiting someone’s home and seeing the machines that she’s helped build in action.

A degree in electronics engineering is a must for the field, Cruz says, but she’s also a big advocate of developing project management and critical thinking skills. She is a certified associate in project management, granted by the Project Management Institute, and has been trained in using tools that facilitate critical thinking. She says the project management program taught her how to solve problems in a more systematic way and helped her stand out in interviews.

It’s also important to constantly update your knowledge, Cruz says, “because electronics is a discipline that doesn’t stand still. Keep learning. Electronics is a science that is constantly growing.”

The Engineer Who Pins Down the Particles at the LHC



The Large Hadron Collider has transformed our understanding of physics since it began operating in 2008, enabling researchers to investigate the fundamental building blocks of the universe. Some 100 meters below the border between France and Switzerland, particles accelerate along the LHC’s 27-kilometer circumference, nearly reaching the speed of light before smashing together.

The LHC is often described as the biggest machine ever built. And while the physicists who carry out experiments at the facility tend to garner most of the attention, it takes hundreds of engineers and technicians to keep the LHC running. One such engineer is Irene Degl’Innocenti, who works in digital electronics at the European Organization for Nuclear Research (CERN), which operates the LHC. As a member of CERN’s beam instrumentation group, Degl’Innocenti creates custom electronics that measure the position of the particle beams as they travel.

Irene Degl’Innocenti


Employer:

CERN

Occupation:

Digital electronics engineer

Education:

Bachelor’s and master’s degrees in electrical engineering; Ph.D. in electrical, electronics, and communications engineering, University of Pisa, in Italy

“It’s a huge machine that does very challenging things, so the amount of expertise needed is vast,” Degl’Innocenti says.

The electronics she works on make up only a tiny part of the overall operation, something Degl’Innocenti is keenly aware of when she descends into the LHC’s cavernous tunnels to install or test her equipment. But she gets great satisfaction from working on such an important endeavor.

“You’re part of something that is very huge,” she says. “You feel part of this big community trying to understand what is actually going on in the universe, and that is very fascinating.”

Opportunities to Work in High-energy Physics

Growing up in Italy, Degl’Innocenti wanted to be a novelist. Throughout high school she leaned toward the humanities, but she had a natural affinity for math, thanks in part to her mother, who is a science teacher.

“I’m a very analytical person, and that has always been part of my mind-set, but I just didn’t find math charming when I was little,” Degl’Innocenti says. “It took a while to realize the opportunities it could open up.”

She started exploring electronics around age 17 because it seemed like the most direct way to translate her logical, mathematical way of thinking into a career. In 2011, she enrolled in the University of Pisa, in Italy, earning a bachelor’s degree in electrical engineering in 2014 and staying on to earn a master’s degree in the same subject.

At the time, Degl’Innocenti had no idea there were opportunities for engineers to work in high-energy physics. But she learned that a fellow student had attended a summer internship at Fermilab, the participle physics and accelerator laboratory in Batavia, Ill. So she applied for and won an internship there in 2015. Since Fermilab and CERN closely collaborate, she was able to help design a data-processing board for LHC’s Compact Muon Solenoid experiment.

Next she looked for an internship closer to home and discovered CERN’s technical student program, which allows students to work on a project over the course of a year. Working in the beam-instrumentation group, Degl’Innocenti designed a digital-acquisition system that became the basis for her master’s thesis.

Measuring the Position of Particle Beams

After receiving her master’s in 2017, Degl’Innocenti went on to pursue a Ph.D., also at the University of Pisa. She conducted her research at CERN’s beam-position section, which builds equipment to measure the position of particle beams within CERN’s accelerator complex. The LHC has roughly 1,000 monitors spaced around the accelerator ring. Each monitor typically consists of two pairs of sensors positioned on opposite sides of the accelerator pipe, and it is possible to measure the beam’s horizontal and vertical positions by comparing the strength of the signal at each sensor.

The underlying concept is simple, Degl’Innocenti says, but these measurements must be precise. Bunches of particles pass through the monitors every 25 nanoseconds, and their position must be tracked to within 50 micrometers.

“We start developing a system years in advance, and then it has to work for a couple of decades.”

Most of the signal processing is normally done in analog, but during her Ph.D., she focused on shifting as much of this work as possible to the digital domain because analog circuits are finicky, she says. They need to be precisely calibrated, and their accuracy tends to drift over time or when temperatures fluctuate.

“It’s complex to maintain,” she says. “It becomes particularly tricky when you have 1,000 monitors, and they are located in an accelerator 100 meters underground.”

Information is lost when analog is converted to digital, however, so Degl’Innocenti analyzed the performance of the latest analog-to-digital converters (ADCs) and investigated their effect on position measurements.

Designing Beam-Monitor Electronics

After completing her Ph.D. in electrical, electronics, and communications engineering in 2021, Degl’Innocenti joined CERN as a senior postdoctoral fellow. Two years later, she became a full-time employee there, applying the results of her research to developing new hardware. She’s currently designing a new beam-position monitor for the High-Luminosity upgrade to the LHC, expected to be completed in 2028. This new system will likely use a system-on-chip to house most of the electronics, including several ADCs and a field-programmable gate array (FPGA) that Degl’Innocenti will program to run a new digital signal-processing algorithm.

She’s part of a team of just 15 who handle design, implementation, and ongoing maintenance of CERN’s beam-position monitors. So she works closely with the engineers who design sensors and software for those instruments and the physicists who operate the accelerator and set the instruments’ requirements.

“We start developing a system years in advance, and then it has to work for a couple of decades,” Degl’Innocenti says.

Opportunities in High-Energy Physics

High-energy physics has a variety of interesting opportunities for engineers, Degl’Innocenti says, including high-precision electronics, vacuum systems, and cryogenics.

“The machines are very large and very complex, but we are looking at very small things,” she says. “There are a lot of big numbers involved both at the large scale and also when it comes to precision on the small scale.”

FPGA design skills are in high demand at all kinds of research facilities, and embedded systems are also becoming more important, Degl’Innocenti says. The key is keeping an open mind about where to apply your engineering knowledge, she says. She never thought there would be opportunities for people with her skill set at CERN.

“Always check what technologies are being used,” she advises. “Don’t limit yourself by assuming that working somewhere would not be possible.”

This article appears in the August 2024 print issue as “Irene Degl’Innocenti.”

❌
❌