Sneaky clocks: Uncovering Einstein’s relativity in an interacting atomic playground

For over a century, physicists have grappled with one of the most profound questions in science: How do the rules of quantum mechanics, which govern the smallest particles, fit with the laws of general relativity, which describe the universe on the largest scales?

The optical lattice clock, one of the most precise timekeeping devices, is becoming a powerful tool used to tackle this great challenge. Within an optical lattice clock, atoms are trapped in a “lattice” potential formed by laser beams and are manipulated with precise control of quantum coherence and interactions governed by quantum mechanics.

Simultaneously, according to Einstein’s laws of general relativity, time moves slower in stronger gravitational fields. This effect, known as gravitational redshift, leads to a tiny shift of atoms’ internal energy levels depending on their position in gravitational fields, causing their “ticking”—the oscillations that define time in optical lattice clocks—to change.

By measuring the tiny shifts of oscillation frequency in these ultra-precise clocks, researchers are able to explore the influences of Einstein’s theory of relativity on quantum systems.

While relativistic effects are well-understood for individual atoms, their role in many-body quantum systems, where atoms can interact and become entangled, remains largely unexplored.

Making a step forward in this direction, researchers led by JILA and NIST Fellows and University of Colorado Boulder physics professors Jun Ye and Ana Maria Rey—in collaboration with scientists at the Leibnitz University in Hanover, the Austrian Academy of Sciences, and the University of Innsbruck—proposed practical protocols to explore the effects of relativity, such as the gravitational redshift, on quantum entanglement and interactions in an optical atomic clock.

Their work revealed that the interplay between gravitational effects and quantum interactions can lead to unexpected phenomena, such as atomic synchronization and quantum entanglement among particles.

The results of this study are published in Physical Review Letters.

“One of our key findings is that interactions between atoms can help to lock them together so that now they behave as a unified system instead of ticking independently due to the gravitational redshift,” explains Dr. Anjun Chu, a former JILA graduate student, now a postdoctoral researcher at the University of Chicago and the paper’s first author.

“This is really cool because it directly shows the interplay between quantum interactions and gravitational effects.”

“The interplay between general relativity [GR] and quantum entanglement has puzzled physicists for years,” Rey adds.

“The challenge lies in the fact that GR corrections in most tabletop experiments are minuscule, making them extremely difficult to detect. However, atomic clocks are now reaching unprecedented precision, bringing these elusive effects within measurable range.

“Since these clocks simultaneously interrogate many atoms, they provide a unique platform to explore the intersection of GR and many-body quantum physics. In this work, we investigated a system where atoms interact by exchanging photons within an optical cavity.

“Interestingly, we found out that while individual interactions alone can have no direct effect on the ticking of the clock, their collective influence on the redshift can significantly modify the dynamics and even generate entanglement among the atoms, which is very exciting.”

Distinguishing gravitational effects

To explore this challenge, the team devised innovative protocols to observe how gravitational redshift interferes with quantum behavior.

The first issue they focused on was to uniquely distinguish gravitational effects in an optical lattice clock from other noise sources contributing to the tiny frequency shifts.

They utilized a technique called a dressing protocol, which involves manipulating the internal states of particles with laser light. While dressing protocols are a standard tool in quantum optics, this is one of the first instances of the protocol being used to fine-tune gravitational effects.

The tunability is based on the mechanism known as mass-energy equivalence (from Einstein’s famous equation E=mc2), which means that changes in a particle’s internal energy can subtly alter its mass. Based on this mechanism, an atom in the excited state has a slightly larger mass compared to the same atom in the ground state.

The mass difference in gravitational potential energy is equivalent to gravitational redshift. The dressing protocol provides a flexible way to tune the mass difference, and thus the gravitational redshift, by controlling the particles to stay in a superposition of the two internal energy states.

Instead of being strictly in the ground or excited state, the particles can be tuned to occupy both of the states simultaneously with a continuous change of occupation probability between these two levels. This technique provides unprecedented control of internal states, enabling the researchers to fine-tune the size of gravitational effects.

In this way, the researchers could distinguish genuine gravitational redshift effects from other influences, like magnetic field gradients, within the system.

“By changing the superpositions of internal levels of the particles you’re addressing, you can change how large the gravitational effects appear,” notes JILA graduate student Maya Miklos. “This is a really clever way to probe mass-energy equivalence at the quantum level.”

Seeing synchronization and entanglement

After providing a recipe to distinguish genuine gravitational effects, the researchers explored gravitational manifestations in quantum many-body dynamics. They made use of the photon-mediated interactions generated by placing the atoms in an optical cavity.

If one atom is in an excited state, it can relax back to the ground state by emitting a photon into the cavity. This photon doesn’t necessarily escape the system but can be absorbed by another atom in the ground state, exciting it in turn.

Such an exchange of energy—known as photon-mediated interactions—is key to making particles interact, even when they cannot physically touch each other.

Such types of quantum interactions can compete with gravitational effects on individual atoms inside the cavity. Typically, particles positioned at different “heights” within a gravitational field experience slight differences in how they “tick” due to gravitational redshift. Without interactions between particles, the slight difference in oscillation frequencies will cause them to fall out of sync over time.

However, when photon-mediated interactions were introduced, something remarkable happened: the particles began to synchronize, effectively “locking” their ticking together despite the differences in oscillation frequencies induced by gravity.

“It’s fascinating,” Chu says. “You can think of each particle as its own little clock. But when they interact, they start to tick in unison, even though gravity is trying to pull their timing apart.”

This synchronization showcased a fascinating interplay between gravitational effects and quantum interactions, where the latter can override the natural desynchronization caused by gravitational redshift.

This synchronization wasn’t just an oddity—it also led to the creation of quantum entanglement, a phenomenon where particles become interconnected, with the state of one instantly affecting the other.

Remarkably, the researchers found that the speed of synchronization could also serve as an indirect measure of entanglement, offering an insight into quantifying the interplay between two effects.

“Synchronization is the first phenomenon we can see that reveals this competition between gravitational redshift and quantum interactions,” adds JILA postdoctoral researcher Dr. Kyungtae Kim. “It’s a window into how these two forces balance each other.”

While this study revealed the initial interactions between these two fields of physics, the protocols developed could help refine experimental techniques, making them even more precise—with applications ranging from quantum computing to fundamental physics experiments.

“Detecting this GR-facilitated entanglement would be a groundbreaking achievement, and our theoretical calculations suggest that it is within reach of current or near-term experiments,” says Rey.

Future experiments could explore how particles behave under different conditions or how interactions can amplify gravitational effects, bringing us closer to unifying the two great pillars of modern physics.

More information: Anjun Chu et al, Exploring the Dynamical Interplay between Mass-Energy Equivalence, Interactions, and Entanglement in an Optical Lattice Clock, Physical Review Letters (2025). DOI: 10.1103/PhysRevLett.134.093201. On arXivDOI: 10.48550/arxiv.2406.03804

Journal information: Physical Review Letters  arXiv 

Provided by JILA 

Controlling electrons in molecules at ultrafast timescales with tailor-made terahertz light pulses

Scientists at Yokohama National University, in collaboration with RIKEN and other institutions in Japan and Korea, have made an important discovery about how electrons move and behave in molecules. This discovery could potentially lead to advances in electronics, energy transfer, and chemical reactions.

Published in the Science, their study reveals a new way to control the distribution of electrons in molecules using very fast phase-controlled pulses of light in the terahertz range.

Atoms and molecules contain negatively charged electrons that usually stay in specific energy levels, like layers, around the positively charged nucleus. The way these electrons are arranged in the molecule is key to how the molecule behaves.

This arrangement affects important processes like how light is emitted, how charges move between molecules, and how chemical reactions happen. For example, when light hits an electron and gives it enough energy, the electron moves to a higher energy level, leaving behind a positively charged “hole.” This creates an exciton—a tiny energy packet in the molecule that can emit the light.

This process is key to technologies like solar cells, where excitons help convert sunlight into electricity, and LEDs, where they release energy as light.

However, there are other important states that molecules can exist in, like charged states and charged excited states. Charged states occur when a molecule gains or loses an electron, while charged excited states involve both a charge change and an electron in a higher energy level.

These are important for many processes, but it has been very difficult to control these states, especially on ultrafast timescales, using traditional technology. Normally, light from the visible spectrum doesn’t provide enough energy to change the charge of the molecule and therefore cannot change the number of electrons in it.

To overcome this challenge, the researchers used terahertz light pulses, a type of light with a much lower frequency than visible light. These pulses cause electrons to move between a molecule and the metal tip of a specialized microscope that can manipulate individual molecules, allowing the team to either remove or add an electron to the molecule.

This new method offers a way to control not only excitons in a controlled manner which is both quick and precise, but also other important molecular states that are essential for chemical reactions, energy transfer and many other processes.

The team also demonstrated that terahertz light, which is invisible to the human eye, can be converted into visible light within a molecule, revealing a novel way to transform one type of light into another through molecular energy changes.

“While excitons typically form when light is absorbed by a material, our findings reveal they can also be created through charged states using these specially designed terahertz pulses,” says Professor Ikufumi Katayama, the study’s corresponding author from the Faculty of Engineering at Yokohama National University.

Discover the latest in science, tech, and space with over 100,000 subscribers who rely on Phys.org for daily insights. Sign up for our free newsletter and get updates on breakthroughs, innovations, and research that matter—daily or weekly.

e-mail
“This opens new possibilities for controlling how charge moves within molecules, which could lead to better solar cells, smaller light-based devices, and faster electronics.”

The team’s main achievement was the ability to control exciton formation at the single-molecule level. Professor Jun Takeda, another corresponding author from the Faculty of Engineering at Yokohama National University, explains, “By precisely controlling how electrons move between a single molecule and the metal tip of the specialized microscope, we were able to guide exciton formation and the chemical reactions that follow.

“These processes usually happen randomly, but with terahertz pulses, we can determine exactly when and how reactions occur at the molecular level. This could lead to breakthroughs in nanotechnology, advanced materials, and more efficient catalysts for energy and industry.”

More information: Kensuke Kimura et al, Ultrafast on-demand exciton formation in a single-molecule junction by tailored terahertz pulses, Science (2025). DOI: 10.1126/science.ads2776. www.science.org/doi/10.1126/science.ads2776

Journal information: Science

Provided by Yokohama National University

A pinch of salt can steer colloids for improved water purification and drug delivery

The ability to better steer particles suspended in liquids could lead to better water purification processes, new drug delivery systems, and other applications. The key ingredient, say Yale researchers, is a pinch of salt.

The research team, led by Prof. Amir Pahlavan, has published their results in Physical Review Letters.

The phenomenon of diffusiophoresis causes suspended particles known as colloids to move due to differing concentrations of a dissolved substance—the gradient—in the solution. Haoyu Liu, a graduate student in Pahlavan’s lab, notes that the phenomenon was discovered more than 50 years ago, yet its applications in microfluidics and implications in environmental flows have just recently been realized.

“Chemical gradient is actually everywhere in our natural systems and also in our industrial processes,” said Liu, lead author of the study. “So this phenomenon has drawn very much interest from scientists and engineers.”

Scientists have traditionally used electric or magnetic fields to manipulate colloids. But Pahlavan and Liu report that varying concentrations of salt can lead to the spontaneous motion of colloids. These effects can lead to unexpected outcomes, and even create a swirling vortex that reverses the particles’ usual paths.

Using gradients in salt, polymer, or other molecular solutes, they say, offers some advantages over the other processes.

“One is the simplicity of using a salt gradient,” said Pahlavan, assistant professor of mechanical engineering & materials science. “All you need is just more salty or less salty water to manipulate the colloids, as opposed to a more sophisticated setup.”

This process might be most useful in natural systems
“You don’t always have an electric field or a magnetic field, but you do always have a salt or chemical gradient, either because of human activities, or because of many other processes that might happen in nature.”

As far as applications, Pahlavan and Liu said it could have benefits for environmental cleanups.

“With contaminant remediation, where they inject polymer particles to react with a chemical plume somewhere to prevent its further spreading, we can utilize the solute gradients to make sure that the particles that are being injected end up at the right location,” Pahlavan said.

Drug delivery is another potential application.

“Here, you want to deliver particles to certain tumor cells or perhaps a biofilm,” he said. “Maybe we can use the solute gradients to guide the particles to go where we want. These are hidden targets whose location we don’t know a priori; solute gradients, however, could steer the colloids toward the right destination.”

More information: Haoyu Liu et al, Diffusioosmotic Reversal of Colloidal Focusing Direction in a Microfluidic T-Junction, Physical Review Letters (2025). DOI: 10.1103/PhysRevLett.134.098201

Journal information: Physical Review Letters

Provided by Yale University

Researchers address material challenges to make commercial fusion power a reality

Imagine if we could take the energy of the sun, put it in a container, and use it to provide green, sustainable power for the world. Creating commercial fusion power plants would essentially make this idea a reality. However, there are several scientific challenges to overcome before we can successfully harness fusion power in this way.

Researchers from the U. S. Department of Energy (DOE) Ames National Laboratory and Iowa State University are leading efforts to overcome material challenges that could make commercial fusion power a reality. The research teams are part of a DOE Advanced Research Projects Agency-Energy (ARPA-E) program called Creating Hardened And Durable fusion first Wall Incorporating Centralized Knowledge (CHADWICK). They will investigate materials for the first wall of a fusion reactor. The first wall is the structure that surrounds the fusion reaction, so it bears the brunt of the extreme environment in the fusion reactor core.

ARPA-E recently selected 13 projects under the CHADWICK program. Of those 13, Ames Lab leads one of the projects and is collaborating alongside Iowa State on another project, which is led by Pacific Northwest National Laboratory (PNNL).

According to Nicolas Argibay, a scientist at Ames Lab and lead of one project, one of the key challenges in harnessing fusion-based power is containing the plasma core that creates the energy. The plasma is like a miniature sun that needs to be contained by materials that can withstand a combination of extreme temperature, irradiation, and magnetic fields while efficiently extracting heat for conversion to electricity.

Argibay explained that in the reactor core, the plasma is contained by a strong magnetic field, and the first wall would surround this environment. The first wall has two layers of material, one that is closest to the strong magnetic and plasma environments, and one that will help move the energy along to other parts of the system.

The first layer material needs to be structurally sound, resisting cracking and erosion over time. Argibay also said that it cannot stay radioactive for very long, so that the reactor can be turned on and off for maintenance without endangering anyone working on it. The project he is leading is focused on the first layer material.

“I think one of the things we [at Ames Lab] bring is a unique capability for materials design, but also, very importantly, for processing them. It is hard to make and manage these materials,” said Argibay. “On the project I’m leading, we’re using tungsten as a major constituent, and with the exception of some forms of carbon, like diamond, that’s the highest melting temperature element on the periodic table.”

Specialized equipment is necessary to process and test refractory materials, which have extremely high melting temperatures. In Argibay’s lab, the first piece of equipment obtained is a commercial, modular, customizable, open-architecture platform for both making refractory materials and exploring advanced and smart manufacturing methods to make the process more efficient and reliable.

“Basically, we can make castings and powders of alloys up to and including pure tungsten, which is the highest melting temperature element other than diamond,” said Argibay.

By spring of 2025, Argibay said that they will have two additional systems in place for creating these refractory materials at both lab-scale and pilot-scale quantities. He explained it is easier to make small quantities (lab-scale) than larger quantities (pilot-scale), but the larger quantities are important for collecting meaningful and useful data that can translate to a real-world application.

Argibay also has capabilities for measuring the mechanical properties of refractory materials at relevant temperatures. Systems capable of making measurements well above 1,000°C (1,832°F) are rare. Ames Lab now has one of the only commercial testers in the country that can measure tensile properties of alloys at temperatures up to 1,500°C (2,732°F), which puts the lab in a unique position to both support process science and alloy design.

Jordan Tiarks, another scientist at Ames Lab who is working on the project led by PNNL, is focused on a different aspect of this reactor research. His team is relying on Ames Lab’s 35 years of experience leading the field in gas atomization, powder metallurgy, and technology transfer to industry to develop materials for the first wall structural material.

“The first wall structural material is actually the part that holds it all together,” said Tiarks. “You need to have more complexity and more structural strength. You might have things like cooling channels that need to be integrated in the structural wall so that we can extract all of that heat, and don’t just melt the first wall material.”

Tiarks’s team hopes to utilize over a decade of research focused on developing a unique way of creating oxide dispersion strengthened (ODS) steel for next generation nuclear fission. ODS steel contains very small ceramic particles (nanoparticles) that are dispersed throughout the steel. These particles improve the metal’s mechanical properties and ability to withstand high irradiation.

“What this project does is it takes all of our lessons learned on steels, and we’re going to apply them to a brand-new medium, a vanadium-based alloy that is well suited for nuclear fusion,” said Tiarks.

The major challenge Tiarks’s team faces is how vanadium behaves differently from steel. Vanadium has a much higher melting point, and it is more reactive than steel, so it cannot be contained with ceramic. Instead, his team must use a slightly different process for creating vanadium-based powders.

“We use high pressure gas to break up the molten material into tiny droplets which rapidly cool to create the powders we’re working with,” explained Tiarks. “And [in this case] we can’t use any sort of ceramic to be able to deliver the melt. So what we have to do is called ‘free fall gas atomization.” It is essentially a big opening in a gas die where a liquid stream pours through and we use supersonic gas jets to attack that liquid stream.”

There are some challenges with the method Tiarks described. First, he said that it is less efficient than other methods that rely on ceramics. Secondly, due to the high melting point of vanadium, it is harder to add more heat during the pouring process, which would provide more time to break up the liquid into droplets. Finally, vanadium tends to be reactive.

“Powders are reactive. If you aerosolize them, they will explode. However, a fair number of metals will form a thin oxide shell on the outside layer that can help ‘passivate’ them from further reactions,” Tiarks explained. “It’s kind of like an M&M. It’s the candy coating on the outside that protects the rest of the powder particle from further oxidizing.

“A lot of the research we’ve done in the Ames lab is actually figuring out how we passivate these powders so you can handle them safely, so they won’t further react, but without degrading too much of the performance of those powders by adding too much oxygen. If you oxidize them fully, all of a sudden, now we have a ceramic particle, and it’s not a metal anymore, and so we have to be very careful to control the passivation process.”

Tiarks explained that discovering a powder processing method for vanadium-based materials will make them easier to form into the complicated geometric chapes that are necessary for the second layer to function properly. Additionally, vanadium will not interfere with the magnetic fields in the reactor core.

Sid Pathak, an assistant professor at Iowa State, is leading the group that will test the material samples for the second layer. When the material powder made by the Ames Lab group is ready, it will be formed into plates at PNNL by spraying the powder and friction stir processing onto a surface.

“Once you make that plate, we need to test its properties, particularly its response under the extreme radiation conditions present in a fusion reactor, and make sure that we get something better than what is currently available,” said Pathak. “That’s our claim, that our materials will be superior to what is used today.”

Pathak explained that it can take 10–20 years for radiation damage to show up on materials in a nuclear reactor. It would be impossible to recreate that timeline during a 3-year research project. Instead, his team uses ion irradiation to test how materials respond in extreme environments. For this process, his team uses a particle accelerator to bombard a material with ions available at University of Michigan’s Michigan Ion Beam Laboratory. The results simulate how a material is affected by radiation.

“Ion irradiation is a technique where you radiate [the material] with ions instead of neutrons. That can be done in a matter of hours,” said Pathak. “Also, the material does not become radioactive after ion irradiation, so you can handle it much more easily.”

Despite these benefits, there is one disadvantage to using ion irradiation. The damage only penetrates the material one or two micrometers deep, meaning that it can only be seen with a microscope. For reference, the average strand of human hair is about 70-100 micrometers thick. So, testing materials at these very small depths requires specialized tools that work at micro-length scales, which are available at Pathak’s lab at Iowa State University.

“The pathway to commercial nuclear fusion power has some of the greatest technical challenges of our day but also has the potential for one of the greatest payoffs—harnessing the power of the sun to produce abundant, clean energy,” said Tiarks. “It’s incredibly exciting to be able to have a tiny role in solving that greater problem.”

“I’m very excited at the prospect that we are kind of in uncharted water. So there is an opportunity for Ames to demonstrate why we’re here, why we should continue to fund and increase funding for national labs like ours, and why we are going to tackle some things that most companies and other national labs just can’t or aren’t,” said Argibay. “We hope to be part of this next generation of solving fusion energy for the grid.”

Provided by Ames National Laboratory 

Where’s my qubit? Scientists develop technique to detect atom loss

Quiet quitting isn’t just for burned out employees. Atoms carrying information inside quantum computers, known as qubits, sometimes vanish silently from their posts. This problematic phenomenon, called atom loss, corrupts data and spoils calculations.

But Sandia National Laboratories and the University of New Mexico have for the first time demonstrated a practical way to detect these “leakage errors” for neutral atom platforms. This achievement removes a major roadblock for one branch of quantum computing, bringing scientists closer to realizing the technology’s full potential. Many experts believe quantum computers will help reveal truths about the universe that are impossible to glean with current technology.

“We can now detect the loss of an atom without disturbing its quantum state,” said Yuan-Yu Jau, Sandia atomic physicist and principal investigator of the experiment team.

In a paper recently published in the journal PRX Quantum, the team reports its circuit-based method achieved 93.4% accuracy. The detection method enables researchers to flag and correct errors.

Detection heads off a looming crisis

Atoms are squirrely little things. Scientists control them in some quantum computers by freezing them at just above absolute zero, about -460 degrees Fahrenheit. A thousandth of a degree too warm and they spring free. Even at the right temperature, they can escape through random chance.

If an atom slips away in the middle of a calculation, “The result can be completely useless. It’s like garbage,” Jau said.

A detection scheme can tell researchers whether they can trust the result and could lead to a way of correcting errors by filling in detected gaps.

Matthew Chow, who led the research, said atom loss is a manageable nuisance in small-scale machines because they have relatively few qubits, so the odds of losing one at any given moment are generally small.

But the future has been looking bleak. Useful quantum computers will need millions of qubits. With so many, the odds of losing them mid-program spikes. Atoms would be silently walking off the jobsite en masse, leaving scientists with the futile task of trying to use a computer that is literally vanishing before their eyes.

“This is super important because if we don’t have a solution for this, I don’t think there’s a way to keep moving forward,” Jau said.

Researchers have found ways to detect atom loss and other kinds of leakage errors in different quantum computing platforms, like those using electrically charged atoms, called trapped ion qubits, instead of neutral ones. The New Mexico-based team is the first to non-destructively detect atom loss in neutral atom systems. By implementing simple circuit-based techniques to detect leakage errors, the team is helping avert the crisis of uncontrollable future leakage.

Just don’t look

The dilemma of detecting atom loss is that scientists cannot look at the atoms they need to preserve during computation.

“Quantum calculations are extremely fragile,” Jau said.

The operation falls apart if researchers do anything at all to observe the state of a qubit while it’s working.

Austrian physicist Erwin Schrödinger famously compared this concept to having a cat inside a box with something that will randomly kill it. According to quantum physics, Schrödinger explained, the cat can be thought of as simultaneously dead and alive until you open the box.

“It’s very easy to have a mathematical description of everything in terms of quantum computing. But to visualize entangled quantum information, it’s hard,” Jau said.

So how do you check that an atom is in the processor without observing it?

“The idea is analogous to having Schrödinger’s cat in a box, and putting that box on a scale, where the weight of the box tells you whether or not there’s a cat, but it doesn’t tell you whether the cat’s dead or alive,” Chow said.

Where's my qubit? Scientists develop technique to detect atom loss
Objective lenses on either side of the vacuum chamber are used to focus laser light into single-atom traps at Sandia National Laboratories. Credit: Craig Fritz

Surprise finding fuels breakthrough

Chow, a University of New Mexico Ph.D. student and Sandia Labs intern at the time of the research, said he never expected this breakthrough.

“This was certainly not a paper that we had planned to write,” he said.

He was debugging a small bit of quantum computing code at Sandia for his dissertation. The code diagnoses the entangling interaction—a unique quantum process that links the states of atoms—by repeatedly applying an operation and comparing the results when two atoms interact versus when only one atom is present. When the atoms interact, the repeated application of the operation makes them switch between entangled and disentangled states. In this comparison, he observed a key pattern.

Every other run, when the atoms were disentangled, the outcome for the two-atom case was markedly different from the solo-atom case.

Without trying, Chow realized, he had found a subtle signal to indicate a neighboring atom was present in a quantum computer without observing it directly. The oscillating measurement was the scale to measure whether the cat is still in the box.

“This was the thing that got me really excited—that made me show it to Vikas.”

Vikas Buchemmavari, another Ph.D. student at UNM and a frequent collaborator, knew more quantum theory than Chow. He works in a research group led by the director of UNM’s Center for Quantum Information and Control, Ivan Deutsch.

“I was simultaneously very impressed by the gate quality and very excited about what the idea meant: we could detect if the atom was there or not without damaging the information in it,” Buchemmavari said.

Verifying the technique

He went to work formalizing the idea into a set of code tailored to detect atom loss. It would use a second atom, not involved in any calculation, to indirectly detect whether an atom of interest is missing.

“Quantum systems are very error-prone. To build useful quantum computers, we need quantum error correction techniques that correct the errors and make the calculations reliable. Atom loss— and leakage errors—are some of the worst kinds of errors to deal with,” he said.

The two then developed ways to test their idea.

“You need to test not only your ability to detect an atom, but to detect an atom that starts in many different states,” Chow said. “And then the second part is to check that it doesn’t disturb that state of the first atom.”

Chow’s Sandia team jumped onboard, too, helping test the new routine and verify its results by comparing them to a method of directly observing the atoms.

“We had the capability at Sandia to verify it was working because we have this measurement where we can say the atom is in the one state or the zero state or it’s gone. A lot of people don’t have that third option,” Sandia’s Bethany Little said.

A guide for correcting atom loss

Looking ahead, Buchemmavari said, “We hope this work serves as a guide for other groups implementing these techniques to overcome these errors in their systems. We also hope this spurs deeper research into the advantages and trade-offs of these techniques in real systems.”

Chow, who has since earned his doctoral degree, said he is proud of the discovery because it shows the problem of atom loss is solvable, even if future quantum computers do not use his exact method.

“If you’re careful to keep your eyes open, you might spot something really useful.”

More information: Matthew N. H. Chow et al, Circuit-Based Leakage-to-Erasure Conversion in a Neutral-Atom Quantum Processor, PRX Quantum (2024). DOI: 10.1103/PRXQuantum.5.040343

Journal information: PRX Quantum 

Provided by Sandia National Laboratories 

Chinese detector to hunt elusive neutrinos deep underground

Underneath a granite hill in southern China, a massive detector is nearly complete that will sniff out the mysterious ghost particles lurking around us.

The Jiangmen Underground Neutrino Observatory will soon begin the difficult task of spotting neutrinos: tiny cosmic particles with a mind-bogglingly small mass.

The detector is one of three being built across the globe to study these elusive ghost particles in the finest detail yet. The other two, based in the United States and Japan, are still under construction.

Spying neutrinos is no small feat in the quest to understand how the universe came to be. The Chinese effort, set to go online next year, will push the technology to new limits, said Andre de Gouvea, a theoretical physicist at Northwestern University who is not involved with the project.

“If they can pull that off,” he said, “it would be amazing.”

What are neutrinos?

Neutrinos date back to the Big Bang, and trillions zoom through our bodies every second. They spew from stars like the sun and stream out when atomic bits collide in a particle accelerator.

Scientists have known about the existence of neutrinos for almost a century, but they’re still in the early stages of figuring out what the particles really are.

Chinese detector to hunt elusive neutrinos deep underground
An aerial view of the Jiangmen Underground Neutrino Observatory where a cosmic detector is located 2297 feet (700 meters) underground in Kaiping, southern China’s Guangdong province on Friday, Oct. 11, 2024. Credit: AP Photo/Ng Han Guan

“It’s the least understood particle in our world,” said Cao Jun, who helps manage the detector known as JUNO. “That’s why we need to study it.”

There’s no way to spot the tiny neutrinos whizzing around on their own. Instead, scientists measure what happens when they collide with other bits of matter, producing flashes of light or charged particles.

Neutrinos bump into other particles only very rarely, so to up their chances of catching a collision, physicists have to think big.

“The solution for how we measure these neutrinos is to build very, very big detectors,” de Gouvea said.

A big detector to measure tiny particles

The $300 million detector in Kaiping, China, took over nine years to build. Its location 2,297 feet (700 meters) underground protects from pesky cosmic rays and radiation that could throw off its neutrino-sniffing abilities.

Chinese detector to hunt elusive neutrinos deep underground
Visitors take a train ride to visit the cosmic detector located 2297 feet (700 meters) underground at the Jiangmen Underground Neutrino Observatory in Kaiping, southern China’s Guangdong province on Friday, Oct. 11, 2024. Credit: AP Photo/Ng Han Guan

On Wednesday, workers began the final step in construction. Eventually, they’ll fill the orb-shaped detector with a liquid designed to emit light when neutrinos pass through and submerge the whole thing in purified water.

It’ll study antineutrinos—an opposite to neutrinos which allow scientists to understand their behavior—produced from collisions inside two nuclear power plants located over 31 miles (50 kilometers) away. When the antineutrinos come into contact with particles inside the detector, they’ll produce a flash of light.

The detector is specially designed to answer a key question about a longstanding mystery. Neutrinos switch between three flavors as they zip through space, and scientists want to rank them from lightest to heaviest.

Sensing these subtle shifts in the already evasive particles will be a challenge, said Kate Scholberg, a physicist at Duke University who is not involved with the project.

Chinese detector to hunt elusive neutrinos deep underground
Wang Yifang, chief scientist and project manager at the Jiangmen Underground Neutrino Observatory briefs visitors on the cosmic detector located 2297 feet (700 meters) underground in Kaiping, southern China’s Guangdong province on Friday, Oct. 11, 2024. Credit: AP Photo/Ng Han Guan

“It’s actually a very daring thing to even go after it,” she said.

China’s detector is set to operate during the second half of next year. After that, it’ll take some time to collect and analyze the data—so scientists will have to keep waiting to fully unearth the secret lives of neutrinos.

Two similar neutrino detectors—Japan’s Hyper-Kamiokande and the Deep Underground Neutrino Experiment based in the United States—are under construction. They’re set to go online around 2027 and 2031 and will cross-check the China detector‘s results using different approaches.

“In the end, we have a better understanding of the nature of physics,” said Wang Yifang, chief scientist and project manager of the Chinese effort.

Understanding how the universe formed

Though neutrinos barely interact with other particles, they’ve been around since the dawn of time. Studying these Big Bang relics can clue scientists into how the universe evolved and expanded billions of years ago.

One question researchers hope neutrinos can help answer is why the universe is overwhelmingly made up of matter with its opposing counterpart—called antimatter—largely snuffed out.

Scientists don’t know how things got to be so out of balance, but they think neutrinos could have helped write the earliest rules of matter.

The proof, scientists say, may lie in the particles. They’ll have to catch them to find out.

© 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Nonlinear ‘skin effect’ unveiled in antiferromagnetic materials

A team of researchers has identified a unique phenomenon, a “skin effect,” in the nonlinear optical responses of antiferromagnetic materials. The research, published in Physical Review Letters, provides new insights into the properties of these materials and their potential applications in advanced technologies.

Nonlinear optical effects occur when light interacts with materials that lack inversion symmetry. It was previously thought that these effects were uniformly distributed throughout the material. However, the research team discovered that in antiferromagnets, the nonlinear optical response can be concentrated on the surfaces, similar to the “skin effect” seen in conductors, where currents flow primarily on the surface.

In this study, the team developed a self-designed computational method to investigate the nonlinear optical responses in antiferromagnets, using the bulk photovoltaic effect as a representative example. Their results showed that, while the global inversion symmetry was broken, the local inversion symmetry deep inside the antiferromagnet was almost untouched.

As a result, the nonlinear optical response was primarily confined to the top and bottom surfaces of the antiferromagnet, with negligible contribution from its interior.

To demonstrate the findings, the researchers conducted first-principles calculations on the two-dimensional antiferromagnet CrI3, confirming the surface-dominant behavior of the bulk photovoltaic effect. Additionally, they calculated the second-harmonic generation effect, finding consistent results with their theoretical models.

The discovery of the skin effect in nonlinear optical responses opens exciting opportunities for both the fundamental sciences and the optoelectronic technology. “It offers a new perspective on how nonlinear optical effects can be utilized in high-performance device applications,” said Prof. Shao Dingfu from the Hefei Institutes of Physical Science of the Chinese Academy of Sciences.

More information: Hang Zhou et al, Skin Effect of Nonlinear Optical Responses in Antiferromagnets, Physical Review Letters (2024). DOI: 10.1103/PhysRevLett.133.236903

Journal information: Physical Review Letters 

Provided by Chinese Academy of Sciences 

Antineutrino detection gets a boost with novel plastic scintillator

How do you find and measure nuclear particles, like antineutrinos, that travel near the speed of light?

Antineutrinos are the antimatter partner of a neutrino, one of nature’s most elusive and least understood subatomic particles. They are commonly observed near nuclear reactors, which emit copious amounts of antineutrinos, but they also are found abundantly throughout the universe as a result of Earth’s natural radioactivity, with most of them originating from the decay of potassium-40, thorium-232 and uranium-238 isotopes.

When an antineutrino collides with a proton, a positron and a neutron are produced—a process known as inverse beta decay (IBD). This event causes scintillating materials to light up, making it possible to detect these antineutrinos; and if they can be detected, they can be used to study the properties of a reactor’s core or Earth’s interior.

Researchers at Lawrence Livermore National Laboratory (LLNL), in partnership with Eljen Technology, are working on one possible detection solution—a plastic, lithium-6 doped scintillator for detecting reactor antineutrinos that represents over a decade of materials science research. Their research appears in the journal of Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment.

Magic in plastic

In the early 2010s, LLNL materials scientist Natalia Zaitseva and her team were the first to develop a plastic scintillator capable of pulse-shape discrimination (PSD), i.e., efficiently distinguishing neutrons from gamma rays (important for detecting IBD events). Building upon this work, the new lithium-6-doped plastic scintillator formulation is also PSD-capable.

“Lithium-6 is particularly advantageous, because in addition to having a significant thermal-neutron-capture cross section, it offers a localized capture location, further enhancing the detector’s ability to effectively reject unwanted background noise,” said LLNL scientist Viacheslav “Slava” Li. This enhanced detection is made possible through the IBD process.

“While integrating lithium-6 into liquid scintillators has proven to be a challenging yet rewarding endeavor—successfully demonstrated by PROSPECT, another reactor-antineutrino experiment with fundamental LLNL contributions—achieving this in a solid, compact and easily transportable plastic scintillator has not been accomplished before, especially not at a scale suitable for effective antineutrino detection,” said Cristian Roca, LLNL scientist and corresponding author of the paper.

Compared to liquid scintillators, which have been the standard technology for reactor–antineutrino detection for decades, plastic scintillators offer superior safety and mobility with fewer of the regulatory and practicality constraints that are typically placed upon liquid scintillators and their operating environment.

Optimizing detector performance

To ready the scintillator (commercially known as EJ-299-50) for the market, researchers in LLNL’s Rare Event Detection group conducted a series of characterization measurements of the material’s performance in a large-scale detector system.

For almost six months, researchers studied the aging process of these scintillators to ensure the long-term stability of the plastic. After demonstrating the reliable optical performance and neutron identification capabilities of EJ-299-50 during this time, researchers installed 36 of the plastic scintillator “bars” in a 6 × 6 grid configuration on a detection system called the Reactor Operations Antineutrino Detection Surface Testbed Rover (ROADSTR). A follow-on study is currently underway to evaluate ROADSTR’s performance with these bars.

Alongside their scintillator work, scientists in the Rare Event Detection group are collaborating with researchers at the University of Hawai’i to improve the directional sensitivity of detectors; i.e., the ability to determine the direction of the incoming antineutrino in relation to the detector. This information can be extracted by correlating the events that take place during the IBD reaction and is especially useful in constraining the illicit production of weapons material.

The team’s research, published in Physical Review Applied and supported by the Consortium for Monitoring, Technology and Verification, explores different detector designs, finding that certain detector geometries outperform others in terms of directional resolution.

With applications in reactor safeguards and monitoring, as well as homeland security and nuclear non-proliferation, these combined research efforts are opening the door to a new era of antineutrino detection.

More information: C. Roca et al, Performance of large-scale 6Li-doped pulse-shape discriminating plastic scintillators, Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment (2024). DOI: 10.1016/j.nima.2024.169916

Mark J. Duvall et al, Directional response of several geometries for reactor-neutrino detectors, Physical Review Applied (2024). DOI: 10.1103/PhysRevApplied.22.054030

Journal information: Physical Review Applied 

Provided by Lawrence Livermore National Laboratory 

When solar probes align: Data confirms how sun’s magnetic field accelerates solar wind

When two probes orbiting the sun aligned with one another, researchers harnessed the opportunity to track the sun’s magnetic field as it traveled into the solar system. They found that the sharply oscillating magnetic field smooths out to gentle waves while accelerating the surrounding solar wind, according to a University of Michigan-led study published in The Astrophysical Journal.

The sharp S-shaped bends of the magnetic fields streaming out of the sun, called magnetic switchbacks, have long been of interest to solar scientists. Switchbacks impact the solar wind—the charged particles, or plasma, that stream from the sun and influence space weather in ways that can disrupt Earth’s electrical grids, radio waves, radar and satellites.

The new understanding of magnetic switchback changes over time will help improve solar wind forecasts to better predict space weather and its potential impacts on Earth.

“This study marks the first direct observation of switchback magnetic energy reducing with distance from the sun,” said Shirsh Soni, a research fellow of climate and space sciences engineering at the University of Michigan and corresponding author of the study.

The researchers pinpointed twelve time windows when the Parker Solar Probe and Solar Orbiter aligned. The Parker Solar Probe was positioned closest to the sun, less than 30 solar radii away (Rs)—a unit of distance based on the sun’s radius. The Solar Orbiter was further away at 130 Rs from the sun—nearing the orbit range of Venus which lies around 156 Rs.

Comparing magnetic field and plasma moment measurements collected from both spacecraft during these windows, the researchers traced the changes in magnetic switchbacks from one point to the next.

They found that switchback patches—bundles of sharp magnetic switchbacks—smoothed out into microstreams with 30% fewer magnetic reversals, while the background proton velocity increased by 10%, indicating acceleration of the surrounding solar wind.

The research team points to magnetic relaxation as the driving force of these changes. Essentially, as magnetic switchbacks travel outwards, the highly energetic switchbacks “relax,” transferring magnetic energy into kinetic energy to accelerate the surrounding plasma.

The next step is to track where and how the magnetic energy transfer occurs and whether it converts to thermal energy alongside kinetic energy. Magnetic switchbacks have been ruled out as a cause of the sun’s curiously hot corona, but could help solve another standing mystery of how the solar wind heats up as it travels through space.

“The collaboration between Parker Solar Probe and Solar Orbiter allows us to piece together this complex puzzle, marking a significant step forward in solar physics,” said Soni.

“Magnetic switchbacks are the fingerprints of the sun’s dynamic energy processes, revealing how it shapes the solar wind and, in turn, the entire solar system,” said Mojtaba Akhavan-Tafti, a U-M associate research scientist of climate and space sciences and engineering and co-corresponding author of the study.

Additional co-authors include Gabriel Ho Hin Suen and Christopher Owen of University College London; Justin Kasper of the University of Michigan; Marco Velli of the University of California, Los Angeles; and Rossana De Marco of the National Institute for Astrophysics and Institute for Space Astrophysics and Planetology in Rome, Italy.

More information: Shirsh Lata Soni et al, Switchback Patches Evolve into Microstreams via Magnetic Relaxation, The Astrophysical Journal (2024). DOI: 10.3847/1538-4357/ad94da

Journal information: Astrophysical Journal 

Provided by University of Michigan College of Engineering 

Physicists magnetize a material with light: Terahertz technique could improve memory chip design

MIT physicists have created a new and long-lasting magnetic state in a material, using only light.

In a study that appears in Nature, the researchers report using a terahertz laser—a light source that oscillates more than a trillion times per second—to directly stimulate atoms in an antiferromagnetic material. The laser’s oscillations are tuned to the natural vibrations among the material’s atoms, in a way that shifts the balance of atomic spins toward a new magnetic state.

The results provide a new way to control and switch antiferromagnetic materials, which are of interest for their potential to advance information processing and memory chip technology.

In common magnets, known as ferromagnets, the spins of atoms point in the same direction, in a way that the whole can be easily influenced and pulled in the direction of any external magnetic field.

In contrast, antiferromagnets are composed of atoms with alternating spins, each pointing in the opposite direction from its neighbor. This up, down, up, down order essentially cancels the spins out, giving antiferromagnets a net zero magnetization that is impervious to any magnetic pull.

If a memory chip could be made from antiferromagnetic material, data could be “written” into microscopic regions of the material, called domains. A certain configuration of spin orientations (for example, up-down) in a given domain would represent the classical bit “0,” and a different configuration (down-up) would mean “1.” Data written on such a chip would be robust against outside magnetic influence.

For this and other reasons, scientists believe antiferromagnetic materials could be a more robust alternative to existing magnetic-based storage technologies. A major hurdle, however, has been in how to control antiferromagnets in a way that reliably switches the material from one magnetic state to another.

“Antiferromagnetic materials are robust and not influenced by unwanted stray magnetic fields,” says Nuh Gedik, the Donner Professor of Physics at MIT. “However, this robustness is a double-edged sword; their insensitivity to weak magnetic fields makes these materials difficult to control.”

Using carefully tuned terahertz light, the MIT team was able to controllably switch an antiferromagnet to a new magnetic state. Antiferromagnets could be incorporated into future memory chips that store and process more data while using less energy and taking up a fraction of the space of existing devices, owing to the stability of magnetic domains.

“Generally, such antiferromagnetic materials are not easy to control,” Gedik says. “Now we have some knobs to be able to tune and tweak them.”

Gedik is the senior author of the new study, which also includes MIT co-authors Batyr Ilyas, Tianchuang Luo, Alexander von Hoegen, Zhuquan Zhang and Keith Nelson, along with collaborators at the Max Planck Institute for the Structure and Dynamics of Matter in Germany, University of the Basque Country in Spain, Seoul National University, and the Flatiron Institute in New York.

Off balance

Gedik’s group at MIT develops techniques to manipulate quantum materials in which interactions among atoms can give rise to exotic phenomena.

“In general, we excite materials with light to learn more about what holds them together fundamentally,” Gedik says. “For instance, why is this material an antiferromagnet, and is there a way to perturb microscopic interactions such that it turns into a ferromagnet?”

Physicists magnetize a material with light
“Generally, such antiferromagnetic materials are not easy to control,” Nuh Gedik says, pictured in between Tianchuang Luo, left, and Alexander von Hoegen. Additional MIT co-authors include Batyr Ilyas, Zhuquan Zhang and Keith Nelson. Credit: Adam Glanzman

In their new study, the team worked with FePS3—a material that transitions to an antiferromagnetic phase at a critical temperature of around 118 Kelvin (-247 degrees Fahrenheit).

The team suspected they might control the material’s transition by tuning into its atomic vibrations.

“In any solid, you can picture it as different atoms that are periodically arranged, and between atoms are tiny springs,” von Hoegen explains. “If you were to pull one atom, it would vibrate at a characteristic frequency which typically occurs in the terahertz range.”

The way in which atoms vibrate also relates to how their spins interact with each other. The team reasoned that if they could stimulate the atoms with a terahertz source that oscillates at the same frequency as the atoms’ collective vibrations, called phonons, the effect could also nudge the atoms’ spins out of their perfectly balanced, magnetically alternating alignment.

Once knocked out of balance, atoms should have larger spins in one direction than the other, creating a preferred orientation that would shift the inherently nonmagnetized material into a new magnetic state with finite magnetization.

“The idea is that you can kill two birds with one stone: You excite the atoms’ terahertz vibrations, which also couples to the spins,” Gedik says.

Shake and write

To test this idea, the team worked with a sample of FePS3 that was synthesized by colleagues at Seoul National University. They placed the sample in a vacuum chamber and cooled it down to temperatures at and below 118 K.

They then generated a terahertz pulse by aiming a beam of near-infrared light through an organic crystal, which transformed the light into the terahertz frequencies. They then directed this terahertz light toward the sample.

“This terahertz pulse is what we use to create a change in the sample,” Luo says. “It’s like ‘writing’ a new state into the sample.”

To confirm that the pulse triggered a change in the material’s magnetism, the team also aimed two near-infrared lasers at the sample, each with an opposite circular polarization. If the terahertz pulse had no effect, the researchers should see no difference in the intensity of the transmitted infrared lasers.

“Just seeing a difference tells us the material is no longer the original antiferromagnet, and that we are inducing a new magnetic state, by essentially using terahertz light to shake the atoms,” Ilyas says.

Over repeated experiments, the team observed that a terahertz pulse successfully switched the previously antiferromagnetic material to a new magnetic state—a transition that persisted for a surprisingly long time, over several milliseconds, even after the laser was turned off.

“People have seen these light-induced phase transitions before in other systems, but typically they live for very short times on the order of a picosecond, which is a trillionth of a second,” Gedik says.

In just a few milliseconds, scientists now might have a decent window of time during which they could probe the properties of the temporary new state before it settles back into its inherent antiferromagnetism.

Then, they might be able to identify new knobs to tweak antiferromagnets and optimize their use in next-generation memory storage technologies.

More information: Nuh Gedik, Terahertz field-induced metastable magnetization near criticality in FePS3Nature (2024). DOI: 10.1038/s41586-024-08226-xwww.nature.com/articles/s41586-024-08226-x

Journal information: Nature 

Provided by Massachusetts Institute of Technology