A spinning neutron disintegrates into a proton, electron, and antineutrino when a down quark in the neutron emits a W boson and converts into an up quark. The exchange of quanta of light (γ) among charged particles changes the strength of this transition. Credit: Vincenzo Cirigliano, Institute for Nuclear Theory
Outside atomic nuclei, neutrons are unstable particles, with a lifetime of about fifteen minutes. The neutron disintegrates due to the weak nuclear force, leaving behind a proton, an electron, and an antineutrino. The weak nuclear force is one of the four fundamental forces in the universe, along with the strong force, the electromagnetic force, and the gravitational force.
Comparing experimental measurements of neutron decay with theoretical predictions based on the weak nuclear force can reveal as-yet undiscovered interactions. To do so, researchers must achieve extremely high levels of precision. A team of nuclear theorists has uncovered a new, relatively large effect in neutron decay that arises from the interplay of the weak and electromagnetic forces.
This research identified a shift in the strength with which a spinning neutron experiences the weak nuclear force. This has two major implications. First, scientists have known since 1956 that due to the weak force, a system and one built like its mirror image do not behave in the same way. In other words, mirror reflection symmetry is broken. This research affects the search for new interactions, technically known as “right-handed currents,” that, at very short distances of less than one hundred quadrillionths of a centimeter, restore the universe’s mirror-reflection symmetry. Second, this research points to the need to compute electromagnetic effects with higher precision. Doing so will require the use of future high-performance computers.
A team of researchers computed the impact of electromagnetic interactions on neutron decay due to the emission and absorption of photons, the quanta of light. The team included nuclear theorists from the Institute for Nuclear Theory at the University of Washington, North Carolina State University, the University of Amsterdam, Los Alamos National Laboratory, and Lawrence Berkeley National Laboratory and their results have been published in Physical Review Letters.
The calculation was performed with a modern method, known as “effective field theory,” that efficiently organizes the importance of fundamental interactions in phenomena involving strongly interacting particles. The team identified a new percent-level shift to the nucleon axial coupling, gA, which governs the strength of decay of a spinning neutron. The new correction originates from the emission and absorption of electrically charged pions, which are mediators of the strong nuclear force. While effective field theory provides an estimate of the uncertainties, improving on the current precision will require advanced calculations on Department of Energy supercomputers.
The researchers also assessed the impact on searches of right-handed current. They found that after including the new correction, experimental data and theory are in good agreement and current uncertainties still allow for new physics at a relatively low mass scale.
More information: Vincenzo Cirigliano et al, Pion-Induced Radiative Corrections to Neutron β Decay, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.121801
Two-time correlation functions of the ce-based MG measured by HP-XPCS at different pressures during compression. At each pressure, the width of the reddish diagonal contour is proportional to the relaxation time, which broadens below 2.9 GPa and then narrows during further compression. Credit: Dr. Qiaoshi Zeng of HPSTAR
A major stumbling block in our understanding of glass and glass phenomena is the elusive relationship between relaxation dynamics and glass structure. A team led by Dr. Qiaoshi Zeng from HPSTAR recently developed a new in situ high-pressure wide-angle X-ray photon correlation spectroscopy method to enable atomic-scale relaxation dynamics studies in metallic glass systems under extreme pressures. The study is published in Proceedings of the National Academy of Sciences (PNAS).
Metallic glasses (MGs), with many superior properties to both conventional metals and glasses, have been the focus of worldwide research. As thermodynamically metastable materials, like typical glasses, MGs spontaneously evolve into their more stable states all the time through various relaxation dynamic behaviors.
These relaxation behaviors have significant effects on the physical properties of MGs. Still, until now, scientists’ ability to deepen the understanding of glass relaxation dynamics and especially its relationships with atomic structures has been limited by the available techniques.
“Thanks to the recent improvements in synchrotron X-ray photon correlation spectroscopy (XPCS), measuring the collective particle motions of glassy samples with a high resolution and broad coverage in the time scale is possible, and thus, various microscopic dynamic processes otherwise inaccessible have been explored in glasses,” said Dr. Zeng.
“However, the change in atomic structures is subtle in previous relaxation process measurements, which makes it still difficult to probe the relationship between the structure and relaxation behavior. To overcome this problem, we decided to employ high pressure because it can effectively alternate the structure of various materials, including MG.”
To this end, the team developed in situ high-pressure synchrotron wide-angle XPCS to probe a cerium-based MG material during compression. In situ high-pressure wide-angle XPCS revealed that the collective atomic motion initially slows down, as generally expected with increasing density. Then, counter-intuitively it accelerates with further compression, showing an unusual non-monotonic pressure-induced steady relaxation dynamics crossover at ~3 GPa.
Furthermore, by combining these results with in situ high-pressure synchrotron X-ray diffraction, the relaxation dynamics anomaly closely correlates with the dramatic changes in local atomic structures during compression, rather than monotonically scaling with either the sample density or overall stress level.
“With density increases, atoms in glasses generally get more difficult to move or diffuse, slowing down its relaxation dynamics. This is what we normally expect from hydrostatic compression,” Dr. Zeng explained.
“So the non-monotonic relaxation behavior observed here in the cerium-based MG under pressure is quite unusual, which indicates besides density, structural details could also play an important role in glass relaxation dynamics,” Dr. Zeng explained.
These findings demonstrate that there is a close relationship between glass relaxation dynamics and atomic structures in MGs. The technique Dr. Qiaoshi Zeng’s group developed here can also be extended to explore the relationship between relaxation dynamics and atomic structures in various glasses, especially those significantly tunable by compression, offering new opportunities for glassrelaxation dynamics studies at extreme conditions.
More information: Qiaoshi Zeng et al, Pressure-induced nonmonotonic cross-over of steady relaxation dynamics in a metallic glass, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.230228112
Bosonic correlated insulator.(A) Illustration of a bosonic correlated insulator consisting of interlayer excitons. Magenta spheres indicate holes and cyan spheres, electrons. (Inset) Type II band alignment of WSe2/WS2 heterostructure. (B) Schematics of continuous-wave pump probe spectroscopy. The exciton and electron density are independently controlled by pump light and electrostatic gate. Red and green shading correspond to wide-field pump light and focused probe light, respectively. (C and E) Gate-dependent PL (C) and absorption (E) spectra of a 60°-aligned WSe2/WS2 moiré bilayer (device D1) at zero pump intensity. The PL peak shows a sudden blue shift at electron filling νe= 1 and 2 (yellow arrows), where the absorption spectrum shows kinks and splitting. (D and F) Pump intensity–dependent PL (D) and absorption (F) spectra of device D1 at charge neutrality. Right axes show dipolar-interaction–induced interlayer exciton energy shift Δdipole, which is approximately proportional to νex. The dominant PL peak in (D) at low and high pump intensity are labeled as peak I and II, respectively. All measurements are performed at a base temperature of 1.65 K. Credit: Science (2023). DOI: 10.1126/science.add5574
Take a lattice—a flat section of a grid of uniform cells, like a window screen or a honeycomb—and lay another, similar lattice above it. But instead of trying to line up the edges or the cells of both lattices, give the top grid a twist so that you can see portions of the lower one through it. This new, third pattern is a moiré, and it’s between this type of overlapping arrangement of lattices of tungsten diselenide and tungsten disulfide where UC Santa Barbara physicists found some interesting material behaviors.
“We discovered a new state of matter—a bosonic correlated insulator,” said Richen Xiong, a graduate student researcher in the group of UCSB condensed matter physicist Chenhao Jin, and the lead author of a paper that appears in the journal Science.
According to Xiong, Jin and collaborators from UCSB, Arizona State University and the National Institute for Materials Science in Japan, this is the first time such a material—a highly ordered crystal of bosonic particles called excitons—has been created in a “real” (as opposed to synthetic) matter system.
“Conventionally, people have spent most of their efforts to understand what happens when you put many fermions together,” Jin said. “The main thrust of our work is that we basically made a new material out of interacting bosons.”
Bosonic, correlated, insulator
Subatomic particles come in one of two broad types: fermions and bosons. One of the biggest distinctions is in their behavior, Jin said.
“Bosons can occupy the same energy level; fermions don’t like to stay together,” he said, “Together, these behaviors construct the universe as we know it.”
Fermions, such as electrons, underlie the matter with which we are most familiar as they are stable and interact through the electrostatic force. Meanwhile bosons, such as photons (particles of light), tend to be more difficult to create or manipulate as they are either fleeting or do not interact with each other.
A clue to their distinct behaviors is in their different quantum mechanical characteristics, Xiong explained. Fermions have half-integer “spins” such as 1/2 or 3/2 et cetera, while bosons have whole integer spins (1, 2, etc.). An exciton is a state in which a negatively charged electron (a fermion) is bound to its positively charged opposite “hole” (another fermion), with the two half-integer spins together becoming a whole integer, creating a bosonic particle.
To create and identify excitons in their system, the researchers layered the two lattices and shone strong lights on them in a method they call “pump-probe spectroscopy.” The combination of particles from each of the lattices (electrons from the tungsten disulfide and the holes from the tungsten diselenide) and the light created a favorable environment for the formation of and interactions between the excitons while allowing the researchers to probe these particles’ behaviors.
“And when these excitons reached a certain density, they could not move anymore,” Jin said. Thanks to strong interactions, the collective behaviors of these particles at a certain density forced them into a crystalline state, and created an insulating effect due to their immobility.
“What happened here is that we discovered the correlation that drove the bosons into a highly ordered state,” Xiong added. Generally, a loose collection of bosons under ultracold temperatures will form a condensate, but in this system, with both light and increased density and interaction at relatively higher temperatures, they organized themselves into a symmetric solid and charge-neutral insulator.
The creation of this exotic state of matter proves that the researchers’ moiré platform and pump-probe spectroscopy could become an important means for creating and investigating bosonic materials.
“There are many-body phases with fermions that result in things like superconductivity,” Xiong said. “There are also many-body counterparts with bosons that are also exotic phases. So what we’ve done is create a platform, because we did not really have a great way to study bosons in real materials.” While excitons are well-studied, he added, there hadn’t until this project been a way to coax them to interacting strongly with one another.
With their method, according to Jin, it could be possible to not only study well-known bosonic particles like excitons but also open more windows into the world of condensed matter with new bosonic materials.
“We know that some materials have very bizarre properties,” he said. “And one goal of condensed matter physics is to understand why they have these rich properties and find ways to make these behaviors come out more reliably.”
More information: Richen Xiong et al, Correlated insulator of excitons in WSe 2 /WS 2 moiré superlattices, Science (2023). DOI: 10.1126/science.add5574
Illustration of a unidirectional flow as investigated in the new study using a Lennard-Jones fluid as an example. The three-dimensional nonequilibrium system is set in motion (red arrows) by a force field (blue arrows) acting along the x-axis. Credit: Matthias Schmidt
Living organisms, ecosystems and the planet Earth are, from a physics point of view, examples of extraordinarily large and complex systems that are not in thermal equilibrium. To physically describe non-equilibrium systems, dynamic density functional theory has been used to date.
However, this theory has weaknesses, as physicists from the University of Bayreuth have now shown in an article published in the Journal of Physics: Condensed Matter. Power functional theory proves to perform substantially better—in combination with artificial intelligence methods, it enables more reliable descriptions and predictions of the dynamics of non-equilibrium systems over time.
Many-particle systems are all kind of systems composed of atoms, electrons, molecules, and other particles invisible to the eye. They are in thermal equilibrium when the temperature is balanced and no heat flow occurs. A system in thermal equilibrium changes its state only when external conditions change. Density functional theory is tailor-made for the study of such systems.
For more than half a century, it has proven its unrestricted value in chemistry and materials science. Based on a powerful classical variant of this theory, states of equilibrium systems can be described and predicted with high accuracy. Dynamic density functional theory (DDFT) extends the scope of this theory to non-equilibrium systems. This involves the physical understanding of systems whose states are not fixed by their external boundary conditions.
These systems have a momentum of their own: they have the ability to change their states without external influences acting on them. Findings and application methods of DDFT are therefore of great interest, for example, for the study of models for living organisms or microscopic flows.
The error potential of dynamic density functional theory
However, DDFT uses an auxiliary construction to make non-equilibrium systems accessible to physical description. It translates the continuous dynamics of these systems into a temporal sequence of equilibrium states. This results in a potential for errors that should not be underestimated, as the Bayreuth team led by Prof. Dr. Matthias Schmidt shows in the new study.
The investigations focused on a comparatively simple example—the unidirectional flow of a gas known in physics as a “Lennard-Jones fluid.” If this nonequilibrium system is interpreted as a chain of successive equilibrium states, one aspect involved in the time-dependent dynamics of the system is neglected, namely the flow field. As a result, DDFT may provide inaccurate descriptions and predictions.
“We do not deny that dynamic density functional theory can provide valuable insights and suggestions when applied to nonequilibrium systems under certain conditions. The problem, however, and we want to draw attention to this in our study using fluid flow as an example, is that it is not possible to determine with sufficient certainty whether these conditions are met in any particular case. The DDFT does not provide any control over whether the restricted framework conditions are given under which it enables reliable calculations. This makes it all the more worthwhile to develop alternative theoretical concepts for understanding nonequilibrium systems,” says Prof. Dr. Daniel de las Heras, first author of the study.
Power functional theory proves to perform substantially better
For ten years, the research team around Prof. Dr. Matthias Schmidt has been making significant contributions to the development of a still young physical theory, which has so far proven to be very successful in the physical study of many-particle systems: power functional theory (PFT). The physicists from Bayreuth are pursuing the goal of being able to describe the dynamics of non-equilibrium systems with the same precision and elegance with which classical density functional theory enables the analysis of equilibrium systems.
In their new study, they now use the example of a fluid flow to show that power functional theory is significantly superior to DDFT when it comes to understanding non-equilibrium systems. PFT allows the dynamics of these systems to be described without having to take a detour via a chain of successive equilibrium states in time. The decisive factor here is the use of artificial intelligence. Machine learning opens up the time-dependent behavior of the fluid flow by including all factors relevant to the system’s inherent dynamics—including the flow field. In this way, the team has even succeeded in controlling the flow of the Lennard-Jones fluid with high precision.
“Our investigation provides further evidence that power function theory is a very promising concept that can be used to describe and explain the dynamics of many-particle systems. In Bayreuth, we intend to further elaborate this theory in the coming years, applying it to nonequilibrium systems that have a much higher degree of complexity than the fluid flow we studied. In this way, the PFT will be able to supersede the dynamic density functional theory, whose systemic weaknesses it avoids according to our findings so far. The original density functional theory, which is tailored to equilibrium systems and has proven its worth, is retained as an elegant special case of PFT,” says Prof. Dr. Matthias Schmidt, who is chair of theoretical physics II at the University of Bayreuth.
More information: Daniel de las Heras et al, Perspective: How to overcome dynamical density functional theory, Journal of Physics: Condensed Matter (2023). DOI: 10.1088/1361-648X/accb33
Schematic top-view layout of the NA61/SHINE experiment in the configuration used during the 2017 proton data taking. In 2016 the forward time projection chambers were not present. The S5 scintillator was not used in this trigger configuration. Credit: Physical Review D (2023). DOI: 10.1103/PhysRevD.107.072004
At the time of the Big Bang, 13.8 billion years ago, every particle of matter is thought to have been produced together with an antimatter equivalent of opposite electrical charge. But in the present-day universe, there is much more matter than antimatter. Why this is the case is one of physics’ greatest questions.
The answer may lie, at least partly, in particles called neutrinos, which lack electrical charge, are almost massless and change their identity—or “oscillate”—from one of three types to another as they travel through space. If neutrinos oscillated in a different way to their antimatter equivalents, antineutrinos, they could help explain the matter–antimatter imbalance in the universe.
Experiments across the world, such as the NOvA experiment in the US, are investigating this possibility, as will next-generation experiments including DUNE. In these long-baseline neutrino-oscillation experiments, a beam of neutrinos is measured after it has traveled a long distance—the long baseline. The experiment is then run with a beam of antineutrinos, and the outcome is compared with that of the neutrino beam to see if the two twin particles oscillate in a similar or different way.
This comparison depends on an estimation of the numbers of neutrinos in the neutrino and antineutrino beams before they travel. These beams are produced by firing beams of protons onto fixed targets. The interactions with the target create other hadrons, which are focused using magnetic “horns” and directed into long tunnels in which they transform into neutrinos and other particles. But in this multi-step process, it isn’t easy to work out the particle content of the resulting beams—including the number of neutrinos they contain—which depends directly on the proton–target interactions.
Enter the NA61 experiment at CERN, also known as SHINE. Using high-energy proton beams from the Super Proton Synchrotron and appropriate targets, the experiment can recreate the relevant proton–target interactions. NA61/SHINE has previously made measurements of electrically charged hadrons that are produced in the interactions and yield neutrinos. These measurements helped improve estimations of the content of neutrino beams used at existing long-baseline experiments.
The NA61/SHINE collaboration has now released new hadron measurements that will help improve these estimations further. This time around, using a proton beam with an energy of 120 GeV and a carbon target, the collaboration measured three kinds of electrically neutral hadrons that decay into neutrino-yielding charged hadrons.
This 120-GeV proton–carbon interaction is used to produce NOvA’s neutrino beam, and it will probably also be used to create DUNE’s beam. Estimations of the numbers of the different neutrino-yielding neutral hadrons that the interaction produces rely on computer simulations, the output of which varies significantly depending on the underlying physics details.
“Up to now, simulations for neutrino experiments that use this interaction have relied on uncertain extrapolations from older measurements with different energies and target nuclei. This new direct measurement of particle production from 120-GeV protons on carbon reduces the need for these extrapolations,” explains NA61/SHINE deputy spokesperson Eric Zimmerman.
The paper is published in the journal Physical Review D.
More information: H. Adhikary et al, Measurements of KS0 , Λ , and Λ¯ production in 120 GeV/c p+C interactions, Physical Review D (2023). DOI: 10.1103/PhysRevD.107.072004
An international research team led by the Paul Scherrer Institute PSI has measured the radius of the nucleus of muonic helium-3 with unprecedented precision. The results are an important stress test for theories and future experiments in atomic physics.
1.97007 femtometer (quadrillionths of a meter): That’s how unimaginably tiny the radius of the atomic nucleus of helium-3 is. This is the result of an experiment at PSI that has now been published in the journal Science.
More than 40 researchers from international institutes collaborated to develop and implement a method that enables measurements with unprecedented precision. This sets new standards for theories and further experiments in nuclear and atomic physics.
This demanding experiment is only possible with the help of PSI’s proton accelerator facility. There Aldo Antognini’s team generates so-called muonic helium-3, in which the two electrons of the helium atom are replaced by an elementary particle called a muon. This allows the nuclear radius to be determined with high precision.
With the measurement of helium-3, the experiments on light muonic atoms have now been completed for the time being. The researchers had previously measured muonic helium-4 and, a few years ago, the atomic nucleus of muonic hydrogen and deuterium.
Muonic helium-3: Twice as slimmed-down
Helium-3 is the lighter cousin of ordinary helium, helium-4. Its atomic nucleus has two protons and two neutrons (hence the 4 after the abbreviation for the element); in helium-3, one of the neutrons is missing. The simplicity of this slimmed-down atomic nucleus is very interesting to Aldo Antognini and other physicists.
The helium-3 that PSI physicist and ETH Zurich professor Antognini is using in the current experiment lacks not only a neutron in the nucleus, but also both electrons that orbit this nucleus. The physicists replace the electrons with a negatively charged muon—hence the name muonic helium-3.
The muon is around 200 times heavier and gets close to the nucleus. Thus, the nucleus and the muon “sense” each other much more intensely, and the wave functions overlap more strongly, as they say in physics.
That makes the muon the perfect probe for measuring the nucleus and its charge radius. This indicates the area over which the positive charge of the nucleus is distributed. Ideal for the researchers, this charge radius of the nucleus does not change when the electrons are replaced by a muon.
Antognini has experience in measuring muonic atoms. A few years ago, he carried out the same experiment with muonic hydrogen, which contains only one proton in the nucleus and whose one electron was replaced by a negatively charged muon. The results caused quite a commotion at the time, because the deviation from measurements based on other methods was surprisingly large. Some critics even considered them wrong. It has now been confirmed many times over: The results were correct.
Worldwide-unique facility enables experiments
This time Antognini will not need to exercise as much persuasive power. For one thing, he has established himself as the leading expert in this area of research. Another factor is that there was no big surprise this time. The current results from muonic helium-3 fit well with those from previous experiments in which other methods were used. However, the PSI team’s measurements are around 15 times more precise.
Negatively charged muons, and plenty of them, are the most important ingredient for the experiment. These must have a very low energy—that is, they must be very slow, at least by the standards of particle physics.
At PSI, around 500 muons per second with energies of one kiloelectron-volt can be generated. This makes the PSI proton accelerator facility, with its beamline developed in-house, the only one in the world that can deliver such slow negative muons in such large numbers.
PSI physicist Aldo Antognini is pleased that he and his team, within an international collaboration, have achieved yet another fundamental result in atomic physics. Credit: Scanderbeg Sauer Photography
Laser developed in-house was crucial for success
A crucial share of the success is due to a laser system that the researchers themselves developed. There the challenge is that the laser must fire immediately when a muon flies into the experimental setup.
To make this possible, Antognini and his team installed an extremely thin foil detector in front of the airless experimental chamber. This detects when a muon passes through the foil and signals the laser to emit a pulse of light immediately and at full power.
The researchers determine the charge radius indirectly by measuring the frequency of the laser light. When the laser frequency precisely matches the resonance of a specific atomic transition, the muon is briefly excited to a higher energy state before decaying to the ground state within picoseconds; at that point it will emit a photon in the form of an X-ray.
Finding the resonance frequency at which this transition occurs requires a lot of patience, but the reward is an extremely accurate value for the charge radius of the nucleus.
New benchmark for theoretical modeling
The charge radii obtained from muonic helium-3 and helium-4 serve as important reference values for modern ab initio theories—that is, physical models that calculate the properties of complex physical systems directly from the fundamental laws of physics, without resorting to experimental data.
In the context of nuclear physics, these models offer detailed insights into the structure of light atomic nuclei and the forces between their building blocks, the protons and neutrons.
Precise knowledge of these nuclear radii is also crucial for comparisons with ongoing experiments on conventional helium ions with one electron and on neutral helium atoms with two electrons. Such comparisons provide stringent tests of quantum electrodynamics (QED) in few-body systems—the fundamental theory that describes how charged particles interact through the exchange of photons. They allow researchers to test the predictive power of our most fundamental understanding of atomic structure.
These efforts could lead to new insights into QED in for bound systems—that is, in systems such as atoms, in which particles are not free but bound to each other by forces—or perhaps even to indications of physical effects outside beyond the Standard Model of Particle Physics.
Follow-up experiments are currently being conducted by research teams in Amsterdam, Garching, and China, as well as in Switzerland by the Molecular Physics and Spectroscopy group led by Frédéric Merkt at ETH Zurich.
Antognini also has additional ideas for future experiments aimed at testing the theories of atomic and nuclear physics with even greater precision. One idea is to measure hyperfine splitting in muonic atoms. This refers to energy transitions between split energy levels that reveal deeper details about effects in the atomic nucleus that involve spin and magnetism.
An experiment with muonic hydrogen is currently being prepared, and an experiment with muonic helium is planned. “Many people who work in nuclear physics are very interested in it and are eagerly awaiting our results,” Antognini says.
But the energy density of the laser must be increased significantly, which will require an enormous advance in laser technology. This development is currently under way at PSI and ETH Zurich.
Scientists at Paderborn University have made a further step forward in the field of quantum research: for the first time ever, they have demonstrated a cryogenic circuit (i.e. one that operates in extremely cold conditions) that allows light quanta—also known as photons—to be controlled more quickly than ever before.
Specifically, these scientists have discovered a way of using circuits to actively manipulate light pulses made up of individual photons. This milestone could substantially contribute to developing modern technologies in quantum information science, communication and simulation. The results have now been published in the journal Optica.
Photons, the smallest units of light, are vital for processing quantum information. This often requires measuring a photon’s state in real time and using this information to actively control the luminous flux—a method known as a “feedforward operation.”
However, thus far this has butted up against technical limitations: light was measured, processed and controlled at a delay, limiting its use for complex applications. With their new method, these scientists have managed to significantly reduce the delay—to less than a quarter of a billionth of a second.
“We have managed to actively interconnect light pulses with detectors, adapted electronics and optical circuits at cryogenic temperatures. This enabled us to manipulate individual photons significantly more quickly than other research groups. The ability allows us to create new active circuits for quantum optics that can be used for a variety of applications,” explains Dr. Frederik Thiele, who is spearheading the project with Niklas Lamberty, both members of the “Mesoscopic Quantum Optics” research group at Paderborn’s Department of Physics.
The researchers used state-of-the-art technologies such as superconducting detectors for this development. These devices measure individual light quanta with extremely high precision.
The electronics were deployed in a cryogenic environment: the amplifier and modulators were operated at temperatures of around -270 degrees Celsius in order to process signals without any significant delay. Integrated modulators are optical components that control the light based on measurement data—virtually loss-free and at high speeds.
The process is based on measuring light pairs, or “correlated photons.” Based on the number of particles measured, the electronic circuit decides in a fraction of a second whether the light should be forwarded or blocked. What makes the integrated design special is that physical losses and delays can be reduced to a minimum.
As well as a fast response, the circuit also produces less heat, which is vital when working in cryostats (extreme cooling systems) in very small spaces.
“Our demonstration shows that we can use superconducting and semiconducting technology to achieve a new level of photonic quantum control. This opens up opportunities for fast and complex quantum circuits, which could be vitally important for quantum information science and communication,” Thiele summarizes.
A new study reveals a fresh way to control and track the motion of skyrmions—tiny, tornado-like magnetic swirls that could power future electronics. Using electric currents in a special magnetic material called Fe₃Sn₂, the team got these skyrmions to “vibrate” in specific ways, unlocking clues about how invisible spin currents flow through complex materials.
The discovery not only confirms what theory had predicted but also points to a powerful new method for detecting spin currents—a discovery that could one day lead to more efficient memory and sensing devices in future electronics. The findings are published in the journal Nature Communications.
Led by Assistant Prof. Amir Capua and Ph.D. Candidate Nirel Bernstein from the Institute of Applied Physics and Nano Center at Hebrew University in collaboration with Prof. Wenhong Wang and Dr. Hang Li from Tiangong University, the team explored how skyrmions behave in a special magnetic material called Fe₃Sn₂ (iron tin).
This material is already known to be promising for use in advanced technologies because it keeps skyrmions stable even at extreme temperatures—a key requirement for practical devices.
What are skyrmions—and why do they matter? Skyrmions are ultra-small, stable magnetic swirls that can exist in certain materials. Because they can be manipulated with very little energy, they are being studied as building blocks for future low-power memory and computing systems.
The team discovered that by sending electrical currents through Fe₃Sn₂, they could excite certain kinds of “resonances” in the skyrmions—essentially making them vibrate in very specific ways. These vibrations, or “modes,” were detected using advanced optical techniques that observe changes in real time.
Interestingly, only two types of motion were triggered: a “breathing” mode (expanding and contracting like lungs) and a rotating motion. This confirmed earlier scientific predictions and suggests that Fe₃Sn₂ behaves differently than other magnetic materials.
A new type of spin current detected The researchers also noticed something unexpected: The width of the resonance signal changed when they applied a steady current. Using computer simulations, they showed that this effect was caused by a “damping-like torque,” which indicates the presence of spin-polarized currents. Furthermore, they realized that the resonances of the magnetic swirls were excited due to “spin-orbit torque” rather than the more familiar “spin-transfer torque.”
“This gives us a deeper understanding of how spin currents interact with magnetic materials, especially in systems where the internal magnetic structure is frustrated or disordered,” said Assistant Prof. Capua.
They also found signs that both real-space and momentum-space spin structures play a role in how electrons and spins move through the material, offering new clues about how to control electrical signals in future devices.
This research not only reveals new physics behind spin-torque effects but also opens up possibilities for using skyrmion resonances as highly sensitive detectors of spin currents—something that could benefit data storage, neuromorphic computing, and sensor technologies.
The study highlights how fundamental research in magnetism can lead to new tools for the electronics of tomorrow.
University of Illinois Physics Professor Paul Kwiat and members of his research group have developed a new tool for precision measurement at the nanometer scale in scenarios where background noise and optical loss from the sample are present.
This new optical interferometry technology leverages the quantum properties of light—specifically, extreme color entanglement—to enable faster and more precise measurements than widely used classical and quantum techniques can achieve.
Colin Lualdi, Illinois Physics graduate student and lead author of the study, emphasizes, “By taking advantage of both quantum interference and quantum entanglement, we can make measurements that would otherwise be difficult with existing methods.”
Lualdi says this tool has ready applications in medical diagnostics, remote system monitoring, and material characterization. The quantum properties of the new technology give it many advantages over current high-precision measurement tools used in these fields. It has increased sensitivity in cases where background noise is present, for example, when trying to measure a distant target that reflects light faintly—making it capable of taking outdoor ranging measurements in broad daylight.
It is also better for measuring samples that transmit light poorly or are sensitive to light, such as metallic thin films or biological tissues. Unlike some of the alternatives for measuring delicate samples, this technology does not require placing a physical probe in close proximity or contact with the material being measured, allowing for more versatile measurement configurations. It also takes faster measurements than some classical and quantum technologies, which will allow researchers to study dynamic systems such as vibrating surfaces—difficult with current techniques.
This technology represents a rare example of an instrument’s quantum advantages enabling immediate applications across many fields.
Kwiat explains, “It is a practical application of some very fundamental quantum mechanical effects that have been known for quite a long time and underpin a lot of quantum information processing. Our measurement hits the quantum limit of how much information can be extracted from a system.”
This research is published in Science Advances.
Interferometry: Classical versus quantum Optical interferometry is today’s gold standard in precision measurement. It uses the interference properties of light described by classical physics to measure tiny distances. Here’s how it works: when two light waves meet and their peaks and troughs are aligned, they can add to each other, interfering constructively to produce a higher-amplitude resultant wave. If, on the other hand, the peaks of one wave are aligned with the troughs of another wave, they will cancel one another out, interfering destructively to produce a lower-amplitude resultant wave.
The classical optical interferometer setup comprises a laser that shines a beam of light through a beam splitter. One light wave travels down the vertical arm, and the other travels down the horizontal arm. A mirror at the end of each arm reflects the light waves, which travel back to meet at the beam splitter. The lengths of the vertical and horizontal arms are arranged such that the two waves interfere destructively, canceling each other out so that no interference signal is detected.
But if the length of one of the arms is made shorter, for example, when a material of some thickness is inserted into one of the arms, the waves will add to each other when they meet back up at the beam splitter, creating an interference signal. Changes in the interference signal are then used to calculate the thickness of the material.
Classical interferometry has many successful applications. It has been used to detect gravitational waves—tiny ripples in the spacetime fabric that are less than the width of a proton. It is also used in medical diagnostic tools; for example, measuring retinal thickness to detect early signs of diseases. However, classical interferometers have limitations. They struggle to measure thin samples that transmit light poorly.
Background light can also leak in, weakening the interference signals and decreasing the sensitivity of the device in the same way an overexposed photo’s saturated light makes it hard to distinguish details.
Quantum two-photon interferometry addresses these shortcomings and adds new capabilities. In quantum physics, light is treated as discrete particles called photons. These particles maintain some wave-like qualities, including interference. In the quantum interferometer, a single photon is sent down each interferometer arm. Just like in the classical case, one goes through a sample, and one is a reference. They meet up, and their relative delay produces an interference signal at the detector.
Highly nondegenerate energy-entangled two-photon interferometer. Credit: Science Advances (2025). DOI: 10.1126/sciadv.adw4938 The quantum nature of this measurement overcomes the issue of measuring low-transmission materials—the strength of the interference signature is unchanged because the low-transmission loss affects both photons equally.
Lualdi explains, “As long as you detect two photons as a part of the interference measurement, the contrast of your interference signature will remain perfectly fine, which is a huge quantum advantage.”
Furthermore, the quantum interferometer’s sensitivity is much less impacted by background light. The measurement of the interference signal is taken in a narrow time window of the photons’ arrival, around 100 picoseconds. Nearly all background light can be filtered out because it does not arrive within that narrow window, meaning the quantum measurement remains highly sensitive.
Still, there are challenges to achieving nanometer sensitivity with quantum two-photon interferometry. Typically, to reach this level of precision, the measurement needs to run for hours or employ photons having a broad color bandwidth. In the same way that white light contains all the colors of the rainbow in its spectrum, photons can also have a particular color bandwidth. These broad-bandwidth photons are very difficult to work with in the lab, and hours-long measurements have limited applicability.
Extreme color entanglement is an advantage Quantum interferometry measurement capabilities can be increased by entangling the two photons. Entanglement is a quantum phenomenon in which the states of two particles are linked, regardless of the distance separating them. By entangling a property of the photons, in this case their color, the interferometer sensitivity increases. The Kwiat group has bypassed the technical issues stemming from the use of broad-bandwidth photons by employing two narrow-bandwidth entangled photons that have been prepared to have very different colors.
The greater the difference in the entangled photons’ colors, the greater the interferometer sensitivity. For example, entanglement between a strawberry-red photon and a raspberry-red photon (a wavelength difference of tens of nanometers) will produce a less sensitive interference signal than between a raspberry-red photon and a blueberry-blue photon. The latter is an example of extreme color entanglement.
“With entanglement, we only need to work with a little bit of blue and a little bit of red, instead of the whole span of colors between them,” explains Lualdi. The actual colors utilized, however, are invisible to the human eye, with wavelengths of 810 and 1,550 nanometers.
The team experimented with various light sources for generating extreme-color-entangled photons at the onset of this project. Their ultimate design enables a high entangled pair rate of hundreds of thousands per second, allowing faster measurements.
Having developed these advances, the team turned their attention to measuring real samples. The group collaborated with Illinois Electrical and Computer Engineering Professor Simeon Bogdanov and graduate student Swetapadma Sahoo to create a metallic thin film sample with low optical transmission—the type of sample that would show their technology’s advantages.
After measuring this sample with the new quantum interferometer, the researchers brought the sample to the Materials Research Laboratory for independent validation by atomic force microscopy. The results agreed. The new interferometer had made an accurate nanometer-scale measurement in a matter of seconds.
Future applications The new interferometric tool holds strong implications for applications across many fields. The Kwiat team is now focused on these potential applications and possible integrations with other measurement tools.
Kwiat elaborates, “We are trying to understand how we can further tailor this technology to be useful for other measurements… looking at thin films of biological samples, for example in microscopy, and being able to combine this with other sensing modalities, like atomic force microscopy.”
The lower light intensity of the Kwiat method—their source generates two photons at a time—opens exciting avenues for biological study. One could imagine imaging a sensitive biological tissue, such as the brain or retina, faster and over a larger area than current state-of-the-art techniques such as atomic force microscopy.
In addition, the lower light intensity allows for studying the behavior of photo-sensitive microorganisms, such as algae, in the dark. Current imaging methods require a bright spotlight to be shone on these organisms, making this kind of observation impossible.
The group is also currently exploring the technology’s capability to measure vibrations, which is much more difficult to do with existing technologies.
Lualdi says, “Compared to other quantum interferometers, our system measures faster and at a higher precision, and so now we have the opportunity to study time-varying signals, such as nanometer-scale vibrations, for example.”
Manuel Endres, professor of physics at Caltech, specializes in finely controlling single atoms using devices known as optical tweezers. He and his colleagues use the tweezers, made of laser light, to manipulate individual atoms within an array of atoms to study fundamental properties of quantum systems. Their experiments have led to, among other advances, new techniques for erasing errors in simple quantum machines; a new device that could lead to the world’s most precise clocks; and a record-breaking quantum system controlling more than 6,000 individual atoms.
One nagging factor in this line of work has been the normal jiggling motion of atoms, which make the systems harder to control. Now, reporting in the journal Science, the team has flipped the problem on its head and used this atomic motion to encode quantum information.
“We show that atomic motion, which is typically treated as a source of unwanted noise in quantum systems, can be turned into a strength,” says Adam Shaw, a co-lead author on the study along with Pascal Scholl and Ran Finkelstein.
Shaw was formerly a graduate student at Caltech during these experiments and is now a postdoctoral scholar at Stanford University. Scholl served as a postdoc at Caltech and is now working at the quantum computing company Pasqal. Finkelstein held the Troesh Postdoctoral Prize Fellowship at Caltech and is now a professor at Tel Aviv University.
Ultimately, the experiment not only encoded quantum information in the motion of the atoms but also led to a state known as hyper-entanglement. In basic entanglement, two particles remain connected even when separated by vast distances. When researchers measure the particles’ states, they observe this correlation: For example, if one particle is in a state known as spin up (in which the orientation of the angular momentum is pointing up), the other will always be spin down.
In hyper-entanglement, two characteristics of a particle pair are correlated. As a simple analogy, this would be like a set of twins separated at birth having both the same names and same types of cars: The two traits are correlated between the twins.
In the new study, Endres and his team were able to hyper-entangle pairs of atoms such that their individual states of motion and their individual electronic states—their internal energy levels—were correlated among the atoms. What is more, this experimental demonstration implies that even more traits could be entangled at the same time.
“This allows us to encode more quantum information per atom,” Endres explains. “You get more entanglement with fewer resources.”
The experiment is the first demonstration of hyper-entanglement in massive particles, such as neutral atoms or ions (earlier demonstrations used photons).
Adam Shaw, Ivaylo Madjarov and Manuel Endres work on their laser-based apparatus at Caltech. Credit: Caltech For these experiments, the team cooled down an array of individual alkaline-earth neutral atoms confined inside optical tweezers. They demonstrated a novel form of cooling via “detection and subsequent active correction of thermal motional excitations,” says Endres, which he compares to James Clerk Maxwell’s famous 1867 thought experiment invoking a demon that measures and sorts particles in a chamber. “We essentially measure the motion of each atom and apply an operation depending on the outcome, atom-by-atom, similar to Maxwell’s demon.”
The method, which outperformed the best-known laser cooling techniques, caused the atoms to come to nearly a complete standstill.
From there, the researchers induced the atoms to oscillate like a swinging pendulum, but with an amplitude of approximately100 nanometers, which is much smaller than the width of a human hair. They were able to excite the atoms into two distinct oscillations simultaneously, causing the motion to be in a state of superposition. Superposition is a quantum state in which a particle exhibits opposite traits simultaneously, like a particle’s spin being both up and down at the same time.
“You can think of an atom moving in this superposition state like a kid on a swing who starts getting pushed by two parents on opposite sides, but simultaneously,” Endres says. “In our everyday world, this would certainly lead to a parental conflict; in the quantum world, we can remarkably make use of this.”
They then entangled the individual, swinging atoms to partner atoms, creating a correlated state of motion over several micrometers of distance. After the atoms were entangled, the team then hyper-entangled them in such a way that both the motion and the electronic states of the atoms were correlated.
“Basically, the goal here was to push the boundaries on how much we could control these atoms,” Endres says. “We are essentially building a toolbox: We knew how to control the electrons within an atom, and we now learned how to control the external motion of the atom as a whole. It’s like an atom toy that you have fully mastered.”
The findings could lead to new ways to perform quantum computing as well as quantum simulations designed to probe fundamental questions in physics. “Motional states could become a powerful resource for quantum technology, from computing to simulation to precision measurements,” Endres says.