Study finds cell cytoskeleton mimics critical phenomena seen in earthquakes and metals

Prof. Michael Murrell’s group (lead author Zachary Gao Sun, graduate student in physics) in collaboration with Prof. Garegin Papoian’s group from the University of Maryland at College Park has found critical phenomena (self-organized criticality) that are reminiscent of the earthquakes and avalanches inside the cell cytoskeleton through self-organization of purified protein components.

In a groundbreaking discovery, researchers have found that the cell’s cytoskeleton—the mechanical machinery of the cell—behaves much like Earth’s crust, constantly regulating how it dissipates energy and transmits information. This self-regulating behavior enables cells to carry out complex processes such as migration and division with remarkable precision.

Even more striking, the study draws parallels between the behavior of microscopic cellular structures and massive celestial bodies, suggesting that the principles of criticality—where systems naturally tune themselves to the brink of transformation—may be universal across vastly different scales of nature.

The results also suggest a metal-to-insulator-like transition in information and energy propagation can be tuned via autofeedback of geometry and active stress inside the cytoskeleton, reminiscent of a phenomenon called Anderson localization, commonly seen in various condensed matter physics fields.

This further indicates that the cell, as a living machinery, uses energetic and mechanical principles commonly seen in non-living systems to process information through self-tuning. The work is published in the journal Nature Physics.

“Whether the cell as machinery is being poised at a critical state, and further, how, have been the central topics for some biophysicists in the past two decades. Here, we have observed phenomena in a well-controlled experimental setting, and proposed the mechanism. Isn’t it amazing to see similarities across scale objects under the microscope to the telescope?” Sun commented.

Sun and colleagues have discovered that cells may regulate information and energy flow using a mechanism strikingly similar to a well-known physics phenomenon called Anderson localization—a process typically observed in non-living systems like disordered metals and insulators. The research shows that the cytoskeleton, the cell’s internal scaffolding, can undergo a metal-to-insulator–like transition in how it transmits signals and energy.

This transition appears to be finely tuned by the cell itself through feedback between its geometry and internal stress. The findings suggest that cells, like finely engineered machines, harness physical laws from condensed matter to adapt and process information—blurring the line between the living and the inanimate.

This work motivates scientists in different disciplines to wonder if a scale-free universal laws of criticality truly exists, and each cell is its own “universe.”

Laser pulses and nanoscale changes yield stable skyrmion bags for advanced spintronics

A team of researchers at the Max Born Institute and collaborating institutions has developed a reliable method to create complex magnetic textures, known as skyrmion bags, in thin ferromagnetic films. Skyrmion bags are donut-like, topologically rich spin textures that go beyond the widely studied single skyrmions.

Magnetic skyrmions are nanometer-sized, stable magnetization vortices with promising applications in spintronics and data storage. Their simplest forms have been explored extensively and take on a circular shape where the spins rotate by 180° from the outside to the inside in a thin magnetic film. The spins in the center of the skyrmion, therefore, point to the opposite direction to those outside the skyrmion.

More intricate configurations include the so-called skyrmionium where the spins rotate by 360° and the spins in the skyrmionium’s center have the same orientation as those outside, resulting in a ring-like structure. Remarkably, this ring can then be filled with skyrmions again, leading to the target skyrmion for one skyrmion inside the ring and skyrmion bags for multiple skyrmions inside.

While theory has already predicted such higher-order configurations, they have remained difficult to produce in real materials in a controlled way.

In a new study published in Advanced Materials (see fig. 1), the research team demonstrates how nanoscale modifications of the magnetic properties of the material, introduced via focused helium-ion beams, can foster the generation of these higher-order textures.

These local anisotropy modifications are designed such that the desired structures can be formed selectively using single ultrafast laser pulses. The resulting magnetic textures with features on the sub-100-nm scale were directly imaged using a high-resolution X-ray microscope equipped with a tailored laser system developed at the Max Born Institute.

Fig. 2. X-ray microscopy images of the magnetization textures showing the different orders of skyrmion bags, ranging from the empty skyrmionium to a bag filled with four skyrmions. The scale bar is 500 nm. Credit: Advanced Materials (2025). DOI: 10.1002/adma.202501250
The researchers demonstrate the generation of a variety of skyrmion bags, from the empty skyrmionium up to bags filled with four skyrmions (see fig. 2). They showed that the skyrmion-bag generation triggered by single laser pulses has a significantly higher success rate compared to a purely magnetic-field-driven approach.

The repeatable, consistent generation of such textures is the key prerequisite for studying the dynamics of higher-order skyrmions in future time-resolved experiments. This work offers a practical route to investigate and utilize complex skyrmion states in thin-film materials, which is an important step toward future spintronic devices that leverage topological control at the nanoscale.

Physicists show tensor mesons play important role in light-on-light scattering

Usually, light waves can pass through each other without any resistance. According to the laws of electrodynamics, two light beams can exist in the same place without influencing each other; they simply overlap. Light saber battles, as seen in science fiction films, would therefore be rather boring in reality.

Nevertheless, quantum physics predicts the effect of “light-on-light scattering.” Ordinary lasers are not powerful enough to detect it, but it has been observed at the CERN particle accelerator. Virtual particles can literally emerge from nothing for a short time, interact with the photons and change their direction. The effect is extremely small, but it must be understood precisely in order to verify particle physics theories through current high-precision experiments on muons.

A team at TU Wien (Vienna) has now been able to show that a previously underestimated aspect plays an important role in this: the contribution of so-called tensor mesons. The new results have been published in the journal Physical Review Letters.

Virtual particles from nothing
When photons interact with photons, virtual particles can be created. They cannot be measured directly, as they disappear immediately. In a sense, they are constantly there and not there at the same time—quantum physics allows such superpositions of states that would be mutually exclusive according to our classical everyday understanding.

“Even though these virtual particles cannot be observed directly, they have a measurable effect on other particles,” says Jonas Mager from the Institute of Theoretical Physics at TU Wien, lead author of the study. “If you want to calculate precisely how real particles behave, you have to take all conceivable virtual particles into account correctly. That’s what makes this task so difficult—but also so interesting.”

When light scatters off light, a photon may transform, for example, into an electron-positron pair. Other photons can then interact with these two particles before the electron and positron annihilate each other and become a new photon. Things become more complicated when heavier particles are created that are also subject to strong nuclear forces—for example, mesons, which consist of a quark and an antiquark.

“There are different types of these mesons,” says Mager. “We have now been able to show that one of them, the tensor mesons, has been significantly underestimated. Through the effect of light-light scattering, they influence the magnetic properties of muons, which can be used to test the Standard Model of particle physics with extreme accuracy.”

Tensor mesons did appear in earlier calculations, but with very rough simplifications. In the new evaluation, not only does their contribution turn out to be much stronger than previously assumed, but it also has a different sign than previously thought, thus influencing the results in the opposite direction.

Unusual theoretical methods
This result also resolves a discrepancy that arose last year between the latest analytical calculations and alternative computer simulations. “The problem is that conventional analytical calculations can describe the strong interactions of quarks only well in limiting cases,” says Anton Rebhan (TU Wien).

The TU Wien team, on the other hand, used an unconventional method—holographic quantum chromodynamics. This involves mapping processes in four dimensions (i.e., three spatial dimensions and one time dimension) onto a five-dimensional space with gravity. Some problems can then be solved more easily in this other space, and the results are then transformed back again.

“The tensor mesons can be mapped onto five-dimensional gravitons, for which Einstein’s theory of gravity makes clear predictions,” explains Rebhan. “We now have computer simulations and analytical results that fit well together but deviate from certain previous assumptions. We hope that this will also provide new impetus to accelerate already planned specific experiments on tensor mesons.”

The Standard Model put to the test
These analyses are important for one of the biggest questions in physics: How reliable is the Standard Model of particle physics? This is the generally accepted quantum physical theory that describes all known types of particles and all forces of nature—except gravity.

The accuracy of the Standard Model can be investigated particularly well in a few special test cases, for example by measuring the magnetic moment of muons. For many years, scientists have been puzzling over whether certain discrepancies between theory and experiment point to “new physics” beyond the Standard Model, or whether they are simply inaccuracies or errors.

The discrepancy in the muon magnetic moment has recently become much smaller—but in order to really search for new physics, the remaining theoretical uncertainties must also be understood as precisely as possible. This is exactly what the new work contributes to.

Hybrid crystal-glass materials from meteorites transform heat control

Crystals and glasses have opposite heat-conduction properties, which play a pivotal role in a variety of technologies. These range from the miniaturization and efficiency of electronic devices to waste-heat recovery systems, as well as the lifespan of thermal shields for aerospace applications.

The problem of optimizing the performance and durability of materials used in these different applications essentially boils down to fundamentally understanding how their chemical composition and atomic structure (e.g., crystalline, glassy, nanostructured) determine their capability to conduct heat.

Michele Simoncelli, assistant professor of applied physics and applied mathematics at Columbia Engineering, tackles this problem from first principles—i.e., in Aristotle’s words, in terms of “the first basis from which a thing is known”—starting from the fundamental equations of quantum mechanics and leveraging machine-learning techniques to solve them with quantitative accuracy.

In research published on July 11 in the Proceedings of the National Academy of Sciences, Simoncelli and his collaborators, Nicola Marzari from the Swiss Federal Technology Institute of Lausanne and Francesco Mauri from Sapienza University of Rome, predicted the existence of a material with hybrid crystal-glass thermal properties, and a team of experimentalists led by Etienne Balan, Daniele Fournier, and Massimiliano Marangolo from the Sorbonne University in Paris confirmed it with measurements.

The first of its kind, this material was discovered in meteorites and has also been identified on Mars. The fundamental physics driving this behavior could advance our understanding and design of materials that manage heat under extreme temperature differences—and, more broadly, provide insight into the thermal history of planets.

A unified theory of thermal transport in atomically ordered crystals and disordered glasses
Thermal conduction depends on whether a material is crystalline, with an ordered lattice of atoms, or glassy, with a disordered, amorphous structure, which influences how heat flows at the quantum level. Broadly speaking, thermal conduction in crystals typically decreases with increasing temperature, while in glasses it increases upon heating.

In 2019, Simoncelli, Nicola Marzari, and Francesco Mauri derived a single equation that captures the opposite thermal-conductivity trends observed in crystals and glasses—and, most importantly, also describes the intermediate behavior of defective or partially disordered materials, such as those used in thermoelectrics for waste-heat recovery, perovskite solar cells, and thermal barrier coatings for heat shields.

Using this equation, they investigated the relationship between atomic structure and thermal conductivity in materials made from silicon dioxide, one of the main components of sand. They predicted that a particular “tridymite” form of silicon dioxide, described in the 1960s as typical of meteorites, would exhibit the hallmarks of a hybrid crystal-glass material with a thermal conductivity that remains unchanged with temperature. This unusual thermal-transport behavior bears analogies to the Invar effect in thermal expansion, for which the Nobel Prize in Physics was awarded in 1920.

That led the team to the experimental groups of Etienne Balan, Daniele Fournier, and Massimiliano Marangolo in France, who obtained special permission from the National Museum of Natural History in Paris to perform experiments on a sample of silica tridymite carved from a meteorite that landed in Steinbach, Germany, in 1724.

Their experiments confirmed their predictions: Meteoric tridymite has an atomic structure that falls between an orderly crystal and disordered glass, and its thermal conductivity remains essentially constant over the experimentally accessible temperature range of 80 K to 380 K.

Upon further investigation, the team also predicted that this material could form from decade-long thermal aging in refractory bricks used in furnaces for steel production. Steel is one of the most essential materials in modern society, but producing it is carbon-intensive: just 1 kg of steel emits approximately 1.3 kg of carbon dioxide, with the nearly 1 billion tons produced each year accounting for about 7% of carbon emissions in the U.S. Materials derived from tridymite could be used to more efficiently control the intense heat involved in steel production, helping to reduce the steel industry’s carbon footprint.

Future: From AI-driven solutions of first-principles theories to real-world technologies
In this new PNAS paper, Simoncelli and the team employed machine-learning methods to overcome the computational bottlenecks of traditional first-principles methods and simulate atomic properties that influence heat transport with quantum-level accuracy. The quantum mechanisms that govern heat flow through hybrid crystal-glass materials may also help us understand the behavior of other excitations in solids, such as charge-carrying electrons and spin-carrying magnons.

Research on these topics is shaping emerging technologies, including wearable devices powered by thermoelectrics, neuromorphic computing, and spintronic devices that exploit magnetic excitations for information processing.

Simoncelli’s group at Columbia is exploring these topics, structured around three core pillars: the formulation of first-principles theories to predict experimental observables, the development of AI simulation methods for quantitatively accurate predictions of material properties, and the application of theory and methods to design and discover materials to overcome targeted industrial or engineering challenges.

New microscope creates 3D ghost images of nanoparticles using entangled photons

Ghost imaging is like a game of Battleship. Instead of seeing an object directly, scientists use entangled photons to remove the background and reveal its silhouette. This method can be used to study microscopic environments without much light, which is helpful for avoiding photodamage to biological samples.

So far, quantum ghost imaging has been limited to two dimensions, or to two planes at fixed z positions. In a new study, published in Optica, scientists at Lawrence Livermore National Laboratory (LLNL) developed a 3D quantum ghost imaging microscope—the first of its kind.

“This is a new way of 3D imaging that can do things with more sensitivity and gather more information without having to scan a sample,” said LLNL scientist and author Audrey Eshun.

The method works based on the quantum phenomenon of entanglement. A laser illuminates a crystal that generates photon pairs that are entangled, or linked together in space and time. These pairs hit a mirror that separates them: one, called the “signal” photon, turns left toward the sample, while the other, the “idler” photon, continues straight to a camera-like detector.

The idler photons, which do not interact with the sample, form a uniform, featureless image on the detector.

Meanwhile, the signal photons move through a microscope objective that collects and focuses them onto a sample. In this case, the authors looked at metallic nanoparticle clusters.3D volumetric reconstruction of silver nanoclusters. Signal perspective is (Z,Y) plane, idler perspective is (X,Y) plane, and (X,Z) plane shows coverslip orientation. Side of each cube is 40 µm. Credit: Optica (2025). DOI: 10.1364/OPTICA.565248

The sample is tilted at a 45-degree angle relative to the incoming photons. When the photons hit it, they scatter in every direction.

Another microscope objective, positioned at a right angle to the incoming light, collects the scattered photons and directs them to a second detector. This camera captures a standard snapshot of the y-z plane of the nanoparticles.

Both detectors measure the exact arrival time of each photon. By matching the timestamps of photon pairs detected by both cameras, researchers can determine which idler photons correspond to signal photons that interacted with the sample. They remove every other photon from the featureless idler image, revealing a ghost image of the x-y plane of the sample.

“The standard image has y and z coordinates and a time for each pixel, and the ghost image has x and y coordinates and a time for each pixel,” said Eshun. “By grouping all the photons that have the same timestamp, we can figure out the x, y and z position for each photon. These coordinates can then be plotted to form a 3D image.”

In comparison to other techniques, 3D quantum ghost imaging doesn’t require scanning the sample—it can happen all at once. It uses extremely low-light intensities, so it could be useful for imaging light-sensitive materials.

“This microscope is the first of its kind,” said LLNL scientist and author Ted Laurence. “There was another 3D quantum ghost image, but in that case the resolution was about 3 centimeters. This is microns. We are getting three spatial dimensions of information at the micron scale.”

Next, the team aims to use this method for high-speed tracking of the movement of cells in relation to each other.

Researchers develop flexible fiber material for self-powered health-monitoring sensors

Could clothing monitor a person’s health in real time, because the clothing itself would be a self-powered sensor? A new material created through electrospinning, which is a process that draws out fibers using electricity, brings this possibility one step closer.

A team led by researchers at Penn State has developed a new fabrication approach that optimizes the internal structure of electrospun fibers to improve their performance in electronic applications. The team has published its findings in the Journal of Applied Physics.

This novel electrospinning approach could open the door to more efficient, flexible and scalable electronics for wearable sensors, health monitoring and sustainable energy harvesting, according to Guanchun Rui, a visiting postdoctoral student in the Department of Electrical Engineering and the Materials Research Institute and co-lead author of the study.

The material is based on poly(vinylidene fluoride-trifluoroethylene), or PVDF-TrFE, a lightweight, flexible polymer known for its ability to generate an electric charge when pressed or bent. That quality, called piezoelectricity, makes it a strong candidate for use in electronics that convert motion into energy or signals.

“PVDF-TrFE has strong ferroelectric, piezoelectric and pyroelectric properties,” Rui said, explaining that like piezoelectricity, pyroelectricity can generate electric charges when temperature change and thus influence the material. “It’s thermally stable, lightweight and flexible, which makes it ideal for things like wearable electronics and energy harvesters.”

Electrospinning is a technique that uses electric force to stretch a polymer solution into extremely thin fibers. As the fibers dry, the way the polymer chains pack together determines their performance. The researchers hypothesized that altering the concentration and molecular weight of the polymer solution could lead to more organized molecular structures.

“Crystallinity means the molecules are more ordered,” Rui said, noting that the team also theorized the structure could have higher polar phase content. “And when we talk about polar phase content, we mean that the positive and negative charges in the molecules are aligned in specific directions. That alignment is what allows the material to generate electricity from motion.”

The researchers explained that electrospinning plays a key role in enabling this alignment.

“The process stretches the fibers in a highly mobile state, which predisposes the polymer chains to crystallize into the form we want,” said Patrick Mather, a co-author of the study and professor of chemical engineering and dean of the Schreyer Honors College. “You start with a liquid, and it dries over a split second as it travels to the collector. All the packing happens during that brief flight.”

One surprising discovery, Mather said, came from experimenting with unusually high concentrations of polymer in the solution.

Can the Large Hadron Collider snap string theory?

In physics, there are two great pillars of thought that don’t quite fit together. The Standard Model of particle physics describes all known fundamental particles and three forces: electromagnetism, the strong nuclear force, and the weak nuclear force. Meanwhile, Einstein’s general relativity describes gravity and the fabric of spacetime.

However, these frameworks are fundamentally incompatible in many ways, says Jonathan Heckman, a theoretical physicist at the University of Pennsylvania. The Standard Model treats forces as dynamic fields of particles, while general relativity treats gravity as the smooth geometry of spacetime, so gravity “doesn’t fit into physics’s Standard Model,” he explains.

In a recent paper in Physical Review Research, Heckman, Rebecca Hicks, a Ph.D. student at Penn’s School of Arts & Sciences, and their collaborators turn that critique on its head. Instead of asking what string theory predicts, the authors ask what it definitively cannot create. Their answer points to a single exotic particle that could show up at the Large Hadron Collider (LHC). If that particle appears, the entire string-theory edifice would be, in Heckman’s words, “in enormous trouble.”

String theory: The good, the bad, the energy-hungry
For decades, physicists have sought a unified theory that can reconcile quantum mechanics, and by extension, the behavior of subatomic particles, with gravity—which is described as a dynamic force in general relativity but is not fully understood within quantum contexts, Heckman says.

A good contender for marrying gravity and quantum phenomena is string theory, which posits that all particles, including a hypothetical one representing gravity, are tiny vibrating strings and which promises a single framework encompassing all forces and matter.

“But one of the drawbacks of string theory is that it operates in high-dimensional math and a vast ‘landscape’ of possible universes, making it fiendishly difficult to test experimentally,” Heckman says, pointing to how string theory necessitates more than the familiar four dimensions— x, y, z, and time—to be mathematically consistent.

“Most versions of string theory require a total of 10 or 11 spacetime dimensions, with the extra dimensions being sort of ‘curled up’ or folding in on one another to extremely small scales,” Hicks says.

To make matters even trickier, string theory’s distinctive behaviors only clearly reveal themselves at enormous energies, “those far beyond what we typically encounter or even generate in current colliders,” Heckman says.

Hicks likens it to zooming in on a distant object: At everyday, lower energies, strings look like regular point-like particles, just as a faraway rope might appear to be a single line.

“But when you crank the energy way up, you start seeing the interactions as they truly are—strings vibrating and colliding,” she explains. “At lower energies, the details get lost, and we just see the familiar particles again. It’s like how from far away, you can’t make out the individual fibers in the rope. You just see a single, smooth line.”

That’s why physicists hunting for signatures of string theory must push their colliders—like the LHC—to ever-higher energies, hoping to catch glimpses of fundamental strings rather than just their lower-energy disguises as ordinary particles.

Why serve string theory a particle it likely won’t be able to return?
Testing a theory often means searching for predictions that confirm its validity. But a more powerful test, Heckman says, is finding exactly where a theory fails. If scientists discover that something a theory forbids actually exists, the theory is fundamentally incomplete or flawed.

Because string theory’s predictions are vast and varied, the researchers instead asked if there’s a simple particle scenario that string theory just can’t accommodate.

They zeroed in on how string theory deals with particle “families,” groups of related particles bound together by the rules of the weak nuclear force, responsible for radioactive decay. Typically, particle families are small packages, like the electron and its neutrino sibling, that form a tidy two-member package called a doublet. String theory handles these modest particle families fairly well, without issue.

However, Heckman and Hicks identified a family that is conspicuously absent from any known string-based calculation: a five-member particle package, or a 5-plet. Heckman likens this to trying to order a Whopper meal from McDonald’s: “No matter how creatively you search the menu, it never materializes.”

“We scoured every toolbox we have, and this five-member package just never shows up,” Heckman says.

But what exactly is this elusive 5-plet?
Hicks explains it as an expanded version of the doublet: “The 5-plet is its supersized cousin, packing five related particles together.”

Physicists encapsulate this particle family in a concise mathematical formula known as the Lagrangian, essentially the particle-physics cookbook. The particle itself is called a Majorana fermion, meaning it acts as its own antiparticle, akin to a coin that has heads on both sides.

Identifying such a particle would directly contradict what current string theory models predict is possible, making the detection of this specific particle family at the LHC a high-stakes test, one that could potentially snap string theory.

Why a 5-plet hasn’t been spotted and the vanishing-track clue
Hicks cites two major hurdles for spotting these 5-plet structures: “production and subtlety.”

In a collider, energy can literally turn into mass; Einstein’s E = mc² says that enough kinetic oomph (E) can be converted into the heft (m) of brand-new particles, so the heavier the quarry the rarer the creation event.

“The LHC has to slam protons together hard enough to conjure these hefty particles out of pure energy,” Hicks explains, citing Einstein’s E = mc², which directly links energy (E) to mass (m). “As the masses of these particles climb toward a trillion electron volts, the chance of creating them drops dramatically.”

Even if produced, detection is challenging. The charged particles in the 5-plet decay very quickly into nearly invisible products.

“The heavier states decay into a soft pion and an invisible neutral particle, zero (X0),” Hicks says. “The pion is so low-energy it’s basically invisible, and X0 passes straight through. The result is a track that vanishes mid-detector, like footprints in snow suddenly stopping.”

Those signature tracks get picked up by LHC’s ATLAS (short for A Toroidal LHC ApparatuS) and CMS (Compact Muon Solenoid), house-sized “digital cameras” wrapped around the collision center. They sit at opposite collision points and operate independently, giving the physics community two sets of eyes on every big discovery. Penn physicists like Hicks are part of the ATLAS Collaboration, helping perform the searches that look for quirky signals like disappearing tracks.

ATLAS’s wheel-like end-cap reveals the maze of sensors primed to catch proton smash-ups at the LHC. Researchers comb through billions of events in search of fleeting “ghost” tracks that might expose cracks in string theory. Credit: CERN
Why a 5-plet matters for dark matter
Hicks says finding the 5-plet isn’t only important for testing string theory, pointing to another exciting possibility: “The neutral member of the 5-plet could explain dark matter, the mysterious mass shaping up most of our universe’s matter.”

Dark matter constitutes roughly 85% of all matter in the universe, yet scientists still don’t know what exactly it is.

“If the 5-plet weighs around 10 TeV—about 10,000 proton masses—it neatly fits theories about dark matter’s formation after the Big Bang,” Hicks says. “Even lighter 5-plets could still play a role as part of a broader dark matter landscape.”

“If we detect a 5-plet, it’s a double win,” says Hicks. “We’d have disproven key predictions of string theory and simultaneously uncovered new clues about dark matter.”

What the LHC has already ruled out
Using existing ATLAS data from collider runs, the team searched specifically for 5-plet signals. “We reinterpreted searches originally designed for ‘charginos’—hypothetical charged particles predicted by supersymmetry—and looked for 5-plet signatures,” Hicks says of the team’s search through the repurposed ATLAS disappearing-track data. “We have found no evidence yet, which means any 5-plet particle must weigh at least 650–700 GeV, five times heavier than the Higgs boson.”

For context, Heckman says, “this early result is already a strong statement; it means lighter 5-plets don’t exist. But heavier ones are still very much on the table.”

Future searches with upgraded LHC experiments promise even sharper tests. “We’re not rooting for string theory to fail,” Hicks says. “We’re stress-testing it, applying more pressure to see if it holds up.”

“If string theory survives, fantastic,” Heckman says. “If it snaps, we’ll learn something profound about nature.”

Compact setup successfully detects elusive antineutrinos from nuclear reactor

Neutrinos are extremely elusive elementary particles. Day and night, 60 billion of them stream from the sun through every square centimeter of Earth every second, which is transparent to them. After the first theoretical prediction of their existence, decades passed before they were actually detected. These experiments are usually extremely large to account for the very weak interaction of neutrinos with matter.

Scientists at the Max Planck Institute for Nuclear Physics (MPIK) in Heidelberg have now succeeded in detecting antineutrinos from the reactor of a nuclear power plant using the CONUS+ experiment, with a detector mass of just 3 kg. The work is published in Nature.

Originally based at the Brokdorf nuclear power plant, the CONUS experiment was relocated to the Leibstadt nuclear power plant (KKL) in Switzerland in the summer of 2023. Improvements to the 1 kg germanium semiconductor detectors, as well as the excellent measurement conditions at KKL, made it possible for the first time to measure what is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS).

In this process, neutrinos do not scatter off the individual components of the atomic nuclei in the detector, but rather coherently with the entire nucleus. This significantly increases the probability of a very small but observable nuclear recoil. This recoil caused by neutrino scattering is comparable to a ping-pong ball bouncing off a car, with the detection being the changing motion of the car.

In the case of CONUS+, the scattering partners are the atomic nuclei of the germanium. Observing this effect requires low-energy neutrinos, such as those produced in large numbers in nuclear reactors.The results of the recent measurement of antineutrinos with the CONUS+ Setup. Credit: MPIK

The effect was predicted as early as 1974, but was first confirmed in 2017 by the COHERENT experiment at a particle accelerator. The CONUS+ experiment has now successfully observed the effect at full coherence and lower energies in a reactor for the first time. The compact CONUS+ setup is located 20.7 m from the reactor core. At this position, more than 10 trillion neutrinos flow through every square centimeter of surface every second.

After approximately 119 days of measurement between autumn 2023 and summer 2024, the researchers were able to extract an excess of 395±106 neutrino signals from the CONUS+ data, after subtracting all background and interfering signals. This value is in very good agreement with theoretical calculations, within the measurement uncertainty.

“We have thus successfully confirmed the sensitivity of the CONUS+ experiment and its ability to detect antineutrino scattering from atomic nuclei,” explains Dr. Christian Buck, one of the authors of the study. He also emphasizes the potential development of small, mobile neutrino detectors to monitor reactor heat output or isotope concentration as possible future applications of the CEvNS technique presented here.

The CEvNS measurement provides unique insights into fundamental physical processes within the Standard Model of particle physics, the current theory describing the structure of our universe. Compared to other experiments, the measurements with CONUS+ allow for a reduced dependence on nuclear physics aspects, thereby improving the sensitivity to new physics beyond the Standard Model.

For this reason, CONUS+ was already equipped with improved and larger detectors in autumn 2024. With the resulting measurement accuracy, even better results are expected.

“The techniques and methods used in CONUS+ have excellent potential for fundamental new discoveries,” emphasizes Prof. Lindner, initiator of the project and also an author of the study. “The groundbreaking CONUS+ results could therefore mark the starting point for a new field in neutrino research.”

Plastic-based spectrometers offer low-cost, compact solution for broadband spectral imaging

A multinational research team, including engineers from the University of Cambridge and Zhejiang University, has developed a breakthrough in miniaturized spectrometer technology that could dramatically expand the accessibility and functionality of spectral imaging in everyday devices.

The study, titled “Stress-engineered ultra-broadband spectrometer,” published in the journal Science Advances, describes a novel, low-cost spectrometer platform built from programmable plastic materials rather than conventional glass.

These innovative devices operate across the full visible and short-wave infrared (SWIR) range—spanning 400 to 1,600 nanometers—which opens up a wealth of possibilities for real-world applications.

Traditionally, spectrometers—the tools that analyze the composition of light to detect materials or environmental conditions—have been bulky, expensive and difficult to mass-produce. Most are also limited to narrow spectral bands or rely on multiple specialized components to cover a broader range.

The new approach sidesteps these issues with a lightweight, scalable alternative that leverages recent advances in polymer science and computational optics.

A plastic revolution in optics
The team was inspired by the evolution of smartphone cameras, which now rely heavily on plastic optical components to achieve high performance in ultra-compact formats. Applying the same principle to spectrometer design, the researchers used transparent shape memory epoxies (SMPs) to engineer dispersive optical elements—components that separate light into its spectral components.

What makes this approach truly innovative is the use of internal stress to tailor the optical properties of the plastic. Normally, stress patterns that develop during the manufacture of plastic objects are uncontrolled and unstable. However, SMPs can be mechanically stretched at elevated temperatures to “program” precise and stable stress distributions into the material. These stresses create birefringence—an optical effect where light is split according to its wavelength.

“By shaping the internal stress within the polymer, we are able to engineer spectral behavior with high repeatability and tunability, something that’s incredibly difficult to achieve with conventional optics,” said Gongyuan Zhang from Zhejiang University, the lead author of the study.

The resulting films act as spectral filters, encoding information that can be read by standard CMOS image sensors. With the aid of computational spectral reconstruction algorithms, these planar components can be turned into powerful, compact spectrometers.

From lab bench to consumer tech
One of the major achievements of this work is demonstrating that these stress-engineered films can be fabricated in a single step, without the need for lithography or expensive nanofabrication. This makes the devices ideal for mass production and integration into consumer electronics, such as mobile phones, wearable health monitors, and even food quality testers.

“We’ve shown that you can use programmable plastics to cover a much broader range of the spectrum than typical miniaturized systems—right into the SWIR,” said Professor Zongyin Yang, lead author from Zhejiang University. “That’s really important for applications like agricultural monitoring, mineral exploration, and medical diagnostics.”

The spectrometers are also highly compact, and the team successfully integrated them into a line-scanning spectral imaging system—suggesting their suitability for hyperspectral imaging in portable form. By linearly varying the stress across the length of the film, the team could create gradient filters capable of scanning a scene one slice at a time, collecting rich spectral data in the process.

A platform for the future
This work represents more than a technical breakthrough; it lays the foundation for a new class of ultra-portable, broadband sensing devices that could transform industrial and consumer markets.

The researchers point to several likely applications: detecting pollutants in water or air, verifying the authenticity of drugs, monitoring blood sugar non-invasively, and even sorting recyclable materials in real-time.

By eliminating the traditional trade-offs between size, cost and spectral range, the platform could democratize access to high-quality spectral data. It also aligns with broader research efforts at Cambridge’s Department of Engineering in computational photonics and sustainable sensing technologies, areas aiming to push intelligence and functionality into smaller, more accessible formats.

“This work shows how mechanical design principles can be used to reshape photonic functionality,” said co-author Professor Tawfique Hasan from Cambridge’s Department of Engineering.

“By embedding stress into transparent polymers, we have created a new class of dispersive optics that are not only lightweight and scalable but also adaptable across a wide spectral range. This level of flexibility is very difficult to achieve with traditional optics relying on static, lithographically defined structures.”

Speed test of ‘tunneling’ electrons challenges alternative interpretation of quantum mechanics

Quantum mechanics describes the unconventional properties of subatomic particles, like their ability to exist in a superposition of multiple states, as popularized by the Schrödinger’s cat analogy, and ability to slip through barriers, a phenomenon known as quantum tunneling.

Reporting in the journal Nature, a team of researchers tested a unique aspect of Bohmian mechanics, an alternative interpretation of quantum theory. This twist on classical quantum theory predicts that a tunneling quantum particle would remain “at rest” inside an infinitely long barrier. The time it spends inside the barrier, called dwell time, would therefore be infinite.

In the classic “Copenhagen” interpretation of quantum physics, photons and other subatomic particles exist as waves of probabilities with no defined location until they are observed. At that point, a particle’s waveform collapses into a discrete particle with a definite location, as demonstrated by the famous double-slit experiment.

The alternate Bohmian interpretation posits that particles remain point-like objects. In this model, the positions of particles are determined by some unmeasured “hidden” variables and their trajectories are guided by a pilot wave, which gives the appearance of wave-particle duality.

Experimental setup for measuring the speed of particles. Credit: Nature (2025). DOI: 10.1038/s41586-025-09099-4
Both interpretations make many of the same predictions, but they differ greatly in the way they describe the fundamental nature of particles.

To test the unique prediction of Bohmian mechanics that photons can, in effect, remain frozen in time when tunneling through a barrier of infinite length, the researchers designed an experiment that, to a photon, would simulate an infinitely long barrier.

The setup was constructed by sandwiching together a pair of specially designed mirrors. The lower mirror was etched with a nanoscale ramp and a pair of parallel waveguides. By shining a laser on the ramp, the researcher could produce photons and control their momentum.

As the photons traveled along the waveguide and tunneled into the barrier, they also tunneled into the secondary waveguide, jumping back and forth between the two at a consistent rate, allowing the research team to calculate their speed.

By combining this element of time with measurements of the photon’s rate of decay inside the barrier, the researchers were able to calculate dwell time, which was found to be finite.

The researchers write, “Our findings contribute to the ongoing tunneling time debate and can be viewed as a test of Bohmian trajectories in quantum mechanics. Regarding the latter, we find that the measured energy–speed relationship does not align with the particle dynamics postulated by the guiding equation in Bohmian mechanics.”

This result challenges but does not rule out the Bohmian prediction. Since the researchers’ experiment was an analog that relied on various assumptions, its findings are not conclusive and may themselves be challenged.