Plastic-based spectrometers offer low-cost, compact solution for broadband spectral imaging

A multinational research team, including engineers from the University of Cambridge and Zhejiang University, has developed a breakthrough in miniaturized spectrometer technology that could dramatically expand the accessibility and functionality of spectral imaging in everyday devices.

The study, titled “Stress-engineered ultra-broadband spectrometer,” published in the journal Science Advances, describes a novel, low-cost spectrometer platform built from programmable plastic materials rather than conventional glass.

These innovative devices operate across the full visible and short-wave infrared (SWIR) range—spanning 400 to 1,600 nanometers—which opens up a wealth of possibilities for real-world applications.

Traditionally, spectrometers—the tools that analyze the composition of light to detect materials or environmental conditions—have been bulky, expensive and difficult to mass-produce. Most are also limited to narrow spectral bands or rely on multiple specialized components to cover a broader range.

The new approach sidesteps these issues with a lightweight, scalable alternative that leverages recent advances in polymer science and computational optics.

A plastic revolution in optics
The team was inspired by the evolution of smartphone cameras, which now rely heavily on plastic optical components to achieve high performance in ultra-compact formats. Applying the same principle to spectrometer design, the researchers used transparent shape memory epoxies (SMPs) to engineer dispersive optical elements—components that separate light into its spectral components.

What makes this approach truly innovative is the use of internal stress to tailor the optical properties of the plastic. Normally, stress patterns that develop during the manufacture of plastic objects are uncontrolled and unstable. However, SMPs can be mechanically stretched at elevated temperatures to “program” precise and stable stress distributions into the material. These stresses create birefringence—an optical effect where light is split according to its wavelength.

“By shaping the internal stress within the polymer, we are able to engineer spectral behavior with high repeatability and tunability, something that’s incredibly difficult to achieve with conventional optics,” said Gongyuan Zhang from Zhejiang University, the lead author of the study.

The resulting films act as spectral filters, encoding information that can be read by standard CMOS image sensors. With the aid of computational spectral reconstruction algorithms, these planar components can be turned into powerful, compact spectrometers.

From lab bench to consumer tech
One of the major achievements of this work is demonstrating that these stress-engineered films can be fabricated in a single step, without the need for lithography or expensive nanofabrication. This makes the devices ideal for mass production and integration into consumer electronics, such as mobile phones, wearable health monitors, and even food quality testers.

“We’ve shown that you can use programmable plastics to cover a much broader range of the spectrum than typical miniaturized systems—right into the SWIR,” said Professor Zongyin Yang, lead author from Zhejiang University. “That’s really important for applications like agricultural monitoring, mineral exploration, and medical diagnostics.”

The spectrometers are also highly compact, and the team successfully integrated them into a line-scanning spectral imaging system—suggesting their suitability for hyperspectral imaging in portable form. By linearly varying the stress across the length of the film, the team could create gradient filters capable of scanning a scene one slice at a time, collecting rich spectral data in the process.

A platform for the future
This work represents more than a technical breakthrough; it lays the foundation for a new class of ultra-portable, broadband sensing devices that could transform industrial and consumer markets.

The researchers point to several likely applications: detecting pollutants in water or air, verifying the authenticity of drugs, monitoring blood sugar non-invasively, and even sorting recyclable materials in real-time.

By eliminating the traditional trade-offs between size, cost and spectral range, the platform could democratize access to high-quality spectral data. It also aligns with broader research efforts at Cambridge’s Department of Engineering in computational photonics and sustainable sensing technologies, areas aiming to push intelligence and functionality into smaller, more accessible formats.

“This work shows how mechanical design principles can be used to reshape photonic functionality,” said co-author Professor Tawfique Hasan from Cambridge’s Department of Engineering.

“By embedding stress into transparent polymers, we have created a new class of dispersive optics that are not only lightweight and scalable but also adaptable across a wide spectral range. This level of flexibility is very difficult to achieve with traditional optics relying on static, lithographically defined structures.”

Speed test of ‘tunneling’ electrons challenges alternative interpretation of quantum mechanics

Quantum mechanics describes the unconventional properties of subatomic particles, like their ability to exist in a superposition of multiple states, as popularized by the Schrödinger’s cat analogy, and ability to slip through barriers, a phenomenon known as quantum tunneling.

Reporting in the journal Nature, a team of researchers tested a unique aspect of Bohmian mechanics, an alternative interpretation of quantum theory. This twist on classical quantum theory predicts that a tunneling quantum particle would remain “at rest” inside an infinitely long barrier. The time it spends inside the barrier, called dwell time, would therefore be infinite.

In the classic “Copenhagen” interpretation of quantum physics, photons and other subatomic particles exist as waves of probabilities with no defined location until they are observed. At that point, a particle’s waveform collapses into a discrete particle with a definite location, as demonstrated by the famous double-slit experiment.

The alternate Bohmian interpretation posits that particles remain point-like objects. In this model, the positions of particles are determined by some unmeasured “hidden” variables and their trajectories are guided by a pilot wave, which gives the appearance of wave-particle duality.

Experimental setup for measuring the speed of particles. Credit: Nature (2025). DOI: 10.1038/s41586-025-09099-4
Both interpretations make many of the same predictions, but they differ greatly in the way they describe the fundamental nature of particles.

To test the unique prediction of Bohmian mechanics that photons can, in effect, remain frozen in time when tunneling through a barrier of infinite length, the researchers designed an experiment that, to a photon, would simulate an infinitely long barrier.

The setup was constructed by sandwiching together a pair of specially designed mirrors. The lower mirror was etched with a nanoscale ramp and a pair of parallel waveguides. By shining a laser on the ramp, the researcher could produce photons and control their momentum.

As the photons traveled along the waveguide and tunneled into the barrier, they also tunneled into the secondary waveguide, jumping back and forth between the two at a consistent rate, allowing the research team to calculate their speed.

By combining this element of time with measurements of the photon’s rate of decay inside the barrier, the researchers were able to calculate dwell time, which was found to be finite.

The researchers write, “Our findings contribute to the ongoing tunneling time debate and can be viewed as a test of Bohmian trajectories in quantum mechanics. Regarding the latter, we find that the measured energy–speed relationship does not align with the particle dynamics postulated by the guiding equation in Bohmian mechanics.”

This result challenges but does not rule out the Bohmian prediction. Since the researchers’ experiment was an analog that relied on various assumptions, its findings are not conclusive and may themselves be challenged.

Quantum battery model achieves theoretical speed limit, demonstrates genuine advantage

Over the past few years, researchers have developed various quantum technologies, alternatives to classical devices that operate by leveraging the principles of quantum mechanics. These technologies have the potential to outperform their classical counterparts in specific settings or scenarios.

Among the many quantum technologies proposed and devised so far are quantum batteries, energy storage devices that could theoretically store energy more efficiently than classical batteries, while also charging more rapidly. Despite their predicted potential, most quantum battery solutions proposed to date have not yet proven to exhibit a genuine quantum advantage, or in other words, to perform better than their classical counterparts.

Researchers at PSL Research University and the University of Pisa recently introduced a new deceptively simple quantum battery model that could exhibit a genuine quantum advantage over a classical analog battery. The new model, outlined in a paper published in Physical Review Letters, was found to successfully reach the so-called quantum speed limit, the maximum speed that a quantum system could theoretically achieve.

“Quantum batteries are microscopic devices that can exhibit quantum advantages over their classical counterparts in energy-related tasks,” Vittoria Stanzione and Gian Marcello Andolina, co-authors of the paper, told Phys.org. “This research area originated from quantum information theory, which predicts that quantum resources—such as entanglement—can dramatically enhance the charging power of quantum systems.

“In recent years, some of the authors of the present work proposed a model displaying such a quantum advantage: the Sachdev–Ye–Kitaev (SYK) model. However, this model is highly complex, both experimentally—due to its many-body interactions—and theoretically, as it is analytically challenging.”

Earlier works demonstrating a quantum advantage of batteries based on the SYK model only did so using numerical simulations, without performing any further analyses. Building on their earlier efforts, Stanzione, Andolina and their colleagues tried to identify the simplest possible quantum battery model that could display a quantum advantage in terms of charging power.

“Our model consists of two coupled harmonic oscillators: one acts as the ‘charger,’ and the other serves as the ‘battery,'” explained Stanzione and Andolina. “The key ingredient enabling the quantum advantage is an anharmonic interaction between the two oscillators during the charging process. This anharmonic coupling allows the system to access non-classical, entangled states that effectively create a ‘shortcut’ in Hilbert space, enabling faster energy transfer than in classical dynamics.”

To rigorously certify their model’s quantum advantage, the researchers compared it to a suitable classical battery model, while also implementing a formal bound that was outlined by Maciej Lewenstein and other researchers at the Institute of Photonic Sciences (ICFO) in Barcelona. Overall, their findings suggest that their quantum battery model does outperform its classical counterpart.

“To the best of our knowledge, this work provides the first rigorous certification of a genuine quantum advantage in a solvable model,” said Stanzione and Andolina. “Furthermore, the proposed setup can be realized with current experimental technologies.”

So far, the researchers’ model is merely theoretical, and much work still needs to be done before it can be realized experimentally. In their paper, the team briefly explores the possibility of realizing their proposed battery model using superconducting circuits, which are electrical circuits made of materials that exhibit a resistance of zero at low temperatures.

“We now plan to collaborate with experimental groups in the future to pursue a proof-of-principle realization,” added Stanzione and Andolina. “At the same time, the development of a fully functional quantum battery—integrated with other quantum technologies—remains a distant goal. We hope that our work will stimulate further research on this exciting topic, fostering progress on both the theoretical and experimental fronts.”

Physicists take step toward a holy grail for electron spins

For decades, ferromagnetic materials have driven technologies like magnetic hard drives, magnetic random access memories and oscillators. But antiferromagnetic materials, if only they could be harnessed, hold out even greater promise: ultra-fast information transfer and communications at much higher frequencies—a “holy grail” for physicists.

Now, researchers have taken a meaningful step towards utilizing antiferromagnets for new technologies. In “Spin-filter tunneling detection of antiferromagnetic resonance with electrically-tunable damping,” published in Science, they describe their innovative approach for both detecting and controlling the motion of spins within antiferromagnets using 2D antiferromagnetic materials and tunnel junctions.

Both types of materials contain atoms that act like tiny individual magnets, each having “spin.” In a ferromagnet, all of these atomic spins are aligned, producing an external magnetic field. In an antiferromagnet, atomic spins cancel when they are added up, so no external magnetic field is produced. That’s why it’s difficult to not only detect the motions of spins within antiferromagnets but also control the motion of their spins.

Previously, detections of the spin dynamics in antiferromagnets occurred with millimeter or larger samples, “not something that really scales down to any kind of useful device scale,” said co-corresponding author Dan Ralph, F.R. Newman Professor of Physics in the College of Arts and Sciences and a member of the Kavli Institute at Cornell.

“What we’ve done is make micrometer-scale devices where we can see strong signals, using tunnel junctions to be able to detect the spin motions electrically—and that’s nearly a factor of 1,000 smaller than what’s been done before.”

Tunneling is a sort of quantum mechanical leaking of an electron through a barrier that a classical particle wouldn’t be able to get through; it’s not a direct flow of electrons across, but a penetration of an electron wave function as it goes through a barrier, Ralph said.

“Electrons can do funny things,” he said, adding that tunneling is a common device used in all kinds of technologies.

When the spins in the antiferromagnet change their directions inside a tunnel junction, this changes the electrical resistance associated with the tunneling electrons, providing a way to measure the spin dynamics.Simulated antiferromagnatic dynamics from numerical integration of the coupled LLGS equations. Credit: Science (2025). DOI: 10.1126/science.adq8590

This electrical detection works at very high speeds. Most technologies are not equipped to detect at that frequency.

“This is one of our breakthroughs: that we’re using this tunneling behavior, which is this quantum mechanical electron behavior, to really read out these extremely fast oscillations,” said co-corresponding author Kelly Luo, a former Presidential Postdoc/Kavli Institute at Cornell Experimental Fellow at Cornell, now an assistant professor at the University of Southern California.

Their breakthroughs came in part by interweaving two fields: 2D materials and spintronics, also known as spin electronics, said lead author Thow Min Jerald Cham, Ph.D.

To help control the spins within the 2D antiferromagnet, the researchers used a mechanism known as spin-orbit torque. They passed a charge current through a material to make a spin current that can interact with the magnet, to apply a torque to the magnet and make it move.

“We were mainly searching for a way to manipulate the spins so that we could detect the 2D layers separately, and we couldn’t really distinguish which layer was doing what. Then we came up with this idea, where we could break the symmetry by twisting the layers,” said Cham, who is now a postdoctoral scholar at California Institute of Technology.

“With this geometry, we can use applied currents with spin-orbit torque to apply a force to just one of the spin layers and not the other, a first step for controlling the spin dynamics,” Ralph said.

“Our studies show that antiferromagnetic materials have great potential,” the researchers wrote, “for realizing nano-oscillators for high-frequency applications.” This is an avenue they continue to explore.

Other co-authors are Xiaoxi Huang, postdoctoral associate in Ralph’s lab; Daniel G. Chica and Xavier Roy, Columbia University; and Kenji Watanabe and Takashi Taniguchi, National Institute for Materials Science, Japan.

Adding up Feynman diagrams to make predictions about real materials

Caltech scientists have found a fast and efficient way to add up large numbers of Feynman diagrams, the simple drawings physicists use to represent particle interactions. The new method has already enabled the researchers to solve a longstanding problem in the materials science and physics worlds known as the polaron problem, giving scientists and engineers a way to predict how electrons will flow in certain materials, both conventional and quantum.

In the 1940s, physicist Richard Feynman first proposed a way to represent the various interactions that take place between electrons, photons, and other fundamental particles using 2D drawings that involve straight and wavy lines intersecting at vertices. Though they look simple, these Feynman diagrams allow scientists to calculate the probability that a particular collision, or scattering, will take place between particles.

Since particles can interact in many ways, many different diagrams are needed to depict every possible interaction. And each diagram represents a mathematical expression. Therefore, by summing all the possible diagrams, scientists can arrive at quantitative values related to particular interactions and scattering probabilities.

“Summing all Feynman diagrams with quantitative accuracy is a holy grail in theoretical physics,” says Marco Bernardi, professor of applied physics, physics, and materials science at Caltech.

“We have attacked the polaron problem by adding up all the diagrams for the so-called electron-phonon interaction, essentially up to an infinite order.”

In a paper published in Nature Physics, the Caltech team uses its new method to precisely compute the strength of electron-phonon interactions and to predict associated effects quantitatively. The lead author of the paper is graduate student Yao Luo, a member of Bernardi’s group.

For some materials, such as simple metals, the electrons moving inside the crystal structure will interact only weakly with its atomic vibrations. For such materials, scientists can use a method called perturbation theory to describe the interactions that occur between electrons and phonons, which can be thought of as “units” of atomic vibration.

Perturbation theory is a good approximation in these systems because each successive order or interaction becomes decreasingly important. That means that computing only one or a few Feynman diagrams—a calculation that can be done routinely—is sufficient to obtain accurate electron-phonon interactions in these materials.

Introducing polarons
But for many other materials, electrons interact much more strongly with the atomic lattice, forming entangled electron-phonon states known as polarons. Polarons are electrons accompanied by the lattice distortion they induce. They form in a wide range of materials including insulators, semiconductors, materials used in electronics or energy devices, as well as many quantum materials.

For example, an electron placed in a material with ionic bonds will distort the surrounding lattice and form a localized polaron state, resulting in decreased mobility due to the strong electron-phonon interaction. Scientists can study these polaron states by measuring how conductive the electrons are or how they distort the atomic lattice around them.

Perturbation theory does not work for these materials because each successive order is more important than the last. “It’s basically a nightmare in terms of scaling,” says Bernardi.

“If you can calculate the lowest order, it’s very likely that you cannot do the second order, and the third order will just be impossible. The computational cost typically scales prohibitively with interaction order. There are too many diagrams to compute, and the higher-order diagrams are too computationally expensive.”

Polaron energy vs. momentum curves. Credit: Nature Physics (2025). DOI: 10.1038/s41567-025-02954-1
Summing Feynman diagrams
Scientists have searched for a way to add up all the Feynman diagrams that describe the many, many ways that the electrons in such a material can interact with atomic vibrations. Thus far such calculations have been dominated by methods where scientists can tune certain parameters to match an experiment.

“But when you do that, you don’t know whether you’ve actually understood the mechanism or not,” says Bernardi. Instead, his group focuses on solving problems from “first principles,” meaning beginning with nothing more than the positions of atoms within a material and using the equations of quantum mechanics.

When thinking about the scope of this problem, Luo says to imagine trying to predict how the stock market might behave tomorrow. To attempt this, one would need to consider every interaction between every trader over some period to get precise predictions of the market’s dynamics.

Luo wants to understand all the interactions between electrons and phonons in a material where the phonons interact strongly with the atoms in the material. But as with predicting the stock market, the number of possible interactions is prohibitively large. “It is actually impossible to calculate directly,” he says. “The only thing we can do is use a smart way of sampling all these scattering processes.”

Betting on Monte Carlo
Caltech researchers are addressing this problem by applying a technique called diagrammatic Monte Carlo (DMC), in which an algorithm randomly samples spots within the space of all Feynman diagrams for a system, but with some guidance in terms of the most important places to sample.

“We set up some rules to move effectively, with high agility, within the space of Feynman diagrams,” explains Bernardi.

The Caltech team overcame the enormous amount of computing that would have normally been required to use DMC to study real materials with first principle methods by relying on a technique they reported last year that compresses the matrices that represent electron-phonon interactions.

Another major advance is nearly removing the so-called “sign problem” in electron-phonon DMC using a clever technique that views diagrams as products of tensors, mathematical objects expressed as multi-dimensional matrices.

“The clever diagram sampling, sign-problem removal, and electron-phonon matrix compression are the three key pieces of the puzzle that have enabled this paradigm shift in the polaron problem,” says Bernardi.

In the new paper, the researchers have applied DMC calculations in diverse systems that contain polarons, including lithium fluoride, titanium dioxide, and strontium titanate. The scientists say their work opens up a wide range of predictions that are relevant to experiments that people are conducting on both conventional and quantum materials—including electrical transport, spectroscopy, superconductivity, and other properties in materials that have strong electron-phonon coupling.

“We have successfully described polarons in materials using DMC, but the method we developed could also help study strong interactions between light and matter, or even provide the blueprint to efficiently add up Feynman diagrams in entirely different physical theories,” says Bernardi.

Tunable laser light: Ring design could be used in telecom, medicine and more

Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Technical University of Vienna (TU Wien) have invented a new type of tunable semiconductor laser that combines the best attributes of today’s most advanced laser products, demonstrating smooth, reliable, wide-range wavelength tuning in a simple, chip-sized design.

Tunable lasers, or lasers whose light output wavelengths can be changed and controlled, are integral to many technologies, from high-speed telecommunications to medical diagnostics to safety inspections of gas pipelines.

Yet laser technology faces many trade-offs—for example, lasers that emit across a wide range of wavelengths, or colors, sacrifice the accuracy of each color. But lasers that can precisely tune to many colors get complicated and expensive because they commonly require moving parts.

The new Harvard device could one day replace many types of tunable lasers in a smaller, more cost-effective package.

The research is published in Optica and was co-led by Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS, and Professor Benedikt Schwarz at TU Wien, with whom Capasso’s group has maintained a longstanding research partnership.

The researchers initially demonstrated a laser that emits light in the mid-infrared wavelength range because that’s where quantum cascade lasers, upon which their architecture is based, typically emit.

“The versatility of this new platform means that similar lasers can be fabricated at more commercially relevant wavelengths, such as for telecommunications applications, for medical diagnostics, or for any laser that emits in the visible spectrum of light,” said Capasso, who co-invented the quantum cascade laser in 1994.

The new laser consists of multiple tiny ring-shaped lasers, each a slightly different size, and all connected to the same waveguide. Each ring emits light of a different wavelength, and by adjusting electric current input, the laser can smoothly tune between different wavelengths.

The clever and compact design ensures the laser emits only one wavelength at a time, remains stable even in harsh environments, and can be easily scaled. The rings function either one at a time or all together to make a stronger beam.

“By adjusting the size of the ring, we can effectively target any line we want, and any lasing frequency we want,” said co-lead author Theodore Letsou, an MIT graduate student and research fellow in Capasso’s lab at Harvard.

“All the light from every single laser gets coupled through the same waveguide and is formed into the same beam. This is quite powerful, because we can extend the tuning range of typical semiconductor lasers, and we can target individual wavelengths using a different ring radius.”

“What’s really nice about our laser is the simplicity of fabrication,” added co-lead author Johannes Fuchsberger, a graduate student at TU Wien, where the team fabricated the devices using the cleanroom facilities permanently provided by the school’s Center for Micro and Nanostructures. “We have no mechanically movable parts and an easy fabrication scheme that results in a small footprint.”

The KATRIN experiment sets new constraints on general neutrino interactions

Neutrinos are elementary particles that are predicted to be massless by the standard model of particle physics, yet their observed oscillations suggest that they do in fact have a mass, which is very low. A further characteristic of these particles is that they only weakly interact with other matter, which makes them very difficult to detect using conventional experimental methods.

The KATRIN (Karlsruhe Tritium Neutrino) experiment is a large-scale research effort aimed at precisely measuring the effective mass of the electron anti-neutrino using advanced instruments located at the Karlsruhe Institute of Technology (KIT) in Germany.

The researchers involved in this experiment recently published the results of a new analysis of data from the second measurement campaign in Physical Review Letters, which set new constraints on interactions involving neutrinos that could arise from unknown physics that is not explained by the standard model, also known as general neutrino interactions.

“We know that beyond standard model (BSM) physics is hiding in the neutrino sector, but we don’t know what it looks like yet,” Caroline Fengler, lead analyst for this search, told Phys.org. “That’s what has motivated us already in the past to look for various BSM physics phenomena with KATRIN, such as light and heavy sterile neutrinos and Lorentz invariance violations.

“The theory work by the group of Werner Rodejohann then gave us the incentive to broaden our search to any possible new neutrino interactions that might contribute to the weak interaction of the beta decay.”

The new interactions that the researchers started looking for could hint at the existence of various exciting physical phenomena outside that are not predicted by the standard model of particle physics, but that have been widely explored by theorists. For instance, they could indicate the presence of various hypothetical particles, including right-handed W bosons, charged Higgs bosons, and Leptoquarks.

Inside the KATRIN spectrometer. Credit: Michael Zacher
“The main purpose of the KATRIN experiment is to measure the mass of the neutrino,” explained Fengler. “This is done through a highly precise measurement of the energy spectrum of the electrons originating from tritium beta decay, using a high-activity tritium source and a one-of-a-kind electron spectrometer. The shape of the recorded beta spectrum then contains information about the neutrino mass and other BSM physics contributions.”

Notably, general neutrino interactions are predicted to prompt characteristic shape deformations of the so-called beta spectrum, which is the distribution of electron energies emitted during a type of radioactive decay known as beta decay. The KATRIN collaboration thus set out to search for these beta spectrum deformations in the data collected as part of the experiment.

“With only a small part (5%) of the final KATRIN dataset, we were already able to set competitive constraints on some of the investigated new neutrino interactions compared to the global constraints from other low-energy experiments,” said Fengler. “This shows that the KATRIN experiment is sensitive to these new interactions.”

While the KATRIN experiment did not detect signs of general neutrino interactions yet, it set competitive constraints on the strength of these new and elusive interactions, employing a new experimental approach. The KATRIN collaboration hopes that these constraints will contribute to the future search for physics beyond the standard model.

“We are already working on further improving our sensitivity on the general neutrino interactions with KATRIN by extending the data set and fine-tuning our analysis approach,” added Fengler. “With the beginning of the upcoming TRISTAN phase at KATRIN in 2026, which is set out to search for keV sterile neutrinos with the help of an upgraded detector, we will gain access to another powerful data set, which promises to greatly improve our sensitivity in the future.”

Can the Large Hadron Collider snap string theory?

In physics, there are two great pillars of thought that don’t quite fit together. The Standard Model of particle physics describes all known fundamental particles and three forces: electromagnetism, the strong nuclear force, and the weak nuclear force. Meanwhile, Einstein’s general relativity describes gravity and the fabric of spacetime.

However, these frameworks are fundamentally incompatible in many ways, says Jonathan Heckman, a theoretical physicist at the University of Pennsylvania. The Standard Model treats forces as dynamic fields of particles, while general relativity treats gravity as the smooth geometry of spacetime, so gravity “doesn’t fit into physics’s Standard Model,” he explains.

In a recent paper in Physical Review Research, Heckman, Rebecca Hicks, a Ph.D. student at Penn’s School of Arts & Sciences, and their collaborators turn that critique on its head. Instead of asking what string theory predicts, the authors ask what it definitively cannot create. Their answer points to a single exotic particle that could show up at the Large Hadron Collider (LHC). If that particle appears, the entire string-theory edifice would be, in Heckman’s words, “in enormous trouble.”

String theory: The good, the bad, the energy-hungry
For decades, physicists have sought a unified theory that can reconcile quantum mechanics, and by extension, the behavior of subatomic particles, with gravity—which is described as a dynamic force in general relativity but is not fully understood within quantum contexts, Heckman says.

A good contender for marrying gravity and quantum phenomena is string theory, which posits that all particles, including a hypothetical one representing gravity, are tiny vibrating strings and which promises a single framework encompassing all forces and matter.

“But one of the drawbacks of string theory is that it operates in high-dimensional math and a vast ‘landscape’ of possible universes, making it fiendishly difficult to test experimentally,” Heckman says, pointing to how string theory necessitates more than the familiar four dimensions— x, y, z, and time—to be mathematically consistent.

“Most versions of string theory require a total of 10 or 11 spacetime dimensions, with the extra dimensions being sort of ‘curled up’ or folding in on one another to extremely small scales,” Hicks says.

To make matters even trickier, string theory’s distinctive behaviors only clearly reveal themselves at enormous energies, “those far beyond what we typically encounter or even generate in current colliders,” Heckman says.

Hicks likens it to zooming in on a distant object: At everyday, lower energies, strings look like regular point-like particles, just as a faraway rope might appear to be a single line.

“But when you crank the energy way up, you start seeing the interactions as they truly are—strings vibrating and colliding,” she explains. “At lower energies, the details get lost, and we just see the familiar particles again. It’s like how from far away, you can’t make out the individual fibers in the rope. You just see a single, smooth line.”

That’s why physicists hunting for signatures of string theory must push their colliders—like the LHC—to ever-higher energies, hoping to catch glimpses of fundamental strings rather than just their lower-energy disguises as ordinary particles.

Why serve string theory a particle it likely won’t be able to return?
Testing a theory often means searching for predictions that confirm its validity. But a more powerful test, Heckman says, is finding exactly where a theory fails. If scientists discover that something a theory forbids actually exists, the theory is fundamentally incomplete or flawed.

Because string theory’s predictions are vast and varied, the researchers instead asked if there’s a simple particle scenario that string theory just can’t accommodate.

They zeroed in on how string theory deals with particle “families,” groups of related particles bound together by the rules of the weak nuclear force, responsible for radioactive decay. Typically, particle families are small packages, like the electron and its neutrino sibling, that form a tidy two-member package called a doublet. String theory handles these modest particle families fairly well, without issue.

However, Heckman and Hicks identified a family that is conspicuously absent from any known string-based calculation: a five-member particle package, or a 5-plet. Heckman likens this to trying to order a Whopper meal from McDonald’s: “No matter how creatively you search the menu, it never materializes.”

“We scoured every toolbox we have, and this five-member package just never shows up,” Heckman says.

But what exactly is this elusive 5-plet?
Hicks explains it as an expanded version of the doublet: “The 5-plet is its supersized cousin, packing five related particles together.”

Physicists encapsulate this particle family in a concise mathematical formula known as the Lagrangian, essentially the particle-physics cookbook. The particle itself is called a Majorana fermion, meaning it acts as its own antiparticle, akin to a coin that has heads on both sides.

Identifying such a particle would directly contradict what current string theory models predict is possible, making the detection of this specific particle family at the LHC a high-stakes test, one that could potentially snap string theory.

Why a 5-plet hasn’t been spotted and the vanishing-track clue
Hicks cites two major hurdles for spotting these 5-plet structures: “production and subtlety.”

In a collider, energy can literally turn into mass; Einstein’s E = mc² says that enough kinetic oomph (E) can be converted into the heft (m) of brand-new particles, so the heavier the quarry the rarer the creation event.

“The LHC has to slam protons together hard enough to conjure these hefty particles out of pure energy,” Hicks explains, citing Einstein’s E = mc², which directly links energy (E) to mass (m). “As the masses of these particles climb toward a trillion electron volts, the chance of creating them drops dramatically.”

Even if produced, detection is challenging. The charged particles in the 5-plet decay very quickly into nearly invisible products.

“The heavier states decay into a soft pion and an invisible neutral particle, zero (X0),” Hicks says. “The pion is so low-energy it’s basically invisible, and X0 passes straight through. The result is a track that vanishes mid-detector, like footprints in snow suddenly stopping.”

Those signature tracks get picked up by LHC’s ATLAS (short for A Toroidal LHC ApparatuS) and CMS (Compact Muon Solenoid), house-sized “digital cameras” wrapped around the collision center. They sit at opposite collision points and operate independently, giving the physics community two sets of eyes on every big discovery. Penn physicists like Hicks are part of the ATLAS Collaboration, helping perform the searches that look for quirky signals like disappearing tracks.

ATLAS’s wheel-like end-cap reveals the maze of sensors primed to catch proton smash-ups at the LHC. Researchers comb through billions of events in search of fleeting “ghost” tracks that might expose cracks in string theory. Credit: CERN
Why a 5-plet matters for dark matter
Hicks says finding the 5-plet isn’t only important for testing string theory, pointing to another exciting possibility: “The neutral member of the 5-plet could explain dark matter, the mysterious mass shaping up most of our universe’s matter.”

Dark matter constitutes roughly 85% of all matter in the universe, yet scientists still don’t know what exactly it is.

“If the 5-plet weighs around 10 TeV—about 10,000 proton masses—it neatly fits theories about dark matter’s formation after the Big Bang,” Hicks says. “Even lighter 5-plets could still play a role as part of a broader dark matter landscape.”

“If we detect a 5-plet, it’s a double win,” says Hicks. “We’d have disproven key predictions of string theory and simultaneously uncovered new clues about dark matter.”

What the LHC has already ruled out
Using existing ATLAS data from collider runs, the team searched specifically for 5-plet signals. “We reinterpreted searches originally designed for ‘charginos’—hypothetical charged particles predicted by supersymmetry—and looked for 5-plet signatures,” Hicks says of the team’s search through the repurposed ATLAS disappearing-track data. “We have found no evidence yet, which means any 5-plet particle must weigh at least 650–700 GeV, five times heavier than the Higgs boson.”

For context, Heckman says, “this early result is already a strong statement; it means lighter 5-plets don’t exist. But heavier ones are still very much on the table.”

Future searches with upgraded LHC experiments promise even sharper tests. “We’re not rooting for string theory to fail,” Hicks says. “We’re stress-testing it, applying more pressure to see if it holds up.”

“If string theory survives, fantastic,” Heckman says. “If it snaps, we’ll learn something profound about nature.”

Polymer-protected DNA sensors enable two-month storage for 50-cent disease diagnostics

Using an inexpensive electrode coated with DNA, MIT researchers have designed disposable diagnostics that could be adapted to detect a variety of diseases, including cancer or infectious diseases such as influenza and HIV.

These electrochemical sensors make use of a DNA-chopping enzyme found in the CRISPR gene-editing system. When a target such as a cancerous gene is detected by the enzyme, it begins shearing DNA from the electrode nonspecifically, like a lawnmower cutting grass, altering the electrical signal produced.

One of the main limitations of this type of sensing technology is that the DNA that coats the electrode breaks down quickly, so the sensors can’t be stored for very long and their storage conditions must be tightly controlled, limiting where they can be used. In a new study, MIT researchers stabilized the DNA with a polymer coating, allowing the sensors to be stored for up to two months, even at high temperatures. After storage, the sensors were able to detect a prostate cancer gene that is often used to diagnose the disease.

The DNA-based sensors, which cost only about 50 cents to make, could offer a cheaper way to diagnose many diseases in low-resource regions, says Ariel Furst, the Paul M. Cook Career Development Assistant Professor of Chemical Engineering at MIT and the senior author of the study.

“Our focus is on diagnostics that many people have limited access to, and our goal is to create a point-of-use sensor. People wouldn’t even need to be in a clinic to use it. You could do it at home,” Furst says.

MIT graduate student Xingcheng Zhou is the lead author of the paper published in the journal ACS Sensors. Other authors of the paper are MIT undergraduate Jessica Slaughter, Smah Riki ’24, and graduate student Chao Chi Kuo.

An inexpensive sensor
Electrochemical sensors work by measuring changes in the flow of an electric current when a target molecule interacts with an enzyme. This is the same technology that glucose meters use to detect concentrations of glucose in a blood sample.

The electrochemical sensors developed in Furst’s lab consist of DNA adhered to an inexpensive gold leaf electrode, which is laminated onto a sheet of plastic. The DNA is attached to the electrode using a sulfur-containing molecule known as a thiol.

In a 2021 study, Furst’s lab showed that they could use these sensors to detect genetic material from HIV and human papillomavirus (HPV). The sensors detect their targets using a guide RNA strand, which can be designed to bind to nearly any DNA or RNA sequence. The guide RNA is linked to an enzyme called Cas12, which cleaves DNA nonspecifically when it is turned on and is in the same family of proteins as the Cas9 enzyme used for CRISPR genome editing.

If the target is present, it binds to the guide RNA and activates Cas12, which then cuts the DNA adhered to the electrode. That alters the current produced by the electrode, which can be measured using a potentiostat (the same technology used in handheld glucose meters).

“If Cas12 is on, it’s like a lawnmower that cuts off all the DNA on your electrode, and that turns off your signal,” Furst says.

In previous versions of the device, the DNA had to be added to the electrode just before it was used, because DNA doesn’t remain stable for very long. In the new study, the researchers found that they could increase the stability of the DNA by coating it with a polymer called polyvinyl alcohol (PVA).

This polymer, which costs less than 1 cent per coating, acts like a tarp that protects the DNA below it. Once deposited onto the electrode, the polymer dries to form a protective thin film.

“Once it’s dried, it seems to make a very strong barrier against the main things that can harm DNA, such as reactive oxygen species that can either damage the DNA itself or break the thiol bond with the gold and strip your DNA off the electrode,” Furst says.

New fluorescent probe enables rapid, visible detection of harmful pesticide residues

A team of researchers led by Prof. Jiang Changlong from the Hefei Institutes of Physical Science of the Chinese Academy of Sciences has developed a fast and simple way to detect harmful pesticide residues, with results visible to the naked eye in just 10 seconds. The study was published in Analytical Chemistry.

While effective against pests, overexposure to pyrethroids can cause health issues such as dizziness and breathing problems. Detecting them usually requires lab equipment and time-consuming procedures. But this new probe offers a much faster and easier solution.

“We created a new fluorescent probe that lights up in different colors when it comes into contact with pyrethroids,” said Liu Anqi, a member of the team. “It’s a widely used type of insecticide found in many household and agricultural products.”

The probe consists of a special fluorescent dye combined with a common protein to form a molecular complex. Under ultraviolet light, the complex emits a yellow glow. When it encounters pyrethroid molecules, the fluorescence shifts to green—an easily distinguishable change visible to the unaided eye within 10 seconds.

Tests showed that the probe is capable of detecting a broad range of pyrethroid concentrations with high sensitivity. For practical use, the researchers also developed a portable system that users can photograph the glowing sample with a smartphone and estimate pesticide levels based on image color analysis, enabling rapid field testing in kitchens, markets, or agricultural settings.

To detect pesticide vapors—such as those emitted by mosquito coils—the researchers went a step further. They embedded the fluorescent probe into a lightweight, sponge-like aerogel material. This aerogel traps airborne pesticides and shows a visible color change from orange to green. It’s one of the first tools of its kind designed for gas-phase pesticide detection.

This breakthrough opens new doors for fast, accessible pesticide monitoring, helping improve food safety and reduce health risks.