The so-called superconducting (SC) diode effect is an interesting nonreciprocal phenomenon, occurring when a material is SC in one direction and resistive in the other. This effect has been the focus of numerous physics studies, as its observation and reliable control in different materials could enable the future development of new integrated circuits.
Researchers at RIKEN and other institutes in Japan and the United States recently observed the SC diode effect in a newly developed device comprised of two coherently coupled Josephson junctions. Their paper, published in Nature Physics, could guide the engineering of promising technologies based on coupled Josephson junctions.
“We experimentally studied nonlocal Josephson effect, which is a characteristic SC transport in the coherently coupled Josephson junctions (JJs), inspired by a previous theoretical paper published in NanoLetters,” Sadashige Matsuo, one of the researchers who carried out the study, told Phys.org.
The recent work by Matsuo and his colleague builds on their previous research efforts focusing on SC transport in coherently coupled JJs. To conduct their experiments, the team used a device that consists in two JJs sharing a single SC lead.
“When the shared SC lead is narrow, the two JJs are coherently coupled and interact with each other,” Matsuo explained. “By embedding one JJ into the SC loop and measuring the other JJ, we can study the SC transport of the JJs affected by the other JJs through the coherent coupling.”
By modulating the phase of the coupled JJs in their device, Matsuo and his colleagues were ultimately able to produce the SC diode effect. Their work thus unveiled a promising and reliable strategy to realize this effect in coupled JJs-based devices, while also shedding further light on the physics underpinning the effect in these devices.
“The SC diode effect itself is important because the phenomenon will be applied for dissipationless rectification in future SC circuits,” Matsuo said. “Additionally, the SC diode effect emerges when SC devices do not have time-reversal and spatial-inversion symmetries. Therefore, our results suggest that the phase control of the coupled JJs can break such symmetries. This means that the other exotic SC phenomena expected with the symmetries broken may be realized in the coupled JJs.”
In the future, this recent paper could open new opportunities for the field of electronics engineering. For instance, the methods they used could be applied to the development of new highly performing superconducting electronic components. Concurrently, the work by Matsuo and his colleagues could inspire other research teams worldwide to carry out similar studies using coupled JJs.
“We now plan to seek exotic SC phenomena other than the SC diode effect by controlling the coherent coupling of the JJs,” Matsuo added.
More information: Sadashige Matsuo et al, Josephson diode effect derived from short-range coherent coupling, Nature Physics (2023). DOI: 10.1038/s41567-023-02144-x
by Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS
Electro-optic modulators (EOMs) are cardinal elements in the optical communication networks that control the amplitude, phase and polarization of a light via external electric signals. Aiming to realize ultracompact and high-performance EOMs, most investigations nowadays target on-chip devices that combine semi-conductor technologies with state-of-art tunable materials. Nevertheless, integrated EOMs, as an independent on-chip element, are commonly separated from light sources.
Thus, extra interfaces that couple the light from light sources to the waveguides of on-chip devices are indispensable. Although state-of-art coupling schemes including edge coupling and grating coupling have been employed, they still suffer from limited integration densities and narrow-band operations, respectively.
Moreover, both coupling schemes require ultra-accurate alignments and complex encapsulations, making on-chip devices expensive for customers. Therefore, an EOM device that circumvents coupling complexity and further reduces coupling losses is needed.
In a new paper published in Light: Science & Applications, a team of scientists has developed methodologies that directly integrate EOM devices on the facet of single-mode optical fiber jumpers, connecting EOM devices with light sources using standard interfaces of optical fibers.
“Embracing the standard nanofabrication methodologies developed in our previous work, the EOM block can be directly integrated on the tips of single-mode optical fibers, so the metafiber EOMs intrinsically avoid the coupling treatment,” Prof. Min Qiu said.
Such plasmonic metafiber EOMs feature a well-defined plasmonic-organic hybrid configuration. Profiting from ultrathin and high quality-factor plasmonic metasurfaces, nanofabrication-friendly and highly efficient EO polymers, the spectral amplitude and quality factor of passed light are well controlled to promote resonance sensitivity for EO modulation.
“More interestingly, by rational[ly] designing the plasmonic modes, resonant waveguided modes and Fabry-Perot modes, tunable dual-band operations can be achieved in telecom O band and S band,” co-first authors Lei Zhang and Xinyu Sun added.
The metafiber EOMs were further driven by direct/alternating current electrical signals. The modulation speed of metafiber EOM can reach as high as 1000 MHz with a bias voltage of ±9 V, which is the best performance for lumped fiber-integrated EOMs.
“Such metafiber EOMs provide new perspectives on designing [a] ultracompact and high performance EO device for applications where compact configuration, highly integrated capability and low coupling loss are required, such as in active mode-locking fiber lasers and tunable broadband fiber polarizers. This work also offers an avenue to ‘plug-and-play’ implementations of EO devices and ultracompact ‘all-in-fibers’ optical systems for communications, imaging, sensing and many others,” Prof. Jiyong Wang added.
More information: Lei Zhang et al, Plasmonic metafibers electro-optic modulators, Light: Science & Applications (2023). DOI: 10.1038/s41377-023-01255-7
Triplons are tricky little things. Experimentally, they’re exceedingly difficult to observe. And even then, researchers usually conduct the tests on macroscopic materials, in which measurements are expressed as an average across the whole sample.
That’s where designer quantum materials offer a unique advantage, says Academy Research Fellow Robert Drost, the first author of a paper published in Physical Review Letters. These designer quantum materials let researchers create phenomena not found in natural compounds, ultimately enabling the realization of exotic quantum excitations.
“These materials are very complex. They give you very exciting physics, but the most exotic ones are also challenging to find and study. So, we are trying a different approach here by building an artificial material using individual components,” says Professor Peter Liljeroth, head of the Atomic Scale physics research group at Aalto University.
Quantum materials are governed by the interactions between electrons at the microscopic level. These electronic correlations lead to unusual phenomena like high-temperature superconductivity or complex magnetic states, and quantum correlations give rise to new electronic states.
In the case of two electrons, there are two entangled states known as singlet and triplet states. Supplying energy to the electron system can excite it from the singlet to the triplet state. In some cases, this excitation can propagate through a material in an entanglement wave known as a triplon. These excitations are not present in conventional magnetic materials, and measuring them has remained an open challenge in quantum materials.
The team’s triplon experiments
In the new study, the team used small organic molecules to create an artificial quantum material with unusual magnetic properties. Each of the cobalt-phthalocyanine molecules used in the experiment contains two frontier electrons.
“Using very simple molecular building blocks, we are able to engineer and probe this complex quantum magnet in a way that has never been done before, revealing phenomena not found in its independent parts,” Drost says. “While magnetic excitations in isolated atoms have long been observed using scanning tunneling spectroscopy, it has never been accomplished with propagating triplons.”
“We use these molecules to bundle electrons together, we pack them into a tight space and force them to interact,” continues Drost. “Looking into such a molecule from the outside, we will see the joint physics of both electrons. Because our fundamental building block now contains two electrons, rather than one, we see a very different kind of physics.”
The team monitored magnetic excitations first in individual cobalt-phthalocyanine molecules and later in larger structures like molecular chains and islands. By starting with the very simple and working towards increasing complexity, the researchers hope to understand emergent behavior in quantum materials. In the present study, the team could demonstrate that the singlet-triplet excitations of their building blocks can traverse molecular networks as exotic magnetic quasiparticles known as triplons.
“We show that we can create an exotic quantum magnetic excitation in an artificial material. This strategy shows that we can rationally design material platforms that open up new possibilities in quantum technologies,” says Assistant Professor Jose Lado, one of the study’s co-authors, who heads the Correlated Quantum Materials research group at Aalto University.
The team plans to extend their approach towards more complex building blocks to design other exotic magnetic excitations and ordering in quantum materials. Rational design from simple ingredients will not only help understand the complex physics of correlated electron systems but also establish new platforms for designer quantum materials.
More information: Robert Drost et al, Real-Space Imaging of Triplon Excitations in Engineered Quantum Magnets, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.131.086701
Researchers from Queen Mary University of London have made a discovery that could change our understanding of the universe. In their study published in Science Advances, they reveal, for the first time, that there is a range in which fundamental constants can vary, allowing for the viscosity needed for life processes to occur within and between living cells. This is an important piece of the puzzle in determining where these constants come from and how they impact life as we know it.
In 2020, the same team found that the viscosity of liquids is determined by fundamental physical constants, setting a limit on how runny a liquid can be. Now this result is taken into the realm of life sciences.
Fundamental physical constants shape the fabric of the universe we live in. Physical constants are quantities with a value that is generally believed to be both universal in nature and to remain unchanged over time—for example the mass of the electron. They govern nuclear reactions and can lead to the formation of molecular structures essential to life, but their origin is unknown. This research might bring scientists one step closer to determining where these constants come from.
“Understanding how water flows in a cup turns out to be closely related to the grand challenge to figure out fundamental constants. Life processes in and between living cells require motion and it is viscosity that sets the properties of this motion. If fundamental constants change, viscosity would change too impacting life as we know it. For example, if water was as viscous as tar life would not exist in its current form or not exist at all. This applies beyond water, so all life forms using the liquid state to function would be affected.”
“Any change in fundamental constants including an increase or decrease would be equally bad news for flow and for liquid-based life. We expect the window to be quite narrow: for example, viscosity of our blood would become too thick or too thin for body functioning with only a few percent change of some fundamental constants such as the Planck constant or electron charge,” Professor of Physics Kostya Trachenko said.
Surprisingly, the fundamental constants were thought to be tuned billions of years ago to produce heavy nuclei in stars and back then life as we know it today didn’t exist. There was no need for these constants to be fine-tuned at that point to also enable cellular life billions of years later, and yet these constants turn out to be bio-friendly to flow in and between living cells.
An accompanying conjecture is that multiple tunings may have been involved and this then suggests a similarity to biological evolution where traits were acquired independently. Through evolutionary mechanisms, fundamental constants may be the result of nature arriving at sustainable physical structures. It remains to be seen how the principles of evolution can be helpful to understand the origin of fundamental constants.
The U.S. particle physics community is preparing for a major research program with the Deep Underground Neutrino Experiment (DUNE). DUNE will study neutrino oscillations. These quantum mechanical oscillations are only possible because neutrinos have mass, albeit it very small masses.
Research at DUNE will address key questions about neutrinos, such as whether they and their antineutrino counterparts behave differently. Answering these questions could help explain why the universe is composed of matter and not antimatter.
These studies require a detailed understanding of how neutrinos interact with atomic nuclei and the nucleons (protons and neutrons) that make up nuclei.
By providing new data, DUNE will help scientists advance beyond the current understanding of neutrino-nucleon interactions, which relies upon data from experiments in the 1970s and ’80s.
Scientists use the nuclear theory method called lattice quantum chromodynamics (LQCD) to predict neutrino-nucleon interactions. The LQCD results predict a stronger neutrino-nucleon interaction than predictions determined from older, less precise, experimental data. The work was published in the Annual Review of Nuclear and Particle Science.
This research demonstrated important implications of how scientists interpret neutrino oscillation signals from LQCD. It also identified the next results to tackle with LQCD. These findings, combined with modern many-body nuclear theory methods, will reduce the potential biases due to incorrect modeling. The findings will also improve scientists’ predictions of these interactions for DUNE and other neutrino experiments.
A recent project by researchers at the University of California, Berkeley and Lawrence Berkeley National Laboratory demonstrated the importance of incorporating state-of-the-art theoretical predictions of the “nucleon axial form factor” into simulations of neutrino-nucleus reactions (a form factor is a measure of the “squishiness” of a particle—the smaller the value, the squishier).
Scientists need these form factors to determine oscillation properties of the elusive neutrinos that will be explored by DUNE and other leading neutrino oscillation experiments. The most advanced LQCD predictions conflict with the older phenomenological models of the axial form-factor, leading to a 30% larger neutrino-nucleon cross-section. This has important implications for the interpretation of the oscillation experiments. These LQCD calculations are made possible by the Department of Energy’s Leadership Class Computing Facilities, which house the fastest supercomputers in the world.
In the exascale computing era, scientists will further refine the LQCD results and tackle additional, more complicated processes. The results will be combined with modern many-body nuclear theory methods to provide more robust predictions of the neutrino-nucleus reactions. These predictions are essential ingredients for interpreting the next-generation neutrino oscillation experiments, such as DUNE, and inferring properties of neutrinos.
More information: Aaron S. Meyer et al, Status of Lattice QCD Determination of Nucleon Form Factors and Their Relevance for the Few-GeV Neutrino Program, Annual Review of Nuclear and Particle Science (2022). DOI: 10.1146/annurev-nucl-010622-120608
The past few weeks have seen a huge surge of interest among scientists and the public in a material called LK-99 after it was claimed to be a superconductor at room temperature and ambient pressure.
LK-99 garnered attention after South Korean researchers posted twopapers about it on arXiv, a non-peer-reviewed repository for scientific reports, on July 22. The researchers reported possible indicators of superconductivity in LK-99, including unexpectedly low electrical resistance and partial levitation in a magnetic field.
The potential discovery drew enthusiasm on social media and was widely reported in traditional media too. As a physicist working on quantum phenomena in materials, I was gratified to see the interest in superconductivity, and I shared in the excitement about the report. But I also approached the results with skepticism, especially since many previous reports of room-temperature superconductivity have failed to be reproduced.
Now, after follow-up experiments by scientists around the world, it seems LK-99 is not so special after all. However, while this particular avenue of research may be a dead end, the dream of a room-temperature superconductor is still very much alive.
What is a superconductor, and why are they useful?
You’re probably familiar with ordinary conductors, like metals, in which electrons can move fairly easily through the “crystal lattice” of atoms that makes up the material. This means an electric current can flow—but the electrons are jostled around a bit as they move, so they lose energy as they travel. (This jostling is called electrical resistance.)
In a superconductor, there is zero resistance and an electrical current can flow perfectly smoothly without losing any energy. Many metals become superconductors at very low temperatures.
Superconductivity occurs when the electrons slightly distort the crystal lattice of the metal in a way that makes them team up into “Cooper pairs.” These pairs of electrons then “condense” into a superfluid, a state of matter that can flow without friction.
Superconductors are very useful. They can be used to create extremely powerful electromagnets, such as those in MRI scanners, particle accelerators, fusion reactors and maglev trains.
Current superconductors work only at ultra-cold temperatures, so they require expensive refrigeration. A material that superconducts at everyday temperature and pressure could be used much more widely.
Currently, the highest superconducting temperatures at ambient pressure are around –138℃ (135 Kelvin), found in “cuprate” superconductors, a family of copper-containing compounds discovered unexpectedly in 1986. Electron pairing in the cuprates appears to involve a different mechanism than interaction with the lattice.
However, while our understanding of such exotic superconductors has improved, we still can’t yet predict with any certainty new materials which could superconduct at even higher temperature. Still, there is no reason to think this can’t be achieved. Moreover, many if not most superconducting materials are discovered serendipitously—so a claimed discovery of an unexpected room-temperature superconductor can’t be dismissed out of hand.
So what about LK-99?
LK-99 is a compound containing oxygen, phosphorus, lead and copper. Little was known about the material when the papers claiming superconductivity emerged. For example, it wasn’t even known whether it should conduct electricity at all.
The report of superconductivity at ambient conditions sparked a crash effort from researchers around the world to understand the material and reproduce the results. While it is still early days, and neither the initial report nor the follow-ups have been peer-reviewed, a picture has started to emerge that the LK-99 compound described by the authors is not a superconductor, and not even a metal.
So if it’s not a superconductor, why did the original researchers think it was? One study has pointed out that an impurity in the initial LK-99 samples, cuprous sulfide, could explain some of what they saw.
Cuprous sulfide experiences a sudden, large change in resistance at a temperature of around 127℃ (400K). The first researchers saw this drop in resistance and attributed it to superconductivity in LK-99, but it is more likely explained by very low (not zero) resistance in the cuprous sulfide impurity.
The partial levitation of LK-99, which might have indicated a property of superconductors called “magnetic flux pinning,” seems to be caused by ferromagnetism, a familiar effect that occurs in iron and many other materials.
So while nobody has proven the LK-99 samples studied in the original reports don’t superconduct, the balance of evidence right now is strongly in favor of other explanations. Most scientists studying superconductivity don’t see much reason to continue looking at LK-99.
Excitons and beyond
What’s next for superconductivity research? Well, we can cross LK-99 off the list of materials to study, but the search goes on.
In fact, there has been a lot of progress in the past few years towards creating zero resistance under ordinary conditions.
Making electrons pair together is the key to superconductivity, but this is hard to do as they naturally repel each other. However, it’s possible to make an electron pair up with a “hole” in a material—a gap where an electron should be.
An alternate route to zero resistance at room temperature has been found in so-called topological insulators. These are materials that only allow electrons to move along their edges or surfaces, in some cases with no resistance.
Graphene, a material made of sheets of carbon only a single atom thick, can be turned into a topological insulator in a strong magnetic field. But the required magnetic field is so extreme it can only be realized in a few laboratories in the world.
There are also other types of topological insulators that work without an externally applied magnetic field. Current versions of these materials show zero resistance only at very low temperatures, but there appears to be no reason they couldn’t work at room temperature.
Unfortunately superfluid excitons and topological insulators can only carry a limited amount of current, and are probably not useful for creating powerful magnets. But they could still be useful for transmitting the tiny electrical signals used in computer chips, and my colleagues and I are using them to create low-power electronic and computing technologies.
A collaborative research team led by Interim Head of Physics Professor Shuang Zhang from The University of Hong Kong (HKU), along with National Center for Nanoscience and Technology, Imperial College London and University of California, Berkeley, has proposed a new synthetic complex frequency wave (CFW) approach to address optical loss in superimaging demonstration. The research findings were recently published in the journal Science.
Imaging plays an important role in many fields, including biology, medicine and material science. Optical microscopes use light to obtain imaging of miniscule objects. However, conventional microscopes can only resolve feature sizes in the order of the optical wavelength at best, known as the diffraction limit.
To overcome the diffraction limit, Sir John Pendry from Imperial College London introduced the concept of superlenses, which can be constructed from negative index media or noble metals like silver. Subsequently, Professor Xiang Zhang, the current President and Vice-Chancellor of HKU, along with his then team at the University of California, Berkeley, experimentally demonstrated superimaging using both a silver thin film and a silver/dielectric multilayer stack.
These works have extensively promoted the development and application of superlens technology. Unfortunately, all superlenses suffer from inevitable optical loss, which converts optical energy into heat. This significantly affects the performance of optical devices, such as superimaging lenses, which rely on the faithful delivery of information carried by light waves.
Optical loss has been the main limiting factor that has constrained the development of nanophotonics for the past three decades. Many applications, including sensing, superimaging, and nanophotonic circuits, would greatly benefit if this problem could be solved.
Professor Shuang Zhang, corresponding author of the paper, explained the research foci, “To solve the optical loss problem in some important applications, we have proposed a practical solution—using a novel synthetic complex wave excitation to obtain virtual gain, and then offset the intrinsic loss of the optical system. As a verification, we applied this approach to the superlens imaging mechanism and theoretically improved imaging resolution significantly.”
“We further demonstrated our theory by conducting experiments using hyperlenses made of hyperbolic metamaterials in the microwave frequency range and polariton metamaterials in the optical frequency range. As expected, we obtained excellent imaging results consistent with our theoretical predictions,” added Dr. Fuxin Guan, the paper’s first author and a Postdoctoral Fellow at HKU.
Multi-frequency approach to overcome optical loss
In this study, the researchers introduced a novel multiple-frequency approach to overcome negative impacts of loss on superimaging. Complex frequency waves can be used to provide virtual gain to compensate for the loss in an optical system.
What does complex frequency mean? Frequency of a wave refers to how fast it oscillates in time, as shown in Figure 2a. It is natural to consider frequency a real number. Interestingly, the concept of frequency can be extended into the complex domain, where the imaginary part of the frequency also has a well-defined physical meaning, i.e., how fast a wave amplifies or decays in time. Hence, for a complex frequency wave, both oscillation and amplification of the wave occurs simultaneously.
For a complex frequency with negative (positive) imaginary part, the wave decays (amplifies) in time, as shown in Figure 2b. Of course, an ideal complex wave is not physical because it would diverge when time goes to either positive or negative infinity, depending on the sign of its imaginary part. Therefore, any realistic implementation of complex frequency waves needs to be truncated in time to avoid the divergence (see Figure 2c). The optical measurement directly based on complex frequency waves needs to be performed in the time domain and it would involve complicated time-gated measurements and therefore it has not been experimentally realized thus far.
The team utilized mathematical tool Fourier Transformation to break down a truncated CFW into many components of different real frequencies (see Figure 2d), greatly facilitating the implementation of CFWs for various applications, such as superimaging. By carrying out optical measurements at multiple real frequencies at a fixed interval, it is possible to construct the optical response of the system at a complex frequency by mathematically combining that of real frequencies.
As a proof of concept, the team started with superimaging at microwave frequencies using a hyperbolic metamaterial. The hyperbolic metamaterial can carry waves with very large wavevectors (or equivalently very small wavelengths), that are capable of transmitting the information of very small feature sizes. However, the larger the wavevector, the more sensitive the waves are to optical loss.
Therefore, in the presence of loss, the information of those small feature sizes gets lost during the propagation inside the hyperbolic metamaterial. The researchers showed that, by appropriately combining the blurred images measured at different real frequencies, a clear image at a complex frequency was formed with a deep-subwavelength resolution in Figure 3.
The team further extended the principle to optical frequencies, employing an optical superlens made of a phononic crystal called silicon carbide, which operates at the far-infrared wavelength of around 10 micrometers. In a phononic crystal, the lattice vibration can couple with light to create the superimaging effect. However, the loss is still a limiting factor in the spatial resolution.
Although the spatial resolutions of imaging at all the real frequencies were limited by the loss, as shown by the blurred images of the nano-scale holes, ultrahigh-resolution imaging can be obtained with synthesized CFWs that consist of multiple frequency components (see Figure 4).
“The work has provided a solution to overcome optical loss in optical systems, a long-standing problem in nanophotonics. The synthesized complex-frequency method can be readily extended to other applications, including molecular sensing and nanophotonic integrated circuits,” said Professor Xiang Zhang, another corresponding author of the paper, and also chair of physics and engineering.
He hailed this as a remarkable and universally applicable method, “This can be leveraged to tackle loss in other wave systems, including sound waves, elastic waves, and quantum waves, elevating imaging quality to a new height.”
More information: Fuxin Guan et al, Overcoming losses in superlenses with synthetic waves of complex frequency, Science (2023). DOI: 10.1126/science.adi1267
In the middle of the last century, physicists found that protons can resonate, much like a ringing bell. Advances over the last three decades have led to 3D pictures of the proton and significant insight into its structure in its ground state. But little is known about the 3D structure of the resonating proton.
Now, an experiment to explore the 3D structures of resonances of protons and neutrons at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility has added one more puzzle piece to the vast picture of the chaotic, nascent universe that existed just after the Big Bang.
Studying the fundamental properties and behaviors of nucleons offers critical insights into the basic building blocks of matter. Nucleons are the protons and neutrons that make up the nuclei of atoms. Each nucleon consists of three quarks tightly bound together by gluons by the strong interaction—the strongest force in nature.
The most stable, lowest-energy state of a nucleon is called its ground state. But when a nucleon is forcibly excited into a higher-energy state, its quarks rotate and vibrate against each other, exhibiting what’s known as a nucleon resonance.
A group of physicists from Justus Liebig Universitat (JLU) Giessen in Germany and the University of Connecticut led the CLAS Collaboration effort to conduct an experiment exploring these nucleon resonances. The experiment was carried out at Jefferson Lab’s world-class Continuous Electron Beam Accelerator Facility (CEBAF). CEBAF is a DOE Office of Science user facility that supports the research of more than 1,800 nuclear physicists worldwide. Results of the research were published in the journal Physical Review Letters.
Analysis leader Stefan Diehl said the team’s work sheds light on the basic properties of nucleon resonances. Diehl, is a postdoctoral researcher and project leader at the 2nd Physics Institute at JLU Giessen and a research professor at the University of Connecticut. He said the work is also inspiring fresh investigations of the 3D structure of the resonating proton and the excitation process.
“This is the first time we have some measurement, some observation, which is sensitive to the 3D characteristics of such an excited state,” said Diehl. “In principle, this is just the beginning, and this measurement is opening a new field of research.”
The mystery of how matter formed
The experiment was conducted in Experimental Hall B in 2018-2019 using Jefferson Lab’s CLAS12 detector. A high-energy electron beam was sent into a chamber of cooled hydrogen gas. The electrons impacted the target’s protons to excite the quarks within and produce nucleon resonance in combination with a quark-antiquark state—a so-called meson.
The excitations are fleeting, but they leave behind evidence of their existence in the form of new particles that are made from the excited particles’ energy as it fritters away. These new particles live long enough for the detector to pick them up, so the team could reconstruct the resonance.
Diehl and others will discuss their results as part of a joint workshop on “Exploring resonance structure with transition GPDs” August 21–25 in Trento, Italy. The research has already inspired two theory groups to publish papers on the work.
The team also plans more experiments at Jefferson Lab using different targets and polarizations. By scattering electrons from polarized protons, they can access different characteristics of the scattering process. In addition, the study of similar processes, such as the production of a resonance in combination with an energetic photon, can provide further important information.
Through such experiments, Diehl said, physicists can tease out the properties of the early cosmos after the Big Bang.
“In the beginning, the early cosmos only had some plasma consisting of quarks and gluons, which were all spinning around because the energy was so high,” said Diehl. “Then, at some point, matter started to form, and the first things that formed were the excited nucleon states. When the universe expanded further, it cooled down and the ground state nucleons manifested.
“With these studies, we can learn about the characteristics of these resonances. And this will tell us things about how matter was formed in the universe and why the universe exists in its present form.”
More information: S. Diehl et al, First Measurement of Hard Exclusive π−Δ++ Electroproduction Beam-Spin Asymmetries off the Proton, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.131.021901
Since the 17th century, when Isaac Newton and Christiaan Huygens first debated the nature of light, scientists have been puzzling over whether light is best viewed as a wave or a particle—or perhaps, at the quantum level, even both at once. Now, researchers at Stevens Institute of Technology have revealed a new connection between the two perspectives, using a 350-year-old mechanical theorem—ordinarily used to describe the movement of large, physical objects like pendulums and planets—to explain some of the most complex behaviors of light waves.
The work, led by Xiaofeng Qian, assistant professor of physics at Stevens and reported in the August 17 online issue of Physical Review Research, also proves for the first time that a light wave’s degree of non-quantum entanglement exists in a direct and complementary relationship with its degree of polarization. As one rises, the other falls, enabling the level of entanglement to be inferred directly from the level of polarization, and vice versa. This means that hard-to-measure optical properties such as amplitudes, phases and correlations—perhaps even these of quantum wave systems—can be deduced from something a lot easier to measure: light intensity.
“We’ve known for over a century that light sometimes behaves like a wave, and sometimes like a particle, but reconciling those two frameworks has proven extremely difficult,” said Qian “Our work doesn’t solve that problem—but it does show that there are profound connections between wave and particle concepts not just at the quantum level, but at the level of classical light-waves and point-mass systems.”
Qian’s team used a mechanical theorem, originally developed by Huygens in a 1673 book on pendulums, that explains how the energy required to rotate an object varies depending on the object’s mass and the axis around which it turns. “This is a well-established mechanical theorem that explains the workings of physical systems like clocks or prosthetic limbs,” Qian explained. “But we were able to show that it can offer new insights into how light works, too.”
This 350-year-old theorem describes relationships between masses and their rotational momentum, so how could it be applied to light where there is no mass to measure? Qian’s team interpreted the intensity of a light as the equivalent of a physical object’s mass, then mapped those measurements onto a coordinate system that could be interpreted using Huygens’ mechanical theorem. “Essentially, we found a way to translate an optical system so we could visualize it as a mechanical system, then describe it using well-established physical equations,” explained Qian.
Once the team visualized a light wave as part of a mechanical system, new connections between the wave’s properties immediately became apparent—including the fact that entanglement and polarization stood in a clear relationship with one another.
“This was something that hadn’t been shown before, but that becomes very clear once you map light’s properties onto a mechanical system,” said Qian. “What was once abstract becomes concrete: using mechanical equations, you can literally measure the distance between ‘center of mass’ and other mechanical points to show how different properties of light relate to one another.”
Clarifying these relationships could have important practical implications, allowing subtle and hard-to-measure properties of optical systems—or even quantum systems—to be deduced from simpler and more robust measurements of light intensity, Qian explained. More speculatively, the team’s findings suggest the possibility of using mechanical systems to simulate and better-understand the strange and complex behaviors of quantum wave systems.
“That still lies ahead of us, but with this first study we’ve shown clearly that by applying mechanical concepts, it’s possible to understand optical systems in an entirely new way,” Qian said. “Ultimately, this research is helping to simplify the way we understand the world, by allowing us to recognize the intrinsic underlying connections between apparently unrelated physical laws.”
More information: Xiao-Feng Qian et al, Bridging coherence optics and classical mechanics: A generic light polarization-entanglement complementary relation, Physical Review Research (2023). DOI: 10.1103/PhysRevResearch.5.033110
Located at CERN’s North Area and receiving beams from the Super Proton Synchrotron (SPS), the NA64 and NA62 experiments search for dark matter, complementing searches at the LHC, as they cover a different energy range. Both experiments have recently published new results. The research is published on the arXiv preprint server.
Dark matter does not seem to interact with our visible world but makes up most of our universe. Researchers assume that the dark sector interacts with the Standard Model via so-called mediators. These mediators could be—for instance—a dark photon, a dark scalar boson and an axion, which could be distinguished by how they interact with Standard Model particles.
The NA62 experiment, which was designed to study the ultra-rare kaon decay into a charged pion and two neutrinos, has now searched for possible contributions from dark-matter particles to another rare kaon decay. The researchers used a beam consisting mainly of pions and kaons, produced by firing the 400 GeV/c SPS proton beam onto a beryllium target.
The rare kaon decay into a pion and a pair of photons, subsequently decaying into two electron-positron pairs, is particularly interesting, as hypothetically, dark bosons would decay into the same final states as Standard Model photons.
Although the experimentalists did not find evidence for such a rare decay, nor for a dark boson, they placed the most stringent upper limits to date by analyzing data recorded in 2017 and 2018. In addition, the experimentalists excluded the axion as a possible explanation for the 17 MeV/c2 ATOMKI anomaly and thus confirmed previous findings by the NA64 experiment.
The NA64 collaboration hunts for invisible light dark-matter particles that interact with Standard Model particles through a possible dark photon. Using electron collision data collected between 2016 and 2022, corresponding to 9.4 × 1011 electrons on target, NA64 started to probe the very exciting region of parameter space predicted by two benchmark dark-matter models for the first time.
Their data set excludes leading sub-GeV dark-matter candidates with a coupling between the dark-matter particle and the dark photon for a range of dark-matter particle masses, ruling out both models. To obtain these results, the NA64 experiment used a 100 GeV/c electron beam generated from protons interacting with a fixed target. The collaboration utilized an active beam dump and attempted to reconstruct the hypothetical dark photon, via both visible electron-positron pairs and missing energy for invisible decays.
More information: NA62 collaboration, Search for K+ decays into the π+e+e–e+e– final state, arXiv (2023). DOI: 10.48550/arxiv.2307.04579