New hybrid structures could pave the way to more stable quantum computers

New hybrid structures could pave the way to more stable quantum computers
RHEED patterns during MBE growth. (a) Bilayer graphene terminated 6H-SiC(0001) substrate. (b) Monolayer NbSe2 film grown on bilayer graphene. (c) 5 QL Bi2Se3/monolayer NbSe2 heterostructure grown on bilayer graphene. Credit: Nature Materials (2022). DOI: 10.1038/s41563-022-01386-z

A new way to combine two materials with special electrical properties—a monolayer superconductor and a topological insulator—provides the best platform to date to explore an unusual form of superconductivity called topological superconductivity. The combination could provide the basis for topological quantum computers that are more stable than their traditional counterparts.

Superconductors—used in powerful magnets, digital circuits, and imaging devices—allow the electric current to pass without resistance, while topological insulators are thin films only a few atoms thick that restrict the movement of electrons to their edges, which can result in unique properties. A team led by researchers at Penn State describe how they have paired the two materials in a paper appearing Oct. 27 in the journal Nature Materials.

“The future of quantum computing depends on a kind of material that we call a topological superconductor, which can be formed by combining a topological insulator with a superconductor, but the actual process of combining these two materials is challenging,” said Cui-Zu Chang, Henry W. Knerr Early Career Professor and Associate Professor of Physics at Penn State and leader of the research team.

“In this study, we used a technique called molecular beam epitaxy to synthesize both topological insulator and superconductor films and create a two-dimensional heterostructure that is an excellent platform to explore the phenomenon of topological superconductivity.”

In previous experiments to combine the two materials, the superconductivity in thin films usually disappears once a topological insulator layer is grown on top. Physicists have been able to add a topological insulator film onto a three-dimensional “bulk” superconductor and retain the properties of both materials.

However, applications for topological superconductors, such as chips with low power consumption inside quantum computers or smartphones, would need to be two-dimensional.

In this paper, the research team stacked a topological insulator film made of bismuth selenide (Bi2Se3) with different thicknesses on a superconductor film made of monolayer niobium diselenide (NbSe2), resulting in a two-dimensional end-product. By synthesizing the heterostructures at very lower temperature, the team was able to retain both the topological and superconducting properties.

“In superconductors, electrons form ‘Cooper pairs’ and can flow with zero resistance, but a strong magnetic field can break those pairs,” said Hemian Yi, a postdoctoral scholar in the Chang Research Group at Penn State and the first author of the paper.

“The monolayer superconductor film we used is known for its ‘Ising-type superconductivity,’ which means that the Cooper pairs are very robust against the in-plane magnetic fields. We would also expect the topological superconducting phase formed in our heterostructures to be robust in this way.”

By subtly adjusting the thickness of the topological insulator, the researchers found that the heterostructure shifted from Ising-type superconductivity—where the electron spin is perpendicular to the film—to another kind of superconductivity called “Rashba-type superconductivity”—where the electron spin is parallel to the film.

This phenomenon is also observed in the researchers’ theoretical calculations and simulations.

This heterostructure could also be a good platform for the exploration of Majorana fermions, an elusive particle that would be a major contributor to making a topological quantum computer more stable than its predecessors.

“This is an excellent platform for the exploration of topological superconductors, and we are hopeful that we will find evidence of topological superconductivity in our continuing work,” said Chang. “Once we have solid evidence of topological superconductivity and demonstrate Majorana physics, then this type of system could be adapted for quantum computing and other applications.”

A faster way to find and study topological materials

Data structure and model architecture.(a) A schematic of the full XANES spectrum for arepresentative sample in the dataset, showing the signatures from di↵erent absorbing elements on an absolute energyscale. For a given material, the inputs to the NN classifier consist of one-hot encoded atom types (left) and XANESspectra (right) for all absorbing atoms. (b) Schematic of the neural network architecture predicting the (binary)topological class using spectral and atom-type inputs. Spectral and atom-type inputs are individually embedded byfully-connected layers before performing a direct product between corresponding spectral and atomic channels.These composite features are aggregated for a given material and passed to a final fully-connected block to predictthe topological class. Credit: Advanced Materials (2022). DOI: 10.1002/adma.202204113

Topological materials, an exotic class of materials whose surfaces exhibit different electrical or functional properties than their interiors, have been a hot area of research since their experimental realization in 2007—a finding that sparked further research and precipitated a Nobel Prize in Physics in 2016. These materials are thought to have great potential in a variety of fields, and might someday be used in ultraefficient electronic or optical devices, or key components of quantum computers.

But there are many thousands of compounds that may theoretically have topological characteristics, and synthesizing and testing even one such material to determine its topological properties can take months of experiments and analysis. Now a team of researchers at MIT and elsewhere have come up with a new approach that can rapidly screen candidate materials and determine with more than 90 percent accuracy whether they are topological.

Using this new method, the researchers have produced a list candidate materials. A few of these were already known to have topological properties, but the rest are newly predicted by this approach.

The findings are reported in the journal Advanced Materials in a paper by Mingda Li, the Class ’47 Career Development Professor at MIT, graduate students (and twin sisters) Nina Andrejevic at MIT and Jovana Andrejevic at Harvard University, and seven others at MIT, Harvard, Princeton University, and Argonne National Laboratory.

Topological materials are named after a branch of mathematics that describes shapes based on their invariant characteristics, which persist no matter how much an object is continuously stretched or squeezed out of its original shape. Topological materials, similarly, have properties that remain constant despite changes in their conditions, such as external perturbations or impurities.

There are several varieties of topological materials, including semiconductors, conductors, and semimetals, among others. Initially, it was thought that there were only a handful of such materials, but recent theory and calculations have predicted that in fact thousands of different compounds may have at least some topological characteristics. The hard part is figuring out experimentally which compounds may be topological.

Applications for such materials span a wide range, including devices that could perform computational and data storage functions similarly to silicon-based devices but with far less energy loss, or devices to harvest electricity efficiently from waste heat, for example in thermal power plants or in electronic devices. Topological materials can also have superconducting properties, which could potentially be used to build the quantum bits for topological quantum computers.

But all of this relies on developing or discovering the right materials. “To study a topological material, you first have to confirm whether the material is topological or not,” Li says, “and that part is a hard problem to solve in the traditional way.”

A method called density functional theory is used to perform initial calculations, which then need to be followed with complex experiments that require cleaving a piece of the material to atomic-level flatness and probing it with instruments under high-vacuum conditions.

“Most materials cannot even be measured due to various technical difficulties,” Nina Andrejevic says. But for those that can, the process can take a long time. “It’s a really painstaking procedure,” she says.

Sensitivity to spectral energy resolution. The overall recall, precision, and F1 scores for (a) topological and (b) trivial examples as a function of the energy interval E between sampled points of the XANES spectra. Scores are presented for both the SVM and NN models, with scores from the atom-type only models (SVM-type and NN-type) shown as a reference by the dotted lines. Spectra were resampled at lower resolutions by computing their average values over length E intervals along the energy axis for varied E. To maintain the same number of neurons across all resolutions, the averaged values were copied by the number of original samples within each interval such that all spectral inputs have length 200. Credit: Advanced Materials (2022). DOI: 10.1002/adma.202204113


Whereas the traditional approach relies on measuring the material’s photoemissions or tunneling electrons, Li explains, the new technique he and his team developed relies on absorption, specifically, the way the material absorbs X-rays.

Unlike the expensive apparatus needed for the conventional tests, X-ray absorption spectrometers are readily available and can operate at room temperature and atmospheric pressure, with no vacuum needed. Such measurements are widely conducted in biology, chemistry, battery research, and many other applications, but they had not previously been applied to identifying topological quantum materials.

X-ray absorption spectroscopy provides characteristic spectral data from a given sample of material. The next challenge is to interpret that data and how it relates to the topological properties. For that, the team turned to a machine-learning model, feeding in a collection of data on the X-ray absorption spectra of known topological and nontopological materials, and training the model to find the patterns that relate the two. And it did indeed find such correlations.

“Surprisingly, this approach was over 90 percent accurate when tested on more than 1500 known materials,” Nina Andrejevic says, adding that the predictions take only seconds. “This is an exciting result given the complexity of the conventional process.”

Though the model works, as with many results from machine learning, researchers don’t yet know exactly why it works or what the underlying mechanism is that links the X-ray absorption to the topological properties.

“While the learned function relating X-ray spectra to topology is complex, the result may suggest that certain attributes the measurement is sensitive to, such as local atomic structures, are key topological indicators,” Jovana Andrejevic says.

The team has used the model to construct a periodic table that displays the model’s overall accuracy on compounds made from each of the elements. It serves as a tool to help researchers home in on families of compounds that may offer the right characteristics for a given application.

The researchers have also produced a preliminary study of compounds that they have used this X-ray method on, without advance knowledge of their topological status, and compiled a list of 100 promising candidate materials—a few of which were already known to be topological.

“This work represents one of the first uses of machine learning to understand what experiments are trying to tell us about complex materials,” says Joel Moore, the Chern-Simons Professor of Physics at the University of California at Berkeley, who was not associated with this research.

“Many kinds of topological materials are well-understood theoretically in principle, but finding material candidates and verifying that they have the right topology of their bands can be a challenge. Machine learning seems to offer a new way to address this challenge: Even experimental data whose meaning is not immediately obvious to a human can be analyzed by the algorithm, and I am excited to see what new materials will result from this way of looking.”

Anatoly Frenkel, a professor in the Department of Materials Science and Chemical Engineering at Stony Brook University and a senior chemist at Brookhaven National Laboratory, further commented that “it was a really nice idea to consider that the X-ray absorption spectrum may hold a key to the topological character in the measured sample.”

How do you solve a problem like a proton? Smash it, then build it back with machine learning

Looking into the HERA tunnel: Berkeley Lab scientists have developed new machine learning algorithms to accelerate the analysis of data collected decades ago by HERA, the world’s most powerful electron-proton collider that ran at the DESY national research center in Germany from 1992 to 2007. Credit: DESY

Protons are tiny yet they carry a lot of heft. They inhabit the center of every atom in the universe and play a critical role in one of the strongest forces in nature.

And yet, protons have a down-to-earth side, too.

Like most particles, protons have spin that act like tiny magnets. Flipping a proton’s spin or polarity may sound like science fiction, but it is the basis of technological breakthroughs that have become essential to our daily lives, such as magnetic resonance imaging (MRI), the invaluable medical diagnostics tool.

Despite such advancements, the proton’s inner workings remain a mystery.

“Basically everything around you exists because of protons—and yet we still don’t understand everything about them. One huge puzzle that physicists want to solve is the proton’s spin,” said Ben Nachman, a physicist who leads the Machine Learning Group in the Physics Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

Understanding how and why protons spin could lead to technological advancements we can’t even imagine today, and help us understand the strong force, a fundamental property that gives all protons and therefore atoms mass.

But it’s not such an easy problem to solve. For one, you can’t exactly pick up a proton and place it in a petri dish: Protons are unfathomably small—their radius is a hair shy of one quadrillionth of a meter, and visible light passes right through them. What’s more, you can’t even observe their insides with the world’s most powerful electron microscopes.

Recent work by Nachman and his team could bring us closer to solving this perplexing proton puzzle.

As a member of the H1 Collaboration—an international group that now includes 150 scientists from 50 institutes and 15 countries, and is based at the DESY national research center in Germany—Nachman has been developing new machine learning algorithms to accelerate the analysis of data collected decades ago by HERA, the world’s most powerful electron-proton collider that ran at DESY from 1992 to 2007.

HERA—a ring 4 miles in circumference—worked like a giant microscope that accelerated both electrons and protons to nearly the speed of light. The particles were collided head-on, which could scatter a proton into its constituent parts: quarks and gluons.

Scientists at HERA took measurements of the particle debris cascading from these electron-proton collisions, what physicists call “deep inelastic scattering,” through sophisticated cameras called particle detectors, one of which was the H1 detector.

Unfolding secrets of the strong force

The H1 stopped collecting data in 2007, the year HERA was decommissioned. Today, the H1 Collaboration is still analyzing the data and publishing results in scientific journals.

The HERA electron-proton collider accelerated both electrons and protons to nearly the speed of light. The particles were collided head-on, which could scatter a proton into its constituent parts: quarks (shown as green and purple balls in the illustration above) and gluons (illustrated as black coils). Credit: DESY


It can take a year or more when using conventional computational techniques to measure quantities related to proton structure and the strong force, such as how many particles are produced when a proton collides with an electron.

And if a researcher wants to examine a different quantity, such as how fast particles are flying in the wake of a quark-gluon jet stream, they would have to start the long computational process all over again, and wait yet another year.

A new machine learning tool called OmniFold—which Nachman co-developed—can simultaneously measure many quantities at once, thereby reducing the amount of time to run an analysis from years to minutes.

OmniFold does this by using neural networks at once to combine computer simulations with data. (A neural network is a machine learning tool that processes complex data that is impossible for scientists to do manually.)

Nachman and his team applied OmniFold to H1 experimental data for the first time in a June issue of the journal Physical Review Letters and more recently at the 2022 Deep Inelastic Scattering (DIS) Conference.

To develop OmniFold and test its robustness against H1 data, Nachman and Vinicius Mikuni, a postdoctoral researcher in the Data and Analytics Services (DAS) group at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) and a NERSC Exascale Science Applications Program for Learning fellow, needed a supercomputer with a lot of powerful GPUs (graphics processing units), Nachman said.

Coincidentally, Perlmutter, a new supercomputer designed to support simulation, data analytics, and artificial intelligence experiments requiring multiple GPUs at a time, had just opened up in the summer of 2021 for an “early science phase,” allowing scientists to test the system on real data. (The Perlmutter supercomputer is named for the Berkeley Lab cosmologist and Nobel laureate Saul Perlmutter.)

“Because the Perlmutter supercomputer allowed us to use 128 GPUs simultaneously, we were able to run all the steps of the analysis, from data processing to the derivation of the results, in less than a week instead of months. This improvement allows us to quickly optimize the neural networks we trained and to achieve a more precise result for the observables we measured,” said Mikuni, who is also a member of the H1 Collaboration.

A central task in these measurements is accounting for detector distortions. The H1 detector, like a watchful guard standing sentry at the entrance of a sold-out concert arena, monitors particles as they fly through it. One source of measurement errors happens when particles fly around the detector rather than through it, for example—sort of like a ticketless concert goer jumping over an unmonitored fence rather than entering through the ticketed security gate.

Correcting for all distortions simultaneously had not been possible due to limited computational methods available at the time. “Our understanding of subatomic physics and data analysis techniques have advanced significantly since 2007, and so today, scientists can use new insights to analyze the H1 data,” Nachman said.

Scientists today have a renewed interest in HERA’s particle experiments, as they hope to use the data—and more precise computer simulations informed by tools like OmniFold—to aid in the analysis of results from future electron-proton experiments, such as at the Department of Energy’s next-generation Electron-Ion Collider (EIC).

The EIC—to be built at Brookhaven National Laboratory in partnership with the Thomas Jefferson National Accelerator Facility—will be a powerful and versatile new machine capable of colliding high-energy beams of polarized electrons with a wide range of ions (or charged atoms) across many energies, including polarized protons and some polarized ions.

“It’s exciting to think that our method could one day help scientists answer questions that still remain about the strong force,” Nachman said.

“Even though this work might not lead to practical applications in the near term, understanding the building blocks of nature is why we’re here—to seek the ultimate truth. These are steps to understanding at the most basic level what everything is made of. That is what drives me. If we don’t do the research now, we will never know what exciting new technological advances we’ll get to benefit future societies.”

With scanning ultrafast electron microscopy, researchers unveil hot photocarrier transport properties of cubic boron

In a study that confirms its promise as the next-generation semiconductor material, UC Santa Barbara researchers have directly visualized the photocarrier transport properties of cubic boron arsenide single crystals.

“We were able to visualize how the charge moves in our sample,” said Bolin Liao, an assistant professor of mechanical engineering in the College of Engineering. Using the only scanning ultrafast electron microscopy (SUEM) setup in operation at a U.S. university, he and his team were able to make “movies” of the generation and transport processes of a photoexcited charge in this relatively little-studied III-V semiconductor material, which has recently been recognized as having extraordinary electrical and thermal properties. In the process, they found another beneficial property that adds to the material’s potential as the next great semiconductor.

Their research, conducted in collaboration with physics professor Zhifeng Ren’s group at the University of Houston, who specialize in fabricating high-quality single crystals of cubic boron arsenide, appears in the journal Matter.

‘Ringing the bell’

Boron arsenide is being eyed as a potential candidate to replace silicon, the computer world’s staple semiconductor material, due to its promising performance. For one thing, with an improved charge mobility over silicon, it easily conducts current (electrons and their positively charged counterpart, “holes”). However, unlike silicon, it also conducts heat with ease.

“This material actually has 10 times higher thermal conductivity than silicon,” Liao said. This heat conducting—and releasing—ability is particularly important as electronic components become smaller and more densely packed, and pooled heat threatens the devices’ performance, he explained.

“As your cellphones become more powerful, you want to be able to dissipate the heat, otherwise you have efficiency and safety issues,” he said. “Thermal management has been a challenge for a lot of microelectronic devices.”

What gives rise to the high thermal conductivity of this material, it turns out, can also lead to interesting transport properties of photocarriers, which are the charges excited by light, for example, in a solar cell. If experimentally verified, this would indicate that cubic boron arsenide can also be a promising material for photovoltaic and light detection applications. Direct measurement of photocarrier transport in cubic boron arsenide, however, has been challenging due to the small size of available high-quality samples.

The research team’s study combines two feats: The crystal growth skills of the University of Houston team, and the imaging prowess at UC Santa Barbara. Combining the abilities of the scanning electron microscope and femtosecond ultrafast lasers, the UCSB team built what is essentially an extremely fast, exceptionally high-resolution camera.

“Electron microscopes have very good spatial resolution—they can resolve single atoms with their sub-nanometer spatial resolution—but they’re typically very slow,” Liao said, noting this makes them excellent for capturing static images.

“With our technique, we couple this very high spatial resolution with an ultrafast laser, which acts as a very fast shutter, for extremely high time resolution,” Liao continued. “We’re talking about one picosecond—a millionth of a millionth of a second. So we can make movies of these microscopic energy and charge transport processes.” Originally invented at Caltech, the method was further developed and improved at UCSB from scratch and now is the only operational SUEM setup at an American university.

“What happens is that we have one pulse of this laser that excites the sample,” explained graduate student researcher Usama Choudhry, the lead author of the Matter paper. “You can think of it like ringing a bell; it’s a loud noise that slowly diminishes over time.” As they “ring the bell,” he explained, a second laser pulse is focused onto a photocathode (“electron gun”) to generate a short electron pulse to image the sample. They then scan the electron pulse over time to gain a full picture of the ring. “Just by taking a lot of these scans, you can get a movie of how the electrons and holes get excited and eventually go back to normal,” he said.

Among the things they observed while exciting their sample and watching the electrons return to their original state is how long the “hot” electrons persist.

“We found, surprisingly, the ‘hot’ electrons excited by light in this material can persist for much longer times than in conventional semiconductors,” Liao said. These “hot” carriers were seen to persist for more that 200 picoseconds, a property that is related to the same feature that is responsible for the material’s high thermal conductivity. This ability to host “hot” electrons for significantly longer amounts of time has important implications.

“For example, when you excite the electrons in a typical solar cell with light, not every electron has the same amount of energy,” Choudhry explained. “The high-energy electrons have a very short lifetime, and the low-energy electrons have a very long lifetime.” When it comes to harvesting the energy from a typical solar cell, he continued, only the low-energy electrons are efficiently being collected; the high-energy ones tend to lose their energy rapidly as heat. Because of the persistence of the high-energy carriers, if this material was used as a solar cell, more energy could efficiently be harvested from it.

With boron arsenide beating silicon in three relevant areas—charge mobility, thermal conductivity and hot photocarrier transport time—it has the potential to become the electronics world’s next state-of-the-art material. However, it still faces significant hurdles—fabrication of high-quality crystals in large quantities—before it can compete with silicon, enormous amounts of which can be manufactured relatively cheaply and with high quality. But Liao doesn’t see too much of a problem.

“Silicon is now routinely available because of years of investment; people started developing silicon around the 1930s and ’40s,” he said. “I think once people recognize the potential of this material, there will be more effort put into finding ways to grow and use it. UCSB is actually uniquely positioned for this challenge with strong expertise in semiconductor development.”

New data transmission record set using a single laser and a single optical chip

optical chip
Credit: Unsplash/CC0 Public Domain

An international group of researchers from Technical University of Denmark (DTU) and Chalmers University of Technology in Gothenburg, Sweden have achieved dizzying data transmission speeds and are the first in the world to transmit more than 1 petabit per second (Pbit/s) using only a single laser and a single optical chip.

1 petabit corresponds to 1 million gigabits.

In the experiment, the researchers succeeded in transmitting 1.8 Pbit/s, which corresponds to twice the total global Internet traffic. And only carried by the light from one optical source. The light source is a custom-designed optical chip, which can use the light from a single infrared laser to create a rainbow spectrum of many colors, i.e., many frequencies. Thus, the one frequency (color) of a single laser can be multiplied into hundreds of frequencies (colors) in a single chip.

All the colors are fixed at a specific frequency distance from each other—just like the teeth on a comb—which is why it is called a frequency comb. Each color (or frequency) can then be isolated and used to imprint data. The frequencies can then be reassembled and sent over an optical fiber, thus transmitting data. Even a huge volume of data, as the researchers have discovered.

One single laser can replace thousands

The experimental demonstration showed that a single chip could easily carry 1.8 Pbit/s, which—with contemporary state-of-the-art commercial equipment—would otherwise require more than 1,000 lasers.

Victor Torres Company, professor at Chalmers University of Technology, is head of the research group that has developed and manufactured the chip.

“What is special about this chip is that it produces a frequency comb with ideal characteristics for fiber-optical communications—it has high optical power and covers a broad bandwidth within the spectral region that is interesting for advanced optical communications,” says Victor Torres Company.

Interestingly enough, the chip was not optimized for this particular application.

“In fact, some of the characteristic parameters were achieved by coincidence and not by design,” says Victor Torres Company. “However, with efforts in my team, we are now capable to reverse engineer the process and achieve with high reproducibility microcombs for target applications in telecommunications.”

Enormous potential for scaling

In addition, the researchers created a computational model to examine theoretically the fundamental potential for data transmission with a single chip identical to the one used in the experiment. The calculations showed enormous potential for scaling up the solution.

Professor Leif Katsuo Oxenløwe, Head of the Center of Excellence for Silicon Photonics for Optical Communications (SPOC) at DTU, says:

“Our calculations show that—with the single chip made by Chalmers University of Technology, and a single laser—we will be able to transmit up to 100 Pbit/s. The reason for this is that our solution is scalable—both in terms of creating many frequencies and in terms of splitting the frequency comb into many spatial copies and then optically amplifying them, and using them as parallel sources with which we can transmit data. Although the comb copies must be amplified, we do not lose the qualities of the comb, which we utilize for spectrally efficient data transmission.”

This is how you pack light with data

Packing light with data is known as modulation. Here, the wave properties of light are utilized such as:

  • Amplitude (the height/strength of the waves)
  • Phase (the “rhythm” of the waves, where it is possible to make a shift so that a wave arrives either a little earlier or a little later than expected)
  • Polarization (the directions in which the waves spread).

By changing these properties, you create signals. The signals can be translated into either ones or zeros—and thus utilized as data signals.

Reduces Internet power consumption

The researchers’ solution bodes well for the future power consumption of the Internet.

“In other words, our solution provides a potential for replacing hundreds of thousands of the lasers located at Internet hubs and data centers, all of which guzzle power and generate heat. We have an opportunity to contribute to achieving an Internet that leaves a smaller climate footprint,” says Leif Katsuo Oxenløwe.

Even though the researchers have broken the petabit barrier for a single laser source and a single chip in their demonstration, there is still some development work ahead before the solution can be implemented in our current communication systems, according to Leif Katsuo Oxenløwe.

“All over the world, work is being done to integrate the laser source in the optical chip, and we’re working on that as well. The more components we can integrate in the chip, the more efficient the whole transmitter will be, i.e., laser, comb-creating chip, data modulators, and any amplifier elements. It will be an extremely efficient optical transmitter of data signals,” says Leif Katsuo Oxenløwe.

The research is published in Nature Photonics.

Navigating when GPS goes dark

Navigating when GPS goes dark
Cross-sectional renderings of the LPAI sensor head. a, Horizontal cross-section showing the cooling-beam and atom-detection channels with fixed optical components. The cooling-channel light is delivered to the sensor head via a polarization maintaining (PM) fiber from which a large collimated Gaussian beam (D1/e2≈28mm) is used for cooling. The beam is truncated to ≈ 19 mm-diameter through the fused silica viewport in the compact LPAI sensor head. The light then passes through a polarizer and a λ/4 waveplate before illuminating the grating chip. The GMOT atoms (solid red circle) form ≈ 3.5 mm from the grating surface. The atom-detection channel was designed to measure atomic fluorescence through a multimode-fiber-coupled avalanche photodiode (APD) module. b, Vertical cross-section of the sensor head showing the designed beam paths for Doppler-sensitive Raman. Cross-linearly-polarized Raman beams are launched from the same PM fiber and the two components are split by a polarizing beam splitter (PBS). Fixed optics route the Raman beams to the GMOT atoms (solid red circle) with opposite directions. Credit: Nature Communications (2022). DOI: 10.1038/s41467-022-31410-4

Words like “tough” or “rugged” are rarely associated with a quantum inertial sensor. The remarkable scientific instrument can measure motion a thousand times more accurately than the devices that help navigate today’s missiles, aircraft and drones. But its delicate, table-sized array of components that includes a complex laser and vacuum system has largely kept the technology grounded and confined to the controlled settings of a lab.

Jongmin Lee wants to change that.

The atomic physicist is part of a team at Sandia that envisions quantum inertial sensors as revolutionary, onboard navigational aids. If the team can reengineer the sensor into a compact, rugged device, the technology could safely guide vehicles where GPS signals are jammed or lost.

In a major milestone toward realizing their vision, the team has successfully built a cold-atom interferometer, a core component of quantum sensors, designed to be much smaller and tougher than typical lab setups. The team describes their prototype in the academic journal Nature Communications, showing how to integrate several normally separated components into a single monolithic structure. In doing so, they reduced the key components of a system that existed on a large optical table down to a sturdy package roughly the size of a shoebox.

“Very high sensitivity has been demonstrated in the lab, but the practical matters are, for real-world application, that people need to shrink down the size, weight and power, and then overcome various issues in a dynamic environment,” Jongmin said.

The paper also describes a roadmap for further miniaturizing the system using technologies under development.

The prototype, funded by Sandia’s Laboratory Directed Research and Development program, demonstrates significant strides toward moving advanced navigation tech out of the lab and into vehicles on the ground, underground, in the air and even in space.

Navigating when GPS goes dark
Concept of the compact light-pulse atom interferometer (LPAI) for high-dynamic conditions. a 3D rendering of the compact LPAI sensor head with fixed optical components and reliable optomechanical design. b Picture of the steady-state GMOT atoms in the sensor head.. Credit: Nature Communications (2022). DOI: 10.1038/s41467-022-31410-4

Ultrasensitive measurements drive navigational power

As a jet does a barrel roll through the sky, current onboard navigation tech can measure the aircraft’s tilts and turns and accelerations to calculate its position without GPS, for a time. Small measurement errors gradually push a vehicle off course unless it periodically syncs with the satellites, Jongmin said.

Quantum sensing would operate in the same way, but the much better accuracy would mean onboard navigation wouldn’t need to cross-check its calculations as often, reducing reliance on satellite systems.

Roger Ding, a postdoctoral researcher who worked on the project, said, “In principle, there are no manufacturing variations and calibrations,” compared to conventional sensors that can change over time and need to be recalibrated.

Aaron Ison, the lead engineer on the project, said to prepare the atom interferometer for a dynamic environment, he and his team used materials proven in extreme environments. Additionally, parts that are normally separate and freestanding were integrated together and fixed in place or were built with manual lockout mechanisms.

“A monolithic structure having as few bolted interfaces as possible was key to creating a more rugged atom interferometer structure,” Aaron said.

Furthermore, the team used industry-standard calculations called finite element analysis to predict that any deformation of the system in conventional environments would fall within required allowances. Sandia has not conducted mechanical stress tests or field tests on the new design, so further research is needed to measure the device’s strength.

“The overall small, compact design naturally leads towards a stiffer more robust structure,” Aaron said.

Navigating when GPS goes dark
Sandia atomic physicist Jongmin Lee examines the sensor head of a cold-atom interferometer that could help vehicles stay on course where GPS is unavailable. Credit: Bret Latter

Photonics light the way to a more miniaturized system

Most modern atom interferometry experiments use a system of lasers mounted to a large optical table for stability reasons, Roger said. Sandia’s device is comparatively compact, but the team has already come up with further design improvements to make the quantum sensors much smaller using integrated photonic technologies.

“There are tens to hundreds of elements that can be placed on a chip smaller than a penny,” said Peter Schwindt, the principal investigator on the project and an expert in quantum sensing.

Photonic devices, such as a laser or optical fiber, use light to perform useful work and integrated devices include many different elements. Photonics are used widely in telecommunications, and ongoing research is making them smaller and more versatile.

With further improvements, Peter thinks the space an interferometer needs could be as little as a few liters. His dream is to make one the size of a soda can.

In their paper, the Sandia team outlines a future design in which most of their laser setup is replaced by a single photonic integrated circuit, about eight millimeters on each side. Integrating the optical components into a circuit would not only make an atom interferometer smaller, it would also make it more rugged by fixing the components in place.

While the team can’t do this yet, many of the photonic technologies they need are currently in development at Sandia.

“This is a viable path to highly miniaturized systems,” Roger said.

Meanwhile, Jongmin said integrated photonic circuits would likely lower costs and improve scalability for future manufacturing.

“Sandia has shown an ambitious vision for the future of quantum sensing in navigation,” Jongmin said.

Exploring the hidden charm of quark-gluon plasma

ALICE explores hidden charm of quark-gluon plasma
Illustration of the effect of quark–gluon plasma on the formation of charmonia in lead-nuclei collisions. When the plasma temperature increases, the more weakly bound ψ(2S) state is more likely to be “screened”, and thus not form, due to the larger number of quarks and gluons in the plasma (the colored circles). The increase in the number of charm quarks and antiquarks (c and c̄) can lead to the formation of additional charmonia by quark recombination. Credit: ALICE collaboration

Quark–gluon plasma is an extremely hot and dense state of matter in which the elementary constituents—quarks and gluons—are not confined inside composite particles called hadrons, as they are in the protons and neutrons that make up the nuclei of atoms. Thought to have existed in the early universe, this special phase of matter can be recreated at the Large Hadron Collider (LHC) in collisions between lead nuclei.

A new analysis from the international ALICE collaboration at the LHC investigates how different bound states of a charm quark and its antimatter counterpart, also produced in these collisions, are affected by quark–gluon plasma. The results open new avenues for studying the strong interaction—one of the four fundamental forces of nature—in the extreme temperature and density conditions of quark–gluon plasma.

Bound states of a charm quark and a charm antiquark, known as charmonia or hidden-charm particles, are held together by the strong interaction and are excellent probes of quark–gluon plasma. In the plasma, their production is suppressed due to “screening” by the large number of quarks and gluons present in this form of matter.

The screening, and thus the suppression, increases with the temperature of the plasma and is expected to affect different charmonia to varying degrees. For example, the production of the ψ(2S) state, which is ten times more weakly bound and 20% more massive than the J/ψ state, is expected to be more suppressed than that of the J/ψ state.

This hierarchical suppression is not the only fate of charmonia in quark–gluon plasma. The large number of charm quarks and antiquarks in the plasma—up to about a hundred in head-on collisions—also gives rise to a mechanism, called recombination, that forms new charmonia and counters the suppression to a certain extent.

This process is expected to depend on the type and momentum of the charmonia, with the more weakly bound charmonia possibly being produced through recombination later in the evolution of the plasma, and charmonia with the lowest (transverse) momentum having the highest recombination rate.

Exploring the hidden charm of quark-gluon plasma
A lead–lead collision event recorded by ALICE in 2015. Credit: ALICE collaboration

Previous studies, which used data from CERN’s Super Proton Synchrotron and subsequently from the LHC, have shown that the production of the ψ(2S) state is indeed more suppressed than that of the J/ψ. ALICE has also previously provided evidence of the recombination mechanism in J/ψ production. But, until now, no studies of ψ(2S) production at low particle momentum had been precise enough to provide conclusive results in this momentum regime, preventing a complete picture of ψ(2S) production from being obtained.

The ALICE collaboration has now reported the first measurements of ψ(2S) production down to zero transverse momentum, based on lead–lead collision data from the LHC collected in 2015 and 2018.

The measurements show that, regardless of particle momentum, the ψ(2S) state is suppressed about two times more than the J/ψ. This is the first time that a clear hierarchy in suppression has been observed for the total production of charmonia at the LHC. A similar observation was previously reported by the LHC collaborations for bound states of a bottom quark and its antiquark.

When further studied as a function of particle momentum, the ψ(2S) suppression is seen to be reduced towards lower momentum. This feature, which was previously observed by ALICE for the J/ψ state, is a signature of the recombination process.

Future higher-precision studies of these and other charmonia using data from LHC Run 3, which started in July, may lead to a definitive understanding of the modification of hidden-charm particles and, as a result, of the strong interaction that holds them together, in the extreme environment of quark–gluon plasma.

Tapping hidden visual information: An all-in-one detector for thousands of colors

rainbows
Credit: Pixabay/CC0 Public Domain

Spectrometers are widely used throughout industry and research to detect and analyze light. Spectrometers measure the spectrum of light—its strength at different wavelengths, like the colors in a rainbow—and are an essential tool for identifying and analyzing specimens and materials. Integrated on-chip spectrometers would be of great benefit to a variety of technologies, including quality inspection platforms, security sensors, biomedical analyzers, health care systems, environmental monitoring tools, and space telescopes.

An international research team led by researchers at Aalto University has developed high-sensitivity spectrometers with high wavelength accuracy, high spectral resolution, and broad operation bandwidth, using only a single microchip-sized detector. The research behind this new ultra-miniaturized spectrometer was published today in the journal Science.

“Our single-detector spectrometer is an all-in-one device. We designed this optoelectronic-lab-on-a-chip with artificial intelligence replacing conventional hardware, such as optical and mechanical components. Therefore, our computational spectrometer does not require separate bulky components or array designs to disperse and filter light. It can achieve a high resolution comparable to benchtop systems but in a much smaller package,” says Postdoctoral Researcher Hoon Hahn Yoon.

“With our spectrometer, we can measure light intensity at each wavelength beyond the visible spectrum using a device at our fingertips. The device is entirely electrically controllable, so it has enormous potential for scalability and integration. Integrating it directly into portable devices such as smartphones and drones could advance our daily lives. Imagine that the next generation of our smartphone cameras could be fitted with hyperspectral cameras that outperform color cameras,” he adds.

Shrinking computational spectrometers is essential for their use in chips and implantable applications. Professor Zhipei Sun, the head of the research team, says, “Conventional spectrometers are bulky because they need optical and mechanical components, so their on-chip applications are limited. There is an emerging demand in this field to improve the performance and usability of spectrometers. From this point of view, miniaturized spectrometers are very important to offer high performance and new functions in all fields of science and industry.”

Professor Pertti Hakonen adds that “Finland and Aalto have invested in photonics research in recent years. For example, there has been great support from the Academy of Finland’s Center of Excellence on quantum technology, Flagship on Photonics Research and Innovation, InstituteQ, and the Otanano Infrastructure. Our new spectrometer is a clear demonstration of the success of these collaborative efforts. I believe that with further improvements in resolution and efficiency, these spectrometers could provide new tools for quantum information processing.”

Exploring the decay processes of a quantum state weakly coupled to a finite-size reservoir

Sketch of a quantum state (white dot) weakly coupled to the discrete levels of a chaotic quantum dot (black dots connected by lines). Credit: Micklitz et al

In quantum physics, Fermi’s golden rule, also known as the golden rule of time-dependent perturbation theory, is a formula that can be used to calculate the rate at which an initial quantum state transitions into a final state, which is composed of a continuum of states (a so-called “bath”). This valuable equation has been applied to numerous physics problems, particularly those for which it is important to consider how systems respond to imposed perturbations and settle into stationary states over time.

Fermi’s golden rule specifically applies to instances in which an initial  is weakly coupled to a continuum of other final states, which overlap its energy. Researchers at the Centro Brasileiro de Pesquisas Físicas, Princeton University, and Universität zu Köln have recently set out to investigate what happens when a quantum state is instead coupled to a set of discrete final states with a nonzero mean level spacing, as observed in recent many-body physics studies.

“The decay of a quantum state into some continuum of final states (i.e., a ‘bath’) is commonly associated with incoherent decay processes, as described by Fermi’s golden rule,” Tobias Micklitz, one of the researchers who carried out the study, told Phys.org. “A standard example for this is an excited atom emitting a photon into an infinite vacuum. Current date experimentations, on the other hand, routinely realize composite systems involving quantum states coupled to effectively finite size reservoirs that are composed of discrete sets of final states, rather than a continuum.”

While several past studies have identified systems in which quantum states are coupled to finite size reservoirs, understanding the conditions under which this happens, allowing the finite size reservoirs to effectively act as “baths” is a challenging task. The key objective of the recent work by Micklitz and his colleagues was to better understand the process through which a quantum state decays when coupled to a finite size .

“Our starting point was to consider generic finite size reservoirs lacking any specific symmetries,” Micklitz explained. “Such systems usually show quantum chaotic behavior and can be modeled by random matrices for which powerful analytical tools are available.”

To carry out their analyses, Micklitz and his colleagues used a combination of effective matrix integral techniques, which are commonly used in studies applying random matrix theory, a theory that summarizes the different properties of matrices with entries drawn randomly from different probability distributions. To benchmark the results of their analyses, they then used exact diagonalization, a powerful numerical technique often used by physicists to study individual quantum many-body systems.

“Initially we hadn’t expected the decay into a finite size reservoir to be described by such a complex time-dependence,” Micklitz said. “We found that the probability to reside in the weakly coupled level shows a non-motonous time-dependence with initial decay, followed by a raise, before saturating to a constant value. The temporal profile follows (in a large regime of parameters) the ‘spectral form factor,’ a well-studied object in the quantum chaos community, which encodes information on energy level correlations in the reservoir. This makes much sense in retrospective.”

Now published in Physical Review Letters, the recent study by this team of researchers offers a fully analytic description of a crucial and fundamental physics problem. More specifically, it offers a connection between the problem of how a quantum state decays into a set of discrete final states to the statistics associated with energy levels and wave functions in chaotic quantum systems.

“We relate the temporal profile of the probability of residence to the spectral form factor, and the ratio of the probability’s minimum and saturation values to the statistics of reservoir-eigenfunctions,” Micklitz added. “Our work focuses on a fundamental but also rather elementary example of relaxation into a finite size reservoir. We are now trying to address more complex systems, such as ensembles of spins coupled to a quantum dot. Hopefully, progress can be made using similar methods as those employed in our recent paper.”

Exploring light-driven molecular swing

Exploring light-driven molecular swing

Calculated coherence and energy transfer ratios. Calculated vibrational molecular coherence in the symmetric stretching vibrational mode of the DMSO2 molecule in solution displayed in a frame rotating with the vibrational eigenfrequency. The calculations are done with a the compressed pulse using the RWA, b the compressed pulse without the RWA, c the chirped pulse using the RWA, and d the chirped pulse without the RWA. The red curves include relaxation; the blue curves do not include relaxation. Each bump of the cycloid in b corresponds to one half-cycle of the electric field, showing that energy transfer from the excitation field to the molecular system is completed after 3 to 4 field cycles in the case of the FCE. In d, the time points that are maxima of CET(t) caused by the symmetric stretching vibration are marked with black dots. e Re-emitted (orange) and maximum absorbed (green) fraction of the impinging pulse energy versus concentration. The results were obtained from the impulsive-regime model (see Supplementary Information) with the FCE and ab initio Lorentz parameters. The solid lines include direct interactions with the surrounding water and the screening effect of the polarizable continuum, the dashed lines only the latter, and the dotted lines neither. For small concentrations, the maximum absorbed energy scales linearly with concentration. In contrast, the coherent re-emission, containing the spectroscopic information, scales quadratically with concentration. Credit: Nature Communications (2022). DOI: 10.1038/s41467-022-33477-5

When light impinges on molecules, it is absorbed and re-emitted. Advances in ultrafast laser technology have steadily improved the level of detail in studies of such light-matter interactions.

FRS, a laser spectroscopy method in which the electric field of laser pulses repeating millions of times per second is recorded with time resolution after passing through the sample, now provides even deeper insights: Scientists led by Prof. Dr. Regina de Vivie-Riedle (LMU/Department of Chemistry) and PD Dr. Ioachim Pupeza (LMU/Department of Physics, MPQ) show for the first time in theory and experiment how molecules gradually absorb the energy of the ultrashort light pulse in each individual optical cycle, and then release it again over a longer period of time, thereby converting it into spectroscopically meaningful light.

The study elucidates the mechanisms that fundamentally determine this energy transfer. It also develops and verifies a detailed quantum chemical model that can be used in the future to quantitatively predict even the smallest deviations from linear behavior.

A child on a swing sets it in motion with tilting movements of the body, which must be synchronized with the swing movement. This gradually adds energy to the swing, so that the deflection of the swing increases over time. Something similar happens when the alternating electromagnetic field of a short laser pulse interacts with a molecule, only about 100 trillion times faster: When the alternating field is synchronized with the vibrations between the atoms of the molecule, these vibration modes absorb more and more energy from the light pulse, and the vibration amplitude increases.

When the exciting field oscillations are over, the molecule continues to vibrate for a while—just like a swing after the person stops the tilting movements. Like an antenna, the slightly electrically charged atoms in motion then radiate a light field. Here, the frequency of the light field oscillation is determined by properties of the molecule such as atomic masses and bond strengths, which allows for an identification of the molecule.

Researchers from the attoworld team at MPQ and LMU, in collaboration with LMU researchers from the Department of Chemistry (Division of Theoretical Femtochemistry), have now distinguished these two constituent parts of the light field—on the one hand, the exciting light pulses, and on the other, the decaying light field oscillations—using time-resolved spectroscopy. In doing so, they investigated the behavior of organic molecules dissolved in water.

“While established laser spectroscopy methods usually only measure the spectrum and thus do not allow any information about the temporal distribution of the energy, our method can precisely track how the molecule absorbs a little more energy with each subsequent oscillation of the light field,” says Ioachim Pupeza, head of the experiment.

That the measurement method allows this temporal distinction is best illustrated by the fact that the scientists repeated the experiment, changing the duration of the exciting pulse but without changing its spectrum. This makes a big difference for the dynamic energy transfer between light and the vibrating molecule: Depending on the temporal structure of the laser pulse, the molecule can then absorb and release energy several times during the excitation.

In order to understand exactly which contributions are decisive for the energy transfer, the researchers have developed a supercomputer-based quantum chemical model. This can explain the results of the measurements without the aid of measured values. “This allows us to artificially switch off individual effects such as the collisions of the vibrating molecules with their environment, or even the dielectric properties of the environment, and thus elucidate their influence on the energy transfer,” explains Martin Peschel, one of the first authors of the study.

In the end, the energy re-emitted during the decaying light field oscillations is decisive for how much information can be obtained from a spectroscopic measurement. The work thus makes a valuable contribution to better understanding the efficiency of optical spectroscopies, for example with regard to molecular compositions of fluids or gases, with the objective of improving it further and further.

The research is published in Nature Communications.