A new approach for high-throughput quantitative phase microscopy

A hybrid bright/darkfield transport of intensity (HBDTI) approach for high-throughput quantitative phase microscopy significantly expands the space-bandwidth-product of a conventional microscope, extending the accessible sample spatial frequencies in the Fourier space well beyond the traditional coherent diffraction limit. Credit: Linpeng Lu, NJUST.

Cell organelles are involved in a variety of cellular life activities. Their dysfunction is closely related to the development and metastasis of cancer. Exploration of subcellular structures and their abnormal states facilitates insights into the mechanisms of pathologies, which may enable early diagnosis for more effective treatment.

The optical microscope, invented more than 400 years ago, has become an indispensable and ubiquitous instrument for the investigation of microscale objects in many areas of science and technology. In particular, fluorescence microscopy has achieved several leaps—from 2D wide-field, to 3D confocal, and then to super-resolution fluorescence microscopy, greatly promoting the development of modern life sciences.

Using conventional microscopes, researchers currently struggle to generate sufficient intrinsic contrast for unstained cells, due to their low absorption or weak scattering properties. Specific dyes or fluorescent labels can help with visualization, but long-term observation of live cells remains difficult to achieve.

Recently, quantitative phase imaging (QPI) has shown promise with its unique ability to quantify the phase delay of unlabeled specimens in a nondestructive way. Yet the throughput of an imaging platform is fundamentally limited by its optical system’s space-bandwidth product (SBP), and the SBP increase of a microscope is fundamentally confounded by the scale-dependent geometric aberrations of its optical elements. This results in a tradeoff between achievable image resolution and field of view (FOV).

Lead author Linpeng Lu, a PhD student in the SCILab, provides a vivid hand-painted animation as a helpful summary of the report. Credit: Lu et al., doi 10.1117/1.AP.4.5.056002.

An approach to achieving label-free, high-resolution, and large FOV microscopic imaging is needed to enable precise detection and quantitative analysis of subcellular features and events. To this end, researchers from Nanjing University of Science and Technology (NJUST) and the University of Hong Kong recently developed a label-free high-throughput microscopy method based on hybrid bright/darkfield illuminations.

As reported in Advanced Photonics, the “hybrid brightfield-darkfield transport of intensity” (HBDTI) approach for high-throughput quantitative phase microscopy significantly expands the accessible sample spatial frequencies in the Fourier space, extending the maximum achievable resolution by approximately fivefold over the coherent imaging diffraction limit.

Based on the principle of illumination multiplexing and synthetic aperture, they establish a forward imaging model of nonlinear brightfield and darkfield intensity transport. This model endows HBDTI with the ability to provide features beyond the coherent diffraction limit.

High-throughput computational microscopy imaging
QPI results of unlabeled HeLa cells. (a) Approximately 4000 HeLa cells on a ∼7.19  mm2 FOV. (b1) and (c1) Low-resolution brightfield (BF) in-focus intensity images of areas 1 and 2 in (a), respectively. (b2) and (c2) Low-resolution darkfield (DF) in-focus intensity images of (b1) and (c1), respectively. (b3) and (c3) Retrieval phase results of (b1) and (c1) using the FFT-based traditional transport of intensity equation (TIE) phase retrieval method, respectively. (b4) and (c4) Retrieval phase results of (b1) and (c1) utilizing the novel HBDTI method, respectively. Credit: Lu et al., doi 10.1117/1.AP.4.5.056002.

Using a commercial microscope with a 4x, 0.16NA objective lens, the team demonstrated HBDTI high-throughput imaging, attaining 488-nm half-width imaging resolution within an FOV of approximately 7.19 mm2, yielding a 25× increase in SBP over the case of coherent illumination.

Noninvasive high-throughput imaging enables delineation of subcellular structures in large-scale cell studies. According to corresponding author Chao Zuo, principal investigator of the Smart Computational Imaging Laboratory (SCILab) at NJUST, “HBDTI offers a simple, high-performance, low-cost, and universal imaging tool for quantitative analysis in life sciences and biomedical research. Given its capability for high-throughput QPI, HBDTI is expected to provide a powerful solution for cross-scale detection and analysis of subcellular structures in a large number of cell clusters.”

Zuo notes that further efforts are needed to promote the high-speed implementation of HBDTI in large-group live cell analysis.

More information: Linpeng Lu et al, Hybrid brightfield and darkfield transport of intensity approach for high-throughput quantitative phase microscopy, Advanced Photonics (2022). DOI: 10.1117/1.AP.4.5.056002

Provided by SPIE 

World’s first optical atomic clock with highly charged ions

Illustration of the laser interrogation of a highly charged ion clock (artwork). Credit: PTB

Highly charged ions are a common form of matter in the cosmos, where they are found, for example, in the sun or other stars. They are so called because they have lost many electrons and therefore have a high positive charge. This is why the outermost electrons are more strongly bound to the atomic nucleus than in neutral or weakly charged atoms.

For this reason, highly charged ions react less strongly to interference from external electromagnetic fields, but become more sensitive probes of fundamental effects of special relativity, quantum electrodynamics and the atomic nucleus.

“Therefore, we expected that an optical atomic clock with highly charged ions would help us to better test these fundamental theories”, explains PTB physicist Lukas Spieß. This hope has already been fulfilled: “We were able to detect the quantum electrodynamic nuclear recoil, an important theoretical prediction, in a five-electron system, which has not been achieved in any other experiment before.”

Beforehand, the team had to solve some fundamental problems, such as detection and cooling, in years of work: For atomic clocks, one has to cool the particles down extremely in order to stop them as much as possible and thus read out their frequency at rest. Highly charged ions, however, are produced by creating an extremely hot plasma.

Because of their extreme atomic structure, highly charged ions can’t be cooled directly with laser light, and standard detection methods can’t be used either. This was solved by a collaboration between MPIK in Heidelberg and the QUEST Institute at PTB by isolating a single highly charged argon ion from a hot plasma and storing it in an ion trap together with a singly charged beryllium ion.

This allows the highly charged ion to be cooled indirectly and studied by means of the beryllium ion. An advanced cryogenic trap system was then built at MPIK and finalized at PTB for the following experiments, which were carried out in part by students switching between the institutions.

Subsequently, a quantum algorithm developed at PTB succeeded in cooling the highly charged ion even further, namely close to the quantum mechanical ground state. This corresponded to a temperature of 200 millionths of a Kelvin above absolute zero. These results were already published in Nature in 2020 and in Physical Review X in 2021.

Now the researchers have successfully taken the next step: They have realized an optical atomic clock based on thirteen-fold charged argon ions and compared the ticking with the existing ytterbium ion clock at PTB. To do this, they had to analyze the system in great detail in order to understand, for example, the movement of the highly charged ion and the effects of external interference fields.

They achieved a measurement uncertainty of 2 parts in 1017—comparable to many currently operated optical atomic clocks. “We expect a further reduction of the uncertainty through technical improvements, which should bring us into the range of the best atomic clocks,” says research group leader Piet Schmidt.

The researchers have thus created a serious competitor to existing optical atomic clocks based on, for example, individual ytterbium ions or neutral strontium atoms. The methods used are universally applicable and allow many different highly charged ions to be studied. These include atomic systems that can be used to search for extensions of the Standard Model of particle physics.

Other highly charged ions are particularly sensitive to changes in the fine structure constant and to certain dark matter candidates that are required in models beyond the Standard Model but could not be detected with previous methods.

More information: Lukas Spieß, An optical atomic clock based on a highly charged ion, Nature (2022). DOI: 10.1038/s41586-022-05245-4. www.nature.com/articles/s41586-022-05245-4

Journal information: Physical Review X , Nature

Provided by Physikalisch-Technische Bundesanstalt

Researchers collaborate to better understand the weak nuclear force

Radial-plane cross-sectional view of the BPT showing a typical triple event. Credit: Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.128.202502

The weak nuclear force is currently not entirely understood, despite being one of the four fundamental forces of nature. In a pair of Physical Review Letters articles, a multi-institutional team, including theorists and experimentalists from Louisiana State University, Lawrence Livermore National Laboratory, Argonne National Laboratory and other institutions worked closely together to test physics beyond the “Standard Model” through high-precision measurements of nuclear beta decay.

By loading lithium-8 ions, an exotic heavy isotope of lithium with a less than one second half-life, in an ion trap, the experimental team was able to detect the energy and directions of the particles emitted in the beta decay of lithium-8 produced with the ATLAS accelerator at Argonne National Laboratory and held in an ion trap. Different underlying mechanisms for the weak nuclear force would give rise to distinct energy and angular distributions, which the team determined to unrivaled precision.

State-of-the-art calculations with the ab initio symmetry-adapted no-core shell model, developed at Louisiana State University, had to be performed to precisely account for typically neglected effects that are 100 times smaller than the dominant decay contributions. However, since the experiments have achieved remarkable precision, it is now required to confront the systematic uncertainties of such corrections that are difficult to be measured.

In their paper, “Impact of Clustering on the 8Li Beta Decay and Recoil Form Factors,” the LSU-led collaboration places unprecedented constraints on recoil corrections in the β decay of 8Li, by identifying a strong correlation between them and the 8Li ground state quadrupole moment in large-scale ab initio calculations.

The results are essential for improving the sensitivity of high-precision experiments that probe the weak interaction theory and test physics beyond the Standard Model. Dr. Grigor Sargsyan led the theoretical developments while he was a Ph.D. student at LSU, and is currently a postdoctoral researcher at Lawrence Livermore National Laboratory (LLNL).

In “Improved Limit on Tensor Currents in the Weak Interaction from 8Li β Decay,” researchers present the most precise measurement of tensor currents in the low-energy regime by examining the β−¯ν correlation of trapped 8Li ions with the Beta-decay Paul Trap. The results are found to be consistent with the Standard Model prediction, ruling out certain possible sources of “new” physics and setting the bar for precision measurements of this kind.

“This has important implications for understanding the physics of the tensor current contribution to the weak interaction,” said LSU Assistant Professor Alexis Mercenne. “Heretofore, the data has favored only vector and axial-vector couplings in the electroweak Lagrangian, but it has been suggested that other Lorentz-invariant interactions such as tensor, scalar, and pseudoscalar, can arise in the Standard Model extensions.”

“These are remarkable findings—the level of theoretical precision reached in ab initio theory beyond the lightest nuclei is unprecedented, and opens the path to novel high-precision predictions in atomic nuclei rooted in first principles,” said LSU Associate Professor Kristina Launey.

“In addition, no one expected that these theoretical developments would unveil a new state in 8Be nucleus that has not been measured yet. This nucleus is notoriously difficult to model due to its cluster structure and collective correlations, but become feasible for calculations in the ab initio symmetry-adapted no-core shell-model framework.”

The excitement of modern nuclear physics is its interdisciplinary nature and the use of a wide range of techniques and tools. LSU has both experimental and theoretical research groups in nuclear physics, with strong connections to the high-energy physics and astrophysics/space science groups. The principal focus of the experimental and theoretical groups is in the area of low-energy nuclear structure and reactions, including the study of nuclei far from stability and applications to astrophysics.

More information: G. H. Sargsyan et al, Impact of Clustering on the Li8 β Decay and Recoil Form Factors, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.128.202503

M. T. Burkey et al, Improved Limit on Tensor Currents in the Weak Interaction from Li8 β Decay, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.128.202502

Journal information: Physical Review Letters

Provided by Louisiana State University

Study proves a generalization of Bell’s theorem: Quantum correlations are genuinely tripartite and nonlocal

Credit: Marya Kuderska

Quantum theory predicts the existence of so-called tripartite-entangled states, in which three quantum particles are related in a way that has no counterpart in classical physics. Theoretical physicists would like to understand how well new theories, alternatives to quantum theory, might be able to reproduce the behavior of these states.

John Clauser, Alain Aspect and Anton Zeilinger, whose work was recently recognized by the Nobel Committee, have experimentally proven Bell’s theorem, showing that no local hidden-variable alternative to quantum theory can reproduce this behavior. In other words, they showed that quantum correlations are nonlocal.

Researchers at the University of Science and Technology of China, Institute of Photonic Sciences, Università della Svizzera Italiana and Perimeter Institute of Theoretical Physics have recently carried out an experimental study generalizing these findings, by considering new potential theories. Their findings, published in Physical Review Letters, suggest that the correlations achieved by the tripartite-entangled state used in their experiment cannot be explained by an hypothetical theory involving a generalization of bipartite entanglement, called “exotic sources of two particles,” in addition to a local hidden-variable theory.

“The main objective of our study was to prove that the behavior of a three particle quantum source (e.g., a source of three photons) cannot be reproduced by any new hypothetical theory (replacing quantum theory, yet to be discovered) which only involves exotic pairs of two particle described by new physical laws’ and a local hidden variable model,” Marc-Olivier Renou, one of the authors of the paper, told Phys.org.

Gaël Massé, a second author, explains: “To do this, we used the idea contained in the ‘inflation technique,’ invented by Elie Wolfe, one of our coauthor. If we imagine a pair of two particles described by new physical laws, then even if we have no idea how to describe them we can still create a copy of this pair and make all the particles interact together in a new way. While this technique seems elementary, it has often proved to be a very powerful tool to address theoretical abstract concepts.”

Study proves a generalization of bell theorem: quantum correlations are genuinely tripartite nonlocal
Credit: Marya Kuderska

In their paper, the researchers first derived a new device-independent witness that could falsify causal theories with bipartite nonclassical resources. Then, through a lab experiment performed by Huan Cao and Chao Zhang, they showed that some tripartite-entangled state (called the “GHZ state”) could obtain, in practice, correlations that violate this witness.

“Using a high-performance photonic GHZ3 state with fidelities of 0.9741±0.002, we provide a clear experimental violation of that witness by more than 26.3 standard deviations, under the locality and fair sampling assumption,” the team explained in their paper. “We generalize our Letter to the |GHZ4⟩ state, obtaining correlations that cannot be explained by any causal theory limited to tripartite nonclassical common causes assisted with unlimited shared randomness.”

The recent work is a generalization of Bell’s theorem. Its most remarkable achievement is that it reaches beyond what physicists previously thought was possible in constraining potential alternative theories to quantum theory.

Study proves a generalization of bell theorem: quantum correlations are genuinely tripartite nonlocal
Credit: Marya Kuderska

“Bell ruled out the possibility that quantum correlations can be explained by a local hidden variable model (i.e., shared randomness),” Xavier Coiteux-Roy, a coauthor of the study, explains. “We went a bit further, by proving that even if you add ‘bipartite exotic sources’ in your theory, it still doesn’t work. In fact, we generalized the result, showing that if you add tripartite, quadripartite, and other exotic sources, it still doesn’t work. You really need to involve N-partite exotic sources for any N, whatever high it is, as is done by quantum theory.” He concludes, “Note that experiment has imperfections, called loopholes. Realizing an experiment without these loopholes, in particular the post-selection loophole, is a great challenge for experimentalists for the next years.”

Based on their findings, the team concluded that nature’s correlations are genuinely multipartite nonlocal. The experiments they carried out so far allowed them to definitively exclude theories of bipartite and tripartite exotic sources, but they are now thinking of evaluating other alternatives to quantum theory.

“We are now trying to understand how far this idea can go, and how far we can exclude potential alternatives to quantum theory by just looking at concrete experimental results, without assuming that they are explained by quantum theory,” Renou added. “This might eventually allow us to exclude all potential alternatives to quantum theory.”

More information: Huan Cao et al, Experimental Demonstration that No Tripartite-Nonlocal Causal Theory Explains Nature’s Correlations, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.150402

Journal information: Physical Review Letters 

© 2022 Science X Network

Light-analyzing ‘lab on a chip’ opens door to widespread use of portable spectrometers

Spectrometer on a chip. Credit: Oregon State

Scientists including an Oregon State University materials researcher have developed a better tool to measure light, contributing to a field known as optical spectrometry in a way that could improve everything from smartphone cameras to environmental monitoring.

The study, published today in Science, was led by Finland’s Aalto University and resulted in a powerful, ultra-tiny spectrometer that fits on a microchip and is operated using artificial intelligence.

The research involved a comparatively new class of super-thin materials known as two-dimensional semiconductors, and the upshot is a proof of concept for a spectrometer that could be readily incorporated into a variety of technologies—including quality inspection platforms, security sensors, biomedical analyzers and space telescopes.

“We’ve demonstrated a way of building spectrometers that are far more miniature than what is typically used today,” said Ethan Minot, a professor of physics in the OSU College of Science. “Spectrometers measure the strength of light at different wavelengths and are super useful in lots of industries and all fields of science for identifying samples and characterizing materials.”

Traditional spectrometers require bulky optical and mechanical components, whereas the new device could fit on the end of a human hair, Minot said. The new research suggests those components can be replaced with novel semiconductor materials and AI, allowing spectrometers to be dramatically scaled down in size from the current smallest ones, which are about the size of a grape.

“Our spectrometer does not require assembling separate optical and mechanical components or array designs to disperse and filter light,” said Hoon Hahn Yoon, who led the study with Aalto University colleague Zhipei Sun Yoon. “Moreover, it can achieve a high resolution comparable to benchtop systems but in a much smaller package.”

The device is 100% electrically controllable regarding the colors of light it absorbs, which gives it massive potential for scalability and widespread usability, the researchers say.

“Integrating it directly into portable devices such as smartphones and drones could advance our daily lives,” Yoon said. “Imagine that the next generation of our smartphone cameras could be hyperspectral cameras.”

Those hyperspectral cameras could capture and analyze information not just from visible wavelengths but also allow for infrared imaging and analysis.

“It’s exciting that our spectrometer opens up possibilities for all sorts of new everyday gadgets, and instruments to do new science as well,” Minot said.

In medicine, for example, spectrometers are already being tested for their ability to identify subtle changes in human tissue such as the difference between tumors and healthy tissue.

For environmental monitoring, Minot added, spectrometers can detect exactly what kind of pollution is in the air, water or ground, and how much of it is there.

“It would be nice to have low-cost, portable spectrometers doing this work for us,” he said. “And in the educational setting, the hands-on teaching of science concepts would be more effective with inexpensive, compact spectrometers.”

Applications abound as well for science-oriented hobbyists, Minot said.

“If you’re into astronomy, you might be interested in measuring the spectrum of light that you collect with your telescope and having that information identify a star or planet,” he said. “If geology is your hobby, you could identify gemstones by measuring the spectrum of light they absorb.”

Minot thinks that as work with two-dimensional semiconductors progresses, “we’ll be rapidly discovering new ways to use their novel optical and electronic properties.” Research into 2D semiconductors has been going on in earnest for only a dozen years, starting with the study of graphene, carbon arranged in a honeycomb lattice with a thickness of one atom.

“It’s really exciting,” Minot said. “I believe we’ll continue to have interesting breakthroughs by studying two-dimensional semiconductors.”

In addition to Minot, Yoon and Sun, the collaboration included scientists from Shanghai Jiao Tong University, Zhejiang University, Sichuan University, Yonsei University and University of Cambridge, as well as other researchers from Aalto University.

Universal parity quantum computing, a new architecture that overcomes performance limitations

Illustration of the modified LHZ architecture with logical lines. Three- and four-body constraints are represented by light gray triangles and squares between corresponding qubits. Data qubits with single logical indices are added as an additional row at the bottom of the architecture to allow direct access to logical Rz rotations. Colored lines connect all qubits whose labels contain the same logical index. Logical Rx rotations can be realized with chains of cnot gates along the corresponding line. Credit: Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.180503

The computing power of quantum machines is currently still very low. Increasing performance is a major challenge. Physicists at the University of Innsbruck, Austria, now present a new architecture for a universal quantum computer that overcomes such limitations and could be the basis of the next generation of quantum computers soon.

Quantum bits (qubits) in a quantum computer serve as a computing unit and memory at the same time. Because quantum information cannot be copied, it cannot be stored in memory as in a classical computer. Due to this limitation, all qubits in a quantum computer must be able to interact with each other.

This is currently still a major challenge for building powerful quantum computers. In 2015, theoretical physicist Wolfgang Lechner, together with Philipp Hauke and Peter Zoller, addressed this difficulty and proposed a new architecture for a quantum computer, now named LHZ architecture after the authors.

“This architecture was originally designed for optimization problems,” says Wolfgang Lechner of the Department of Theoretical Physics at the University of Innsbruck, Austria. “In the process, we reduced the architecture to a minimum in order to solve these optimization problems as efficiently as possible.”

The physical qubits in this architecture do not represent individual bits, but encode the relative coordination between the bits. “This means that not all qubits have to interact with each other anymore,” explains Wolfgang Lechner. With his team, he has now shown that this parity concept is also suitable for a universal quantum computer.

Complex operations are simplified

Parity computers can perform operations between two or more qubits on a single qubit. “Existing quantum computers already implement such operations very well on a small scale,” Michael Fellner from Wolfgang Lechner’s team explains. “However, as the number of qubits increases, it becomes more and more complex to implement these gate operations.”

In two publications in Physical Review Letters and Physical Review A, the Innsbruck scientists now show that parity computers can, for example, perform quantum Fourier transformations—a fundamental building block of many quantum algorithms—with significantly fewer computation steps and thus more quickly. “The high parallelism of our architecture means that, for example, the well-known Shor algorithm for factoring numbers can be executed very efficiently,” Fellner explains.

Two-stage error correction

The new concept also offers hardware-efficient error correction. Because quantum systems are very sensitive to disturbances, quantum computers must correct errors continuously. Significant resources must be devoted to protecting quantum information, which greatly increases the number of qubits required. “Our model operates with a two-stage error correction, one type of error (bit flip error or phase error) is prevented by the hardware used,” write Anette Messinger and Kilian Ender, also members of the Innsbruck research team.

There are already initial experimental approaches for this on different platforms. “The other type of error can be detected and corrected via the software,” Messinger and Ender say. This would allow a next generation of universal quantum computers to be realized with manageable effort.

The spin-off company ParityQC, co-founded by Wolfgang Lechner and Magdalena Hauser, is already working in Innsbruck with partners from science and industry on possible implementations of the new model.

Electrons with Planckian scattering in strange metals follow standard rules of orbital motion in a magnet

The 100-tesla magnet system at the National Laboratory for Intense Magnetic Fields in Toulouse, France. Credit: Nanda Gonzague.

Strange metals, or non-Fermi liquids, are distinct states of matter that have been observed in different quantum materials, including cuprate superconductors. These states are characterized by unusual conductive properties, such as a resistivity that is linearly associated with temperature (T-linear).

In the strange metal phase of matter, electrons undergo what is known as “Planckian dissipation,” a high scattering rate that linearly increases as the temperature rises. This T-linear, strong electron scattering is anomalous for metals, which typically present a quadratic temperature dependence (T2), as predicted by the standard theory of metals.

Researchers at Université de Sherbrooke in Canada, Laboratoire National des Champs Magnétiques Intenses in France, and other institutes worldwide have recently carried out a study exploring the possibility that the resistivity of strange metals is not only associated with temperature, but also with an applied magnetic field. This magnetic field linearity had been previously observed in some cuprates and pnictides, with some physicists suggesting that it could also be linked to Planckian dissipation.

The researchers carried out their experiments on two specific cuprate strange metals, namely Nd0.4La1.6−xSrxCuO4 and La2−xSrxCuO4. Their findings, published in Nature Physics, suggest that the resistivity of these two strange metals is consistent with the predictions of the standard Boltzmann theory of electron motion in a magnetic field in all ways, highlighting no anomaly associated with Planckian dissipation.

“We wanted to investigate the field dependence of Planckian scattering rate in the strange metal phase of cuprate superconductors, in particular in NdLSCO, that its scattering rate was previously measured with Angle Dependence Magnetoresistance (ADMR) experiments,” Amirreza Ataei, one of the researchers who carried out the study, told Phys.org. “In this material, due to a relatively low critical temperature, Tc, we had access to one of the largest measured ranges of B-linear resistivity and were able to reproduce the magnetoresistance over this magnetic field range using the standard Boltzmann theory.”

Study shows that electrons with Planckian scattering in strange metals follow standard rules of orbital motion in a magn
The sample holder that was used for high field measurements at Toulouse. The length of the black single crystal sample is less than 2 mm, the contacts were made with silver epoxy and 25 micrometer wires, and the sample is mounted on a sapphire plate Credit: Ataei M.Sc. thesis https://savoirs.usherbrooke.ca/handle/11143/15285

A key objective of the recent work by Ataei and his colleagues was to determine whether the in-plane magnetoresistance in the strange metal phase of namely Nd0.4La1.6−xSrxCuO4 and La2−xSrxCuOwas anomalous in instances where the magnetic field and electric current were in parallel. Ultimately, the measurements they collected suggest that it was not.

“We expect our findings to have a big impact in the field of Planckian dissipation, a major mystery in condensed-matter physics with intriguing connections to the physics of black holes,” Ataei explained. “We show that this enigmatic phenomenon is insensitive to magnetic field, up to 85 T, one of the highest achievable magnetic fields in the world.”

Study shows that electrons with Planckian scattering in strange metals follow standard rules of orbital motion in a magn
Louis Taillefer, Cyril Proust and Seyed Amirreza Ataei. Credit: Michel Caron – UdeS.

Overall, the results gathered by this team of researchers would seem to challenge the hypothesis that the linear dependence of resistivity on a magnetic field observed in some strange metals is associated with Planckian dissipation. In contrast, their experimental data suggests that Planckian dissipation is only anomalous in its temperature dependence, while its field dependence is aligned with standard theoretical predictions.

“We now plan to extend the scope of this research to different quantum materials in the strange metal phase or in its proximity,” Ataei added.

New hybrid structures could pave the way to more stable quantum computers

New hybrid structures could pave the way to more stable quantum computers
RHEED patterns during MBE growth. (a) Bilayer graphene terminated 6H-SiC(0001) substrate. (b) Monolayer NbSe2 film grown on bilayer graphene. (c) 5 QL Bi2Se3/monolayer NbSe2 heterostructure grown on bilayer graphene. Credit: Nature Materials (2022). DOI: 10.1038/s41563-022-01386-z

A new way to combine two materials with special electrical properties—a monolayer superconductor and a topological insulator—provides the best platform to date to explore an unusual form of superconductivity called topological superconductivity. The combination could provide the basis for topological quantum computers that are more stable than their traditional counterparts.

Superconductors—used in powerful magnets, digital circuits, and imaging devices—allow the electric current to pass without resistance, while topological insulators are thin films only a few atoms thick that restrict the movement of electrons to their edges, which can result in unique properties. A team led by researchers at Penn State describe how they have paired the two materials in a paper appearing Oct. 27 in the journal Nature Materials.

“The future of quantum computing depends on a kind of material that we call a topological superconductor, which can be formed by combining a topological insulator with a superconductor, but the actual process of combining these two materials is challenging,” said Cui-Zu Chang, Henry W. Knerr Early Career Professor and Associate Professor of Physics at Penn State and leader of the research team.

“In this study, we used a technique called molecular beam epitaxy to synthesize both topological insulator and superconductor films and create a two-dimensional heterostructure that is an excellent platform to explore the phenomenon of topological superconductivity.”

In previous experiments to combine the two materials, the superconductivity in thin films usually disappears once a topological insulator layer is grown on top. Physicists have been able to add a topological insulator film onto a three-dimensional “bulk” superconductor and retain the properties of both materials.

However, applications for topological superconductors, such as chips with low power consumption inside quantum computers or smartphones, would need to be two-dimensional.

In this paper, the research team stacked a topological insulator film made of bismuth selenide (Bi2Se3) with different thicknesses on a superconductor film made of monolayer niobium diselenide (NbSe2), resulting in a two-dimensional end-product. By synthesizing the heterostructures at very lower temperature, the team was able to retain both the topological and superconducting properties.

“In superconductors, electrons form ‘Cooper pairs’ and can flow with zero resistance, but a strong magnetic field can break those pairs,” said Hemian Yi, a postdoctoral scholar in the Chang Research Group at Penn State and the first author of the paper.

“The monolayer superconductor film we used is known for its ‘Ising-type superconductivity,’ which means that the Cooper pairs are very robust against the in-plane magnetic fields. We would also expect the topological superconducting phase formed in our heterostructures to be robust in this way.”

By subtly adjusting the thickness of the topological insulator, the researchers found that the heterostructure shifted from Ising-type superconductivity—where the electron spin is perpendicular to the film—to another kind of superconductivity called “Rashba-type superconductivity”—where the electron spin is parallel to the film.

This phenomenon is also observed in the researchers’ theoretical calculations and simulations.

This heterostructure could also be a good platform for the exploration of Majorana fermions, an elusive particle that would be a major contributor to making a topological quantum computer more stable than its predecessors.

“This is an excellent platform for the exploration of topological superconductors, and we are hopeful that we will find evidence of topological superconductivity in our continuing work,” said Chang. “Once we have solid evidence of topological superconductivity and demonstrate Majorana physics, then this type of system could be adapted for quantum computing and other applications.”

A faster way to find and study topological materials

Data structure and model architecture.(a) A schematic of the full XANES spectrum for arepresentative sample in the dataset, showing the signatures from di↵erent absorbing elements on an absolute energyscale. For a given material, the inputs to the NN classifier consist of one-hot encoded atom types (left) and XANESspectra (right) for all absorbing atoms. (b) Schematic of the neural network architecture predicting the (binary)topological class using spectral and atom-type inputs. Spectral and atom-type inputs are individually embedded byfully-connected layers before performing a direct product between corresponding spectral and atomic channels.These composite features are aggregated for a given material and passed to a final fully-connected block to predictthe topological class. Credit: Advanced Materials (2022). DOI: 10.1002/adma.202204113

Topological materials, an exotic class of materials whose surfaces exhibit different electrical or functional properties than their interiors, have been a hot area of research since their experimental realization in 2007—a finding that sparked further research and precipitated a Nobel Prize in Physics in 2016. These materials are thought to have great potential in a variety of fields, and might someday be used in ultraefficient electronic or optical devices, or key components of quantum computers.

But there are many thousands of compounds that may theoretically have topological characteristics, and synthesizing and testing even one such material to determine its topological properties can take months of experiments and analysis. Now a team of researchers at MIT and elsewhere have come up with a new approach that can rapidly screen candidate materials and determine with more than 90 percent accuracy whether they are topological.

Using this new method, the researchers have produced a list candidate materials. A few of these were already known to have topological properties, but the rest are newly predicted by this approach.

The findings are reported in the journal Advanced Materials in a paper by Mingda Li, the Class ’47 Career Development Professor at MIT, graduate students (and twin sisters) Nina Andrejevic at MIT and Jovana Andrejevic at Harvard University, and seven others at MIT, Harvard, Princeton University, and Argonne National Laboratory.

Topological materials are named after a branch of mathematics that describes shapes based on their invariant characteristics, which persist no matter how much an object is continuously stretched or squeezed out of its original shape. Topological materials, similarly, have properties that remain constant despite changes in their conditions, such as external perturbations or impurities.

There are several varieties of topological materials, including semiconductors, conductors, and semimetals, among others. Initially, it was thought that there were only a handful of such materials, but recent theory and calculations have predicted that in fact thousands of different compounds may have at least some topological characteristics. The hard part is figuring out experimentally which compounds may be topological.

Applications for such materials span a wide range, including devices that could perform computational and data storage functions similarly to silicon-based devices but with far less energy loss, or devices to harvest electricity efficiently from waste heat, for example in thermal power plants or in electronic devices. Topological materials can also have superconducting properties, which could potentially be used to build the quantum bits for topological quantum computers.

But all of this relies on developing or discovering the right materials. “To study a topological material, you first have to confirm whether the material is topological or not,” Li says, “and that part is a hard problem to solve in the traditional way.”

A method called density functional theory is used to perform initial calculations, which then need to be followed with complex experiments that require cleaving a piece of the material to atomic-level flatness and probing it with instruments under high-vacuum conditions.

“Most materials cannot even be measured due to various technical difficulties,” Nina Andrejevic says. But for those that can, the process can take a long time. “It’s a really painstaking procedure,” she says.

Sensitivity to spectral energy resolution. The overall recall, precision, and F1 scores for (a) topological and (b) trivial examples as a function of the energy interval E between sampled points of the XANES spectra. Scores are presented for both the SVM and NN models, with scores from the atom-type only models (SVM-type and NN-type) shown as a reference by the dotted lines. Spectra were resampled at lower resolutions by computing their average values over length E intervals along the energy axis for varied E. To maintain the same number of neurons across all resolutions, the averaged values were copied by the number of original samples within each interval such that all spectral inputs have length 200. Credit: Advanced Materials (2022). DOI: 10.1002/adma.202204113


Whereas the traditional approach relies on measuring the material’s photoemissions or tunneling electrons, Li explains, the new technique he and his team developed relies on absorption, specifically, the way the material absorbs X-rays.

Unlike the expensive apparatus needed for the conventional tests, X-ray absorption spectrometers are readily available and can operate at room temperature and atmospheric pressure, with no vacuum needed. Such measurements are widely conducted in biology, chemistry, battery research, and many other applications, but they had not previously been applied to identifying topological quantum materials.

X-ray absorption spectroscopy provides characteristic spectral data from a given sample of material. The next challenge is to interpret that data and how it relates to the topological properties. For that, the team turned to a machine-learning model, feeding in a collection of data on the X-ray absorption spectra of known topological and nontopological materials, and training the model to find the patterns that relate the two. And it did indeed find such correlations.

“Surprisingly, this approach was over 90 percent accurate when tested on more than 1500 known materials,” Nina Andrejevic says, adding that the predictions take only seconds. “This is an exciting result given the complexity of the conventional process.”

Though the model works, as with many results from machine learning, researchers don’t yet know exactly why it works or what the underlying mechanism is that links the X-ray absorption to the topological properties.

“While the learned function relating X-ray spectra to topology is complex, the result may suggest that certain attributes the measurement is sensitive to, such as local atomic structures, are key topological indicators,” Jovana Andrejevic says.

The team has used the model to construct a periodic table that displays the model’s overall accuracy on compounds made from each of the elements. It serves as a tool to help researchers home in on families of compounds that may offer the right characteristics for a given application.

The researchers have also produced a preliminary study of compounds that they have used this X-ray method on, without advance knowledge of their topological status, and compiled a list of 100 promising candidate materials—a few of which were already known to be topological.

“This work represents one of the first uses of machine learning to understand what experiments are trying to tell us about complex materials,” says Joel Moore, the Chern-Simons Professor of Physics at the University of California at Berkeley, who was not associated with this research.

“Many kinds of topological materials are well-understood theoretically in principle, but finding material candidates and verifying that they have the right topology of their bands can be a challenge. Machine learning seems to offer a new way to address this challenge: Even experimental data whose meaning is not immediately obvious to a human can be analyzed by the algorithm, and I am excited to see what new materials will result from this way of looking.”

Anatoly Frenkel, a professor in the Department of Materials Science and Chemical Engineering at Stony Brook University and a senior chemist at Brookhaven National Laboratory, further commented that “it was a really nice idea to consider that the X-ray absorption spectrum may hold a key to the topological character in the measured sample.”

How do you solve a problem like a proton? Smash it, then build it back with machine learning

Looking into the HERA tunnel: Berkeley Lab scientists have developed new machine learning algorithms to accelerate the analysis of data collected decades ago by HERA, the world’s most powerful electron-proton collider that ran at the DESY national research center in Germany from 1992 to 2007. Credit: DESY

Protons are tiny yet they carry a lot of heft. They inhabit the center of every atom in the universe and play a critical role in one of the strongest forces in nature.

And yet, protons have a down-to-earth side, too.

Like most particles, protons have spin that act like tiny magnets. Flipping a proton’s spin or polarity may sound like science fiction, but it is the basis of technological breakthroughs that have become essential to our daily lives, such as magnetic resonance imaging (MRI), the invaluable medical diagnostics tool.

Despite such advancements, the proton’s inner workings remain a mystery.

“Basically everything around you exists because of protons—and yet we still don’t understand everything about them. One huge puzzle that physicists want to solve is the proton’s spin,” said Ben Nachman, a physicist who leads the Machine Learning Group in the Physics Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

Understanding how and why protons spin could lead to technological advancements we can’t even imagine today, and help us understand the strong force, a fundamental property that gives all protons and therefore atoms mass.

But it’s not such an easy problem to solve. For one, you can’t exactly pick up a proton and place it in a petri dish: Protons are unfathomably small—their radius is a hair shy of one quadrillionth of a meter, and visible light passes right through them. What’s more, you can’t even observe their insides with the world’s most powerful electron microscopes.

Recent work by Nachman and his team could bring us closer to solving this perplexing proton puzzle.

As a member of the H1 Collaboration—an international group that now includes 150 scientists from 50 institutes and 15 countries, and is based at the DESY national research center in Germany—Nachman has been developing new machine learning algorithms to accelerate the analysis of data collected decades ago by HERA, the world’s most powerful electron-proton collider that ran at DESY from 1992 to 2007.

HERA—a ring 4 miles in circumference—worked like a giant microscope that accelerated both electrons and protons to nearly the speed of light. The particles were collided head-on, which could scatter a proton into its constituent parts: quarks and gluons.

Scientists at HERA took measurements of the particle debris cascading from these electron-proton collisions, what physicists call “deep inelastic scattering,” through sophisticated cameras called particle detectors, one of which was the H1 detector.

Unfolding secrets of the strong force

The H1 stopped collecting data in 2007, the year HERA was decommissioned. Today, the H1 Collaboration is still analyzing the data and publishing results in scientific journals.

The HERA electron-proton collider accelerated both electrons and protons to nearly the speed of light. The particles were collided head-on, which could scatter a proton into its constituent parts: quarks (shown as green and purple balls in the illustration above) and gluons (illustrated as black coils). Credit: DESY


It can take a year or more when using conventional computational techniques to measure quantities related to proton structure and the strong force, such as how many particles are produced when a proton collides with an electron.

And if a researcher wants to examine a different quantity, such as how fast particles are flying in the wake of a quark-gluon jet stream, they would have to start the long computational process all over again, and wait yet another year.

A new machine learning tool called OmniFold—which Nachman co-developed—can simultaneously measure many quantities at once, thereby reducing the amount of time to run an analysis from years to minutes.

OmniFold does this by using neural networks at once to combine computer simulations with data. (A neural network is a machine learning tool that processes complex data that is impossible for scientists to do manually.)

Nachman and his team applied OmniFold to H1 experimental data for the first time in a June issue of the journal Physical Review Letters and more recently at the 2022 Deep Inelastic Scattering (DIS) Conference.

To develop OmniFold and test its robustness against H1 data, Nachman and Vinicius Mikuni, a postdoctoral researcher in the Data and Analytics Services (DAS) group at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) and a NERSC Exascale Science Applications Program for Learning fellow, needed a supercomputer with a lot of powerful GPUs (graphics processing units), Nachman said.

Coincidentally, Perlmutter, a new supercomputer designed to support simulation, data analytics, and artificial intelligence experiments requiring multiple GPUs at a time, had just opened up in the summer of 2021 for an “early science phase,” allowing scientists to test the system on real data. (The Perlmutter supercomputer is named for the Berkeley Lab cosmologist and Nobel laureate Saul Perlmutter.)

“Because the Perlmutter supercomputer allowed us to use 128 GPUs simultaneously, we were able to run all the steps of the analysis, from data processing to the derivation of the results, in less than a week instead of months. This improvement allows us to quickly optimize the neural networks we trained and to achieve a more precise result for the observables we measured,” said Mikuni, who is also a member of the H1 Collaboration.

A central task in these measurements is accounting for detector distortions. The H1 detector, like a watchful guard standing sentry at the entrance of a sold-out concert arena, monitors particles as they fly through it. One source of measurement errors happens when particles fly around the detector rather than through it, for example—sort of like a ticketless concert goer jumping over an unmonitored fence rather than entering through the ticketed security gate.

Correcting for all distortions simultaneously had not been possible due to limited computational methods available at the time. “Our understanding of subatomic physics and data analysis techniques have advanced significantly since 2007, and so today, scientists can use new insights to analyze the H1 data,” Nachman said.

Scientists today have a renewed interest in HERA’s particle experiments, as they hope to use the data—and more precise computer simulations informed by tools like OmniFold—to aid in the analysis of results from future electron-proton experiments, such as at the Department of Energy’s next-generation Electron-Ion Collider (EIC).

The EIC—to be built at Brookhaven National Laboratory in partnership with the Thomas Jefferson National Accelerator Facility—will be a powerful and versatile new machine capable of colliding high-energy beams of polarized electrons with a wide range of ions (or charged atoms) across many energies, including polarized protons and some polarized ions.

“It’s exciting to think that our method could one day help scientists answer questions that still remain about the strong force,” Nachman said.

“Even though this work might not lead to practical applications in the near term, understanding the building blocks of nature is why we’re here—to seek the ultimate truth. These are steps to understanding at the most basic level what everything is made of. That is what drives me. If we don’t do the research now, we will never know what exciting new technological advances we’ll get to benefit future societies.”