Water has puzzled scientists for decades. For the last 30 years or so, they have theorized that when cooled down to a very low temperature like -100C, water might be able to separate into two liquid phases of different densities. Like oil and water, these phases don’t mix and may help explain some of water’s other strange behavior, like how it becomes less dense as it cools.
It’s almost impossible to study this phenomenon in a lab, though, because water crystallizes into ice so quickly at such low temperatures. Now, new research from the Georgia Institute of Technology uses machine learning models to better understand water’s phase changes, opening more avenues for a better theoretical understanding of various substances. With this technique, the researchers found strong computational evidence in support of water’s liquid-liquid transition that can be applied to real-world systems that use water to operate.
“We are doing this with very detailed quantum chemistry calculations that are trying to be as close as possible to the real physics and physical chemistry of water,” said Thomas Gartner, an assistant professor in the School of Chemical and Biomolecular Engineering at Georgia Tech. “This is the first time anyone has been able to study this transition with this level of accuracy.”
To better understand how water interacts, the researchers ran molecular simulations on supercomputers, which Gartner compared to a virtual microscope.
“If you had an infinitely powerful microscope, you could zoom in all the way down to the level of the individual molecules and watch them move and interact in real time,” he said. “This is what we’re doing by creating almost a computational movie.”
The researchers analyzed how the molecules move and characterized the liquid structure at different water temperatures and pressures, mimicking the phase separation between the high and low-density liquids. They collected extensive data—running some simulations for up to a year—and continued to fine-tune their algorithms for more accurate results.
Even a decade ago, running such long and detailed simulations wouldn’t have been possible, but machine learning today offered a shortcut. The researchers used a machine learning algorithm that calculated the energy of how water molecules interact with each other. This model performed the calculation significantly faster than traditional techniques, allowing the simulations to progress much more efficiently.
Machine learning isn’t perfect, so these long simulations also improved the accuracy of the predictions. The researchers were careful to test their predictions with different types of simulation algorithms. If multiple simulations gave similar results, then it validated their accuracy.
“One of the challenges with this work is that there’s not a lot of data that we can compare to because it’s a problem that’s almost impossible to study experimentally,” Gartner said. “We’re really pushing the boundaries here, so that’s another reason why it’s so important that we try to do this using multiple different computational techniques.”
Beyond Water
Some of the conditions the researchers tested were extremes that probably don’t exist on Earth directly, but potentially could be present in various water environments of the solar system, from the oceans of Europa to water in the center of comets. Yet these findings could also help researchers better explain and predict water’s strange and complex physical chemistry, informing water’s use in industrial processes, developing better climate models, and more.
The work is even more generalizable, according to Gartner. Water is a well-studied research area, but this methodology could be expanded to other difficult-to-simulate materials like polymers, or complex phenomena like chemical reactions.
“Water is so central to life and industry, so this particular question of whether water can undergo this phase transition has been a longstanding problem, and if we can move toward an answer, that’s important,” he said. “But now we have this really powerful new computational technique, but we don’t yet know what the boundaries are and there’s a lot of room to move the field forward.”
More information: Thomas E. Gartner et al, Liquid-Liquid Transition in Water from First Principles, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.255702
Revealing the impulsive response of the bound electron to an electromagnetic field is of fundamentally importance for understanding various nonlinear processes of matter, and it is one of the ultimate goals of attosecond science.
However, it is still a very challenging task. Recently, a group from Huazhong University of Science and Technology demonstrated a new photoelectron spectroscopy for metrology of the attosecond response of the bound electron exposed to the intense XUV pulses.
In this work, they took advantage of the benefits of the attosecond temporal resolution of tunneling ionization and the subatomic spatial resolution of photoelectron holography. With this scheme, they successfully filmed an attosecond-scale movie of the impulsive response of the bound electron to an intense XUV pulse.
Their work not only revealed how the atomic stabilization is established in the intense laser field, but also established a novel photoelectron spectroscopy to time-resolved imaging of the ultrafast bound-state electron processes in intense laser fields. Extension of this method to more complex molecules is promising, and it will be an exciting aspect in the attosecond science.
The paper is published in the journal Ultrafast Science.
More information: Jintai Liang et al, Direct Visualization of Deforming Atomic Wavefunction in Ultraintense High-Frequency Laser Pulses, Ultrafast Science (2022). DOI: 10.34133/2022/9842716
The cosmic optical background (COB) is the visible light emitted by all sources outside of the Milky Way. This faint glow of light, which can only be observed using very precise and sophisticated telescopes, could help astrophysics to learn more about the origins of the universe and what lies beyond our galaxy.
Last year, physicists working at different institutes across the United States published the most precise COB measurements collected so far, gathered by the New Horizons spacecraft, an interplanetary space probe launched by NASA over a decade ago. These measurements suggested that the COB is two times brighter than theoretical predictions.
Researchers at Johns Hopkins University have recently carried out a theoretical study exploring the possibility that this observed excess light could be caused by the decay of a hypothesized type of dark matter particles, known as axions. In their paper, published in Physical Review Letters, they showed that axions with masses between 8 and 20 eV could potentially account for the excess COB flux measured by the New Horizons team.
“Marc Postman is a colleague across the street who is an incredible observational cosmologist, and so when his paper with Todd Lauer and the New Horizons team appeared, I noticed it and read it,” Marc Kamionkowski, one of the researchers who carried out the study, told Phys.org.
“The measurement they collected is a great example of cleverly repurposing a powerful astronomical observatory to different ends than those it was designed for. We sent this incredible little spacecraft out toward Pluto years ago, and it did everything it was supposed to, but it had no brakes, and is still speeding further and further from the sun with not much to do. Marc and Todd realized that it could be used to detect—for the very first time—the cosmic background of optical photons from all the unresolved galaxies in the universe, and it did.”
After reading the paper by Lauer and his colleagues, Kamionkowski realized that if the excess they measured was in-fact attributed to the decay of axions, this could be potentially confirmed using available cosmological data. Specifically, this excess would be detected with a high signal-to-noise ratio during SPHEREx, a planned two-year NASA mission that will send a near-infrared space observatory into space to collect new and potentially valuable measurements.
“Our calculations are embarrassingly simple, as they are the types of calculations that we and tons of other people have been doing for years,” Kamionkowski explained. “The idea that two-photon decay of an axion could lead to a cosmic signal was already around when I was a graduate student over 30 years ago. Our work simply involves summing the photons from all those produced by axion decay over time, a simple integral. We had to get some factors of cosmic redshift in there correctly, but that’s a homework problem in a typical cosmology class.”
Overall, the calculations performed by Kamionkowski and his colleagues highlight the possibility of confirming or disproving the connection between axion dark matter decay and the recently observed excess COB using future line-intensity-mapping (LIM) measurements set to be collected by NASA’s SPHEREx satellite.
SPHEREx is expected to launch in 2025, collecting measuring near-infrared signals originating from approximately 450 million galaxies. The researchers at Johns Hopkins have already published a follow-up paper, where they explored the consistency of the axion decay scenario with existing constraints to COB from gamma rays.
“NASA’s Fermi telescope has obtained gamma-ray energy spectra from over 800 blazars, and the highest-energy gamma rays can be attenuated by production of electron-positron pairs via scattering with COB photons,” Kamionkowski added.
“In our new study, we modeled the attenuation expected from this COB scattering and by comparing with Fermi data were able to place an upper limit to the COB background from dark-matter decay, which was still consistent with the COB excess inferred from New Horizons.
“My student Gabriela Sato-Polito has also been working with Dan Grin (Haverford College) looking in deep VLT images of several high-redshift clusters for dark-matter decay lines. These measurements should allow us to probe some, but not all, of the parameter space for dark-matter decays consistent with the New Horizons excess.”
More information: José Luis Bernal et al, Cosmic Optical Background Excess, Dark Matter, and Line-Intensity Mapping, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.231301
Tod R. Lauer et al, New Horizons Observations of the Cosmic Optical Background, The Astrophysical Journal (2021). DOI: 10.3847/1538-4357/abc881
A new computational analysis by theorists at the U.S. Department of Energy’s Brookhaven National Laboratory and Wayne State University supports the idea that photons (a.k.a. particles of light) colliding with heavy ions can create a fluid of “strongly interacting” particles. In a paper just published in Physical Review Letters, they show that calculations describing such a system match up with data collected by the ATLAS detector at Europe’s Large Hadron Collider (LHC).
As the paper explains, the calculations are based on the hydrodynamic particle flow seen in head-on collisions of various types of ions at both the LHC and the Relativistic Heavy Ion Collider (RHIC), a DOE Office of Science user facility for nuclear physics research at Brookhaven Lab. With only modest changes, these calculations also describe flow patterns seen in near-miss collisions, where photons that form a cloud around the speeding ions collide with the ions in the opposite beam.
“The upshot is that using the same framework we use to describe lead-lead and proton-lead collisions, we can describe the data of these ultra-peripheral collisions where we have a photon colliding with a lead nucleus,” said Brookhaven Lab theorist Bjoern Schenke, a co-author of the paper. “That tells you there’s a possibility that in these photon-ion collisions, we create a small dense strongly interacting medium that is well described by hydrodynamics—just like in the larger systems.”
Fluid signatures
Observations of particles flowing in characteristic ways have been key evidence that the larger collision systems (lead-lead and proton-lead collisions at the LHC; and gold-gold and proton-gold collisions at RHIC) create a nearly perfect fluid. The flow patterns were thought to stem from the enormous pressure gradients created by the large number of strongly interacting particles produced where the colliding ions overlap.
“By smashing these high-energy nuclei together we’re creating such high energy density—compressing the kinetic energy of these guys into such a small space—that this stuff essentially behaves like a fluid,” Schenke said.
Spherical particles (including protons and nuclei) colliding head-on are expected to generate a uniform pressure gradient. But partially overlapping collisions generate an oblong, almond-shaped pressure gradient that pushes more high-energy particles out along the short axis than perpendicular to it.
This “elliptic flow” pattern was one of the earliest hints that particle collisions at RHIC could create a quark-gluon plasma, or QGP—a hot soup of the fundamental building blocks that make up the protons and neutrons of nuclei/ions. Scientists were at first surprised by the QGP’s liquid-like behavior. But they later established elliptic flow as a defining feature of QGP, and evidence that the quarks and gluons were still interacting strongly, even when free from confinement within individual protons and neutrons. Later observations of similar flow patterns in collisions of protons with large nuclei, intriguingly suggest that these proton-nucleus collision systems can also create tiny specks of quark-gluon soup.
“Our new paper is about pushing this to even further extremes, looking at collisions between photons and nuclei,” Schenke said.
Changing the projectile
It has long been known that that ultra-peripheral collisions could create photon-nucleus interactions, using the nuclei themselves as the source of the photons. That’s because charged particles accelerated to high energies, like the lead nuclei/ions accelerated at the LHC (and gold ions at RHIC), emit electromagnetic waves—particles of light. So, each accelerated lead ion at the LHC is essentially surrounded by a cloud of photons.
“When two of these ions pass each other very closely without colliding, you can think of one as emitting a photon, which then hits the lead ion going the other way,” Schenke said. “Those events happen a lot; it’s easier for the ions to barely miss than to precisely hit one another.”
ATLAS scientists recently published data on intriguing flow-like signals from these photon-nucleus collisions.
“We had to set up special data collection techniques to pick out these unique collisions,” said Blair Seidlitz, a Columbia University physicist who helped set up the ATLAS trigger system for the analysis when he was a graduate student at the University of Colorado, Boulder. “After collecting enough data, we were surprised to find flow-like signals that were similar to those observed in lead-lead and proton-lead collisions, although they were a little smaller.”
Schenke and his collaborators set out to see whether their theoretical calculations could accurately describe the particle flow patterns.
They used the same hydrodynamic calculations that describe the behavior of particles produced in lead-lead and proton-lead collision systems. But they made a few adjustments to account for the “projectile” striking the lead nucleus changing from a proton to a photon.
According to the laws of physics (specifically, quantum electrodynamics), a photon can undergo quantum fluctuations to become another particle with the same quantum numbers. A rho meson, a particle made of a particular combination of a quark and antiquark held together by gluons, is one of the most likely results of those photon fluctuations.
If you think back to the proton—made of three quarks—this two-quark rho particle is just a step down the complexity ladder.
“Instead of having a gluon distribution around three quarks inside a proton, we have the two quarks (quark-antiquark) with a gluon distribution around those to collide with the nucleus,” Schenke said.
Accounting for energy
The calculations also had to account for the big difference in energy in these photon-nucleus collision systems, compared to proton-lead and especially lead-lead.
“The emitted photon that’s colliding with the lead won’t carry the entire momentum of the lead nucleus it came from, but only a tiny fraction of that. So, the collision energy will be much lower,” Schenke said.
That energy difference turned out to be even more important than the change of projectile.
In the most energetic lead-lead or gold-gold heavy ion collisions, the pattern of particles emerging in the plane transverse to the colliding beams generally persists no matter how far you look from the collision point along the beamline (in the longitudinal direction). But when Schenke and collaborators modeled the patterns of particles expected to emerge from lower-energy photon-lead collisions, it became apparent that including the 3D details of the longitudinal direction made a difference. The model showed that the geometry of the particle distributions changes rapidly with increasing longitudinal distance; the particles become “decorrelated.”
“The particles see different pressure gradients depending on their longitudinal position,” Schenke explained.
“So, for these low energy photon-lead collisions, it is important to run a full 3D hydrodynamic model (which is more computationally demanding) because the particle distribution changes more rapidly as you go out in the longitudinal direction,” he said.
When the theorists compared their predictions using this lower-energy, full 3D, hydrodynamic model with the particle flow patterns observed in photon-lead collisions by the ATLAS detector, the data and theory matched up nicely, at least for the most obvious elliptic flow pattern, Schenke said.
Implications and the future
“From this result, it looks like it’s conceivable that even in photon-heavy ion collisions, we have a strongly interacting fluid that responds to the initial collision geometry, as described by hydrodynamics,” Schenke said. “If the energies and temperatures are high enough,” he added, “there will be a quark-gluon plasma.”
Seidlitz, the ATLAS physicist, commented, “It was very interesting to see these results suggesting the formation of a small droplet of quark-gluon plasma, as well as how this theoretical analysis offers concrete explanations as to why the flow signatures are a bit smaller in photon-lead collisions.”
Additional data to be collected by ATLAS and other experiments at RHIC and the LHC over the next several years will enable more detailed analyses of particles flowing from photon-nucleus collisions. These analyses will help distinguish the hydrodynamic calculation from another possible explanation, in which the flow patterns are not a result of the system’s response to the initial geometry.
In the longer-term future, experiments at an Electron-Ion Collider (EIC), a facility planned to replace RHIC sometime in the next decade at Brookhaven Lab, could provide more definitive conclusions.
The Large Hadron Collider Beauty (LHCb) experiment at CERN is the world’s leading experiment in quark flavor physics with a broad particle physics program. Its data from Runs 1 and 2 of the Large Hadron Collider (LHC) has so far been used for over 600 scientific publications, including a number of significant discoveries.
While all scientific results from the LHCb collaboration are already publicly available through open access papers, the data used by the researchers to produce these results is now accessible to anyone in the world through the CERN open data portal. The data release is made in the context of CERN’s Open Science Policy, reflecting the values of transparency and international collaboration enshrined in the CERN Convention for more than 60 years.
“The data collected at LHCb is a unique legacy to humanity, especially since no other experiment covers the region LHCb looks at,” says Sebastian Neubert, leader of the LHCb open data project. “It has been obtained through a huge international collaborative effort, which was funded by the public. Therefore the data belongs to society.”
The data sample made available amounts to 20% of the total data set collected by the LHCb experiment in 2011 and 2012 during LHC Run 1. It comprises 200 terabytes containing information obtained from proton–proton collision events filtered and recorded with the detector.
The LHCb collaboration has preprocessed the data by reconstructing experimental signatures, such as the trajectories of charged particles, from the raw information delivered by its complex detector system. The data is filtered, classified according to approximately 300 processes and decays, and made available in the same format as that used by LHCb physicists.
The analysis of LHC data is a complex and time-consuming exercise. Therefore, to facilitate the analysis, the samples are accompanied by extensive documentation and metadata, as well as a glossary explaining several hundred special terms used in the preprocessing. The data can be analyzed using dedicated LHCb algorithms, which are available as open source software.
The data is suitable for different types of physics studies and can be directly downloaded by anyone. “It is intended to be used by professional scientists and its interpretation needs some knowledge of particle physics, but everybody is invited to give it a try,” continues Neubert. “It would be great if the data inspires new research directions and is used by researchers in other fields, such as data science and artificial intelligence. We are eager to hear from users of the data what they find.”
Further data releases from the LHCb collaboration are planned in the future.
How are galaxies born, and what holds them together? Astronomers assume that dark matter plays an essential role. However, as yet it has not been possible to prove directly that dark matter exists. A research team including Technical University of Munich (TUM) scientists has now measured for the first time the survival rate of antihelium nuclei from the depths of the galaxy—a necessary prerequisite for the indirect search for dark matter.
Many things point to the existence of dark matter. The way in which galaxies move in galactic clusters, or how fast stars circle the center of a galaxy results in calculations which indicate that there must be far more mass present than what we can see. Approximately 85 percent of our Milky Way for example consists of a substance which is not visible and which can only be detected based on its gravitational effects. As of today it has still not been possible to directly prove the existence of this material.
Several theoretical models of dark matter predict that it could be composed of particles which interact weakly with one another. This produces antihelium-3 nuclei, which consist of two antiprotons and one antineutron. These nuclei are also generated in high-energy collisions between cosmic radiation and common matter like hydrogen and helium—however, with energies different from those that would be expected in the interaction of dark matter particles.
In both processes, the antiparticles originate in the depths of the galaxy, several tens of thousands of lightyears away from us. After their creation, a part of them makes its way in our direction. How many of these particles survive this journey unscathed and reach the vicinity of the Earth as messengers of their formation process determines the transparency of the Milky Way for antihelium nuclei.
Until now scientists have only been able to roughly estimate this value. However, an improved approximation of transparency, a unit of measure for the number and energies of antinuclei, will be important for interpreting future antihelium measurements.
LHC particle accelerator as antimatter factory
Researchers from the ALICE collaboration have now carried out measurements that have enabled them to determine the transparency more precisely for the first time. ALICE stands for A Large Ion Collider Experiment and is one of the largest experiments in the world to explore physics on the smallest length scales. ALICE is part of the Large Hadron Collider (LHC) at CERN.
The LHC can generate large amounts of light antinuclei such as antihelium. To do so, protons and lead atoms are each put on a collision course. The collisions produce particle showers which are then recorded by the detector of the ALICE experiment. Thanks to several subsystems of the detector, the researchers can then detect the antihelium-3 nuclei that have formed and follow their trails in the detector material.
This makes it possible to quantify the probability that an antihelium-3 nucleus will interact with the detector material and disappear. Scientists from TUM and the Excellence Cluster ORIGINS have contributed significantly to the analysis of the experimental data.
Galaxy transparent for antinuclei
Using simulations, the researchers were able to transfer the findings from the ALICE experiment to the entire galaxy. The result: About half of the antihelium-3 nuclei which were expected to be generated in the interaction of dark matter particles would reach the vicinity of the Earth. Our Milky Way is thus 50 percent permeable for these antinuclei.
For antinuclei generated in collisions between cosmic radiation and the interstellar medium, the resulting transparency varies from 25 to 90 percent with increasing antihelium-3 momentum. However, these antinuclei can be distinguished from those generated from dark matter based on their higher energy.
This means that antihelium nuclei can not only travel long distances in the Milky Way, but also serve as important informants in future experiments: Depending on how many antinuclei arrive at the Earth and with which energies, the origin of these well-traveled messengers can be interpreted as cosmic rays or dark matter thanks to the new calculations.
Reference for future antinuclei measurements in space
“This is an excellent example of an interdisciplinary analysis that illustrates how measurements at particle accelerators can be directly linked with the study of cosmic rays in space,” says ORIGINS scientist Prof. Laura Fabbietti of the TUM School of Natural Sciences.
The results from the ALICE experiment at the LHC are of great importance for the search for antimatter in space with the AMS-02 module (Alpha Magnetic Spectrometer) on the International Space Station (ISS). Starting in 2025 the GAPS balloon experiment over the Arctic will also examine incoming cosmic rays for antihelium-3.
The work is published in the journal Nature Physics.
Due to the ever-increasing growth of our data consumption, researchers are looking for faster, more efficient, and more energy-conscious data storage techniques. TU/e researcher Youri van Hees uses ultrashort light pulses that enable him to write information. This way, he combines the advantages of both light and magnetic storage.
His thesis cover looks like an old-fashioned British tabloid newspaper, which includes a large headline of a playful article in the Spintronic Chronicle about femtomagnetism, photonics and spin transport and reads “Quick as Flick.”
A smart way to create some enthusiasm among a wider audience for complicated matters of technology? “More of a nice extra,” says Youri van Hees, who defended his thesis at the department of Applied Physics on December 7th. Apart from being a researcher down to the last nanometer, he’s also a passionate music lover and longtime fan of progressive rock band Jethro Tull.
“My favorite album, Thick as a Brick, was released exactly fifty years ago. The cover of my thesis is a reference to their album, which came wrapped in a satirical edition of a non-existent local newspaper. Nevertheless, it was quite instructive to write about my research in a way that was appealing to non-physicists. Only then do you realize how inextricably linked you’ve become to certain technology.”
Perhaps it’s the cover of his thesis that will draw the most attention, but its content is equally interesting. Van Hees spent the past four years working on so-called femtosecond lasers, small mirrors, and thin magnetic layers. And successfully so, because his research has brought the technology of writing information on a magnetic medium with the use of light one step closer. This is important for data centers because they would like to start using light as the most energy-efficient means of information transmission.
Van Hees stated, “People have been using magnets to store data for a long time now. This is done in the form of bits—the familiar zeros and ones, which are like tiny magnetic domains with a north and a south pole. To write data, those poles need to switch. We now make those poles shift by creating a magnetic field. Your laptop’s hard disk, for example, contains a small coil with which you can write small magnetic domains. Until now, we always needed an electronic intermediate step for that, because that coil needs to be actuated. That process costs extra time and energy.”
Nano-sandwich
When he shot an ultrashort laser pulse—”even smaller than one billionth of a second”—at magnetic material, Van Hees didn’t just notice something happening locally, but he also realized that the pulse was able to move electrons, which carry magnetic information. He lined up a few short light pulses like rail wagons and used this setup to influence magnetic materials with various small mirrors.
“These ultrafast light pulses allowed us to shift the north and south poles of the magnetic domains, which made it possible to skip the electronic intermediate step. It’s also possible now to know in advance with certainty whether you’ll write a ‘0’ or a ‘1,’ and all without having to know the bit’s initial state. That makes it even more efficient.
“We also investigated which magnetic materials we need to use to make the bits stable with this new light method. We can’t use the standard sandwich formula of layers of cobalt and gadolinium for stable data storage. However, it turns out that an extra atomic layer of the ferromagnetic metal terbium works really well. We are still looking for the right balance because even though we have stable bits now, the magnetization switching is not as efficient as we would like it to be.”
Broken mirror
Does all this mean that we can expect a photonic data storage breakthrough any time soon? Van Hees says that the technology of writing data with light at the laboratory level is becoming mainstream, and that the chip industry is following this kind of research closely. But, he adds with a smile: “The size of our current laser pulse generator is 60x30x30 cm. It will take some time before it can fit in your pocket.”
There’s a short message on his news cover’s back page, on the right beneath the non-collinear puzzle, which illustrates the fact that minimization can sometimes be a rough business. A true story, Van Hees says. “We had ordered a special mirror for our setup, but it was too large and didn’t fit. Physics sometimes can be frustrated too, so what do you come up with? Together with a colleague, I went to work on the mirror with a screw driver. We managed to make it smaller allright, but we couldn’t exactly use it any more either.”
The National Institute for Materials Science (NIMS), the Rutherford Appleton Laboratory in the U.K. and the University of Oxford in the U.K. have experimentally confirmed that a cupric oxide exhibits multiferroic state (i.e., both magnetic and ferroelectric properties) at room temperature under high pressure.
The theoretical model constructed in this research is expected to facilitate the development of next-generation memory devices and optical modulators.
Multiferroic materials are potentially applicable to the development of next-generation memory devices and energy-efficient optical modulators. However, because most of these materials are functional only at temperatures below 100 K, scientists had worked for years to make them exhibit multiferroic properties at room temperature—a requirement for devices that need to operate at ambient temperatures.
This research team focused on cupric oxide—a multiferroic material—because when it is subjected to high pressure, the copper and oxide ions constituting it change their positions relative to each other, significantly increasing the magnetic interactions between them. Due to this phenomenon, it had been theoretically suggested to be able to exhibit multiferroic properties at room temperature. However, this had not been experimentally confirmed due to the inability to directly measure atomic spin (i.e., atomic-level magnetism) under high pressure.
The research team developed a high-pressure generator which also enables the measurement of atomic spin under high pressure. Using this apparatus, the team confirmed through neutron diffraction experiments that cupric oxide is able to exhibit multiferroic state at room temperature under high pressure.
In addition, NIMS developed a new calculation method and used it to build a theoretical model which is expected to facilitate the development of room-temperature multiferroic materials. This calculation method was designed to operate effectively without requiring a large number of predetermined assumptions related to the strength of the magnetic interactions taking place between specific copper ions under high pressure.
The cupric oxide compound is able to exhibit its room-temperature multiferroic state only when subjected to a high pressure of 18.5 GPa (185,000 atm). Thin films composed of precisely distorted crystals grown in accordance with the theoretical model may potentially be able to exhibit such properties at ambient atmospheric pressure.
This research was published in the online version of Physical Review Letters on November 15, 2022.
More information: Noriki Terada et al, Room-Temperature Type-II Multiferroic Phase Induced by Pressure in Cupric Oxide, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.217601
What are these microscopic zebra-striped squares, and why did scientists painstakingly arrange more than 6 million of them on a silicon plate just half a centimeter wide?
The squares are a type of diffraction grating, which causes waves that travel through to bend as they pass (similar to ocean waves hitting a breakwater). That’s beneficial to neutron scientists, who want to make beams of neutrons more effective at exploring the interiors of objects.
Neutrons can penetrate solid objects and reveal internal details that X-rays cannot. While X-rays are absorbed by heavier elements such as the calcium in our bones, neutrons are useful for examining materials that contain the light element hydrogen, which doesn’t stop X-rays but reflects neutrons.
However, scanning beams disperse quickly after emerging from the beam’s aperture. That limits what scientists can see. Researchers at the NIST Center for Neutron Research (NCNR) have been working on a solution, developing an idea that guest scientists from the University of Waterloo recently built.
Here’s how it works: A scanning beam of neutrons passes through the small silicon plate. The neutrons fly through the square grating, twisting between the zebra-striped structures. Their movements cause twisting waves to form on either side of the neutrons. Those waves extend our researchers’ vision farther away from the source than usual.
The groundbreaking accomplishment provides a new avenue for researchers to study next-generation quantum materials.
Very small exhaled droplets, so-called aerosol particles, play an important role in the airborne transmission of pathogens such as the coronavirus. Researchers in the field of fluid mechanics used a model to investigate how exactly the small droplets are formed in the larynx when speaking or singing. The team now reports its results in the current issue of Physics of Fluids. The findings can now help to develop targeted measures to stop chains of infection.
“Every person spreads not only gases but also aerosolized particles with the exhaled air. The connection between an increased risk of infection and coughing, singing or speaking loudly, suggests that particles are emitted more frequently during these activities,” says Prof. Rüdiger Schwarze, an expert in fluid mechanics at TU Bergakademie Freiberg (Germany).
The team has now investigated for the first time how the particles are created in the larynx using a model of the human vocal folds. “For protection, the vocal folds are covered with a thin, gel-like layer of liquid called mucus. When speaking, they are adducted by the laryngeal muscles and induced to oscillate by the exhaled airflow. Depending on oscillation frequency and airflow, different sounds are produced,” explains first author Lisa Fritzsche, who developed the model used for the experiments. The model is made of perspex, the artificial vocal folds made of silicone.
To obtain realistic properties of the silicone vocal folds, they were surface-modified at the Freiberg Research Institute for Leather and Plastic Sheeting (FILK Freiberg Institute). The model shows how the mucus forms a liquid film between the oscillating vocal folds. Then the exhaled air inflates the film, creating a bubble. When this bubble bursts, a large number of small droplets are formed, which are transported into the mouth with the airflow and then exhaled as an aerosol.
Detailed measurements using a model experiment
Employing high-resolution cameras and a special optical setup, the researchers were able to measure how different oscillations of the vocal folds affect the size distribution of the aerosol particles.
“If high-pitched tones are produced by fast oscillations, mainly smaller aerosol particles, about the size of a grain of dust, are emitted. Exhaling more air with louder tones, however, increases the proportion of larger aerosol particles that are about the size of a grain of sand,” Lisa Fritzsche summarizes.
Basis for targeted measures to break infection chains
The results show which mechanisms are responsible for the formation of the aerosols at the vocal folds and how speech volume and pitch influence the droplet sizes. “What we now need to investigate further are the properties of the mucus and how these relate to the size of the exhaled aerosol particles,” says Prof. Rüdiger Schwarze.
If, in the future, the mucus of an infected person could, for example, be specifically influenced by drugs, the risk of infection for contact persons could be reduced. In further studies, the team also wants to investigate the further path of the aerosols in the pharynx in more detail.
More information: L. Fritzsche et al, Toward unraveling the mechanisms of aerosol generation during phonation, Physics of Fluids (2022). DOI: 10.1063/5.0124944