Researchers create edible, transparent composite packaging with biocellulose

by Society of Chemical Industry

Researchers create edible, transparent composite packaging with biocellulose
By incorporating soy protein into the structure and coating it with an oil-resistant composite, the CUHK team successfully created an edible, transparent, and robust BC-based composite packaging. Credit: To Ngai

Plastic food packaging accounts for a significant proportion of plastic waste in landfills. In the face of escalating environmental concerns, researchers are looking to bio-derived alternatives.

Now, scientists at The Chinese University of Hong Kong (CUHK) have developed an edible, transparent and biodegradable material with considerable potential for application in food packaging. Their work is published in the Journal of the Science of Food and Agriculture.

Heavy reliance on petrochemicals and inherent non-biodegradability of plastic packaging mean it has long been a significant contributor to environmental contamination. A team at CUHK has turned its attention to bacterial cellulose (BC), an organic compound derived from certain types of bacteria, which has garnered attention as a sustainable, easily available, and non-toxic solution to the pervasive use of plastics.

Professor To Ngai from the Department of Chemistry, CUHK and corresponding author of the study, explained that the impressive tensile strength and high versatility of BC are the key to its potential.

He said, “Extensive research has been conducted on BC, including its use in intelligent packaging, smart films, and functionalized materials created through blending, coating, and other techniques. These studies demonstrate the potential of BC as a replacement for single-use plastic packaging materials, making it a logical starting point for our research.”

Unlike the cellulose found in the cell walls of plants, BC can be produced through microbial fermentation, which eliminates the need for harvesting trees or crops. Ngai noted that as a result, “…this production method does not contribute to deforestation or habitat loss, making BC a more sustainable and environmentally friendly material alternative to plant cellulose.”

Up until now, the widespread adoption of BC has been limited by its unfavorable sensitivity to moisture in the air (hygroscopicity), which detrimentally impacts its physical properties.

In the paper, the researchers laid out a novel approach to address the limitations of BC-based materials. By incorporating certain soy proteins into the structure and coating it with an oil-resistant composite, they successfully created an edible, transparent, and robust BC-based composite packaging.

Researchers create edible, transparent composite packaging with biocellulose
Bacterial cellulose (BC) – an organic compound derived from certain types of bacteria which has garnered attention as a sustainable, easily available, and non-toxic solution to the pervasive use of plastics. Credit: To Ngai

Ngai noted that this approach has a high feasibility for scale-up: “It does not require specific reaction conditions like chemical reactions, but rather a simple and practical method with mixing and coating. This approach offers a promising solution to the challenge of developing sustainable and environmentally friendly packaging materials that can replace single-use plastics on a large scale.”

The study demonstrated that the plastic alternative could be completely degraded within 1-2 months. Unlike other bio-derived plastics such as polylactic acid, the BC-based composite does not require specific industrial composting conditions to degrade.

Ngai explained, “The material developed in this research is completely edible, making it safe for turtles and other sea animals to consume without causing aquatic toxicity in the ocean.”

The researchers at CUHK are now exploring the directions for future research. They hope to enhance the versatility of modified BC films, making them suitable for a wider range of applications. Specifically, they are focused on developing a thermosetting glue that can create strong bonds between bacterial cellulose, allowing it to be easily molded into various shapes when heated.

“One of the main challenges with bacterial cellulose films is that they are not thermoplastic, which limits their potential for use in certain applications. By addressing this issue, we hope to make bacterial cellulose films more competitive with traditional plastics while maintaining their eco-friendliness,” explained Ngai.

Ngai hopes that the current study will help to combat the excessive use of single-use plastics, which can persist for hundreds of years after only a few days of being displayed on supermarket shelves.

“This research serves as a reminder that natural raw materials may already possess the necessary characteristics to perform beyond the functions of plastic packaging,” he concluded.

More information: Ka Man Cheung et al, Edible, strong, and low‐hygroscopic bacterial cellulose derived from biosynthesis and physical modification for food packaging, Journal of the Science of Food and Agriculture (2023). DOI: 10.1002/jsfa.12758

Journal information: Journal of the Science of Food and Agriculture 

Provided by Society of Chemical Industry

Physicists discover an exotic material made of bosons

Physicists discover an exotic material made of bosons
Bosonic correlated insulator.(A) Illustration of a bosonic correlated insulator consisting of interlayer excitons. Magenta spheres indicate holes and cyan spheres, electrons. (Inset) Type II band alignment of WSe2/WS2 heterostructure. (B) Schematics of continuous-wave pump probe spectroscopy. The exciton and electron density are independently controlled by pump light and electrostatic gate. Red and green shading correspond to wide-field pump light and focused probe light, respectively. (C and E) Gate-dependent PL (C) and absorption (E) spectra of a 60°-aligned WSe2/WS2 moiré bilayer (device D1) at zero pump intensity. The PL peak shows a sudden blue shift at electron filling νe= 1 and 2 (yellow arrows), where the absorption spectrum shows kinks and splitting. (D and F) Pump intensity–dependent PL (D) and absorption (F) spectra of device D1 at charge neutrality. Right axes show dipolar-interaction–induced interlayer exciton energy shift Δdipole, which is approximately proportional to νex. The dominant PL peak in (D) at low and high pump intensity are labeled as peak I and II, respectively. All measurements are performed at a base temperature of 1.65 K. Credit: Science (2023). DOI: 10.1126/science.add5574

Take a lattice—a flat section of a grid of uniform cells, like a window screen or a honeycomb—and lay another, similar lattice above it. But instead of trying to line up the edges or the cells of both lattices, give the top grid a twist so that you can see portions of the lower one through it. This new, third pattern is a moiré, and it’s between this type of overlapping arrangement of lattices of tungsten diselenide and tungsten disulfide where UC Santa Barbara physicists found some interesting material behaviors.

“We discovered a new state of matter—a bosonic correlated insulator,” said Richen Xiong, a graduate student researcher in the group of UCSB condensed matter physicist Chenhao Jin, and the lead author of a paper that appears in the journal Science.

According to Xiong, Jin and collaborators from UCSB, Arizona State University and the National Institute for Materials Science in Japan, this is the first time such a material—a highly ordered crystal of bosonic particles called excitons—has been created in a “real” (as opposed to synthetic) matter system.

“Conventionally, people have spent most of their efforts to understand what happens when you put many fermions together,” Jin said. “The main thrust of our work is that we basically made a new material out of interacting bosons.”

Bosonic, correlated, insulator

Subatomic particles come in one of two broad types: fermions and bosons. One of the biggest distinctions is in their behavior, Jin said.

“Bosons can occupy the same energy level; fermions don’t like to stay together,” he said, “Together, these behaviors construct the universe as we know it.”

Fermions, such as electrons, underlie the matter with which we are most familiar as they are stable and interact through the electrostatic force. Meanwhile bosons, such as photons (particles of light), tend to be more difficult to create or manipulate as they are either fleeting or do not interact with each other.

A clue to their distinct behaviors is in their different quantum mechanical characteristics, Xiong explained. Fermions have half-integer “spins” such as 1/2 or 3/2 et cetera, while bosons have whole integer spins (1, 2, etc.). An exciton is a state in which a negatively charged electron (a fermion) is bound to its positively charged opposite “hole” (another fermion), with the two half-integer spins together becoming a whole integer, creating a bosonic particle.

To create and identify excitons in their system, the researchers layered the two lattices and shone strong lights on them in a method they call “pump-probe spectroscopy.” The combination of particles from each of the lattices (electrons from the tungsten disulfide and the holes from the tungsten diselenide) and the light created a favorable environment for the formation of and interactions between the excitons while allowing the researchers to probe these particles’ behaviors.

“And when these excitons reached a certain density, they could not move anymore,” Jin said. Thanks to strong interactions, the collective behaviors of these particles at a certain density forced them into a crystalline state, and created an insulating effect due to their immobility.

“What happened here is that we discovered the correlation that drove the bosons into a highly ordered state,” Xiong added. Generally, a loose collection of bosons under ultracold temperatures will form a condensate, but in this system, with both light and increased density and interaction at relatively higher temperatures, they organized themselves into a symmetric solid and charge-neutral insulator.

The creation of this exotic state of matter proves that the researchers’ moiré platform and pump-probe spectroscopy could become an important means for creating and investigating bosonic materials.

“There are many-body phases with fermions that result in things like superconductivity,” Xiong said. “There are also many-body counterparts with bosons that are also exotic phases. So what we’ve done is create a platform, because we did not really have a great way to study bosons in real materials.” While excitons are well-studied, he added, there hadn’t until this project been a way to coax them to interacting strongly with one another.

With their method, according to Jin, it could be possible to not only study well-known bosonic particles like excitons but also open more windows into the world of condensed matter with new bosonic materials.

“We know that some materials have very bizarre properties,” he said. “And one goal of condensed matter physics is to understand why they have these rich properties and find ways to make these behaviors come out more reliably.”

More information: Richen Xiong et al, Correlated insulator of excitons in WSe 2 /WS 2 moiré superlattices, Science (2023). DOI: 10.1126/science.add5574

Journal information: Science 

Provided by University of California – Santa Barbara 

Unraveling the role of the NiO electrocatalyst in alcohol electrooxidation reactions

Unraveling the catalytic role of NiO electrocatalyst in alcohol electrooxidation reaction
The synergy between EOM-HAT and the hydration of aldehyde results in the electrooxidation of R-CH2OH to R-COOH. The synergy between EOM-HAT and EOM-Cleavage of the C-C bond causes the electrooxidation of R-CHOH-CH2OH to R-COOH and HCOOH. Credit: Science China Press

A study led by Dr. Wei Chen, Prof. Yuqin Zou, and Prof. Shuangyin Wang (State Key Laboratory of Chemo/Bio-Sensing and Chemometrics, College of Chemistry and Chemical Engineering, Advanced Catalytic Engineering Research Center of the Ministry of Education, Hunan University) unravels the reaction mechanism of the primary alcohol/vicinal diol electrooxidation reaction on NiO, especially for the synergy between the electrochemical and non-electrochemical steps.

The alcohol electrooxidation on NiO is an indirect electrooxidation reaction with the electrophilic oxygen species as the redox mediator. Therefore, the electrochemical step is the electrochemical generation of electrophilic oxygen species. The research team found two kinds of NiO electrocatalyst function mechanisms, i.e., the electrophilic oxygen-mediated mechanism involving hydrogen atom transfer (EOM-HAT) and the electrophilic oxygen-mediated mechanism involving C-C bond cleavage (EOM-Cleavage of C-C bond).

The synergy between EOM-HAT and the hydration of aldehyde results in the electrooxidation of primary alcohol (R-CH2OH) to carboxylic acid (R-COOH) on NiO. On the other hand, the synergy between EOM-HAT and EOM-Cleavage of the C-C bond causes the electrooxidation of vicinal diol (R-CHOH-CH2OH) to R-COOH and formic acid (HCOOH) on NiO.

This study, published in the journal National Science Review, highlights the synergy between the electrochemical and non-electrochemical steps in the alcohol electrooxidation reaction. It establishes a unified reaction mechanism of alcohol electrooxidation reaction based on nickel-based electrocatalysts.

More information: Wei Chen et al, Unraveling the electrophilic oxygen-mediated mechanism for alcohol electrooxidation on NiO, National Science Review (2023). DOI: 10.1093/nsr/nwad099

Provided by Science China Press 

Physicists develop powerful alternative to dynamic density functional theory

Physicists develop powerful alternative to the dynamic density functional theory
Illustration of a unidirectional flow as investigated in the new study using a Lennard-Jones fluid as an example. The three-dimensional nonequilibrium system is set in motion (red arrows) by a force field (blue arrows) acting along the x-axis. Credit: Matthias Schmidt

Living organisms, ecosystems and the planet Earth are, from a physics point of view, examples of extraordinarily large and complex systems that are not in thermal equilibrium. To physically describe non-equilibrium systems, dynamic density functional theory has been used to date.

However, this theory has weaknesses, as physicists from the University of Bayreuth have now shown in an article published in the Journal of Physics: Condensed Matter. Power functional theory proves to perform substantially better—in combination with artificial intelligence methods, it enables more reliable descriptions and predictions of the dynamics of non-equilibrium systems over time.

Many-particle systems are all kind of systems composed of atoms, electrons, molecules, and other particles invisible to the eye. They are in thermal equilibrium when the temperature is balanced and no heat flow occurs. A system in thermal equilibrium changes its state only when external conditions change. Density functional theory is tailor-made for the study of such systems.

For more than half a century, it has proven its unrestricted value in chemistry and materials science. Based on a powerful classical variant of this theory, states of equilibrium systems can be described and predicted with high accuracy. Dynamic density functional theory (DDFT) extends the scope of this theory to non-equilibrium systems. This involves the physical understanding of systems whose states are not fixed by their external boundary conditions.

These systems have a momentum of their own: they have the ability to change their states without external influences acting on them. Findings and application methods of DDFT are therefore of great interest, for example, for the study of models for living organisms or microscopic flows.

The error potential of dynamic density functional theory

However, DDFT uses an auxiliary construction to make non-equilibrium systems accessible to physical description. It translates the continuous dynamics of these systems into a temporal sequence of equilibrium states. This results in a potential for errors that should not be underestimated, as the Bayreuth team led by Prof. Dr. Matthias Schmidt shows in the new study.

The investigations focused on a comparatively simple example—the unidirectional flow of a gas known in physics as a “Lennard-Jones fluid.” If this nonequilibrium system is interpreted as a chain of successive equilibrium states, one aspect involved in the time-dependent dynamics of the system is neglected, namely the flow field. As a result, DDFT may provide inaccurate descriptions and predictions.

“We do not deny that dynamic density functional theory can provide valuable insights and suggestions when applied to nonequilibrium systems under certain conditions. The problem, however, and we want to draw attention to this in our study using fluid flow as an example, is that it is not possible to determine with sufficient certainty whether these conditions are met in any particular case. The DDFT does not provide any control over whether the restricted framework conditions are given under which it enables reliable calculations. This makes it all the more worthwhile to develop alternative theoretical concepts for understanding nonequilibrium systems,” says Prof. Dr. Daniel de las Heras, first author of the study.

Power functional theory proves to perform substantially better

For ten years, the research team around Prof. Dr. Matthias Schmidt has been making significant contributions to the development of a still young physical theory, which has so far proven to be very successful in the physical study of many-particle systems: power functional theory (PFT). The physicists from Bayreuth are pursuing the goal of being able to describe the dynamics of non-equilibrium systems with the same precision and elegance with which classical density functional theory enables the analysis of equilibrium systems.

In their new study, they now use the example of a fluid flow to show that power functional theory is significantly superior to DDFT when it comes to understanding non-equilibrium systems. PFT allows the dynamics of these systems to be described without having to take a detour via a chain of successive equilibrium states in time. The decisive factor here is the use of artificial intelligence. Machine learning opens up the time-dependent behavior of the fluid flow by including all factors relevant to the system’s inherent dynamics—including the flow field. In this way, the team has even succeeded in controlling the flow of the Lennard-Jones fluid with high precision.

“Our investigation provides further evidence that power function theory is a very promising concept that can be used to describe and explain the dynamics of many-particle systems. In Bayreuth, we intend to further elaborate this theory in the coming years, applying it to nonequilibrium systems that have a much higher degree of complexity than the fluid flow we studied. In this way, the PFT will be able to supersede the dynamic density functional theory, whose systemic weaknesses it avoids according to our findings so far. The original density functional theory, which is tailored to equilibrium systems and has proven its worth, is retained as an elegant special case of PFT,” says Prof. Dr. Matthias Schmidt, who is chair of theoretical physics II at the University of Bayreuth.

More information: Daniel de las Heras et al, Perspective: How to overcome dynamical density functional theory, Journal of Physics: Condensed Matter (2023). DOI: 10.1088/1361-648X/accb33

Provided by Bayreuth University

Multifunctional self-healing liquid metal hydrogel developed for human-computer interaction

Multifunctional self-healing liquid metal hydrogel developed for human-computer interaction
Schematic structure and application of the liquid metal hydrogel. Credit: Li XIaofei

Recently, researchers from the Hefei Institutes of Physical Science (HFIPS) of the Chinese Academy of Sciences, led by Prof. Tian Xingyou and Prof. Zhang Xian, along with associate Prof. Yang Yanyu from the College of Materials Science and Engineering at Zhengzhou University, used gallium indium alloy (EGaIn) to initiate the polymerization and serve as flexible fillers to construct liquid metal/polyvinyl alcohol(PVA)/P(AAm-co-SMA) double network hydrogel.

“The resulting material was super-stretchable and self-healing,” said Li Xiaofei, first author of the paper, “it will promote the research and practical application of hydrogels and liquid metal in intelligent devices and military fields.”

The study was published in Materials Horizons.

Most conductive hydrogels suffer from subpar mechanical qualities and lack desirable self-recovery and self-healing abilities, severely limiting hydrogels’ potential uses. Liquid metals like gallium indium alloy (EGaIn) can toughen polymers by conforming to their changing shapes. Also, gallium (Ga) in EGaIn can initiate vinyl monomer’s free radical polymerization.

In this research, the team built a liquid metal/PVA/P(AAm-co-SMA) double network hydrogel (LM hydrogel) with EGaIn serving as both the polymerization initiator and the flexible fillers.

The PVA network used PVA microcrystals and coordination interaction of Ga3+ and PVA as cross-links, while the P(AAm-co-SMA) network used hydrophobic association and the EGaIn microspheres. The LM hydrogel was endowed with excellent super-stretchability (2000%), toughness (3.00 MJ/m3), notch resistance, and self-healing property (> 99% at 25 °C after 24 h) due to the multiple physical cross-links and the synergistic effect of the rigid PVA microcrystal network and the ductile P(AAm-co-SMA) hydrophobic network.

“The sensors developed for it can be used in health monitoring and motion identification through human-computer interaction,” said Li Xiaofei, “thanks to the LM hydrogel’s sensitive strain sensing capability.”

As a result of EGaIn’s low infrared emissivity and remarkable photothermal, LM hydrogel shows considerable promise in infrared camouflage.

More information: Xiaofei Li et al, Self-healing liquid metal hydrogel for human–computer interaction and infrared camouflage, Materials Horizons (2023). DOI: 10.1039/D3MH00341H

Journal information: Materials Horizons 

Provided by Chinese Academy of Sciences 

CERN experiment may help physicists work out the content of neutrino beams

SHINE shines a light on neutrino beams
Schematic top-view layout of the NA61/SHINE experiment in the configuration used during the 2017 proton data taking. In 2016 the forward time projection chambers were not present. The S5 scintillator was not used in this trigger configuration. Credit: Physical Review D (2023). DOI: 10.1103/PhysRevD.107.072004

At the time of the Big Bang, 13.8 billion years ago, every particle of matter is thought to have been produced together with an antimatter equivalent of opposite electrical charge. But in the present-day universe, there is much more matter than antimatter. Why this is the case is one of physics’ greatest questions.

The answer may lie, at least partly, in particles called neutrinos, which lack electrical charge, are almost massless and change their identity—or “oscillate”—from one of three types to another as they travel through space. If neutrinos oscillated in a different way to their antimatter equivalents, antineutrinos, they could help explain the matter–antimatter imbalance in the universe.

Experiments across the world, such as the NOvA experiment in the US, are investigating this possibility, as will next-generation experiments including DUNE. In these long-baseline neutrino-oscillation experiments, a beam of neutrinos is measured after it has traveled a long distance—the long baseline. The experiment is then run with a beam of antineutrinos, and the outcome is compared with that of the neutrino beam to see if the two twin particles oscillate in a similar or different way.

This comparison depends on an estimation of the numbers of neutrinos in the neutrino and antineutrino beams before they travel. These beams are produced by firing beams of protons onto fixed targets. The interactions with the target create other hadrons, which are focused using magnetic “horns” and directed into long tunnels in which they transform into neutrinos and other particles. But in this multi-step process, it isn’t easy to work out the particle content of the resulting beams—including the number of neutrinos they contain—which depends directly on the proton–target interactions.

Enter the NA61 experiment at CERN, also known as SHINE. Using high-energy proton beams from the Super Proton Synchrotron and appropriate targets, the experiment can recreate the relevant proton–target interactions. NA61/SHINE has previously made measurements of electrically charged hadrons that are produced in the interactions and yield neutrinos. These measurements helped improve estimations of the content of neutrino beams used at existing long-baseline experiments.

The NA61/SHINE collaboration has now released new hadron measurements that will help improve these estimations further. This time around, using a proton beam with an energy of 120 GeV and a carbon target, the collaboration measured three kinds of electrically neutral hadrons that decay into neutrino-yielding charged hadrons.

This 120-GeV proton–carbon interaction is used to produce NOvA’s neutrino beam, and it will probably also be used to create DUNE’s beam. Estimations of the numbers of the different neutrino-yielding neutral hadrons that the interaction produces rely on computer simulations, the output of which varies significantly depending on the underlying physics details.

“Up to now, simulations for neutrino experiments that use this interaction have relied on uncertain extrapolations from older measurements with different energies and target nuclei. This new direct measurement of particle production from 120-GeV protons on carbon reduces the need for these extrapolations,” explains NA61/SHINE deputy spokesperson Eric Zimmerman.

The paper is published in the journal Physical Review D.

More information: H. Adhikary et al, Measurements of KS0 , Λ , and Λ¯ production in 120 GeV/c p+C interactions, Physical Review D (2023). DOI: 10.1103/PhysRevD.107.072004

Provided by CERN 

Researchers discover a new way to develop drugs without side effects

A new way to develop drugs without side effects
An artist’s impression of the GPCR activation from inside the cell and the resulting targeted response. Credit: Kobayashi and Kawakami et al., 2023

Have you ever wondered how drugs reach their targets and achieve their function within our bodies? If a drug molecule or a ligand is a message, an inbox is typically a receptor in the cell membrane. One such receptor involved in relaying molecular signals is a G protein-coupled receptor (GPCR). About one-third of existing drugs work by controlling the activation of this protein. Japanese researchers now reveal a new way of activating GPCR by triggering shape changes in the intracellular region of the receptor. This new process can help researchers design drugs with fewer or no side effects.

If the cell membrane is like an Oreo cookie sandwich, GPCR is like a snake with seven segments traversing in and out of the cookie sandwich surface. The extracellular loops are the inbox for messages. When a message molecule binds to the extracellular side of the receptor, it triggers a shape change activating G proteins and the ß-arrestin protein attached to the intracellular side of the receptor. Like a molecular relay, the information passes downstream and affects various bodily processes. That is how we see, smell, and taste, which are sensations of light, smell, and taste messages.

Adverse side effects ensue if drugs acting on GPCRs activate multiple signaling pathways rather than a specific target pathway. That is why drug development focuses on activating specific molecular signal pathways within cells. Activating the GPCR from inside the cell rather than outside the cell could be one way to achieve specificity. But until now, there was no evidence of direct activation of only the intracellular side of GPCRs without the initiations from the extracellular side.

A team of researchers headed by Osamu Nureki, a professor at the University of Tokyo, and his lab, discovered a new receptor activation mode of a bone metabolism-related GPCR called human parathyroid hormone type 1 receptor (PTH1R) without signal transduction from the extracellular side.

“Understanding the molecular mechanism will enable us to design optimal drugs,” says Kazuhiro Kobayashi, a doctoral student and an author of the study. Such a drug offers “a promising treatment for osteoporosis.”

Kobayashi has been conducting research on bone formation in animal models since he was an undergrad. “Treatments for osteoporosis that target PTH1R require strict dosage, have administrative restrictions, and there aren’t yet any better alternatives,” he says. That motivated their team to look for better drug design strategies targeting the parathyroid hormone receptor.

To understand function through structure, they used cryo-electron microscopy and revealed the 3D structure of the PTH1R and G protein bound to a message molecule. The team synthesized a non-peptide message molecule called PCO371 which binds to the intracellular region of the receptor and interacts directly with G protein subunits. In other words, PCO371 activates the receptor after entering the cell.

The PCO371-bound PTH1R structure can directly and stably modulate the intracellular side of PTH1R. And because PCO371 activates only G protein and not ß-arrestin it does not cause side effects. This specificity of its binding and receptor activation mode makes it a suitable candidate for potential small-molecule-based drugs for class B1 GPCRs, like PTH1R, which currently lack oral administrative drug ligands. Such drugs would have reduced adverse effects and burdens on patients as they act on specific molecular pathways.

The findings from this study will help “develop new drugs for disorders such as obesity, pain, osteoporosis, and neurological disorders.”

The study appears in the journal Nature.

More information: Kazuhiro Kobayashi et al, Class B1 GPCR activation by an intracellular agonist, Nature (2023). DOI: 10.1038/s41586-023-06169-3

Journal information: Nature 

Provided by University of Tokyo 

Record precision achieved in measuring muonic helium-3 nucleus radius

An international research team led by the Paul Scherrer Institute PSI has measured the radius of the nucleus of muonic helium-3 with unprecedented precision. The results are an important stress test for theories and future experiments in atomic physics.

1.97007 femtometer (quadrillionths of a meter): That’s how unimaginably tiny the radius of the atomic nucleus of helium-3 is. This is the result of an experiment at PSI that has now been published in the journal Science.

More than 40 researchers from international institutes collaborated to develop and implement a method that enables measurements with unprecedented precision. This sets new standards for theories and further experiments in nuclear and atomic physics.

This demanding experiment is only possible with the help of PSI’s proton accelerator facility. There Aldo Antognini’s team generates so-called muonic helium-3, in which the two electrons of the helium atom are replaced by an elementary particle called a muon. This allows the nuclear radius to be determined with high precision.

With the measurement of helium-3, the experiments on light muonic atoms have now been completed for the time being. The researchers had previously measured muonic helium-4 and, a few years ago, the atomic nucleus of muonic hydrogen and deuterium.

Muonic helium-3: Twice as slimmed-down

Helium-3 is the lighter cousin of ordinary helium, helium-4. Its atomic nucleus has two protons and two neutrons (hence the 4 after the abbreviation for the element); in helium-3, one of the neutrons is missing. The simplicity of this slimmed-down atomic nucleus is very interesting to Aldo Antognini and other physicists.

The helium-3 that PSI physicist and ETH Zurich professor Antognini is using in the current experiment lacks not only a neutron in the nucleus, but also both electrons that orbit this nucleus. The physicists replace the electrons with a negatively charged muon—hence the name muonic helium-3.

The muon is around 200 times heavier and gets close to the nucleus. Thus, the nucleus and the muon “sense” each other much more intensely, and the wave functions overlap more strongly, as they say in physics.

That makes the muon the perfect probe for measuring the nucleus and its charge radius. This indicates the area over which the positive charge of the nucleus is distributed. Ideal for the researchers, this charge radius of the nucleus does not change when the electrons are replaced by a muon.

Antognini has experience in measuring muonic atoms. A few years ago, he carried out the same experiment with muonic hydrogen, which contains only one proton in the nucleus and whose one electron was replaced by a negatively charged muon. The results caused quite a commotion at the time, because the deviation from measurements based on other methods was surprisingly large. Some critics even considered them wrong. It has now been confirmed many times over: The results were correct.

Worldwide-unique facility enables experiments

This time Antognini will not need to exercise as much persuasive power. For one thing, he has established himself as the leading expert in this area of research. Another factor is that there was no big surprise this time. The current results from muonic helium-3 fit well with those from previous experiments in which other methods were used. However, the PSI team’s measurements are around 15 times more precise.

Negatively charged muons, and plenty of them, are the most important ingredient for the experiment. These must have a very low energy—that is, they must be very slow, at least by the standards of particle physics.

At PSI, around 500 muons per second with energies of one kiloelectron-volt can be generated. This makes the PSI proton accelerator facility, with its beamline developed in-house, the only one in the world that can deliver such slow negative muons in such large numbers.

New standards in nuclear physics
PSI physicist Aldo Antognini is pleased that he and his team, within an international collaboration, have achieved yet another fundamental result in atomic physics. Credit: Scanderbeg Sauer Photography

Laser developed in-house was crucial for success

A crucial share of the success is due to a laser system that the researchers themselves developed. There the challenge is that the laser must fire immediately when a muon flies into the experimental setup.

To make this possible, Antognini and his team installed an extremely thin foil detector in front of the airless experimental chamber. This detects when a muon passes through the foil and signals the laser to emit a pulse of light immediately and at full power.

The researchers determine the charge radius indirectly by measuring the frequency of the laser light. When the laser frequency precisely matches the resonance of a specific atomic transition, the muon is briefly excited to a higher energy state before decaying to the ground state within picoseconds; at that point it will emit a photon in the form of an X-ray.

Finding the resonance frequency at which this transition occurs requires a lot of patience, but the reward is an extremely accurate value for the charge radius of the nucleus.

New benchmark for theoretical modeling

The charge radii obtained from muonic helium-3 and helium-4 serve as important reference values for modern ab initio theories—that is, physical models that calculate the properties of complex physical systems directly from the fundamental laws of physics, without resorting to experimental data.

In the context of nuclear physics, these models offer detailed insights into the structure of light atomic nuclei and the forces between their building blocks, the protons and neutrons.

Precise knowledge of these nuclear radii is also crucial for comparisons with ongoing experiments on conventional helium ions with one electron and on neutral helium atoms with two electrons. Such comparisons provide stringent tests of quantum electrodynamics (QED) in few-body systems—the fundamental theory that describes how charged particles interact through the exchange of photons. They allow researchers to test the predictive power of our most fundamental understanding of atomic structure.

These efforts could lead to new insights into QED in for bound systems—that is, in systems such as atoms, in which particles are not free but bound to each other by forces—or perhaps even to indications of physical effects outside beyond the Standard Model of Particle Physics.

Follow-up experiments are currently being conducted by research teams in Amsterdam, Garching, and China, as well as in Switzerland by the Molecular Physics and Spectroscopy group led by Frédéric Merkt at ETH Zurich.

Antognini also has additional ideas for future experiments aimed at testing the theories of atomic and nuclear physics with even greater precision. One idea is to measure hyperfine splitting in muonic atoms. This refers to energy transitions between split energy levels that reveal deeper details about effects in the atomic nucleus that involve spin and magnetism.

An experiment with muonic hydrogen is currently being prepared, and an experiment with muonic helium is planned. “Many people who work in nuclear physics are very interested in it and are eagerly awaiting our results,” Antognini says.

But the energy density of the laser must be increased significantly, which will require an enormous advance in laser technology. This development is currently under way at PSI and ETH Zurich.

Functional analyses of RNA-related enzymes using a next-generation DNA sequencer

tRNA-MaP: functional analyses of RNA-related enzymes using a next-generation DNA sequencer
Overview of tRNA-MaP. Credit: Ehime University, Ryota Yamagami, Hiroyuki Hori

Genetic information encoded in genomic DNA is transcribed to mRNAs and then the codons on mRNA are decoded by transfer RNAs (tRNAs) during protein synthesis. tRNAs deliver amino acids to ribosomes and proteins are synthesized from the amino acids on the ribosomes according to the decoded genetic information. Therefore, tRNA plays a key role during the translation of genetic information.

tRNAs contain numerous modified nucleosides, which regulate the accuracy and efficiency of protein synthesis. Modified nucleosides in tRNA are synthesized by tRNA modification enzymes. Therefore, unveiling the mechanisms by which tRNA modification enzymes selectively recognize substrate tRNAs from non-substrate RNAs; the when, where, and how many tRNAs are being modified by the modification enzymes, is of crucial importance to understand the protein synthesis machinery.

Addressing these key questions is, however, challenging due to the lack of a high-throughput technique that identifies the characteristic properties of tRNA modification enzymes.

To overcome this issue, Drs. Yamagami and Hori at Ehime University applied next-generation DNA sequencing technology to functional analyses of tRNA modification enzymes and developed a new high-throughput assay method, “tRNA-MaP.”

The tRNA-MaP technique can rapidly screen an RNA pool consisting of more than 5,000 RNA species and identify the substrate tRNAs of the target tRNA modification enzyme(s) with comparative sensitivity to already-established methods. By tRNA-MaP, in combination with protein orthology analyses, the researchers predicted numerous natural modifications in Geobacillus stearothermophilus tRNAs.

Furthermore, they analyzed the substrate recognition mechanism of G. stearothermophilus tRNA m1A22 methyltransferase (TrmK), which methylates adenosine at position 22 to 1-methyladenosine (m1A22) in tRNA, using tRNA-Map. Mutation profiling has revealed that TrmK selects a subset of tRNAs for the substrate.

Using 240 variants of G. stearothermophilus tRNALeu transcripts, the researchers found that U8, A14, G15, G18, G19, U55, Purine57 and A58 are important for the methylation by TrmK. In addition, based on the recognition sites in tRNA and the crystal structure of TrmK, a docking model between TrmK and tRNA has been constructed.

This study, now published in the Journal of Biological Chemistry, revealed that tRNA-Map is applicable for the analysis of the tRNA modification enzyme. Notably, because tRNA-Map can analyze any RNA molecular species from any organism, even DNA molecules, tRNA-Map can be used for analysis of all nucleic acid-related proteins except for tRNA modification enzymes. Thus, tRNA-Map can accelerate the integrative understanding of the flow of genetic information.

More information: Ryota Yamagami et al, Application of mutational profiling: New functional analyses reveal the tRNA recognition mechanism of tRNA m1A22 methyltransferase, Journal of Biological Chemistry (2022). DOI: 10.1016/j.jbc.2022.102759

Journal information: Journal of Biological Chemistry 

Provided by Ehime University

Photon manipulation near absolute zero: New record for processing individual light particles

Scientists at Paderborn University have made a further step forward in the field of quantum research: for the first time ever, they have demonstrated a cryogenic circuit (i.e. one that operates in extremely cold conditions) that allows light quanta—also known as photons—to be controlled more quickly than ever before.

Specifically, these scientists have discovered a way of using circuits to actively manipulate light pulses made up of individual photons. This milestone could substantially contribute to developing modern technologies in quantum information science, communication and simulation. The results have now been published in the journal Optica.

Photons, the smallest units of light, are vital for processing quantum information. This often requires measuring a photon’s state in real time and using this information to actively control the luminous flux—a method known as a “feedforward operation.”

However, thus far this has butted up against technical limitations: light was measured, processed and controlled at a delay, limiting its use for complex applications. With their new method, these scientists have managed to significantly reduce the delay—to less than a quarter of a billionth of a second.

“We have managed to actively interconnect light pulses with detectors, adapted electronics and optical circuits at cryogenic temperatures. This enabled us to manipulate individual photons significantly more quickly than other research groups. The ability allows us to create new active circuits for quantum optics that can be used for a variety of applications,” explains Dr. Frederik Thiele, who is spearheading the project with Niklas Lamberty, both members of the “Mesoscopic Quantum Optics” research group at Paderborn’s Department of Physics.

The researchers used state-of-the-art technologies such as superconducting detectors for this development. These devices measure individual light quanta with extremely high precision.

The electronics were deployed in a cryogenic environment: the amplifier and modulators were operated at temperatures of around -270 degrees Celsius in order to process signals without any significant delay. Integrated modulators are optical components that control the light based on measurement data—virtually loss-free and at high speeds.

The process is based on measuring light pairs, or “correlated photons.” Based on the number of particles measured, the electronic circuit decides in a fraction of a second whether the light should be forwarded or blocked. What makes the integrated design special is that physical losses and delays can be reduced to a minimum.

As well as a fast response, the circuit also produces less heat, which is vital when working in cryostats (extreme cooling systems) in very small spaces.

“Our demonstration shows that we can use superconducting and semiconducting technology to achieve a new level of photonic quantum control. This opens up opportunities for fast and complex quantum circuits, which could be vitally important for quantum information science and communication,” Thiele summarizes.