Bone regeneration is a complex process, and existing methods to aid regeneration including transplants and growth factor transmissions face limitations such as the high cost. But recently, a piezoelectric material that can promote the growth of bone tissue has been developed.
A KAIST research team led by Professor Seungbum Hong from the Department of Materials Science and Engineering (DMSE) has developed a biomimetic scaffold that generates electrical signals upon the application of pressure by utilizing the unique osteogenic ability of hydroxyapatite (HAp). HAp is a basic calcium phosphate material found in bones and teeth. This biocompatible mineral substance is also known to prevent tooth decay and is often used in toothpaste.
This research was conducted in collaboration with a team led by Professor Jangho Kim from the Department of Convergence Biosystems Engineering at Chonnam National University. The results are published in the journal ACS Applied Materials & Interfaces.
Previous studies on piezoelectric scaffolds confirmed the effects of piezoelectricity on promoting bone regeneration and improving bone fusion in various polymer-based materials, but were limited in simulating the complex cellular environment required for optimal bone tissue regeneration. However, this research suggests a new method for utilizing the unique osteogenic abilities of HAp to develop a material that mimics the environment for bone tissue in a living body.
The research team developed a manufacturing process that fuses HAp with a polymer film. The flexible and free-standing scaffold developed through this process demonstrated its remarkable potential for promoting bone regeneration through in-vitro and in-vivo experiments in rats.
The team also identified the principles of bone regeneration that their scaffold is based on. Using atomic force microscopy (AFM), they analyzed the electrical properties of the scaffold and evaluated the detailed surface properties related to cell shape and cell skeletal protein formation. They also investigated the effects of piezoelectricity and surface properties on the expression of growth factors.
Professor Hong from KAIST’s DMSE said, “We have developed a HAp-based piezoelectric composite material that can act like a ‘bone bandage’ through its ability to accelerate bone regeneration.” He added, “This research not only suggests a new direction for designing biomaterials, but is also significant in having explored the effects of piezoelectricity and surface properties on bone regeneration.”
Lipids are a class of biomolecules that play an important role in many cellular processes. Analyses that seek to characterize all lipids in a sample—called lipidomics—are crucial to studying complex biological systems.
An important challenge in lipidomics is connecting the variety of structures of lipids with their biological functions. The positions of the double bonds within fatty acid chains is particularly important. This is because they can affect the physical properties of cellular membranes and modulate cell signaling pathways.
This information is not routinely measured in lipidomics studies because it requires a complicated experimental setup that produces complex data. Thus, scientists at Pacific Northwest National Laboratory (PNNL) developed a streamlined workflow to determine the positions of double bonds. This workflow uses both automation and machine learning approaches.
Their new method, LipidOz, streamlines the data analysis to determine the positions of double bonds. By addressing this key part of the analysis of lipids, LipidOz offers researchers a more efficient and accurate method for lipid characterization. The study is published in the journal Communications Chemistry.
The unambiguous identification of lipids is complicated by the presence of molecular parts that have the same chemical formula but different physical configurations. Specifically, the differences in these molecular parts include the fatty acyl chain length, stereospecifically numbered (sn) position, and position/stereochemistry of double bonds.
Conventional analyses can determine the fatty acyl chain lengths, the number of double bonds, and—in some cases—the sn position but not the positions of carbon–carbon double bonds. The positions of these double bonds can be determined with greater confidence using a gas-phase oxidation reaction called ozone-induced dissociation (OzID), which produces characteristic fragments.
However, the analysis of the data obtained from this reaction is complex and repetitive, and there is lack of software tool support. The open-source Python tool, LipidOz, automatically determines and assigns the double bond positions of lipids using a combination of traditional automation and deep learning approaches. New research demonstrates this ability for standard lipid mixtures and complex lipid extracts, enabling practical application of OzID for future lipidomics studies.
The “ten electron rule” provides guidance for the design of single-atom alloy catalysts for targeted chemical reactions.
A collaborative team across four universities has discovered a very simple rule to design single-atom alloy catalysts for chemical reactions. The “ten electron rule” helps scientists identify promising catalysts for their experiments very rapidly. Instead of extensive trial and error experiments of computationally demanding computer simulations, catalysts’ composition can be proposed simply by looking at the periodic table.
Single-atom alloys are a class of catalysts made of two metals: a few atoms of reactive metal, called the dopant, are diluted in an inert metal (copper, silver or gold). This recent technology is extremely efficient at speeding up chemical reactions but traditional models don’t explain how they work.
The team, which worked across the University of Cambridge, University College London, the University of Oxford and the Humboldt-University of Berlin, has published their research in Nature Chemistry. The scientists made computer simulations to unravel the underlying laws that control how single-atom alloy catalysts work.
The rule showed a simple connection: chemicals bind the most strongly to single-atom alloy catalysts when the dopant is surrounded by ten electrons. This means that scientists designing experiments can now simply use the columns on the periodic table to find which catalysts will have the desired properties for their reactions.
Dr. Romain Réocreux, a postdoctoral researcher in the group of Prof. Angelos Michaelides, who led this research, says, “When you have a difficult chemical reaction, you need a catalyst with optimal properties. On the one hand, a strong-binding catalyst may poison and stop accelerating your reaction; on the other hand, a weakly-binding catalyst may just do nothing.”
“Now we can identify the optimal catalyst just by looking at a column on the periodic table. This is very powerful since the rule is simple and can speed up the discovery of new catalysts for particularly difficult chemical reactions.”
Prof. Stamatakis, Professor of Computational Inorganic Chemistry at the University of Oxford, who contributed to the research, says, “After a decade of intense research on single-atom alloys, we now have an elegant, simple but powerful theoretical framework that explains binding energy trends and enables us to make predictions about catalytic activity.”
Using this rule, the team proposed a promising catalyst for an electrochemical version of the Haber-Bosch process, a key reaction for the synthesis of fertilizers that has been using the same catalyst since it was first discovered in 1909.
Dr. Julia Schumann, who started the project at the University of Cambridge and is now at Humboldt-Universität of Berlin, explains, “Many catalysts used in the chemical industry today were discovered in the laboratory using trial and error methods. With a better understanding of the materials’ properties, we can propose new catalysts with improved energy efficiency and reduced CO2 emissions for industrial processes.”
It is easy to be optimistic about hydrogen as an ideal fuel. It is much more difficult to come up with a solution to an absolutely fundamental problem: How to store this fuel efficiently? A Swiss-Polish team of experimental and theoretical physicists has found the answer to the question of why previous attempts to use the promising magnesium hydride for this purpose have proved unsatisfactory, and why they may succeed in the future.
Hydrogen has long been seen as the energy carrier of the future. However, before it becomes a reality in the energy sector, efficient methods of storing it must be developed. Materials—selected in such a way that at low energy cost, hydrogen can first be injected into them and then recovered on demand, preferably under conditions similar to those typical of our everyday environment—appear to be the optimal solution.
A promising candidate for hydrogen storage appears to be magnesium. Converting it into magnesium hydride, however, requires a suitably efficient catalyst, which has not yet been found.
The work of a team of scientists from Empa—the Swiss Federal Laboratories for Materials Science and Technology in Dübendorf, and the Department of Chemistry at the University of Zurich as well as the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow, has shown that the reason for the many years of failure up to this point lies in an incomplete understanding of the phenomena occurring in magnesium during hydrogen injection.
The main obstacle to the uptake of hydrogen as an energy source is the difficulty of storing it. In still-rare hydrogen-powered cars, it is stored compressed at a pressure of around 700 atmospheres. This is neither the cheapest nor the safest method, and it has little to do with efficiency: There is only 45 kg of hydrogen in one cubic meter. The same volume can hold 70 kg of hydrogen, if it is condensed beforehand.
Unfortunately, the liquefaction process requires large amounts of energy, and the extremely low temperature, at around 20 Kelvin, must then be maintained throughout storage. An alternative could be suitable materials; for example, magnesium hydride, which can hold up to 106 kg of hydrogen in a cubic meter.
Magnesium hydride is among the simplest of the materials tested for hydrogen storage capacity. Its content can reach 7.6% (by weight). Magnesium hydride devices are therefore quite heavy and so mainly suitable for stationary applications. However, it is important to note that magnesium hydride is a very safe substance and can be stored without risk; for example, in a basement, and magnesium itself is a readily available and cheap metal.
“Research on the incorporation of hydrogen into magnesium has been going on for decades, yet it has not resulted in solutions that can count on wider use,” says Prof. Zbigniew Lodziana (IFJ PAN), a theoretical physicist who has co-authored an article in Advanced Science, where the latest discovery is presented.
“One source of problems is hydrogen itself. This element can effectively penetrate the crystal structure of magnesium, but only when it is present in the form of single atoms. To obtain it from typical molecular hydrogen, a catalyst efficient enough to make the process of hydrogen migration in the material fast and energetically viable is required. So everyone has been looking for a catalyst that meets the above conditions, unfortunately without much success. Today, we finally know why these attempts were doomed to failure.”
Prof. Lodziana has developed a new model of the thermodynamic and electron processes occurring in magnesium in contact with hydrogen atoms. The model predicts that during the migration of hydrogen atoms, local, thermodynamically stable magnesium hydride clusters are formed in the material. At the boundaries between the metallic magnesium and its hydride, changes in the electronic structure of the material then occur, and it is these that have a significant role in reducing the mobility of hydrogen ions.
In other words, the kinetics of magnesium hydride formation is primarily determined by phenomena at its interface with magnesium. This effect had so far not been taken into account in the search for efficient catalysts.
Prof. Lodziana’s theoretical work complements experiments performed in the Swiss laboratory in Dübendorf. Here, the migration of atomic hydrogen in a layer of pure magnesium sputtered onto palladium was studied in an ultra-high vacuum chamber. The measuring apparatus was capable of recording changes in the state of several outer atomic layers of the sample under study, caused by the formation of a new chemical compound and the associated transformations of the material’s electronic structure. The model proposed by the researchers from the IFJ PAN allows us to fully understand the experimental results.
The achievements of the Swiss-Polish group of physicists not only pave the way for a new search for an optimal catalyst for magnesium hydride, but also explain why some of the previously found catalysts showed higher efficiency than expected.
“There is much to suggest that the lack of significant progress in hydrogen storage in magnesium and its compounds was simply due to our incomplete understanding of the processes involved in hydrogen transport in these materials. For decades, we have all been looking for better catalysts, only not where we should be looking. Now, new theoretical and experimental results make it possible to think again with optimism about further improvements in methods of introducing hydrogen into magnesium,” concludes Prof. Lodziana.
More information: Selim Kazaz et al, Why Hydrogen Dissociation Catalysts do not Work for Hydrogenation of Magnesium, Advanced Science (2023). DOI: 10.1002/advs.202304603
Chemists of the University of Amsterdam (UvA) have developed an autonomous chemical synthesis robot with an integrated AI-driven machine learning unit. Dubbed “RoboChem,” the benchtop device can outperform a human chemist in terms of speed and accuracy while also displaying a high level of ingenuity.
As the first of its kind, it could significantly accelerate chemical discovery of molecules for pharmaceutical and many other applications. RoboChem’s first results are published in the journal Science.
RoboChem was developed by the group of Prof. Timothy Noël at the UvA’s Van ‘t Hoff Institute for Molecular Sciences. Their paper shows that RoboChem is a precise and reliable chemist that can perform a variety of reactions while producing minimal amounts of waste.
Working autonomously around the clock, the system delivers results quickly and tirelessly. Noël said, “In a week, we can optimize the synthesis of about ten to twenty molecules. This would take a Ph.D. student several months.” The robot not only yields the best reaction conditions, but also provides the settings for scale-up.
“This means we can produce quantities that are directly relevant for suppliers to the pharmaceutical industry, for example.”
RoboChem’s ‘brain’
The expertise of the Noël group is in “flow chemistry,” a novel way of performing chemistry where a system of small, flexible tubes replaces beakers, flasks and other traditional chemistry tools.
In RoboChem, a robotic needle carefully collects starting materials and mixes these together in small volumes of just over half a milliliter. These then flow through the tubing system towards the reactor. There, the light from powerful LEDs triggers the molecular conversion by activating a photocatalyst included in the reaction mixture.
The flow then continues towards an automated NMR spectrometer that identifies the transformed molecules. These data are fed back in real time to the computer that controls RoboChem.
“This is the brain behind RoboChem,” says Noël. “It processes the information using artificial intelligence. We use a machine learning algorithm that autonomously determines which reactions to perform. It always aims for the optimal outcome and constantly refines its understanding of the chemistry.”
Impressive ingenuity
The group put a lot of effort into substantiating RoboChem’s results. All of the molecules now included in the Science paper were isolated and checked manually. Noël says the system has impressed him with its ingenuity.
“I have been working on photocatalysis for more than a decade now. Still, RoboChem has shown results that I would not have been able to predict. For instance, it has identified reactions that require only very little light. At times I had to scratch my head to fathom what it had done. You then wonder: would we have done it the same way? In retrospect, you see RoboChem’s logic. But I doubt if we would have obtained the same results ourselves. Or not as quickly, at least.”
The researchers also used RoboChem to replicate previous research published in four randomly selected papers. They then determined whether Robochem produced the same—or better—results.
“In about 80% of the cases, the system produced better yields. For the other 20%, the results were similar,” Noël says. “This leaves me with no doubt that an AI-assisted approach will be beneficial to chemical discovery in the broadest possible sense.”
Breakthroughs in chemistry using AI
According to Noël, the relevance of RoboChem and other “computerized” chemistry also lies in the generation of high-quality data, which will benefit the future use of AI.
“In traditional chemical discovery only a few molecules are thoroughly researched. Results are then extrapolated to seemingly similar molecules. RoboChem produces a complete and comprehensive dataset where all relevant parameters are obtained for each individual molecule. That provides much more insight.”
Another feature is that the system also records “negative” data. In current scientific practice, most published data only reflects successful experiments. “A failed experiment also provides relevant data,” says Noël.
“But this can only be found in the researchers’ handwritten lab notes. These are not published and thus unavailable for AI-powered chemistry. RoboChem will change that, too. I have no doubt that if you want to make breakthroughs in chemistry with AI, you will need these kinds of robots.”
A University of Massachusetts Amherst team has made a major advance toward modeling and understanding how intrinsically disordered proteins (IDPs) undergo spontaneous phase separation, an important mechanism of subcellular organization that underlies numerous biological functions and human diseases.
IDPs play crucial roles in cancer, neurodegenerative disorders and infectious diseases. They make up about one-third of proteins that human bodies produce, and two-thirds of cancer-associated proteins contain large, disordered segments or domains. Identifying the hidden features crucial to the functioning and self-assembly of IDPs will help researchers understand what goes awry with these features when diseases occur.
In a paper published in the Journal of the American Chemical Society, senior author Jianhan Chen, professor of chemistry, describes a novel way to simulate phase separations mediated by IDPs, an important process that has been difficult to study and describe.
“Phase separation is a really well-known phenomenon in polymer physics, but what people did not know until about 15 years ago was that this is also a really common phenomenon in biology,” Chen explains. “You can look at phase separation with a microscope, but to understand this phenomenon at the molecular level is very difficult.
“In the past five or 10 years, people have started to discover that many of these disordered proteins can drive phase separation, including numerous important ones involved in cancer and neurodegenerative disorders.”
The new paper, based on research in Chen’s computational biophysics and biomaterials lab, constitutes one chapter of lead author Yumeng Zhang’s Ph.D. dissertation. Zhang will start work as a postdoctoral researcher at Massachusetts Institute of Technology (MIT) in February. Another key contributor is Shanlong Li, a postdoctoral research associate in Chen’s lab.
Chen’s lab developed an accurate, GPU-accelerated hybrid resolution (HyRes) force field for simulating phase separations mediated by IDPs. This model is unique in its ability to accurately describe peptide backbone interactions and transient secondary structures, while being computationally efficient enough to model liquid-liquid phase separation. This new model fills a critical gap in the existing capability in computer simulation of IDP phase separation.
Chen and team created HyRes simulations to demonstrate for the first time what governs the condensate stability of two important IDPs.
“I actually did not anticipate that it could do such a good job at describing phase separation because it’s a really difficult phenomenon to simulate,” Chen says. “We demonstrated that this model is accurate enough to start looking at the impacts of even a single mutation or residual structures in the phase separation.”
The researchers’ HyRes-GPU provides an innovative simulation tool for studying the molecular mechanisms of phase separation. The ultimate goal is to develop therapeutic strategies in the treatment of diseases associated with disordered proteins.
“This is really the significance of this work,” Chen says. “Important biological processes are believed to occur through phase separation. So, if we can understand better what controls this process, that knowledge will be really powerful, if not essential, for us to think about controlling phase separation for various scientific and engineering purposes. This will help us understand the type of intervention that will be required to achieve therapeutic effects.”
Chen says the next step is to apply what his team has learned to larger-scale simulations of more complex biomolecular mixtures.
“Shanlong is now working on constructing a similar model for nucleic acids because phase separation often involves both disordered proteins and nucleic acids,” he says. “We want to be able to describe both key components, and that would allow us to look at many more systems.”
More information: Yumeng Zhang et al, Toward Accurate Simulation of Coupling between Protein Secondary Structure and Phase Separation, Journal of the American Chemical Society (2023). DOI: 10.1021/jacs.3c09195
The water in the air originates from both natural and forced evaporation, with condensation being the final and crucial step in water harvesting. Condensation involves nucleation, growth, and shedding of water droplets, which are then collected.
However, uncontrollable growth of condensed droplets leading to surface flooding is a pressing challenge due to insufficient driving forces, posing a threat to sustainable condensation.
A study, led by Prof. Jiuhui Qu, Dr. Qinghua Ji, and Dr. Wei Zhang from Tsinghua University, focuses on addressing water scarcity by exploring atmospheric water harvesting. The work is published in the journal National Science Review.
To expedite this process and achieve orderly and rapid droplet shedding from the condensing surface, the team took inspiration from nature. They observed that the Australian thorny devil efficiently spread droplets, such as rains, dews, and pond water, from its scales to capillary channels between the scales, eventually connecting to its mouth.
This natural mechanism made water easier to store and consume. Additionally, the team drew inspiration from fish, particularly catfish, which possess an epidermal mucus layer reducing swimming drag and enhancing adaptability to aqueous environments. These insights from nature address the challenges of orderly droplet navigation and low-drag droplet shedding, respectively.
The research team employed hydrogel fibers to create an engineered pattern on glass, incorporating the advantageous features of both lizards and catfish.
The hydrogel fiber is an interpenetrated network of sodium alginate and polyvinyl alcohol with a partially polymerized surface and arch structure. The surface, adorned with branched –OH and –COOH chains, exhibits a strong affinity for water molecules.
This affinity, coupled with the arch structure, provides sufficient driving force for droplets to move from the condensing substrate to the hydrogel fiber. Simultaneously, the branched –OH and –COOH chains can retain water molecules even after droplets leave the surface, aiding in the formation of a precursor water film that lubricates droplet sliding.
To observe droplet movement, fluorescent molecules were utilized as probes. The captured trajectories revealed an impressive migration rate, with droplets formed on the glass swiftly pumped to the hydrogel fiber, thereby regenerating the condensing sites.
The success lies in the concurrent application of chemical wetting gradients and the Laplace pressure difference across the hydrogel fiber and the glass. The pumping effect resulted in a reduction of over 40% in the energy of the droplet-condensing surface system, acting as the driving force source. “This is similar to the directional water dispersion over the integuments of lizards,” Professor Qu notes.
The researchers also observed distinctions in the movement of water on the hydrogel fiber surface compared to that on glass. On the glass, droplets advanced as a cohesive unit with successive formation of new advancing angles, resulting in complete mixing of fluorescent probes within the droplet during advancement.
In contrast, droplet sliding on the hydrogel fiber surface exhibited a layered behavior. The inner layer of water bonded to the hydrogel surface, while the outer layer slid without direct contact with the hydrogel surface.
“The dangling chains over the hydrogel surface act like the mucus layer of the catfish, lubricating the friction between the droplets and the condensing surface,” explains Dr. Ji.
This engineered hydrogel fiber pattern increased the condensation rate by 85.9% without requiring external energy input. Moreover, it was successfully applied to enhance the water collection rate of solar evaporative water purification by 109%.
This study not only provides insights into natural phenomena but also marks a novel attempt to manipulate droplet movement for condensation. The findings lay the foundation for future endeavors in discovering phenomena and translating theories into practical applications.
More information: Wei Zhang et al, Pumping and sliding of droplets steered by a hydrogel pattern for atmospheric water harvesting, National Science Review (2023). DOI: 10.1093/nsr/nwad334
What if organ damage could be repaired by simply growing a new organ in the lab? Improving researchers’ ability to print live cells on demand into geometrically well-defined, soft complex 3D architectures is essential to such work, as well as for animal-free toxicological testing.
In a study recently published in ACS Biomaterials Science and Engineering, researchers from Osaka University have overcome prior limitations that have hindered cell growth and the geometrical fidelity of bioprinted architectures. This work might help bring 3D-printed cell constructs closer to mimicking biological tissue and organs.
Ever since bioprinting was first reported in 1988 by using a standard inkjet printer, researchers have explored the potential of this layer-by-layer tissue assembly procedure to regrow damaged body parts and test medical hypotheses. Bioprinting is to eject a cell-containing “ink” from a printing nozzle to form 3D structures. It is usually easier to print hard rather than soft structures. However, soft structures are preferable in terms of cell growth in the printed structures.
When printing soft structures, doing so in a printing support is effective; however, solidification of ink in the support filled in a vessel can result in its contamination with unwanted substances from the support. Ink solidification into a soft matrix using a printing support without contamination, while retaining cell viability, was the goal of this work.
“In our approach, a 3D printer alternately dispenses the cell-containing ink and a printing support,” explains Takashi Kotani, lead author of the study. “The interesting point is that the support also plays a role in facilitating the solidification of the ink. All that’s necessary for ink solidification is in the support, and after removing the support, the geometry of the soft printed cell structures remains intact.”
Hydrogen peroxide from the support enabled an enzyme in the ink to initiate gelation of the ink, resulting in a gel-enclosed cell assembly within a few seconds. This rapid gelation prevented contamination of the assembly during formation. After removing the support, straightforward 3D constructs such as inverted trapezium geometries as well as human nose shapes—including bridges, holes, and overhangs—were readily obtained.
“We largely retain mouse fibroblast cell geometry and growth, and the cells remain viable for at least two weeks,” says Shinji Sakai, senior author. “These cells also adhere to and proliferate on our constructs, which highlights our work’s potential in tissue engineering.”
This new technique is an important step forward to engineering human cell assemblies and tissues. Further work might involve further optimizing the ink and support, as well as incorporating blood vessels into the artificial tissue to improve its resemblance to physiological architectures. Regenerative medicine, pharmaceutical toxicology, and other fields will all benefit from this work and further improvements in the precise fidelity of bioprinting.
More information: Takashi Kotani et al, Horseradish Peroxidase-Mediated Bioprinting via Bioink Gelation by Alternately Extruded Support Material, ACS Biomaterials Science & Engineering (2023). DOI: 10.1021/acsbiomaterials.3c00996
Using a small bird’s nest-making process as a model, researchers from North Carolina State University have developed a nontoxic process for making cellulose gels. The freeze-thaw process is simple, cost-effective, and can create cellulose gels that are useful in a number of applications, including tunable gels for timed drug delivery. The process also works with bamboo and potentially other lignin-containing plant fibers.
The work appears in Advanced Composites and Hybrid Materials. Noureddine Abidi of Texas Tech University is a co-corresponding author of the work.
Cellulose is a wonderful material for making hydrogels—which are used in applications ranging from contact lenses to wound care and drug delivery. But creating hydrogels from cellulose is tricky, and often the processes used to create the hydrogels are themselves toxic.
“Normally, you have to first dissolve the cellulose and then induce it to crosslink or form the structure of interest, which often requires the use of difficult to handle, unstable, or toxic solvents,” says Lucian Lucia, professor of forest biomaterials and chemistry at NC State and co-corresponding author of the work.
Enter the swift family of birds—small birds who use their saliva to hold twigs in place when building their nests.
“My then-Ph.D. student Zhen Zhang noted that when birds do this, the saliva acts like a natural resin that holds the nest together and encourages the fibers within the nest to interconnect or crosslink,” Lucia says. “Which is exactly what we want the dissolved cellulose to do when making hydrogels. So we asked ourselves, ‘What if we mimic the birds?'”
Zhang, currently a postdoc at Texas Tech University, is a co-corresponding author.
The researchers added a water soluble cellulose called carboxymethyl cellulose (CMC) to an acid solution and dissolved the CMC. Then they added powdered cellulose fiber to the solution and subjected it to four rounds of freezing and thawing. The result was cellulose gel.
“Think of it as adding a thickener to water, like you would a pie filling,” Lucia says. “By changing the pH of the CMC, the water essentially becomes sticky. Freezing and thawing the solution causes the cellulose to compact and interweave itself into the sticky network, giving you a more organized structure, just as swifts do when they create their nests. Only we don’t have to use beaks and saliva to do it.”
Freeze drying the gels resulted in cellulose foam. The researchers repeated the process with bamboo fibers as well, which suggests that it could be useful with many other lignin– and cellulose-containing fibers.
“The cellulose gels are robust, stable at room temperature and can be tuned to degrade on a schedule, so would be useful in drug delivery applications, among others,” Lucia says. “This opens a promising new window for using biomimicry to process these insoluble cellulosic materials in a greener way.”
More information: Zhen Zhang et al, A “bird nest” bioinspired strategy deployed for inducing cellulose gelation without concomitant dissolution, Advanced Composites and Hybrid Materials (2023). DOI: 10.1007/s42114-023-00745-x
A new kind of polymer membrane created by researchers at Georgia Tech could reshape how refineries process crude oil, dramatically reducing the energy and water required while extracting even more useful materials.
The so-called DUCKY polymers—more on the unusual name in a minute—are reported in Nature Materials. And they’re just the beginning for the team of Georgia Tech chemists, chemical engineers, and materials scientists. They also have created artificial intelligence tools to predict the performance of these kinds of polymer membranes, which could accelerate development of new ones.
The implications are stark: the initial separation of crude oil components is responsible for roughly 1% of energy used across the globe. What’s more, the membrane separation technology the researchers are developing could have several uses, from biofuels and biodegradable plastics to pulp and paper products.
“We’re establishing concepts here that we can then use with different molecules or polymers, but we apply them to crude oil because that’s the most challenging target right now,” said M.G. Finn, professor and James A. Carlos Family Chair in the School of Chemistry and Biochemistry.
Crude oil in its raw state includes thousands of compounds that have to be processed and refined to produce useful materials—gas and other fuels, as well as plastics, textiles, food additives, medical products, and more. Squeezing out the valuable stuff involves dozens of steps, but it starts with distillation, a water- and energy-intensive process.
Researchers have been trying to develop membranes to do that work instead, filtering out the desirable molecules and skipping all the boiling and cooling.
“Crude oil is an enormously important feedstock for almost all aspects of life, and most people don’t think about how it’s processed,” said Ryan Lively, Thomas C. DeLoach Jr. Professor in the School of Chemical and Biomolecular Engineering. “These distillation systems are massive water consumers, and the membranes simply are not. They’re not using heat or combustion. They just use electricity. You could ostensibly run it off of a wind turbine, if you wanted. It’s just a fundamentally different way of doing a separation.”
What makes the team’s new membrane formula so powerful is a new family of polymers. The researchers used building blocks called spirocyclic monomers that assemble together in chains with lots of 90-degree turns, forming a kinky material that doesn’t compress easily and forms pores that selectively bind and permit desirable molecules to pass through. The polymers are not rigid, which means they’re easier to make in large quantities. They also have a well-controlled flexibility or mobility that allows pores of the right filtering structure to come and go over time.
The DUCKY polymers are created through a chemical reaction that’s easy to produce at a scale that would be useful for industrial purposes. It’s a flavor of a Nobel Prize-winning family of reactions called click chemistry, and that’s what gives the polymers their name. The reaction is called copper-catalyzed azide-alkyne cycloaddition—abbreviated CuAAC and pronounced “quack.” Thus: DUCKY polymers.
In isolation, the three key characteristics of the polymer membranes aren’t new; it’s their unique combination that makes them a novelty and effective, Finn said.
The research team included scientists at ExxonMobil, who discovered just how effective the membranes could be. The company’s scientists took the crudest of the crude oil components—the sludge left at the bottom after the distillation process—and pushed it through one of the membranes. The process extracted even more valuable materials.
“That’s actually the business case for a lot of the people who process crude oils. They want to know what they can do that’s new. Can a membrane make something new that the distillation column can’t?” Lively said. “Of course, our secret motivation is to reduce energy, carbon, and water footprints, but if we can help them make new products at the same time, that’s a win-win.”
Predicting such outcomes is one way the team’s AI models can come into play. In a related study recently published in Nature Communications, Lively, Finn, and researchers in Rampi Ramprasad’s Georgia Tech lab described using machine learning algorithms and mass transport simulations to predict the performance of polymer membranes in complex separations.
“This entire pipeline, I think, is a significant development. And it’s also the first step toward actual materials design,” said Ramprasad, professor and Michael E. Tennenbaum Family Chair in the School of Materials Science and Engineering. “We call this a ‘forward problem,’ meaning you have a material and a mixture that goes in—what comes out? That’s a prediction problem. What we want to do eventually is to design new polymers that achieve a certain target permeation performance.”
Complex mixtures like crude oil might have hundreds or thousands of components, so accurately describing each compound in mathematical terms, how it interacts with the membrane, and extrapolating the outcome is “non-trivial,” as Ramprasad put it.
Training the algorithms also involved combing through all the experimental literature on solvent diffusion through polymers to build an enormous dataset. But, like the potential of membranes themselves to reshape refining, knowing ahead of time how a proposed polymer membrane might work would accelerate a materials design process that’s basically trial-and-error now, Ramprasad said.
“The default approach is to make the material and test it, and that takes time. This data-driven or machine learning-based approach uses past knowledge in a very efficient manner,” he said. “It’s a digital partner: You’re not guaranteed an exact prediction, because the model is limited by the space spanned by the data you use to train it. But it can extrapolate a little bit and it can take you in new directions, potentially. You can do an initial screening by searching through vast chemical spaces and make go, no-go decisions up front.”
Lively said he’d long been a skeptic about the ability of machine learning tools to tackle the kinds of complex separations he works with.
“I always said, ‘I don’t think you can predict the complexity of transport through polymer membranes. The systems are too big; the physics are too complicated. Can’t do it.'”
But then he met Ramprasad: “Rather than just be a naysayer, Rampi and I took a stab at it with a couple of undergrads, built this big database, and dang. Actually, you can do it,” Lively said.
Developing the AI tools also involved comparing the algorithms’ predictions to actual results, including with the DUCKY polymer membranes. The experiments showed the AI models predictions were within 6% to 7% of actual measurements.
“It’s astonishing,” Finn said. “My career has been spent trying to predict what molecules are going to do. The machine learning approach, and Rampi’s execution of it, is just completely revolutionary.”
More information: Nicholas C. Bruno et al, Solution-processable polytriazoles from spirocyclic monomers for membrane-based hydrocarbon separations, Nature Materials (2023). DOI: 10.1038/s41563-023-01682-2