Research finds drones can deliver blood safely

A project finds drone delivery does not influence the blood’s quality or how long it lasts.

Share Button

Sharing risk to avoid power outages in an era of extreme weather

This summer’s Western heat waves raise the specter of recent years’ rotating power outages and record-breaking electricity demand in the region. If utilities across the area expanded current schemes to share electricity, they could cut outage risks by as much as 40%, according to new research by the Climate and Energy Policy Program at the Stanford Woods Institute for the Environment. The study highlights how such a change could also help ensure public opinion and policy remain favorable for renewable energy growth. It comes amid debate over initiatives like the West-Wide Governance Pathways Initiative, an effort led by Western regulators to create a multi-state grid operations and planning organization.

“Extreme weather events disregard state and electric utilities’ boundaries, and so will the solution needed to mitigate the impact,” said study co-author Mareldi Ahumada-Paras, a postdoctoral scholar in energy science and engineering in the Stanford Doerr School of Sustainability.”Greater regional cooperation can benefit reliability under wide-spread stress conditions.”

The new abnormal

Across the West, electricity providers are struggling with three new realities. Demand and resource availability are becoming harder to predict because of factors ranging from more frequent and widespread weather extremes to the proliferation of rooftop solar installations to more frequent and widespread weather extremes. Rapid growth of renewable energy, such as wind and solar, along with energy storage options requires new operating and planning strategies for meeting demand. On top of these trends, a patchwork of state and federal clean energy goals creates different incentives that influence utilities’ operation and planning differently.

“New grid management approaches can capitalize on the opportunities created by our rapidly changing electricity system and address increasing stress from extreme heat, drought, and other climate-related events,” said study co-author Michael Mastrandrea, research director of the Climate and Energy Policy Program.

The study focuses on the power grid that stretches from the West Coast to the Great Plains and from western Canada to Baja California. In recent years, extreme heat events and severe droughts have put major demand stresses on the grid and reduced hydropower availability.

The researchers used power system optimization models to simulate grid operations under stress conditions based on those experienced during a 2022 California heat wave that saw record-breaking energy demand. Their simulations demonstrated that expanding the area of cooperation could reduce the risk of power outages by as much as 40%, reduce the amount of unserved energy — when electricity demand exceeds supply — by more than half, and increase reliability.

Policy and public opinion

The researchers refer to these estimates as “illustrative and directional” because incomplete information makes it hard to precisely simulate how those responsible for ensuring power system reliability within specific service territories will respond to stress conditions. Still, the results highlight how expanded cooperation among utilities can improve responses to local shortages and excesses, offer greater flexibility in managing unexpected disruptions and balancing supply and demand, and ensure reliable electricity supply during extreme weather events.

Expanded cooperation among utilities could also maximize the value of the region’s growing renewable energy portfolio, according to the researchers. Renewable power generation, such as wind and solar, can be variable since the wind doesn’t always blow and the sun only shines so many hours per day. Expanding cooperation across a larger geographic area can ensure that renewable power generation is used (or stored for later) when it is available. Critics of these sources are also likely to blame them for major power outages, according to the researchers, feeding a narrative that could sour public opinion and lead to policies slowing the adoption or expansion of clean energy.

“Our work shows how greater cooperation isn’t just about dollars and cents for utilities and their customers,” said study co-author Michael Wara, director of the Climate and Energy Policy Program at the Stanford Woods Institute for the Environment. “It’s about keeping the lights on as we confront the challenge of the energy transition and the growing impacts of climate change.”

Wara and Mastrandrea are also senior director for policy and director for policy, respectively, in the Stanford Doerr School of Sustainability’s Sustainability Accelerator.

Share Button

Compound in rosemary extract can reduce cocaine sensitivity

A team of researchers led by the University of California, Irvine has discovered that an antioxidant found in rosemary extract can reduce volitional intakes of cocaine by moderating the brain’s reward response, offering a new therapeutic target for treating addiction.

The study, recently published online in the journal Neuron, describes team members’ focus on a region of the brain called the globus pallidus externus, which acts as a gatekeeper that regulates how we react to cocaine. They discovered that within the GPe, parvalbumin-positive neurons are crucial in controlling the response to cocaine by changing the activity neurons releasing the pleasure molecule dopamine.

“There are currently no effective therapeutics for dependence on psychostimulants such as cocaine, which, along with opioids, represent a substantial health burden,” said corresponding author Kevin Beier, UC Irvine associate professor of physiology and biophysics. “Our study deepens our understanding of the basic brain mechanisms that increase vulnerability to substance use disorder-related outcomes and provides a foundation for the development of new interventions.”

Findings in mice revealed that globus pallidus externus parvalbumin-positive cells, which indirectly influence the release of dopamine, become more excitable after being exposed to cocaine. This caused a drop in the expression of certain proteins that encode membrane channels that usually help keep the globus pallidus cell activity in check. Researchers found that carnosic acid, an isolate of rosemary extract, selectively binds to the affected channels, providing an avenue to reduce response to the drug in a relatively specific fashion.

“Only a subset of individuals are vulnerable to developing a substance use disorder, but we cannot yet identify who they are. If globus pallidus cell activity can effectively predict response to cocaine, it could be used to measure likely responses and thus serve as a biomarker for the most vulnerable,” Beier said. “Furthermore, it’s possible that carnosic acid could be given to those at high risk to reduce the response to cocaine.”

The next steps in this research include thoroughly assessing negative side effects of carnosic acid and determining the ideal dosage and timing. The team is also interested in testing its efficacy in reducing the desire for other drugs and in developing more potent and targeted variants.

In addition to UC Irvine researchers, scientists from the University of West Virginia and the University of Colorado participated in the study.

This work was supported by grants from the National Institutes of Health, One Mind, the Alzheimer’s Association, New Vision Research, BrightFocus Foundation, and the Brain & Behavior Research Foundation.

Share Button

Researchers use AI tools to uncover connections between radiotherapy for lung cancer and heart complications

Researchers from Brigham and Women’s Hospital, a founding member of the Mass General Brigham healthcare system, have used artificial intelligence tools to accelerate the understanding of the risk of specific cardiac arrhythmias when various parts of the heart are exposed to different thresholds of radiation as part of a treatment plan for lung cancer. Their results are published in JACC: CardioOncology.

“Radiation exposure to the heart during lung cancer treatment can have very serious and immediate effects on a patient’s cardiovascular health,” said corresponding author Raymond Mak, MD, of the Department of Radiation Oncology at Brigham and Women’s Hospital. “We are hoping to inform not only oncologists and cardiologists, but also patients receiving radiation treatment, about the risks to the heart when treating lung cancer tumors with radiation.”

The emergence of artificial intelligence tools in health care has been groundbreaking and has the potential to positively reshape the continuum of care, including informing treatment plans for patients with cancer. Mass General Brigham, as one of the nation’s top integrated academic health systems and largest innovation enterprises, is leading the way in conducting rigorous research on new and emerging technologies to inform the responsible incorporation of AI into care delivery.

For patients receiving radiation therapy to treat non-small cell lung cancer (NSCLC), arrhythmias or irregular rhythms of the heart can be common. Because of the close proximity of the heart to the lungs and with NSCLC tumors being near or around the heart, the heart can receive collateral damage from radiation dose spillage meant to target the cancer tumors. Prior studies have found that this type of exposure to the heart is associated with general cardiac issues. However, this nuanced study demonstrated that the risk for different types of arrhythmias can vary significantly based on the pathophysiology and cardiac structures that are exposed to different levels of radiation.

In order to classify the types of arrhythmias that are associated with cardiac substructures receiving radiation, researchers conducted a retrospective analysis on 748 patients in Massachusetts, who were treated with radiation for locally advanced NSCLC. The arrhythmia subtypes cataloged included atrial fibrillation, atrial flutter, other supraventricular tachycardia, bradyarrhythmia, and ventricular tachyarrhythmia or asystole.

The team’s statistical analyses indicated that about one out of every six patients experienced at least one grade 3 arrhythmia with a median time of 2.0 years until the first arrhythmia. Grade 3 classifications are considered serious events that likely need intervention or require hospitalization. They also found that almost one-third of patients who experienced arrhythmias also suffered from major adverse cardiac events.

The arrhythmia classes outlined in the study did not entirely encompass the range of heart rhythm issues that are possible, but the authors note that these observations still create a better understanding of the possible pathophysiology pathways and potential avenues for minimizing cardiac toxicity after receiving radiation treatment. Their work also offers a predictive model for dose exposure and the type of expected arrhythmia.

For the future, the researchers believe that radiation oncologists should collaborate with cardiology experts to better understand the mechanisms of heart injuries and their connection to radiation treatment. In addition, they should take advantage of modern radiation treatment to actively sculpt radiation exposure away from the specific cardiac regions that are at high risk for causing arrhythmias. According to Mak, this study, alongside previous research, will help with surveillance, screening, and informing radiation oncologists on which parts of the heart to limit radiation exposure to, and in turn, mitigate complications.

“An interesting part of what we did was leverage artificial intelligence algorithms to segment structures like the pulmonary vein and parts of the conduction system to measure the radiation dose exposure in over 700 patients. This saved us many months of manual work,” said Mak. “So, not only does this work have potential clinical impact, but it also opens the door for using AI in radiation oncology research to streamline discovery and create larger datasets.”

Share Button

Mpox not new Covid and can be stopped, expert says

The world must act now to ensure vaccines reach the areas most in need, the WHO’s Dr Hans Kluge says.

Share Button

‘I was addicted to smoking Spice vapes at school’

A teenager shares how he became addicted after trying the drug in a vape with school friends.

Share Button

Heart data unlocks sleep secrets

We know that quality sleep is as essential to survival as food and water. Yet, despite spending a third of our lives in slumber, it largely remains a scientific mystery.

Not that experts haven’t tried.

Sleep analysis, also known as polysomnography, is used to diagnose sleep disorders by recording multiple types of data, including brain (electroencephalogram or EEG) and heart (electrocardiogram or ECG). Typically, patients are hooked up to dozens of sensors and wires in a clinic, tracking brain, eye, muscle, breathing, and heart activity while sleeping. Not exactly Zzz-inducing.

But what if you could perform the same test at home, just as accurately and in real time?

For the first time, computer science researchers at the University of Southern California have developed an approach that matches the performance of expert-scored polysomnography using just a single-lead echocardiogram. The software, which is open-source, allows anyone with basic coding experience to create their own low-cost, DIY sleep-tracking device.

“Researchers have been trying for decades to find simpler and cheaper methods to monitor sleep‚ especially without the awkward cap,” said lead author Adam Jones, who recently earned his PhD from USC. “But so far, the poor performance, even in ideal conditions, has led to the conclusion that it won’t be possible and that measuring brain activity is necessary. Our research shows that this assumption is no longer true.”

The model, which assesses sleep stages at the highest level, also significantly outperformed other EEG-less models, said the researchers, including commercial sleep-tracking devices. “We wanted to develop a system that addresses the limitations of current methods and the need for more accessibility and affordability in sleep analysis,” said Jones.

The study, published June 2024 in the journal Computers in Biology and Medicine, was co-authored by Laurent Itti, a professor of computer science and Jones’ advisor, and Jones’ longtime collaborator, Bhavin R. Sheth, a USC alumnus and electrical engineer at the University of Houston.

Could the heart be leading the band?

Sleep, a key cognitive decline predictor, becomes shorter and more fragmented with age — a finding validated by both previous studies and the researchers’ neural network. But this decline happens earlier than you might expect. A recent study in Neurology found that people who have more interrupted sleep in their 30s and 40s are more than twice as likely to have memory problems a decade later.

Chronic poor sleep can also contribute to the accumulation of beta-amyloid plaques, a hallmark of Alzheimer’s disease.

“It’s a little scary,” said Jones, who admits he was formerly in the “sleep when I’m dead” camp before embarking on this research as a hobby project in 2010. “That’s why I want these interventions to come quickly and to make them accessible to as many people as possible. This software could help tease apart what’s happening when we sleep every night.”

The researchers trained their model on a large, diverse dataset of 4,000 recordings from subjects ranging from 5 to 90 years old, using only heart data and a deep-learning neural network. Through trial and error, spanning hundreds of iterations, they found that the automated ECG-only network could score sleep just as well as the “gold standard” polysomnography. It successfully categorized sleep into all five stages, including rapid eye movement (REM), which is essential for memory consolidation and emotional stability, and non-REM sleep, including deep sleep, which is crucial for physical and mental restoration.

In addition to simplifying a typically expensive and cumbersome process, this insight highlights a deeper connection between the heart and the brain than previously understood. It also underscores the role of the autonomic nervous system, which links the brain and heart.

“The heart and the brain are connected in ways that are not well-understood, and this research aims to bridge that gap,” said Jones. “There is a lot of evidence in my paper that, in fact, the heart may be leading the band, as it were.”

The work could also help improve sleep studies in remote populations, helping to shed light on the origins and functions of sleep.

In a follow-up paper currently being prepared, Jones aims to explore further what the network focuses on in the ECG data. “I think there is a lot of information hidden in the heart that we don’t know about yet,” he said.

Share Button

Development of a model capable of predicting the cycle lives of high-energy-density lithium-metal batteries

NIMS and SoftBank Corp. have jointly developed a model capable of predicting the cycle lives of high-energy-density lithium-metal batteries by applying machine learning methods to battery performance data. The model proved able to accurately estimate batteries’ longevity by analyzing their charge, discharge and voltage relaxation process data without relying on any assumption about specific battery degradation mechanisms. The technique is expected to be useful in improving the safety and reliability of devices powered by lithium-metal batteries.

Lithium-metal batteries have the potential to achieve energy densities per unit mass higher than those of the lithium-ion batteries currently in use. For this reason, expectations are high for their use in a wide range of technologies, including drones, electric vehicles and household electricity storage systems. In 2018, NIMS and SoftBank established the NIMS-SoftBank Advanced Technologies Development Center. Together they have since carried out research on high-energy-density rechargeable batteries for use in various systems, such as mobile phone base stations, the Internet of Things (IoT) and high altitude platform stations (HAPS).

A lithium-metal battery with an energy density higher than 300 Wh/kg and a life of more than 200 charge/discharge cycles has previously been reported. Putting high-performance lithium-metal batteries like this into practical use while ensuring their safety will require the development of techniques capable of accurately estimating the cycle lives of these batteries. However, degradation mechanisms are more complex in lithium-metal batteries than in conventional lithium-ion batteries and are not yet fully understood, making the development of models capable of predicting the cycle lives of lithium-metal batteries a great challenge.

This research team fabricated a large number of high-energy-density lithium-metal battery cells — each composed of a lithium-metal anode and a nickel-rich cathode — using advanced battery fabrication techniques the team had previously developed. The team then evaluated the charge/discharge performance of these cells. Finally, the team constructed a model able to predict the cycle lives of lithium-metal batteries by applying machine learning methods to the charge/discharge data. The model proved able to make accurate predictions by analyzing charge, discharge and voltage relaxation process data without relying on any assumption about specific battery degradation mechanisms.

The team intends to further improve the cycle life prediction accuracy of the model and expedite efforts to put high-energy-density lithium-metal batteries into practical use by leveraging the model in the development of new lithium-metal anode materials.

Share Button

Using AI to find the polymers of the future

Nylon, Teflon, Kevlar. These are just a few familiar polymers — large-molecule chemical compounds — that have changed the world. From Teflon-coated frying pans to 3D printing, polymers are vital to creating the systems that make the world function better.

Finding the next groundbreaking polymer is always a challenge, but now Georgia Tech researchers are using artificial intelligence (AI) to shape and transform the future of the field. Rampi Ramprasad’s group develops and adapts AI algorithms to accelerate materials discovery.

This summer, two papers published in the Nature family of journals highlight the significant advancements and success stories emerging from years of AI-driven polymer informatics research. The first, featured in Nature Reviews Materials, showcases recent breakthroughs in polymer design across critical and contemporary application domains: energy storage, filtration technologies, and recyclable plastics. The second, published in Nature Communications, focuses on the use of AI algorithms to discover a subclass of polymers for electrostatic energy storage, with the designed materials undergoing successful laboratory synthesis and testing.

“In the early days of AI in materials science, propelled by the White House’s Materials Genome Initiative over a decade ago, research in this field was largely curiosity-driven,” said Ramprasad, a professor in the School of Materials Science and Engineering. “Only in recent years have we begun to see tangible, real-world success stories in AI-driven accelerated polymer discovery. These successes are now inspiring significant transformations in the industrial materials R&D landscape. That’s what makes this review so significant and timely.”

AI Opportunities

Ramprasad’s team has developed groundbreaking algorithms that can instantly predict polymer properties and formulations before they are physically created. The process begins by defining application-specific target property or performance criteria. Machine learning (ML) models train on existing material-property data to predict these desired outcomes. Additionally, the team can generate new polymers, whose properties are forecasted with ML models. The top candidates that meet the target property criteria are then selected for real-world validation through laboratory synthesis and testing. The results from these new experiments are integrated with the original data, further refining the predictive models in a continuous, iterative process.

While AI can accelerate the discovery of new polymers, it also presents unique challenges. The accuracy of AI predictions depends on the availability of rich, diverse, extensive initial data sets, making quality data paramount. Additionally, designing algorithms capable of generating chemically realistic and synthesizable polymers is a complex task.

The real challenge begins after the algorithms make their predictions: proving that the designed materials can be made in the lab and function as expected and then demonstrating their scalability beyond the lab for real-world use. Ramprasad’s group designs these materials, while their fabrication, processing, and testing are carried out by collaborators at various institutions, including Georgia Tech. Professor Ryan Lively from the School of Chemical and Biomolecular Engineering frequently collaborates with Ramprasad’s group and is a co-author of the paper published in Nature Reviews Materials.

“In our day-to-day research, we extensively use the machine learning models Rampi’s team has developed,” Lively said. “These tools accelerate our work and allow us to rapidly explore new ideas. This embodies the promise of ML and AI because we can make model-guided decisions before we commit time and resources to explore the concepts in the laboratory.”

Using AI, Ramprasad’s team and their collaborators have made significant advancements in diverse fields, including energy storage, filtration technologies, additive manufacturing, and recyclable materials.

Polymer Progress

One notable success, described in the Nature Communications paper, involves the design of new polymers for capacitors, which store electrostatic energy. These devices are vital components in electric and hybrid vehicles, among other applications. Ramprasad’s group worked with researchers from the University of Connecticut.

Current capacitor polymers offer either high energy density or thermal stability, but not both. By leveraging AI tools, the researchers determined that insulating materials made from polynorbornene and polyimide polymers can simultaneously achieve high energy density and high thermal stability. The polymers can be further enhanced to function in demanding environments, such as aerospace applications, while maintaining environmental sustainability.

“The new class of polymers with high energy density and high thermal stability is one of the most concrete examples of how AI can guide materials discovery,” said Ramprasad. “It is also the result of years of multidisciplinary collaborative work with Greg Sotzing and Yang Cao at the University of Connecticut and sustained sponsorship by the Office of Naval Research.”

Industry Potential

The potential for real-world translation of AI-assisted materials development is underscored by industry participation in the Nature Reviews Materials article. Co-authors of this paper also include scientists from Toyota Research Institute and General Electric. To further accelerate the adoption of AI-driven materials development in industry, Ramprasad co-founded Matmerize Inc., a software startup company recently spun out of Georgia Tech. Their cloud-based polymer informatics software is already being used by companies across various sectors, including energy, electronics, consumer products, chemical processing, and sustainable materials.

“Matmerize has transformed our research into a robust, versatile, and industry-ready solution, enabling users to design materials virtually with enhanced efficiency and reduced cost,” Ramprasad said. “What began as a curiosity has gained significant momentum, and we are entering an exciting new era of materials by design.”

Share Button

Women harmed by vaginal mesh in England get payout

More than 100 women who experienced complications have received payouts from manufacturers.

Share Button