When allocating scarce resources with AI, randomization can improve fairness

Organizations are increasingly utilizing machine-learning models to allocate scarce resources or opportunities. For instance, such models can help companies screen resumes to choose job interview candidates or aid hospitals in ranking kidney transplant patients based on their likelihood of survival.

When deploying a model, users typically strive to ensure its predictions are fair by reducing bias. This often involves techniques like adjusting the features a model uses to make decisions or calibrating the scores it generates.

However, researchers from MIT and Northeastern University argue that these fairness methods are not sufficient to address structural injustices and inherent uncertainties. In a new paper, they show how randomizing a model’s decisions in a structured way can improve fairness in certain situations.

For example, if multiple companies use the same machine-learning model to rank job interview candidates deterministically — without any randomization — then one deserving individual could be the bottom-ranked candidate for every job, perhaps due to how the model weighs answers provided in an online form. Introducing randomization into a model’s decisions could prevent one worthy person or group from always being denied a scarce resource, like a job interview.

Through their analysis, the researchers found that randomization can be especially beneficial when a model’s decisions involve uncertainty or when the same group consistently receives negative decisions.

They present a framework one could use to introduce a specific amount of randomization into a model’s decisions by allocating resources through a weighted lottery. This method, which an individual can tailor to fit their situation, can improve fairness without hurting the efficiency or accuracy of a model.

“Even if you could make fair predictions, should you be deciding these social allocations of scarce resources or opportunities strictly off scores or rankings? As things scale, and we see more and more opportunities being decided by these algorithms, the inherent uncertainties in these scores can be amplified. We show that fairness may require some sort of randomization,” says Shomik Jain, a graduate student in the Institute for Data, Systems, and Society (IDSS) and lead author of the paper.

Jain is joined on the paper by Kathleen Creel, assistant professor of philosophy and computer science at Northeastern University; and senior author Ashia Wilson, the Lister Brothers Career Development Professor in the Department of Electrical Engineering and Computer Science and a principal investigator in the Laboratory for Information and Decision Systems (LIDS). The research will be presented at the International Conference on Machine Learning.

Considering claims

This work builds off a previous paper in which the researchers explored harms that can occur when one uses deterministic systems at scale. They found that using a machine-learning model to deterministically allocate resources can amplify inequalities that exist in training data, which can reinforce bias and systemic inequality.

“Randomization is a very useful concept in statistics, and to our delight, satisfies the fairness demands coming from both a systemic and individual point of view,” Wilson says.

In this paper, they explored the question of when randomization can improve fairness. They framed their analysis around the ideas of philosopher John Broome, who wrote about the value of using lotteries to award scarce resources in a way that honors all claims of individuals.

A person’s claim to a scarce resource, like a kidney transplant, can stem from merit, deservingness, or need. For instance, everyone has a right to life, and their claims on a kidney transplant may stem from that right, Wilson explains.

“When you acknowledge that people have different claims to these scarce resources, fairness is going to require that we respect all claims of individuals. If we always give someone with a stronger claim the resource, is that fair?” Jain says.

That sort of deterministic allocation could cause systemic exclusion or exacerbate patterned inequality, which occurs when receiving one allocation increases an individual’s likelihood of receiving future allocations. In addition, machine-learning models can make mistakes, and a deterministic approach could cause the same mistake to be repeated.

Randomization can overcome these problems, but that doesn’t mean all decisions a model makes should be randomized equally.

Structured randomization

The researchers use a weighted lottery to adjust the level of randomization based on the amount of uncertainty involved in the model’s decision-making. A decision that is less certain should incorporate more randomization.

“In kidney allocation, usually the planning is around projected lifespan, and that is deeply uncertain. If two patients are only five years apart, it becomes a lot harder to measure. We want to leverage that level of uncertainty to tailor the randomization,” Wilson says.

The researchers used statistical uncertainty quantification methods to determine how much randomization is needed in different situations. They show that calibrated randomization can lead to fairer outcomes for individuals without significantly affecting the utility, or effectiveness, of the model.

“There is a balance to be had between overall utility and respecting the rights of the individuals who are receiving a scarce resource, but oftentimes the tradeoff is relatively small,” says Wilson.

However, the researchers emphasize there are situations where randomizing decisions would not improve fairness and could harm individuals, such as in criminal justice contexts.

But there could be other areas where randomization can improve fairness, such as college admissions, and the researchers plan to study other use-cases in future work. They also want to explore how randomization can affect other factors, such as competition or prices, and how it could be used to improve the robustness of machine-learning models.

“We are hoping our paper is a first move toward illustrating that there might be a benefit to randomization. We are offering randomization as a tool. How much you are going to want to do it is going to be up to all the stakeholders in the allocation to decide. And, of course, how they decide is another research question all together,” says Wilson.

Share Button

New additive process can make better — and greener — high-value chemicals

Researchers at the Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) have achieved a significant breakthrough that could lead to better — and greener — agricultural chemicals and everyday products.

Using a process that combines natural enzymes and light, the team from the University of Illinois Urbana-Champaign developed an eco-friendly way to precisely mix fluorine, an important additive, into chemicals called olefins — hydrocarbons used in a vast array of products, from detergents to fuels to medicines. This groundbreaking method offers an efficient new strategy for creating high-value chemicals with potential applications in agrochemicals, pharmaceuticals, renewable fuels, and more.

The study, published in Science, was led by CABBI Conversion Theme Leader Huimin Zhao, Professor of Chemical and Biomolecular Engineering (ChBE), Biosystems Design Theme Leader at the Carl R. Woese Institute for Genomic Biology (IGB), and Director of the NSF Molecule Maker Lab Institute at Illinois; and lead author Maolin Li, a Postdoctoral Research Associate with CABBI, ChBE, and IGB.

As an additive, fluorine can make agrochemicals and medicines work better and last longer. Its small size, electronic properties, and ability to dissolve easily in fats and oils all have a profound impact on the function of organic molecules, augmenting their absorption, metabolic stability, and protein interactions. However, adding fluorine is tricky and usually requires complex chemical processes that are not always friendly to the environment.

The scientists in this study used a “photoenzyme” — a repurposed enzyme that works under light — to help bring fluorine into these chemicals. By using light and photoenzymes, they were able to precisely attach fluorine to olefins, controlling exactly where and how it is added. Because this method is not only environmentally friendly but very specific, it allows for more efficient creation of useful new compounds that were difficult to make before.

This approach fills a large gap in molecular chemistry, as previous methods to add fluorine were limited and inefficient. It also opens up new possibilities for creating better medicines and agricultural products, as fluorinated compounds are often more effective, stable, and longer-lasting than their non-fluorinated counterparts. That means fertilizers and herbicides could be more effective in protecting crops, and medicines could be more potent or have fewer side effects.

“This breakthrough represents a significant shift in how we approach the synthesis of fluorinated compounds, crucial in numerous applications from medicine to agriculture,” Zhao said. “By harnessing the power of light-activated enzymes, we’ve developed a method that improves the efficiency of these syntheses and aligns with environmental sustainability. This work could pave the way for new, greener technologies in chemical production, which is a win not just for science, but for society at large.”

The research advances CABBI’s bioenergy mission by pioneering innovative methods in biocatalysis that can enhance the production of bio-based chemicals — those derived from renewable resources such as plants or microorganisms rather than petroleum. The development of more efficient and environmentally friendly biochemical processes aligns with CABBI’s focus on creating sustainable bioenergy solutions that minimize environmental impact and reduce reliance on fossil fuels.

It also contributes to the broader U.S. Department of Energy (DOE) mission of driving advances in bioenergy and bioproducts. The methods developed in this study can lead to more sustainable industrial processes that are less energy-intensive and reduce chemical waste and pollution, supporting DOE’s goals of fostering clean energy technologies. The ability to efficiently create high-value fluorinated compounds could lead to enhancements in various fields, including renewable energy sources and bioproducts that support economic growth and environmental sustainability.

“Our research opens up fascinating possibilities for the future of pharmaceutical and agrochemical development,” Li said. “By integrating fluorine into organic molecules through a photoenzymatic process, we are not only enhancing the beneficial properties of these compounds but also doing so in a manner that’s more environmentally responsible. It’s thrilling to think about the potential applications of our work in creating more effective and sustainable products for everyday use.”

CABBI researchers Yujie Yuan, Wesley Harrison, and Zhengyi Zhang of ChBE and IGB at Illinois were co-authors on this study.

Share Button

‘Dancing molecules’ heal cartilage damage

In November 2021, Northwestern University researchers introduced an injectable new therapy, which harnessed fast-moving “dancing molecules,” to repair tissues and reverse paralysis after severe spinal cord injuries.

Now, the same research group has applied the therapeutic strategy to damaged human cartilage cells. In the new study, the treatment activated the gene expression necessary to regenerate cartilage within just four hours. And, after only three days, the human cells produced protein components needed for cartilage regeneration.

The researchers also found that, as the molecular motion increased, the treatment’s effectiveness also increased. In other words, the molecules’ “dancing” motions were crucial for triggering the cartilage growth process.

The study was published today (July 26) in the Journal of the American Chemical Society.

“When we first observed therapeutic effects of dancing molecules, we did not see any reason why it should only apply to the spinal cord,” said Northwestern’s Samuel I. Stupp, who led the study. “Now, we observe the effects in two cell types that are completely disconnected from one another — cartilage cells in our joints and neurons in our brain and spinal cord. This makes me more confident that we might have discovered a universal phenomenon. It could apply to many other tissues.”

An expert in regenerative nanomedicine, Stupp is Board of Trustees Professor of Materials Science and Engineering, Chemistry, Medicine and Biomedical Engineering at Northwestern, where he is founding director of the Simpson Querrey Institute for BioNanotechnology and its affiliated center, the Center for Regenerative Nanomedicine. Stupp has appointments in the McCormick School of Engineering, Weinberg College of Arts and Sciences and Feinberg School of Medicine. Shelby Yuan, a graduate student in the Stupp laboratory, was primary author of the study.

Big problem, few solutions

As of 2019, nearly 530 million people around the globe were living with osteoarthritis, according to the World Health Organization. A degenerative disease in which tissues in joints break down over time, osteoarthritis is a common health problem and leading cause of disability.

In patients with severe osteoarthritis, cartilage can wear so thin that joints essentially transform into bone on bone — without a cushion between. Not only is this incredibly painful, patients’ joints also can no longer properly function. At that point, the only effective treatment is a joint replacement surgery, which is expensive and invasive.

“Current treatments aim to slow disease progression or postpone inevitable joint replacement,” Stupp said. “There are no regenerative options because humans do not have an inherent capacity to regenerate cartilage in adulthood.”

What are ‘dancing molecules’?

Stupp and his team posited that “dancing molecules” might encourage the stubborn tissue to regenerate. Previously invented in Stupp’s laboratory, dancing molecules are assemblies that form synthetic nanofibers comprising tens to hundreds of thousands of molecules with potent signals for cells. By tuning their collective motions through their chemical structure, Stupp discovered the moving molecules could rapidly find and properly engage with cellular receptors, which also are in constant motion and extremely crowded on cell membranes.

Once inside the body, the nanofibers mimic the extracellular matrix of the surrounding tissue. By matching the matrix’s structure, mimicking the motion of biological molecules and incorporating bioactive signals for the receptors, the synthetic materials are able to communicate with cells.

“Cellular receptors constantly move around,” Stupp said. “By making our molecules move, ‘dance’ or even leap temporarily out of these structures, known as supramolecular polymers, they are able to connect more effectively with receptors.”

Motion matters

In the new study, Stupp and his team looked to the receptors for a specific protein critical for cartilage formation and maintenance. To target this receptor, the team developed a new circular peptide that mimics the bioactive signal of the protein, which is called transforming growth factor beta-1 (TGFb-1).

Then, the researchers incorporated this peptide into two different molecules that interact to form supramolecular polymers in water, each with the same ability to mimic TGFb-1. The researchers designed one supramolecular polymer with a special structure that enabled its molecules to move more freely within the large assemblies. The other supramolecular polymer, however, restricted molecular movement.

“We wanted to modify the structure in order to compare two systems that differ in the extent of their motion,” Stupp said. “The intensity of supramolecular motion in one is much greater than the motion in the other one.”

Although both polymers mimicked the signal to activate the TGFb-1 receptor, the polymer with rapidly moving molecules was much more effective. In some ways, they were even more effective than the protein that activates the TGFb-1 receptor in nature.

“After three days, the human cells exposed to the long assemblies of more mobile molecules produced greater amounts of the protein components necessary for cartilage regeneration,” Stupp said. “For the production of one of the components in cartilage matrix, known as collagen II, the dancing molecules containing the cyclic peptide that activates the TGF-beta1 receptor were even more effective than the natural protein that has this function in biological systems.”

What’s next?

Stupp’s team is currently testing these systems in animal studies and adding additional signals to create highly bioactive therapies.

“With the success of the study in human cartilage cells, we predict that cartilage regeneration will be greatly enhanced when used in highly translational pre-clinical models,” Stupp said. “It should develop into a novel bioactive material for regeneration of cartilage tissue in joints.”

Stupp’s lab is also testing the ability of dancing molecules to regenerate bone — and already has promising early results, which likely will be published later this year. Simultaneously, he is testing the molecules in human organoids to accelerate the process of discovering and optimizing therapeutic materials.

Stupp’s team also continues to build its case to the Food and Drug Administration, aiming to gain approval for clinical trials to test the therapy for spinal cord repair.

“We are beginning to see the tremendous breadth of conditions that this fundamental discovery on ‘dancing molecules’ could apply to,” Stupp said. “Controlling supramolecular motion through chemical design appears to be a powerful tool to increase efficacy for a range of regenerative therapies.”

Share Button

Two shark species documented in Puget Sound for first time

Oregon State University researchers have made the first scientific confirmation in Puget Sound of two distinct shark species, one of them critically endangered.

The presence of the broadnose sevengill shark and endangered soupfin shark in the sound, the southern portion of the Salish Sea, may indicate changes in what biologists in OSU’s Big Fish Lab describe as an economically, culturally and ecologically valuable inland waterway.

The Salish Sea separates northwest Washington from British Columbia’s Vancouver Island. The 6,500-square-mile body of water stretches into Washington as Puget Sound, and the sharks were caught close to Olympia near the sound’s southernmost point.

Taylor Chapple, an assistant professor in Oregon State’s College of Agricultural Sciences, and graduate students Jessica Schulte and Ethan Personius report the broadnose sevengill and soupfin documentations in papers published in Frontiers in Marine Science.

The authors collaborated with partners at NOAA’s National Marine Fisheries Service and the Washington Department of Fish and Wildlife to confirm that the broadnose sevengill, an apex predator that can grow to nearly 10 feet, is now inhabiting heavily urbanized South Puget Sound.

“Understanding the sevengill presence in this new habitat is crucial for understanding the food webs of the Salish Sea, and it highlights the need for continued monitoring and research — including their relationship with other species of conservation concern, such as salmon,” said Schulte, the lead author on the sevengill paper.

Broadnose sevengill sharks — so named because they have two more gill slits than most shark species — eat a wide variety of prey: fishes (including rays and other sharks), crustaceans and marine mammals. They live in temperate waters worldwide, and off the west coast of North America they range from southern Alaska to Baja California.

Prior to 2021, only one sevengill shark had ever been confirmed in the Salish Sea, at Point Roberts, Washington, near the Canadian border. In August 2021, however, anecdotal reports indicated several of them had been caught in South Puget Sound.

During 10 days of field work in 2022 and 2023, the scientists caught nine sevengills, more than 190 miles away from their previously documented range. Eight of them were males — the largest measured just under 7 feet — and the female was about 4 feet, 6 inches.

“Our continued research on this species in Oregon and Washington waters will allow us to have a better handle on its role in our valuable marine ecosystems,” Schulte said.

The same holds for the soupfin shark, said Personius, the lead author on that paper. It is the largest species of hound shark, can be as big as 6 1/2 feet and got its name because of its use as the key ingredient in shark fin soup.

“Soupfin sharks were relentlessly exploited during the 1930s and 1940s, including for their livers, which are rich in vitamin A,” Personius said. “Despite lower fishing pressure the species has not been able to recover and is currently under consideration for federal protection under the Endangered Species Act.”

Like the broadnose sevengill shark, the soupfin shark is found in temperate waters around the globe and is a top predator in any ecosystem it inhabits, eating cephalopods as well as a variety of fishes. Soupfin sharks are known as strong swimmers whose migrations can exceed 1,000 miles.

In field work concurrent with the sevengill project, the scientists caught one soupfin shark, a male that measured just over 5 feet.

“The Salish Sea has experienced pervasive shifts in species abundance and composition along with industrialization and significant habitat degradation,” Personius said. “The appearance of soupfin sharks may be a result of climate change and changes in prey availability.”

Following the 2014-15 extreme marine heat wave event known as “The Blob,” he explained, anchovies emerged as a dominant forage fish species in the Salish Sea after having been historically uncommon there. Soupfin sharks are a known predator of anchovies.

Graduate student Maddie English is a co-author of the soupfin shark paper, along with scientists from the NOAA Marine Fisheries Service and the Washington Department of Fish and Wildlife. Research associate Alexandra McInturf contributed to the sevengill study.

Share Button

AI method radically speeds predictions of materials’ thermal properties

It is estimated that about 70 percent of the energy generated worldwide ends up as waste heat.

If scientists could better predict how heat moves through semiconductors and insulators, they could design more efficient power generation systems. However, the thermal properties of materials can be exceedingly difficult to model.

The trouble comes from phonons, which are subatomic particles that carry heat. Some of a material’s thermal properties depend on a measurement called the phonon dispersion relation, which can be incredibly hard to obtain, let alone utilize in the design of a system.

A team of researchers from MIT and elsewhere tackled this challenge by rethinking the problem from the ground up. The result of their work is a new machine-learning framework that can predict phonon dispersion relations up to 1,000 times faster than other AI-based techniques, with comparable or even better accuracy. Compared to more traditional, non-AI-based approaches, it could be 1 million times faster.

This method could help engineers design energy generation systems that produce more power, more efficiently. It could also be used to develop more efficient microelectronics, since managing heat remains a major bottleneck to speeding up electronics.

“Phonons are the culprit for the thermal loss, yet obtaining their properties is notoriously challenging, either computationally or experimentally,” says Mingda Li, associate professor of nuclear science and engineering and senior author of a paper on this technique.

Li is joined on the paper by co-lead authors Ryotaro Okabe, a chemistry graduate student; and Abhijatmedhi Chotrattanapituk, an electrical engineering and computer science graduate student; Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT; as well as others at MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory. The research appears in Nature Computational Science.

Predicting phonons

Heat-carrying phonons are tricky to predict because they have an extremely wide frequency range, and the particles interact and travel at different speeds.

A material’s phonon dispersion relation is the relationship between energy and momentum of phonons in its crystal structure. For years, researchers have tried to predict phonon dispersion relations using machine learning, but there are so many high-precision calculations involved that models get bogged down.

“If you have 100 CPUs and a few weeks, you could probably calculate the phonon dispersion relation for one material. The whole community really wants a more efficient way to do this,” says Okabe.

The machine-learning models scientists often use for these calculations are known as graph neural networks (GNN). A GNN converts a material’s atomic structure into a crystal graph comprising multiple nodes, which represent atoms, connected by edges, which represent the interatomic bonding between atoms.

While GNNs work well for calculating many quantities, like magnetization or electrical polarization, they are not flexible enough to efficiently predict an extremely high-dimensional quantity like the phonon dispersion relation. Because phonons can travel around atoms on X, Y, and Z axes, their momentum space is hard to model with a fixed graph structure.

To gain the flexibility they needed, Li and his collaborators devised virtual nodes.

They create what they call a virtual node graph neural network (VGNN) by adding a series of flexible virtual nodes to the fixed crystal structure to represent phonons. The virtual nodes enable the output of the neural network to vary in size, so it is not restricted by the fixed crystal structure.

Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. While virtual nodes will be updated as the model updates real nodes during computation, they do not affect the accuracy of the model.

“The way we do this is very efficient in coding. You just generate a few more nodes in your GNN. The physical location doesn’t matter, and the real nodes don’t even know the virtual nodes are there,” says Chotrattanapituk.

Cutting out complexity

Since it has virtual nodes to represent phonons, the VGNN can skip many complex calculations when estimating phonon dispersion relations, which makes the method more efficient than a standard GNN.

The researchers proposed three different versions of VGNNs with increasing complexity. Each can be used to predict phonons directly from a material’s atomic coordinates.

Because their approach has the flexibility to rapidly model high-dimensional properties, they can use it to estimate phonon dispersion relations in alloy systems. These complex combinations of metals and nonmetals are especially challenging for traditional approaches to model.

The researchers also found that VGNNs offered slightly greater accuracy when predicting a material’s heat capacity. In some instances, prediction errors were two orders of magnitude lower with their technique.

A VGNN could be used to calculate phonon dispersion relations for a few thousand materials in just a few seconds with a personal computer, Li says.

This efficiency could enable scientists to search a larger space when seeking materials with certain thermal properties, such as superior thermal storage, energy conversion, or superconductivity.

Moreover, the virtual node technique is not exclusive to phonons, and could also be used to predict challenging optical and magnetic properties.

In the future, the researchers want to refine the technique so virtual nodes have greater sensitivity to capture small changes that can affect phonon structure.

“Researchers got too comfortable using graph nodes to represent atoms, but we can rethink that. Graph nodes can be anything. And virtual nodes are a very generic approach you could use to predict a lot of high-dimensional quantities,” Li says.

This work is supported by the U.S. Department of Energy, National Science Foundation, a Mathworks Fellowship, a Sow-Hsin Chen Fellowship, the Harvard Quantum Initiative, and the Oak Ridge National Laboratory.

Share Button

Win-win potential of grass-powered energy production

Strategically planting perennial grass throughout corn and soybean fields helps address the unintended environmental consequences of growing the dominant row crops, including soil erosion, fertilizer runoff and greenhouse gas emissions.

But converting portions of farmland back to prairie has to make financial sense for farmers, which is why a research team led by Iowa State University landscape ecologist Lisa Schulte Moore has spent the past six years studying how to efficiently turn harvested grass into lucrative renewable natural gas.

“We’re looking at existing markets where there is already a demand, use existing infrastructure to reduce costs of the energy transition and create wins in multiple categories. We want wins for farmers, wins for businesses, wins for municipalities and wins for society,” said Schulte Moore, professor of natural resource ecology and management and director of the Consortium for Cultivating Human And Naturally reGenerative Enterprises (C-CHANGE). “We can have great conversations about what could be, but unless it benefits everyone along these supply chains, it won’t happen.”

A pair of recently published peer-reviewed articles by Schulte-Moore’s research group modeled the economic feasibility of grass-to-gas production in different settings and from varying perspectives, analysis that helps flesh out the system’s win-win potential.

“To replace natural gas with resources that revitalize sustainable agriculture, we have to be able to quantify how much energy we can produce and show it can be cost effective and environmentally friendly,” said associate professor of mechanical engineering Mark Mba-Wright, co-author of the studies.

City-based scenarios

The ongoing research is funded in part by a $10 million federal grant in 2020, another $10 million in federal support in 2022 and about $650,000 from the Walton Family Foundation. The work centers on optimizing and expanding the use of anaerobic digesters. Biogas is released in anaerobic digestion, the natural process of organic matter biodegrading without oxygen. Captured in tank-like digesters, biogas can be processed into a fuel that easily swaps in for petroleum-based natural gas. It also can power electrical generators and produce fertilizer.

In a study published in BioEnergy Research, the Iowa State researchers modeled how a network of digesters in and around Ames could supply the city’s heat and power demands. Livestock manure, biofuel byproducts, food waste and wastewater would join grassy biomass as the feedstock supplies for up to 10 digesters. The locations, size and number of facilities depended on whether the network was designed primarily to produce natural gas or power.

The analysis found renewable natural gas was the most economically practical focus, with a levelized cost roughly twice the historical average price of traditional natural gas. Incentives supporting clean energy production could provide a boost to make pricing competitive. Regardless, seeing how digester supply chains would work to serve municipal needs helps city leaders envision possibilities, Mba-Wright said.

“We wanted to consider the seasonality of the supply and demand over a year to give a mayor, for instance, scenarios to look at and strategize around,” he said.

Researchers have discussed anaerobic digestion with municipal wastewater officials in several cities in Iowa, and generally they’ve been curious, said Schulte Moore, co-director of the Bioeconomy Institute and a 2021 MacArthur Fellow.

“Their immediate need is to provide a service to their customers 24-7. But they work on 15- to 30-year planning horizons, so they’re also thinking about the future,” she said.

A grass-to-gas road map

A study published in Global Change Biology Bioenergy modeled the economic and environmental impact of two hypothetical digesters processing grassy biomass in the Grand River Basin in northwest Missouri and southwest Iowa.

Over their expected 20-year lifespan, the digesters would produce a combined profit of more than $400 million under the best conditions, based on the researchers’ analysis. The 45 million gigajoules of renewable natural gas created over two decades — equal to about 12.5 billion kilowatt hours — would have a carbon footprint 83% lower than natural gas derived from fossil fuels. Emissions also project to be lower than those from corn-based ethanol or soybean-based biodiesel.

Most existing anaerobic digesters that produce renewable natural gas have run on dairy manure, so it’s essential to pencil out how they would perform on a grass diet, Mba-Wright said.

“This is dotting our ‘i’s and crossing our ‘t’s to confirm the benefits are what we’d expect. We’re providing a road map to help build infrastructure, which will in turn reduce future costs,” he said.

The profitable scenarios examined in the study rely on existing carbon credit programs, including the California Low Carbon Fuel Standard and federal Renewable Fuel Standard. The most valuable outcomes also require high-yield grass and prairie restoration on some of the least-productive farmland.

Researchers aimed to be as realistic as possible in both studies, accounting for all known costs — including capital expenses. But they’ll be even more accurate in the coming years, as methods improve and new research results roll in, Schulte Moore said.

“In the future, we will refine our models by plugging in data our research teams have collected right here in Iowa,” she said.

Share Button

Autumn date to fix blood transfusion services

The systems were affected by a hack on the NHS, which caused significant disruption.

Share Button

Study finds targeting inflammation may not help reduce liver fibrosis in MAFLD

Researchers at UCLA Health uncovered new information about the role inflammation plays in mitigating liver fibrosis, which is associated with metabolic-associated fatty liver disease (MAFLD), one of the most common diseases in the world affecting up to 40 percent of U.S. adults. While inflammation in the liver has long been considered a prerequisite to developing liver fibrosis, the scarring and thickening of tissue that can impair the liver’s ability to function, this new research suggests that reducing inflammation may not influence the extent of fibrosis.

“Liver fibrosis is the critical feature that creates chronic liver disease and liver cancer. If we can keep fibrosis in check then we can meaningfully impact liver disease,” said Tamer Sallam, MD, corresponding author of the study and vice chair and associate professor in the department of medicine at the David Geffen School of Medicine at UCLA.

“For decades we have believed that targeting inflammation is one of the most important ways to reduce MAFLD. But this new research indicates that inflammation, while still important, may not be the main driver of fibrosis.”

The study, published in the Journal of Clinical Investigation, looked specifically at a protein called lipopolysaccharide binding protein (LBP), which is involved in the body’s immune response, and how LBP functions in mice. Findings showed that mice without LBP in their liver cells had lower levels of liver inflammation and better liver function but no change in fibrosis.

In addition to mouse models, the researchers also studied genetic analyses from large human datasets and human tissue samples from MAFLD patients at different stages in the disease, to examine the consequence of loss of LBP function. The evidence combined showed that the LBP does not alter scar tissue markers.

Sallam indicated a need to further explore how LBP influences inflammation and whether other factors can offer a more potent reduction in inflammation and have an impact on reducing fibrosis.

“Reducing scar burden is one of the holy grails in the treatment of advanced liver diseases,” Sallam said. “These results suggest that certain ways of targeting inflammation may not be a viable option and that more directed therapies against other pathways could help us better target fibrosis and improve outcomes for patients.”

Share Button

New study disputes Hunga Tonga volcano’s role in 2023-24 global warm-up

New research from a collaborative team featuring Texas A&M University atmospheric scientist Dr. Andrew Dessler is exploring the climate impact of the 2022 Hunga Tonga volcano eruption and challenging existing assumptions about its effects in the process.

The remarkable two-day event, which occurred in mid-January 2022, injected vast amounts of volcanic aerosols and water vapor into the atmosphere. Historically, large volcanic eruptions like Tambora in 1815 and Mt. Pinatubo in 1991 have led to significant cooling effects on the global climate by blocking sunlight with their aerosols. However, Hunga Tonga’s eruption presented a unique scenario: As a submarine volcano, it introduced an unprecedented amount of water vapor into the stratosphere, increasing total stratospheric water content by about 10%.

Because water vapor is a powerful greenhouse gas, Dessler says there was initial speculation that it might account for the extreme global warmth in 2023 and 2024. Instead, the results of the team’s research, published Wednesday (July 24) in the Journal of Geophysical Research: Atmospheres, reveal the opposite: The eruption actually contributed to cooling the Earth, similar to other major volcanic events.

A Volcanic Eruption’s Cooling Effect

The team’s paper, titled “Evolution of the Climate Forcing During the Two Years after the Hunga Tonga-Hunga Ha’apai Eruption,” includes insight and analysis from Dessler, a professor in the Texas A&M Department of Atmospheric Sciences and the director of the Texas Center for Climate Studies; first author Dr. Mark Schoeberl, chief scientist at the Virginia-based Science and Technology Corporation in Hamburg, Virginia; and multiple scientists from the National Aeronautics and Space Administration (NASA).

Their methodology involved analyzing NASA and National Oceanic and Atmospheric Administration (NOAA) satellite data observations of aerosols and water vapor, among other variables, to estimate the energy balance of the Earth’s climate system. Their analysis revealed that the eruption resulted in more energy leaving the climate system than entering it, thereby inducing the slight cooling effect.

“Our paper pours cold water on the explanation that the eruption caused the extreme warmth of 2023 and 2024,” Dessler explained. “Instead, we need to focus primarily on greenhouse gases from human activities as the main cause of the warming, with a big assist from the ongoing El Nino.”

Implications And Future Research

According to Dessler, this research has important implications for both scientists and the general public. By dismissing the volcanic eruption as a major factor in the recent warming, the team’s study reinforces his point that human-induced greenhouse gas emissions are the primary driver of climate change. This focus is particularly relevant, given the ongoing debate and misinformation about the causes of global warming.

Moreover, Schoeberl says the study underscores the importance of continued investment in satellite-based stratospheric measurements.

“Our understanding of the Hunga Tonga eruption is largely thanks to the investment in stratospheric satellite measurements by NOAA and NASA over the past two decades,” Schoeberl added. “However, we need to be cautious about a potential ‘stratospheric data desert,’ as some of the most critical instruments are not being replaced.”

The Challenging Path Ahead

While this paper answers several important questions, Dessler acknowledges that it simultaneously introduces new ones. For instance, the researchers highlighted some unresolved issues related to the Hunga Tonga eruption, such as the unexpectedly low levels of sulfur dioxide produced by such a violent eruption and the minimal impact the eruption had on the 2023 ozone hole. The 2023 ozone hole refers to a significant thinning of the ozone layer over Antarctica, which allows more harmful UV radiation to reach the Earth’s surface. Additionally, the persistence of water vapor in the stratosphere beyond what was predicted by models suggests that there is still much to learn about stratospheric circulation processes.

As scientists work to resolve ongoing questions and deepen our understanding of the stratosphere, Schoeberl says the team’s work highlights the critical need for continued research and precise data to tackle the challenges of climate change.

Share Button

Climate is most important factor in where mammals choose to live, study finds

While human activity has had a massive effect on the natural world, a new study from North Carolina State University finds that climate is still the most influential factor in determining where mammals can thrive. The work sheds light on how climate change will affect wildlife populations.

Roland Kays, lead author of a paper on the work, said the study’s goal was to compare the importance of climate versus human factors in where mammals chose to live. To do so, researchers collected data on 25 mammal species from 6,645 locations across the United States. The study is one of the largest camera trap data analyses ever done. The data came largely from Snapshot USA, which is a national mammal camera trap survey conducted with collaborators across the country.

“One of our ideas was that humans may have changed our landscape so much that we have become the primary determinants of which animals live where,” said Kays, who is a research professor at NC State and scientist at the N.C. Museum of Natural Sciences. “What we found was that in fact humans were not the most important. Climate, including temperature and the amount of rainfall, was the most important factor across most of the species we observed.”

However, human activity in the form of large population centers and agriculture was still a significant factor in where mammals chose to live. Some species struggled in the presence of cities and farms, Kays said, but many thrived.

“There are a lot of species that do well when humans are around. The Eastern gray squirrel for instance is the most common squirrel in Raleigh, and it does great around people. But there’s another species called the Eastern fox squirrel, and that one does well around agriculture but not as well around people,” he said. “We can see those differences in many other species. The snowshoe hare does poorly around both people and around agriculture. This study allows us to see the species that are sensitive to our impacts, and which ones benefit.”

This information helped researchers create maps which predict how common various mammals are across the contiguous U.S., which allowed them to separate the country into regions based on what kinds of mammals were common in each. These regions, known as ecoregions, are commonly used when studying plants but have never before been applied to mammal populations.

“When you look at something like the Eastern deciduous forest, that is an ecoregion classified by how common a type of tree is,” Kays said. “We’re now able to do that with mammal species and then compare that to the plant ecoregions. What we found was a striking similarity between the two. For instance, in the east where there is more rainfall, you have more plants growing. That lined up with a greater abundance of mammals that we saw in that region as well, because more plants mean more food for those animals to eat.”

The open access paper, “Climate, food and humans predict communities of mammals in the United States” is available to read in Diversity and Distributions. In identifying climate as the number one influence on mammal habitat choice, the study presents a new tool for predicting the impacts of climate change on mammal populations. Rising global temperatures will cause shifts in where animals are able to live, as well as influence precipitation levels and plant growth. Understanding these factors will be important to making sustainable decisions about mammal population management in the future.

Share Button