Mum’s school CCTV plea after autistic son attacked

Charlotte’s son was attacked repeatedly by a teaching assistant while he was at school.

Share Button

Why we might never know the truth about ultra-processed foods

Experts can’t agree how exactly they affect us and it’s not clear that science will give us an answer.

Share Button

‘My body just keeps swelling and swelling’

Chloe Davies has “spontaneous swelling attacks”, which can be life-threatening.

Share Button

Are you feeling grotty all the time this summer?

Why some people are saying they’ve been ill all year.

Share Button

When allocating scarce resources with AI, randomization can improve fairness

Organizations are increasingly utilizing machine-learning models to allocate scarce resources or opportunities. For instance, such models can help companies screen resumes to choose job interview candidates or aid hospitals in ranking kidney transplant patients based on their likelihood of survival.

When deploying a model, users typically strive to ensure its predictions are fair by reducing bias. This often involves techniques like adjusting the features a model uses to make decisions or calibrating the scores it generates.

However, researchers from MIT and Northeastern University argue that these fairness methods are not sufficient to address structural injustices and inherent uncertainties. In a new paper, they show how randomizing a model’s decisions in a structured way can improve fairness in certain situations.

For example, if multiple companies use the same machine-learning model to rank job interview candidates deterministically — without any randomization — then one deserving individual could be the bottom-ranked candidate for every job, perhaps due to how the model weighs answers provided in an online form. Introducing randomization into a model’s decisions could prevent one worthy person or group from always being denied a scarce resource, like a job interview.

Through their analysis, the researchers found that randomization can be especially beneficial when a model’s decisions involve uncertainty or when the same group consistently receives negative decisions.

They present a framework one could use to introduce a specific amount of randomization into a model’s decisions by allocating resources through a weighted lottery. This method, which an individual can tailor to fit their situation, can improve fairness without hurting the efficiency or accuracy of a model.

“Even if you could make fair predictions, should you be deciding these social allocations of scarce resources or opportunities strictly off scores or rankings? As things scale, and we see more and more opportunities being decided by these algorithms, the inherent uncertainties in these scores can be amplified. We show that fairness may require some sort of randomization,” says Shomik Jain, a graduate student in the Institute for Data, Systems, and Society (IDSS) and lead author of the paper.

Jain is joined on the paper by Kathleen Creel, assistant professor of philosophy and computer science at Northeastern University; and senior author Ashia Wilson, the Lister Brothers Career Development Professor in the Department of Electrical Engineering and Computer Science and a principal investigator in the Laboratory for Information and Decision Systems (LIDS). The research will be presented at the International Conference on Machine Learning.

Considering claims

This work builds off a previous paper in which the researchers explored harms that can occur when one uses deterministic systems at scale. They found that using a machine-learning model to deterministically allocate resources can amplify inequalities that exist in training data, which can reinforce bias and systemic inequality.

“Randomization is a very useful concept in statistics, and to our delight, satisfies the fairness demands coming from both a systemic and individual point of view,” Wilson says.

In this paper, they explored the question of when randomization can improve fairness. They framed their analysis around the ideas of philosopher John Broome, who wrote about the value of using lotteries to award scarce resources in a way that honors all claims of individuals.

A person’s claim to a scarce resource, like a kidney transplant, can stem from merit, deservingness, or need. For instance, everyone has a right to life, and their claims on a kidney transplant may stem from that right, Wilson explains.

“When you acknowledge that people have different claims to these scarce resources, fairness is going to require that we respect all claims of individuals. If we always give someone with a stronger claim the resource, is that fair?” Jain says.

That sort of deterministic allocation could cause systemic exclusion or exacerbate patterned inequality, which occurs when receiving one allocation increases an individual’s likelihood of receiving future allocations. In addition, machine-learning models can make mistakes, and a deterministic approach could cause the same mistake to be repeated.

Randomization can overcome these problems, but that doesn’t mean all decisions a model makes should be randomized equally.

Structured randomization

The researchers use a weighted lottery to adjust the level of randomization based on the amount of uncertainty involved in the model’s decision-making. A decision that is less certain should incorporate more randomization.

“In kidney allocation, usually the planning is around projected lifespan, and that is deeply uncertain. If two patients are only five years apart, it becomes a lot harder to measure. We want to leverage that level of uncertainty to tailor the randomization,” Wilson says.

The researchers used statistical uncertainty quantification methods to determine how much randomization is needed in different situations. They show that calibrated randomization can lead to fairer outcomes for individuals without significantly affecting the utility, or effectiveness, of the model.

“There is a balance to be had between overall utility and respecting the rights of the individuals who are receiving a scarce resource, but oftentimes the tradeoff is relatively small,” says Wilson.

However, the researchers emphasize there are situations where randomizing decisions would not improve fairness and could harm individuals, such as in criminal justice contexts.

But there could be other areas where randomization can improve fairness, such as college admissions, and the researchers plan to study other use-cases in future work. They also want to explore how randomization can affect other factors, such as competition or prices, and how it could be used to improve the robustness of machine-learning models.

“We are hoping our paper is a first move toward illustrating that there might be a benefit to randomization. We are offering randomization as a tool. How much you are going to want to do it is going to be up to all the stakeholders in the allocation to decide. And, of course, how they decide is another research question all together,” says Wilson.

Share Button

New additive process can make better — and greener — high-value chemicals

Researchers at the Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) have achieved a significant breakthrough that could lead to better — and greener — agricultural chemicals and everyday products.

Using a process that combines natural enzymes and light, the team from the University of Illinois Urbana-Champaign developed an eco-friendly way to precisely mix fluorine, an important additive, into chemicals called olefins — hydrocarbons used in a vast array of products, from detergents to fuels to medicines. This groundbreaking method offers an efficient new strategy for creating high-value chemicals with potential applications in agrochemicals, pharmaceuticals, renewable fuels, and more.

The study, published in Science, was led by CABBI Conversion Theme Leader Huimin Zhao, Professor of Chemical and Biomolecular Engineering (ChBE), Biosystems Design Theme Leader at the Carl R. Woese Institute for Genomic Biology (IGB), and Director of the NSF Molecule Maker Lab Institute at Illinois; and lead author Maolin Li, a Postdoctoral Research Associate with CABBI, ChBE, and IGB.

As an additive, fluorine can make agrochemicals and medicines work better and last longer. Its small size, electronic properties, and ability to dissolve easily in fats and oils all have a profound impact on the function of organic molecules, augmenting their absorption, metabolic stability, and protein interactions. However, adding fluorine is tricky and usually requires complex chemical processes that are not always friendly to the environment.

The scientists in this study used a “photoenzyme” — a repurposed enzyme that works under light — to help bring fluorine into these chemicals. By using light and photoenzymes, they were able to precisely attach fluorine to olefins, controlling exactly where and how it is added. Because this method is not only environmentally friendly but very specific, it allows for more efficient creation of useful new compounds that were difficult to make before.

This approach fills a large gap in molecular chemistry, as previous methods to add fluorine were limited and inefficient. It also opens up new possibilities for creating better medicines and agricultural products, as fluorinated compounds are often more effective, stable, and longer-lasting than their non-fluorinated counterparts. That means fertilizers and herbicides could be more effective in protecting crops, and medicines could be more potent or have fewer side effects.

“This breakthrough represents a significant shift in how we approach the synthesis of fluorinated compounds, crucial in numerous applications from medicine to agriculture,” Zhao said. “By harnessing the power of light-activated enzymes, we’ve developed a method that improves the efficiency of these syntheses and aligns with environmental sustainability. This work could pave the way for new, greener technologies in chemical production, which is a win not just for science, but for society at large.”

The research advances CABBI’s bioenergy mission by pioneering innovative methods in biocatalysis that can enhance the production of bio-based chemicals — those derived from renewable resources such as plants or microorganisms rather than petroleum. The development of more efficient and environmentally friendly biochemical processes aligns with CABBI’s focus on creating sustainable bioenergy solutions that minimize environmental impact and reduce reliance on fossil fuels.

It also contributes to the broader U.S. Department of Energy (DOE) mission of driving advances in bioenergy and bioproducts. The methods developed in this study can lead to more sustainable industrial processes that are less energy-intensive and reduce chemical waste and pollution, supporting DOE’s goals of fostering clean energy technologies. The ability to efficiently create high-value fluorinated compounds could lead to enhancements in various fields, including renewable energy sources and bioproducts that support economic growth and environmental sustainability.

“Our research opens up fascinating possibilities for the future of pharmaceutical and agrochemical development,” Li said. “By integrating fluorine into organic molecules through a photoenzymatic process, we are not only enhancing the beneficial properties of these compounds but also doing so in a manner that’s more environmentally responsible. It’s thrilling to think about the potential applications of our work in creating more effective and sustainable products for everyday use.”

CABBI researchers Yujie Yuan, Wesley Harrison, and Zhengyi Zhang of ChBE and IGB at Illinois were co-authors on this study.

Share Button

‘Dancing molecules’ heal cartilage damage

In November 2021, Northwestern University researchers introduced an injectable new therapy, which harnessed fast-moving “dancing molecules,” to repair tissues and reverse paralysis after severe spinal cord injuries.

Now, the same research group has applied the therapeutic strategy to damaged human cartilage cells. In the new study, the treatment activated the gene expression necessary to regenerate cartilage within just four hours. And, after only three days, the human cells produced protein components needed for cartilage regeneration.

The researchers also found that, as the molecular motion increased, the treatment’s effectiveness also increased. In other words, the molecules’ “dancing” motions were crucial for triggering the cartilage growth process.

The study was published today (July 26) in the Journal of the American Chemical Society.

“When we first observed therapeutic effects of dancing molecules, we did not see any reason why it should only apply to the spinal cord,” said Northwestern’s Samuel I. Stupp, who led the study. “Now, we observe the effects in two cell types that are completely disconnected from one another — cartilage cells in our joints and neurons in our brain and spinal cord. This makes me more confident that we might have discovered a universal phenomenon. It could apply to many other tissues.”

An expert in regenerative nanomedicine, Stupp is Board of Trustees Professor of Materials Science and Engineering, Chemistry, Medicine and Biomedical Engineering at Northwestern, where he is founding director of the Simpson Querrey Institute for BioNanotechnology and its affiliated center, the Center for Regenerative Nanomedicine. Stupp has appointments in the McCormick School of Engineering, Weinberg College of Arts and Sciences and Feinberg School of Medicine. Shelby Yuan, a graduate student in the Stupp laboratory, was primary author of the study.

Big problem, few solutions

As of 2019, nearly 530 million people around the globe were living with osteoarthritis, according to the World Health Organization. A degenerative disease in which tissues in joints break down over time, osteoarthritis is a common health problem and leading cause of disability.

In patients with severe osteoarthritis, cartilage can wear so thin that joints essentially transform into bone on bone — without a cushion between. Not only is this incredibly painful, patients’ joints also can no longer properly function. At that point, the only effective treatment is a joint replacement surgery, which is expensive and invasive.

“Current treatments aim to slow disease progression or postpone inevitable joint replacement,” Stupp said. “There are no regenerative options because humans do not have an inherent capacity to regenerate cartilage in adulthood.”

What are ‘dancing molecules’?

Stupp and his team posited that “dancing molecules” might encourage the stubborn tissue to regenerate. Previously invented in Stupp’s laboratory, dancing molecules are assemblies that form synthetic nanofibers comprising tens to hundreds of thousands of molecules with potent signals for cells. By tuning their collective motions through their chemical structure, Stupp discovered the moving molecules could rapidly find and properly engage with cellular receptors, which also are in constant motion and extremely crowded on cell membranes.

Once inside the body, the nanofibers mimic the extracellular matrix of the surrounding tissue. By matching the matrix’s structure, mimicking the motion of biological molecules and incorporating bioactive signals for the receptors, the synthetic materials are able to communicate with cells.

“Cellular receptors constantly move around,” Stupp said. “By making our molecules move, ‘dance’ or even leap temporarily out of these structures, known as supramolecular polymers, they are able to connect more effectively with receptors.”

Motion matters

In the new study, Stupp and his team looked to the receptors for a specific protein critical for cartilage formation and maintenance. To target this receptor, the team developed a new circular peptide that mimics the bioactive signal of the protein, which is called transforming growth factor beta-1 (TGFb-1).

Then, the researchers incorporated this peptide into two different molecules that interact to form supramolecular polymers in water, each with the same ability to mimic TGFb-1. The researchers designed one supramolecular polymer with a special structure that enabled its molecules to move more freely within the large assemblies. The other supramolecular polymer, however, restricted molecular movement.

“We wanted to modify the structure in order to compare two systems that differ in the extent of their motion,” Stupp said. “The intensity of supramolecular motion in one is much greater than the motion in the other one.”

Although both polymers mimicked the signal to activate the TGFb-1 receptor, the polymer with rapidly moving molecules was much more effective. In some ways, they were even more effective than the protein that activates the TGFb-1 receptor in nature.

“After three days, the human cells exposed to the long assemblies of more mobile molecules produced greater amounts of the protein components necessary for cartilage regeneration,” Stupp said. “For the production of one of the components in cartilage matrix, known as collagen II, the dancing molecules containing the cyclic peptide that activates the TGF-beta1 receptor were even more effective than the natural protein that has this function in biological systems.”

What’s next?

Stupp’s team is currently testing these systems in animal studies and adding additional signals to create highly bioactive therapies.

“With the success of the study in human cartilage cells, we predict that cartilage regeneration will be greatly enhanced when used in highly translational pre-clinical models,” Stupp said. “It should develop into a novel bioactive material for regeneration of cartilage tissue in joints.”

Stupp’s lab is also testing the ability of dancing molecules to regenerate bone — and already has promising early results, which likely will be published later this year. Simultaneously, he is testing the molecules in human organoids to accelerate the process of discovering and optimizing therapeutic materials.

Stupp’s team also continues to build its case to the Food and Drug Administration, aiming to gain approval for clinical trials to test the therapy for spinal cord repair.

“We are beginning to see the tremendous breadth of conditions that this fundamental discovery on ‘dancing molecules’ could apply to,” Stupp said. “Controlling supramolecular motion through chemical design appears to be a powerful tool to increase efficacy for a range of regenerative therapies.”

Share Button

Two shark species documented in Puget Sound for first time

Oregon State University researchers have made the first scientific confirmation in Puget Sound of two distinct shark species, one of them critically endangered.

The presence of the broadnose sevengill shark and endangered soupfin shark in the sound, the southern portion of the Salish Sea, may indicate changes in what biologists in OSU’s Big Fish Lab describe as an economically, culturally and ecologically valuable inland waterway.

The Salish Sea separates northwest Washington from British Columbia’s Vancouver Island. The 6,500-square-mile body of water stretches into Washington as Puget Sound, and the sharks were caught close to Olympia near the sound’s southernmost point.

Taylor Chapple, an assistant professor in Oregon State’s College of Agricultural Sciences, and graduate students Jessica Schulte and Ethan Personius report the broadnose sevengill and soupfin documentations in papers published in Frontiers in Marine Science.

The authors collaborated with partners at NOAA’s National Marine Fisheries Service and the Washington Department of Fish and Wildlife to confirm that the broadnose sevengill, an apex predator that can grow to nearly 10 feet, is now inhabiting heavily urbanized South Puget Sound.

“Understanding the sevengill presence in this new habitat is crucial for understanding the food webs of the Salish Sea, and it highlights the need for continued monitoring and research — including their relationship with other species of conservation concern, such as salmon,” said Schulte, the lead author on the sevengill paper.

Broadnose sevengill sharks — so named because they have two more gill slits than most shark species — eat a wide variety of prey: fishes (including rays and other sharks), crustaceans and marine mammals. They live in temperate waters worldwide, and off the west coast of North America they range from southern Alaska to Baja California.

Prior to 2021, only one sevengill shark had ever been confirmed in the Salish Sea, at Point Roberts, Washington, near the Canadian border. In August 2021, however, anecdotal reports indicated several of them had been caught in South Puget Sound.

During 10 days of field work in 2022 and 2023, the scientists caught nine sevengills, more than 190 miles away from their previously documented range. Eight of them were males — the largest measured just under 7 feet — and the female was about 4 feet, 6 inches.

“Our continued research on this species in Oregon and Washington waters will allow us to have a better handle on its role in our valuable marine ecosystems,” Schulte said.

The same holds for the soupfin shark, said Personius, the lead author on that paper. It is the largest species of hound shark, can be as big as 6 1/2 feet and got its name because of its use as the key ingredient in shark fin soup.

“Soupfin sharks were relentlessly exploited during the 1930s and 1940s, including for their livers, which are rich in vitamin A,” Personius said. “Despite lower fishing pressure the species has not been able to recover and is currently under consideration for federal protection under the Endangered Species Act.”

Like the broadnose sevengill shark, the soupfin shark is found in temperate waters around the globe and is a top predator in any ecosystem it inhabits, eating cephalopods as well as a variety of fishes. Soupfin sharks are known as strong swimmers whose migrations can exceed 1,000 miles.

In field work concurrent with the sevengill project, the scientists caught one soupfin shark, a male that measured just over 5 feet.

“The Salish Sea has experienced pervasive shifts in species abundance and composition along with industrialization and significant habitat degradation,” Personius said. “The appearance of soupfin sharks may be a result of climate change and changes in prey availability.”

Following the 2014-15 extreme marine heat wave event known as “The Blob,” he explained, anchovies emerged as a dominant forage fish species in the Salish Sea after having been historically uncommon there. Soupfin sharks are a known predator of anchovies.

Graduate student Maddie English is a co-author of the soupfin shark paper, along with scientists from the NOAA Marine Fisheries Service and the Washington Department of Fish and Wildlife. Research associate Alexandra McInturf contributed to the sevengill study.

Share Button

AI method radically speeds predictions of materials’ thermal properties

It is estimated that about 70 percent of the energy generated worldwide ends up as waste heat.

If scientists could better predict how heat moves through semiconductors and insulators, they could design more efficient power generation systems. However, the thermal properties of materials can be exceedingly difficult to model.

The trouble comes from phonons, which are subatomic particles that carry heat. Some of a material’s thermal properties depend on a measurement called the phonon dispersion relation, which can be incredibly hard to obtain, let alone utilize in the design of a system.

A team of researchers from MIT and elsewhere tackled this challenge by rethinking the problem from the ground up. The result of their work is a new machine-learning framework that can predict phonon dispersion relations up to 1,000 times faster than other AI-based techniques, with comparable or even better accuracy. Compared to more traditional, non-AI-based approaches, it could be 1 million times faster.

This method could help engineers design energy generation systems that produce more power, more efficiently. It could also be used to develop more efficient microelectronics, since managing heat remains a major bottleneck to speeding up electronics.

“Phonons are the culprit for the thermal loss, yet obtaining their properties is notoriously challenging, either computationally or experimentally,” says Mingda Li, associate professor of nuclear science and engineering and senior author of a paper on this technique.

Li is joined on the paper by co-lead authors Ryotaro Okabe, a chemistry graduate student; and Abhijatmedhi Chotrattanapituk, an electrical engineering and computer science graduate student; Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT; as well as others at MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory. The research appears in Nature Computational Science.

Predicting phonons

Heat-carrying phonons are tricky to predict because they have an extremely wide frequency range, and the particles interact and travel at different speeds.

A material’s phonon dispersion relation is the relationship between energy and momentum of phonons in its crystal structure. For years, researchers have tried to predict phonon dispersion relations using machine learning, but there are so many high-precision calculations involved that models get bogged down.

“If you have 100 CPUs and a few weeks, you could probably calculate the phonon dispersion relation for one material. The whole community really wants a more efficient way to do this,” says Okabe.

The machine-learning models scientists often use for these calculations are known as graph neural networks (GNN). A GNN converts a material’s atomic structure into a crystal graph comprising multiple nodes, which represent atoms, connected by edges, which represent the interatomic bonding between atoms.

While GNNs work well for calculating many quantities, like magnetization or electrical polarization, they are not flexible enough to efficiently predict an extremely high-dimensional quantity like the phonon dispersion relation. Because phonons can travel around atoms on X, Y, and Z axes, their momentum space is hard to model with a fixed graph structure.

To gain the flexibility they needed, Li and his collaborators devised virtual nodes.

They create what they call a virtual node graph neural network (VGNN) by adding a series of flexible virtual nodes to the fixed crystal structure to represent phonons. The virtual nodes enable the output of the neural network to vary in size, so it is not restricted by the fixed crystal structure.

Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. While virtual nodes will be updated as the model updates real nodes during computation, they do not affect the accuracy of the model.

“The way we do this is very efficient in coding. You just generate a few more nodes in your GNN. The physical location doesn’t matter, and the real nodes don’t even know the virtual nodes are there,” says Chotrattanapituk.

Cutting out complexity

Since it has virtual nodes to represent phonons, the VGNN can skip many complex calculations when estimating phonon dispersion relations, which makes the method more efficient than a standard GNN.

The researchers proposed three different versions of VGNNs with increasing complexity. Each can be used to predict phonons directly from a material’s atomic coordinates.

Because their approach has the flexibility to rapidly model high-dimensional properties, they can use it to estimate phonon dispersion relations in alloy systems. These complex combinations of metals and nonmetals are especially challenging for traditional approaches to model.

The researchers also found that VGNNs offered slightly greater accuracy when predicting a material’s heat capacity. In some instances, prediction errors were two orders of magnitude lower with their technique.

A VGNN could be used to calculate phonon dispersion relations for a few thousand materials in just a few seconds with a personal computer, Li says.

This efficiency could enable scientists to search a larger space when seeking materials with certain thermal properties, such as superior thermal storage, energy conversion, or superconductivity.

Moreover, the virtual node technique is not exclusive to phonons, and could also be used to predict challenging optical and magnetic properties.

In the future, the researchers want to refine the technique so virtual nodes have greater sensitivity to capture small changes that can affect phonon structure.

“Researchers got too comfortable using graph nodes to represent atoms, but we can rethink that. Graph nodes can be anything. And virtual nodes are a very generic approach you could use to predict a lot of high-dimensional quantities,” Li says.

This work is supported by the U.S. Department of Energy, National Science Foundation, a Mathworks Fellowship, a Sow-Hsin Chen Fellowship, the Harvard Quantum Initiative, and the Oak Ridge National Laboratory.

Share Button

Win-win potential of grass-powered energy production

Strategically planting perennial grass throughout corn and soybean fields helps address the unintended environmental consequences of growing the dominant row crops, including soil erosion, fertilizer runoff and greenhouse gas emissions.

But converting portions of farmland back to prairie has to make financial sense for farmers, which is why a research team led by Iowa State University landscape ecologist Lisa Schulte Moore has spent the past six years studying how to efficiently turn harvested grass into lucrative renewable natural gas.

“We’re looking at existing markets where there is already a demand, use existing infrastructure to reduce costs of the energy transition and create wins in multiple categories. We want wins for farmers, wins for businesses, wins for municipalities and wins for society,” said Schulte Moore, professor of natural resource ecology and management and director of the Consortium for Cultivating Human And Naturally reGenerative Enterprises (C-CHANGE). “We can have great conversations about what could be, but unless it benefits everyone along these supply chains, it won’t happen.”

A pair of recently published peer-reviewed articles by Schulte-Moore’s research group modeled the economic feasibility of grass-to-gas production in different settings and from varying perspectives, analysis that helps flesh out the system’s win-win potential.

“To replace natural gas with resources that revitalize sustainable agriculture, we have to be able to quantify how much energy we can produce and show it can be cost effective and environmentally friendly,” said associate professor of mechanical engineering Mark Mba-Wright, co-author of the studies.

City-based scenarios

The ongoing research is funded in part by a $10 million federal grant in 2020, another $10 million in federal support in 2022 and about $650,000 from the Walton Family Foundation. The work centers on optimizing and expanding the use of anaerobic digesters. Biogas is released in anaerobic digestion, the natural process of organic matter biodegrading without oxygen. Captured in tank-like digesters, biogas can be processed into a fuel that easily swaps in for petroleum-based natural gas. It also can power electrical generators and produce fertilizer.

In a study published in BioEnergy Research, the Iowa State researchers modeled how a network of digesters in and around Ames could supply the city’s heat and power demands. Livestock manure, biofuel byproducts, food waste and wastewater would join grassy biomass as the feedstock supplies for up to 10 digesters. The locations, size and number of facilities depended on whether the network was designed primarily to produce natural gas or power.

The analysis found renewable natural gas was the most economically practical focus, with a levelized cost roughly twice the historical average price of traditional natural gas. Incentives supporting clean energy production could provide a boost to make pricing competitive. Regardless, seeing how digester supply chains would work to serve municipal needs helps city leaders envision possibilities, Mba-Wright said.

“We wanted to consider the seasonality of the supply and demand over a year to give a mayor, for instance, scenarios to look at and strategize around,” he said.

Researchers have discussed anaerobic digestion with municipal wastewater officials in several cities in Iowa, and generally they’ve been curious, said Schulte Moore, co-director of the Bioeconomy Institute and a 2021 MacArthur Fellow.

“Their immediate need is to provide a service to their customers 24-7. But they work on 15- to 30-year planning horizons, so they’re also thinking about the future,” she said.

A grass-to-gas road map

A study published in Global Change Biology Bioenergy modeled the economic and environmental impact of two hypothetical digesters processing grassy biomass in the Grand River Basin in northwest Missouri and southwest Iowa.

Over their expected 20-year lifespan, the digesters would produce a combined profit of more than $400 million under the best conditions, based on the researchers’ analysis. The 45 million gigajoules of renewable natural gas created over two decades — equal to about 12.5 billion kilowatt hours — would have a carbon footprint 83% lower than natural gas derived from fossil fuels. Emissions also project to be lower than those from corn-based ethanol or soybean-based biodiesel.

Most existing anaerobic digesters that produce renewable natural gas have run on dairy manure, so it’s essential to pencil out how they would perform on a grass diet, Mba-Wright said.

“This is dotting our ‘i’s and crossing our ‘t’s to confirm the benefits are what we’d expect. We’re providing a road map to help build infrastructure, which will in turn reduce future costs,” he said.

The profitable scenarios examined in the study rely on existing carbon credit programs, including the California Low Carbon Fuel Standard and federal Renewable Fuel Standard. The most valuable outcomes also require high-yield grass and prairie restoration on some of the least-productive farmland.

Researchers aimed to be as realistic as possible in both studies, accounting for all known costs — including capital expenses. But they’ll be even more accurate in the coming years, as methods improve and new research results roll in, Schulte Moore said.

“In the future, we will refine our models by plugging in data our research teams have collected right here in Iowa,” she said.

Share Button