‘Doctors told my mum poison symptoms were anxiety’

Ellena Baxter says watching her mother’s health deteriorate was devastating.

Share Button

2024 sea level ‘report cards’ map futures of US coastal communities

William & Mary’s Batten School & VIMS have released their 2024 U.S. sea level “report cards,” providing updated analyses of sea level trends and projections for 36 coastal communities. Encompassing 55 years of historical data, the report cards aid planning and adaptation efforts by analyzing rates of sea level rise and acceleration at each locality and forecasting 2050 water levels.

This year, the report cards are consolidated in an interactive dashboard and add data from tide gauge stations in Annapolis, MD; Solomons Island, MD; Yorktown, VA; and Fort Myers, FL.

Most sea level projections are based on an understanding of average global sea level rise. However, sea levels do not rise uniformly across the world. Factors such as geological uplift, land subsidence, ocean currents and other processes all impact regional sea level trends.

“Many people who live near the coast want to know what they can reasonably expect over the next few decades, giving them time to make actionable plans and decisions,” says Molly Mitchell, an assistant professor at the Batten School of Coastal & Marine Sciences & VIMS. “Compared to other predictions based on satellite data and global computer models, our reports are created using observed tide gauge data from the past 55 years and reflect the exact experience at the location of the gauge. The annual release of the report cards allows coastal regions to examine if past trends are changing and alter their planning accordingly.”

The reports group localities into East Coast, Gulf Coast, West Coast and Alaskan Coast regions. Each report card shows values for monthly sea level averages along with high-and low-water levels caused by storms and other transient events, as well as a decadal signal showing the influence of longer-term climate patterns such as El Niño. Observed rates of acceleration are factored into future projections and are displayed in comparison to a linear trendline that does not account for acceleration.

The projections also show the range of sea level rise within the 95% confidence interval, which allows individuals and municipalities to plan adequately for the highest predicted rates of sea level rise caused by things like storm surge and tidal flooding.

Overall, most locations continue a trend of accelerating sea level rise. However, Mitchell notes that projections have remained mostly uniform since reporting began in 2018, apart from a few notable exceptions.

“One interesting new trend is the acceleration occurring in southeastern states such as South Carolina and Georgia,” said Mitchell. “We continue to see the fastest rates of sea level rise in Gulf states like Texas and Louisiana, but many of the East Coast stations are accelerating quite quickly, likely due to patterns of water distribution related to glacial melt from the Greenland ice sheet.”

Mitchell also notes that most West Coast localities have been fairly stable, despite past predictions that they would increase rapidly. “This has led to some questions about why,” she said.

Information about the processes most affecting regional sea levels is listed on the Batten School & VIMS website: List of cities, states and processes.

Emeritus Professor John Boon launched the sea level report cards in 2018 following the publication of the study Anthropocene Sea Level Change: A History of Recent Trends Observed in the U.S. East, Gulf and West Coast Regions, which showed a notable increase in sea level acceleration rates beginning in 2013-2014.

Share Button

Accelerating drug discovery with a single carbon atom

A research team from the University of Oklahoma has pioneered a groundbreaking method that could accelerate drug discovery and reduce pharmaceutical development costs. Their work, published in the Journal of the American Chemical Society, introduces a safe, sustainable way to insert a single carbon atom into drug molecules at room temperature. These atoms have versatile diversification handles for further modifications that allow researchers to enhancing chemical diversity without compromising sensitive structures.

Nitrogen atoms and nitrogen-containing rings, known as heterocycles, play crucial roles in the development of medicines. A research team led by OU Presidential Professor Indrajeet Sharma has found a way to change these rings by adding just one carbon atom using a fast-reacting chemical called sulfenylcarbene. This method, called skeletal editing, transforms existing molecules into new drug candidates.

“By selectively adding one carbon atom to these existing drug heterocycles in the later stages of development, we can change the molecule’s biological and pharmacological properties without changing its functionalities,” he said. “This could open uncharted regions of chemical space in drug discovery.”

Previous studies have demonstrated a similar concept but relied on potentially explosive reagents, exhibited limited functional group compatibility, and posed significant safety concerns for industrial-scale applications.

Sharma’s team has developed a bench-stable reagent that generates sulfenylcarbenes under metal-free conditions at room temperature, achieving yields as high as 98%. Avoiding metal-based carbenes helps reduce environmental and health risks because many metals are known to have some level of human toxicity.

The researchers are also exploring how this chemistry could revolutionize a fast-growing area in pharmaceutical science known as DNA-encoded library (DEL) technology. DEL platforms allow researchers to rapidly screen billions of small molecules for their potential to bind to disease-relevant proteins.

The metal-free, room-temperature conditions of the team’s new carbon insertion strategy make it a compelling candidate for use in DNA-encoded libraries. Unlike other reactions that need harsh chemicals or high heat, this new method works in water-friendly liquids and is gentle enough to use with molecules attached to DNA.

By enabling precise skeletal editing in collaboration with the Damian Young group at the Baylor College of Medicine, Sharma’s approach could significantly enhance the chemical diversity and biological relevance of DEL libraries. Importantly, these are two key bottlenecks in drug discovery.

“The cost of many drugs depends on the number of steps involved in making them, and drug companies are interested in finding ways to reduce these steps. Adding a carbon atom in the late stages of development can make new drugs cheaper. It’s like renovating a building rather than building it from scratch,” Sharma said. “By making these drugs easier to produce at large scale, we could reduce the cost of healthcare for populations around the world.”

Share Button

Breakthrough in fuel cell recycling turns ‘forever chemicals’ into renewable resources

A new technique that uses soundwaves to separate materials for recycling could help prevent potentially harmful chemicals leaching into the environment.

Researchers at the University of Leicester have achieved a major milestone in fuel cell recycling, advancing techniques to efficiently separate valuable catalyst materials and fluorinated polymer membranes (PFAS) from catalyst-coated membranes (CCMs).

This development addresses critical environmental challenges posed by PFAS — often referred to as ‘forever chemicals’ — which are known to contaminate drinking water and have serious health implications. The Royal Society of Chemistry has urged government intervention to reduce PFAS levels in UK water supplies.

Fuel cells and water electrolysers, essential components of hydrogen-powered energy systems, powering cars, trains and buses, depend on CCMs containing precious platinum group metals. However, the strong adhesion between catalyst layers and PFAS membranes has made recycling difficult. Researchers at Leicester have developed a scalable method using organic solvent soaking and water ultrasonication to effectively separate these materials, revolutionising the recycling process.

Dr Jake Yang from the University of Leicester School of Chemistry said: “This method is simple and scalable. We can now separate PFAS membranes from precious metals without harsh chemicals — revolutionising how we recycle fuel cells. Fuel cells have been heralded for a long time as the breakthrough technology for clean energy but the high cost of platinum group metals has been seen as a limitation. A circular economy in these metals will bring this breakthough technology one step closer to reality.”

Building on this success, a follow-up study introduced a continuous delamination process, using a bespoke blade sonotrode that uses high frequency ultrasound to split the membranes to accelerate recycling. The process creates bubbles that collapse when subjected to high pressure, meaning the precious catalysts can be separated in seconds at room temperature. The innovative process is both sustainable and economically viable, paving the way for widespread adoption.

This groundbreaking research was carried out in collaboration with Johnson Matthey, a global leader in sustainable technologies. Industry-academia partnerships such as this underscore the importance of collective efforts in driving technological progress.

Ross Gordon, Principal Research Scientist at Johnson Matthey, said: “The development of high-intensity ultrasound to separate catalyst-loaded membranes is a game-changer in how we approach fuel cell recycling. At Johnson Matthey, we are proud to collaborate on pioneering solutions that accelerate the adoption of hydrogen-powered energy while making it more sustainable and economically viable.”

As fuel cell demand continues to grow, this breakthrough contributes to the circular economy by enabling efficient recycling of essential clean energy components. The researchers’ efforts support a greener and more affordable future for fuel cell technology while addressing pressing environmental challenges.

Share Button

‘The NHS at its worst’, ex-ombudsman tells inquiry

Sir Rob Behrens says it was a “disgrace” how mental health services failed two vulnerable men.

Share Button

Plan to modernise 1,000 GP practice buildings

The Department of Health and Social Care said it is the biggest public investment in facilities in England in five years.

Share Button

Saving the Asian ‘unicorn’ — if it still exists

Is it extinct, or does it still roam somewhere deep in the misty highland forests of Vietnam and Laos? It has been nicknamed the Asian unicorn due to its almost mythical rarity, and it is the most recently discovered large land mammal, becoming known to science as late as in 1992. Even then, it was already endangered. Today, even the most optimistic estimates say fewer than 100 saola individuals (Pseudoryx nghetinhensis) remain, but it could also be extinct by now. The last confirmed sighting in the wild was in 2013.

Researchers have been searching for it ever since, but so far without success. The task is made even more difficult by the fact that the saola lives only in the remote, rugged forests of the Annamite Mountains in Vietnam and Laos.

“Right now, the existence of live saolas can neither be proven nor disproven. The last evidence we have was from 2013, when one was captured on a camera trap. But given the remoteness of its habitat, it is extremely difficult to say for sure whether there are still a few out there. There are some signs and indications that still give us hope,” says Nguyen Quoc Dung from the Forest Inventory and Planning Institute in Vietnam.

He is one of the authors of a new international study, in which researchers from Denmark, Vietnam and many other countries have mapped the saola’s genome for the first time ever. Up until now, almost no genetic data on the saola have been generated. The study is published in the scientific journal Cell.

By analyzing fragments from saola remains collected from hunters’ households, the researchers generated complete genomes for 26 saolas. This has provided brand new insights into the history of the enigmatic bovine — and its future prospects.

How It Might Survive

“We were quite surprised to find that the saola is split into two populations with considerable genetic differences. The split happened between 5,000 and 20,000 years ago. That was completely unknown before, and there was also no way we could have known without genetic data. It is an important result because it affects how the genetic variation in the species is distributed,” says lead author Genís Garcia Erill, a former PhD student at the Department of Biology.

The genetic analyses also show that both populations have been in decline since the last Ice Age. According to the researchers’ estimates, the total saola population never exceeded 5,000 individuals in the last 10,000 years. And this long-term decline means that both populations began losing genetic diversity. But crucially, they did not lose the same genetic diversity.

“This means that the genetic variation lost in each population complements the other. So, if you mix them, they could compensate for what the other is missing,” says Genís Garcia Erill.

And that could potentially be the solution to saving the saola from extinction. The researchers have calculated the probability of the species surviving under various conservation scenarios. Their models show that the best survival chances occur if the two populations are mixed in a captive breeding program.

“If we can bring together at least a dozen saolas — ideally a mix from both populations — to form the foundation of a future population, our models show the species would have a decent chance of long-term survival. But it hinges on actually locating some individuals and starting a breeding program. That has worked before when species were on the brink of extinction,” says Rasmus Heller, senior author of the study and Associate Professor from the Department of Biology at UCPH.

But Does It Even Still Exist?

Finding 12 saolas, however, is no simple task. But the new research might help solve that problem. The genetic mapping opens up new possibilities for using various technologies to locate the last remaining saolas.

“Many researchers have unsuccessfully tried to find traces of saola through methods like environmental DNA in water and even in leeches, the blood suckers inhabiting the same habitat. These techniques all rely on detecting tiny DNA fragments, and now that we know the complete saola genome, we have a much larger toolkit for detecting those fragments,” says Minh Duc Le, co-author on the study from Vietnam National University.

But even if it turns out the saola is extinct, the new research findings might still be useful: “Our results could in theory be used if we were ever to succeed in bringing the saola back through genetic de-extinction technologies, which are a hot topic right now. In that case, our new insights into saola genetic variation could make a huge difference in creating a viable population,” says Rasmus Heller.

Still, he has his doubts about the chances of finding living saolas.

“Scientists have been searching for saolas since the 1990s, and it’s only gotten harder since then, because there were more of them back then. I’m not overly optimistic, I have to admit — but I really hope the saola is still out there,” Rasmus Heller concludes.

ABOUT THE SAOLA

  • The saola (Pseudoryx nghetinhensis) was discovered by science in 1992, making it the most recently discovered large mammal. The second-most recent was the kouprey, discovered in 1937.
  • Danish and Vietnamese biologists have been working together on studying the secretive saola right from the very beginning — starting with the scientific description of the saola in the early 1990s.
  • According to the IUCN, fewer than 100 individuals likely remain, making the saola one of the most endangered mammals in the world.
  • The saola is evolutionarily unique — it sits on a 12-15 million-year-old branch of the tree of life and is the only surviving descendant on that branch.

ABOUT THE STUDY

  • An international team of researchers from many countries and institutions contributed to the study, which is published in the journal Cell [DOI: 10.1016/j.cell.2025.03.040]
  • The study was supported by the Vietnamese Ministry of Science and Technology, the European Research Council, the Carlsberg Foundation and Independent Research Fund Denmark, among other sources.
Share Button

Are at-home water tests worth it? New study shows quality can vary widely

For the cautious — or simply curious — homeowner, an at-home water testing kit may seem reassuring. But there are high levels of variability between test kits’ abilities to detect potential contaminants in water, a new study from the University of Massachusetts Amherst has found.

“People might be concerned about their drinking water, whether they’ve heard things in the news, or they notice it tastes different, or the color is different,” says Emily Kumpel, associate professor of civil and environmental engineering at UMass Amherst and senior study author on the new paper.

While water quality reports are widely available from utilities, they only pertain to people on city water, not well water. Also, sometimes the water source isn’t the problem. “Some of these issues, like brown water, can come from home plumbing, and that’s something that the utility doesn’t always know,” says Kumpel. “A test to understand more about your home plumbing can be very helpful.”

However, finding a kit that actually works is easier said than done. The researchers found that there are hundreds of kits on the market — the availability of which can change daily — and it’s an unregulated field.

The researchers selected eight kits that evaluated levels of iron, copper, manganese and fluoride. Overall, they found high variability between the kits — some worked well, while others didn’t.

Despite the mixed results, there were some general takeaways from the research. First, the type of kit matters, and there are essentially two types available: one that measures for a particular element, and one that supposedly can measure a dozen parameters at one time, Kumple explains.

Generally, single-parameter tests had more consistently accurate results than multi-parameter ones, compared to laboratory-measured results. None of the multiparameter tests detected low levels of iron, while three out of the four single tests did (though the results often over- or underestimated the presence of iron).

Many tests advertise that they can detect high levels of iron (20-100 mg/L). While the multiparameter tests performed better when measuring for high levels of iron versus low levels, most tests still underestimated the actual concentration present. Kumpel advises that kit users should interpret the results with a healthy dose of skepticism, especially if you’re testing to see if there are concerningly high levels of metals in your water. “[These tests] might be a good first cut on things, but it’s not necessarily telling you all the information you need,” she says.

Kumpel says that single tests often have a preprocessing step for water samples that improves result accuracy. When testing for iron, changing the pH of the water makes it easier for the metal to be detected by kits.

The study also found that testing instructions and result interpretation guidance were inconsistent. For instance, for iron, one kit informed users that 0-0.3 parts per million (ppm) was “OK” and 0.5-5 ppm was “high,” while other test kits said that 0 ppm was “ideal.”

“It really points to the fact that this is an unregulated space,” says Kumpel. “This shouldn’t just be on the homeowner. These tests should be better checked for how well they actually perform, particularly under real-world conditions. A lot of them perform perfectly fine in deionized water, but not so much once you get into real water, [as] normal levels of background minerals or organics can interfere with your testing.”

One piece of advice Kumpel offers that you likely won’t see in kit instructions is to think about when you test. Metals like copper or lead (not assessed in this study) likely come from a home’s pipes, not the upstream distribution system. “You want to do what’s called a first draw sample,where you’re getting the very first water that comes out of your system that’s been sitting there overnight.” If metal is leaching from the pipes into the water, this first draw will have the highest concentration, giving you the best opportunity to detect it.

“If you want to see what’s going on directly from your well or directly from the distribution system, then you want to do what’s called a flush, which is, you let the water run for a few minutes before collecting your water sample.”

For consumers who are looking for a water test and are looking for something more reliable than the home test option, Kumpel says that state departments of environmental protection or public health websites list locally or nationally certified labs.

Consumers who should consider testing their water are well owners, especially after there has been flooding; people who live in older houses that haven’t had plumbing updated or replaced in the last two decades; or after a disaster such as wildfires or floods.

“There’s widespread mistrust in tap water across the U.S.,” says Kumpel. “Having access to be able to test your own water and confirm that it is okay — which is the most common result that people would get by testing their water — is a really good thing. I think this could be a positive tool if we can get these to work [reliably] and get people to really understand more about their water.”

Share Button

Targeting gluten: researchers delete proteins in wheat harmful to people with celiac disease

Wheat is a major source of calories, carbohydrates and protein worldwide, and its distinctive gluten proteins are what gives bread and pasta dough texture and elasticity. But it also can cause autoimmune reactions such as celiac disease, which is growing in prevalence worldwide.

Researchers at the University of California, Davis, have deleted a cluster of genes in wheat that generates gluten proteins that can trigger immune reactions without harming the breadmaking quality of this globally nutritious crop.

The findings, published this month in the journal Theoretical and Applied Genetics, won’t produce a celiac-safe form of wheat but represent a critical step forward in celiac disease research, said Maria Rottersman, a lead author on the paper and a doctoral student in plant biology working in the lab of wheat geneticist Jorge Dubcovsky.

“The gluten proteins we eliminated are the ones that trigger the strongest response in people with celiac disease, and their elimination can reduce the risk of triggering the disease in people without celiac disease,” Dubcovsky said.

Gluten is comprised of two classes of proteins — glutenins and gliadins — and deleting them all would lower the quality of bread. The research team used gamma radiation to target and delete alpha-gliadins, which can cause severe reactions in people with celiac disease.

“Wheat is a staple crop, and many people are reliant on it for calories,” Rottersman said. “It becomes a barrier when people are not able to safely eat wheat. Alpha-gliadins are definitely candidates for removal in terms of trying to create a less allergenic wheat.”

On the market

The team produced seeds from these edited varieties and tested the quality of the wheat and dough at the California Wheat Commission quality lab. Once the value of these breeding lines was established, they were deposited in the Germplasm Resources Information Network, or GRIN, operated by the Agricultural Research Service in the U.S. Department of Agriculture to make them widely available.

“The exciting thing that we found is that the quality of the flour produced by this wheat is actually, in some cases, improved,” Rottersman said. “Growers can not only grow it but can expect to have a higher quality product, which I think is a huge incentive for folks to widely adopt this variety. They can be planted in the same way that normal wheat is planted.”

Artisanal bakers, millers and farm-to-fork operations have expressed interest in the new varieties. The seeds are planted like any other crop and don’t require special handling. The varieties are conventionally bred and suitable for California, Rottersman said.

“It was previously assumed that the elimination of gliadins would have a negative effect on breadmaking quality,” Dubcovsky said. “Our study shows that this is not always the case and that we can reduce wheat allergenicity and improve quality at the same time.”

German Burguener, Joshua Hegarty, Junli Zhang, Wenjun Zhang and Xiaoqin Zhang in the Department of Plant Sciences contributed to the research, as did scientists from the UC Davis Proteomics Core Facility, Howard Hughes Medical Institute, California Wheat Commission and the USDA’s Agricultural Research Service.

Funding for the research came from the Celiac Disease Foundation, USDA’s National Institute of Food and Agriculture, Howard Hughes Medical Institute and the Foundation for Food and Agriculture Research.

Share Button

Doing nothing on social care ‘untenable’, MPs warn

The report says failure to fix England’s social care system carries an unknown human and financial cost.

Share Button