European bird declines linked to range of climatic conditions experienced

New research suggests conservation efforts could more effectively identify and protect bird species at greatest risk from climate change by better understanding the range of specific conditions they need to thrive.

The study, led by the University of East Anglia (UEA), examined the relationship between the extent of the climatic conditions that species tolerate and in which populations can survive — known as climatic niche breadth — and their likelihood of declining in response to climate change.

For species inhabiting a similar area of geographic space, those able to tolerate a wider range of climate conditions are less likely to experience population declines, and are more likely to be increasing, compared to those with narrower climatic preferences.

The authors say their findings, published in the Journal of Biogeography, provide valuable insights into how climatic niche breadth can act as an important factor in predicting bird species vulnerability to climate change.

They suggest incorporating species’ climatic niches into climate change risk assessments to better inform conservation strategies, arguing that variation in climatic conditions within a species range can provide greater nuance in understanding their resilience to factors that affect their population.

“Deciding which species are more at risk isn’t straight forward, and species may be declining for a range of reasons,” said lead author Karolina Zalewska, a postgraduate researcher in UEA’s School of Environmental Sciences.

“Rare species and those that have smaller distributions are more likely to be vulnerable to climate change. This study has shown that birds that are more widespread, such as the house sparrow and the common starling, can also face threats to their populations.

“Species, whether rare or widespread, with narrower climatic niches may be more susceptible to the rapid changes brought on by climate change compared to those with broader niches, and this may be one of the underlying reasons behind the population declines observed.

“Our results emphasise the importance of understanding and incorporating the level of exposure to climatic variability when assessing vulnerability to climate change and long-term population declines.”

Human-induced climate change has increasingly been identified as a major threat to global biodiversity. However, the extent of this threat is likely to be uneven across species, due to differences in life histories or exposure to environmental change, with some climatic conditions, such as variables of temperature and rainfall, being more widespread than others across geographic area.

While species with broad geographic distributions would be expected to experience a wider range of climatic conditions and so be more resilient to environmental change, recent population declines of many widespread species suggest other factors may be involved, with the breadth of climatic conditions a species experiences being a possible indicator of their resilience to climate change.

Co-author Prof Aldina Franco, also from the School of Environmental Sciences, said: “Faced with the challenges of the global biodiversity crisis and climate change, the rapid assessment of species vulnerability to environmental change has become of paramount importance to address priorities for conservation. As climate change accelerates, our study highlights the need to prioritize species that are most at risk due to their more restricted environmental requirements.”

The researchers drew on data for the population trends of 159 European breeding bird species across 29 European countries from the Pan-European Common Bird Monitoring Scheme. These 40-year population trends were related to the climate conditions species experience and their distribution area.

They used 30-years of climate data for the species breeding ranges to construct representations of their climatic niches and produced a new index of climatic niche breadth that accounts for species distribution area.

This analysis, along with additional factors such as the species’ diet, primary habitat type, migratory status, and average body mass, was then used to explore how these variables influence the long-term population trends.

“These findings can help us understand the threats associated with climate change and allow for rapid assessment of the importance of climatic factors on population trends, providing an invaluable tool for targeting species conservation,” added Miss Zalewska. “In particular, we show that the climatic niche breadth to range area index can help predict which species may be more vulnerable to population declines.”

Within the species included in this study, 58 had a decreasing population trend, populations of 68 species were stable and 33 were increasing.

As in previous studies the team found that species associated with farmland habitats, both in the UK and wider European area, such as the corn bunting or skylark, were more likely to be declining while those able to tolerate human-modified environments, including blackbirds and blue tits, were more likely to experience increasing population trends.

The work was supported by funding from the UK’s Natural Environment Research Council and the ARIES Doctoral Training Partnership.

Share Button

‘Hidden galaxies’: Key to unlocking some of universe’s secrets

Astronomers have peered back in time to find what looks like a population of ‘hidden’ galaxies that could hold the key to unlocking some of the universe’s secrets.

If their existence is confirmed it would “effectively break current models of galaxy numbers and evolution.”

The possible galaxies may also provide the missing piece of the puzzle for the energy generation in the universe in infrared light.

That’s because their combined light would be enough to top-up the energy budget of the universe to the maximum we observe, effectively accounting for all remaining energy emission at these long wavelengths.

Possible evidence of the galaxies’ existence was detected on the deepest ever image of the universe at long far-infrared wavelengths, which features almost 2,000 distant galaxies and was created by a team of researchers led by STFC RAL Space and Imperial College London.

Dr Chris Pearson, from STFC RAL Space, is lead author on one of two papers published today in Monthly Notices of the Royal Astronomical Society.

He said: “This work has pushed the science with Herschel to its absolute limit, probing far below what we can normally discernibly see and potentially revealing a completely new population of galaxies that are contributing to the very faintest light we can observe in the universe.”

The team behind the research created their deep view of the universe by stacking 141 images on top of each other using data from the SPIRE instrument on the Herschel Space Observatory, a European Space Agency mission which ran from 2009 to 2013.

The resulting Herschel-SPIRE Dark Field is the deepest ever image of the far-infrared sky — five times deeper than the previous single deepest Herschel observation and at least twice as deep as any other area on the sky observed by the telescope.

Placing the images on top of each other allowed astronomers to see the dustiest galaxies, where most new stars are formed in the cosmos.

It also enabled them to track how the number of galaxies changes with brightness and to measure the contribution each one makes to the total energy budget of the universe.

However, the image was so deep and detected so many galaxies that the individual objects began to merge and become indistinguishable from each other.

This made extracting information challenging, according to Thomas Varnish, a PhD student at the Massachusetts Institute of Technology (MIT) and lead author on the second paper.

“We employed statistical techniques to get around this overcrowding, analysing the blurriest parts of the image to probe and model the underlying distribution of galaxies not individually discernible in the original image,” said Mr Varnish, who carried out most of his research as a summer intern at Imperial College London and RAL Space.

“What we found was possible evidence of a completely new, undiscovered population of faint galaxies hidden in the blur of the image, too faint to be detected by conventional methods in the original analysis.

“If confirmed, this new population would effectively break all of our current models of galaxy numbers and evolution.”

The researchers are now hoping to confirm the existence of the potential new group of galaxies using telescopes at other wavelengths.

Their aim is to decipher the nature of these faint, dusty objects and their importance in the grand scheme of the evolution of our universe.

Dr Pearson said: “When we look at starlight through normal telescopes, we are only able to read half of the story of our universe, the other half is hidden, obscured by the intervening dust.

“In fact, roughly half of the energy output of the universe is from starlight that has been absorbed by dust and re-emitted as cooler infrared radiation. To fully understand the evolution of our universe we need to observe the sky in both optical and longer wavelength infrared light.”

The Herschel Space Observatory was tasked with observing the universe in the infrared, with its SPIRE instrument covering the very longest wavelengths.

Like any scientific instrument in space, the SPIRE instrument also required regular observations for calibration and routinely stared at a single patch of ‘dark sky’ every month or so, over the duration of its four-year mission.

Herschel held the record for the largest ever infrared space telescope, until it was eclipsed by the James Webb Space Telescope in 2021.

Imperial College London astrophysicist Dr David Clements, who was also involved in the research, added: “These results show just how valuable the Herschel archive is.

“We’re still getting great new results more than 10 years after the satellite stopped operating.

“What we can’t get, though, is more data at these wavelengths to follow up these fascinating new results. For that we need the next generation far-IR mission, PRIMA, currently being proposed to NASA.”

The Probe far-Infrared Mission for Astrophysics (PRIMA) is being supported by a UK consortium including RAL Space, the University of Sussex, Imperial College London and Cardiff University.

It would involve the use of a 1.8-metre telescope optimised for far-infrared imaging and spectroscopy, bridging the gap between existing observatories such as the James Webb Space Telescope and radio telescopes.

PRIMA is one of two proposals shortlisted for NASA’s next $1 billion (£772 million) probe mission. The US space agency will confirm its final mission selection in 2026.

Share Button

Coral reefs exude myriad chemicals, fueling dynamic microbial recycling of nutrients

New research revealed the remarkable chemical diversity of substances exuded by coral reefs and demonstrated that thousands of different chemicals derived from tropical corals and seaweeds are available for microbes to decompose and utilize. The study, published recently in Environmental Microbiology by an international team led by Scripps Institution of Oceanography (SIO) and University of Hawai’i (UH) at Manoa scientists, provides crucial insights into the intricate relationships between coral reefs, marine microorganisms, and the carbon cycle.

In dynamic ecosystems, and especially in the nutrient-limited environments where coral reefs grow, not much will go to waste. Microbes dominate when it comes to decomposing, recycling, and transforming what other organisms discard.

“We’ve known that some of the substances exuded on coral reefs, termed exometabolites, are available for microbial metabolism,” said Craig Nelson, professor in the UH Manoa School of Ocean and Earth Science and Technology. “However, in this study, we discovered that the number and variety of exometabolites that microbes find useful is much higher than previously considered, and includes hundreds of compounds spanning most of the broad chemical classifications.”

“We were especially surprised to discover that exometabolites belonging to chemical families traditionally thought to be harder for microbes to break down, such as benzene rings, terpenoids, and steroids, were among those that are able to be utilized,” said Zachary Quinlan, lead author, postdoctoral researcher at the Hawai’i Institute of Marine Biology in SOEST, and former graduate student at SIO. “Our results paint a highly dynamic picture of ecosystem production of bioavailable substrates and their effects on microbial metabolism relevant to carbon cycling in coastal marine environments.”

Carbon cycle and reef resilience

Combined, all of the dissolved organic material in the ocean, including the chemicals exuded by coral reefs, contains an amount of carbon comparable to the amount of carbon dioxide in the atmosphere. So, the study authors point out, how microbes utilize this organic material has a major influence on the global carbon cycle.

When there is a shift in the types of organisms living on a reef, that is stony corals versus fleshy seaweeds, the chemistry of the seawater also changes. In addition to their detailed study of what chemicals are being exuded on the reef, the research team also conducted experiments to determine whether microbes preferred to use substances from stony corals or seaweed.

“We observed that coral and algae can selectively facilitate the growth of specific microbial communities by exuding distinct chemicals that can be used by specific types of microbes,” said Linda Wegley Kelly, senior author on the study and associate researcher at SIO. “Our results highlight how shifting from coral-dominated to algae-dominated reefs can alter reef ecosystem function and impact resilience of the system, potentially making it more susceptible to disease or bleaching.”

In the future, the team aims to continue discovering how chemical features can inform coral reef management and be used to advance coral restoration success.

Share Button

Certain nasal bacteria may boost the risk for COVID-19 infection, study finds

A new study from researchers at the George Washington University has found that certain bacteria living in the nose may influence how likely someone is to get a COVID-19 infection. Published in EBioMedicine, the research reveals that certain types of nasal bacteria can affect the levels of key proteins the virus needs to enter human cells, offering new insight into why some people are more vulnerable to COVID-19 than others.

“We’ve known that the virus SARS-CoV-2 enters the body through the respiratory tract, with the nose being a key entry point. What’s new — and surprising — is that bacteria in our noses can influence the levels of proteins that the virus uses to infect cells,” said Cindy Liu, associate professor of environmental and occupational health at the GW Milken Institute School of Public Health.

Higher Gene Expression of Viral Entry Proteins Increases COVID-19 Infection Risk

In the study, Liu and her team analyzed nasal swab samples from over 450 people, including some who later tested positive for COVID-19. They found that those who became infected had higher levels of gene expression for two key proteins — ACE2 and TMPRSS2. ACE2 allows the virus to enter nasal cells, while TMPRSS2 helps activate the virus by cleaving its spike protein.

Those with high expression for these proteins were more than three times as likely to test positive for COVID-19, while those with moderate levels had double the risk. The study also found that people who became infected had more unstable levels of gene expression, with the sharpest increases just days before testing positive, suggesting rising expression levels may signal increased vulnerability to the virus.

Notably, while women generally had higher gene expression levels of these proteins — consistent with previous studies showing higher COVID-19 infection rates in women — men with higher levels were more likely to get infected, indicating elevated protein levels may present a greater risk for men.

Nasal Bacteria May Play a Role in COVID-19 Risk

To understand what could impact the expression levels of these viral entry proteins, the researchers turned to the nasal microbiome — the diverse community of bacteria that naturally reside in the nose. They found that certain nasal bacteria may affect the expression levels of ACE2 and TMPRSS2, influencing the respiratory tract’s susceptibility to COVID-19.

The study identified three common nasal bacteria — Staphylococcus aureus, Haemophilus influenzae, and Moraxella catarrhalis/nonliquefaciens — that were linked to higher expression levels of ACE2 and TMPRSS2 and increased COVID-19 risk. On the other hand, Dolosigranulum pigrum, another common type of nasal bacteria, was connected to lower levels of these key proteins and may offer some protection against the virus.

“Some bacteria in your nose may be setting the stage — or even holding the door open — for viruses like SARS-CoV-2 to get in,” said Daniel. Park, a senior research scientist at GW and the first author of the study.

While some of the high-risk bacteria were less common, 20% of participants carried enough S. aureus to nearly double their risk for having elevated ACE2 and TMPRSS2 expression, making it a major nasal microbiome risk factor for increasing individuals’ risk for COVID-19 infection.

Why This Matters

The findings offer new potential ways to predict and prevent COVID-19 infection. The study suggests that monitoring ACE2 and TMPRSS2 gene expression could help identify individuals at higher risk for infection. The research also highlights the potential of targeting the nasal microbiome to help prevent viral infections.

“We’re only beginning to understand the complex relationship between the nasal microbiome and our health,” said Liu. “This study suggests that the bacteria in our nose — and how they interact with the cells and immune system in our nasal cavity — could play an important role in determining our risk for respiratory infections like COVID-19.”

The team plans to explore whether modifying the nasal microbiome, such as through nasal sprays or live biotherapeutics, could reduce the risk of infection — potentially paving the way for new ways to prevent respiratory viral infections in future pandemics.

The study, “The Nasal Microbiome Modulates Risk for SARS-CoV-2 Infection,” was published April 9 in the journal EBioMedicine. The research was supported by the GW Milken Institute School of Public Health and by the National Institutes of Health.

Share Button

AI models of the brain could serve as ‘digital twins’ in research

Much as a pilot might practice maneuvers in a flight simulator, scientists might soon be able to perform experiments on a realistic simulation of the mouse brain. In a new study, Stanford Medicine researchers and collaborators used an artificial intelligence model to build a “digital twin” of the part of the mouse brain that processes visual information.

The digital twin was trained on large datasets of brain activity collected from the visual cortex of real mice as they watched movie clips. It could then predict the response of tens of thousands of neurons to new videos and images.

Digital twins could make studying the inner workings of the brain easier and more efficient.

“If you build a model of the brain and it’s very accurate, that means you can do a lot more experiments,” said Andreas Tolias, PhD, Stanford Medicine professor of ophthalmology and senior author of the study published April 10 in Nature. “The ones that are the most promising you can then test in the real brain.”

The lead author of the study is Eric Wang, PhD, a medical student at Baylor College of Medicine.

Beyond the training distribution

Unlike previous AI models of the visual cortex, which could simulate the brain’s response to only the type of stimuli they saw in the training data, the new model can predict the brain’s response to a wide range of new visual input. It can even surmise anatomical features of each neuron.

The new model is an example of a foundation model, a relatively new class of AI models capable of learning from large datasets, then applying that knowledge to new tasks and new types of data — or what researchers call “generalizing outside the training distribution.”

(ChatGPT is a familiar example of a foundation model that can learn from vast amounts of text to then understand and generate new text.)

“In many ways, the seed of intelligence is the ability to generalize robustly,” Tolias said. “The ultimate goal — the holy grail — is to generalize to scenarios outside your training distribution.”

Mouse movies

To train the new AI model, the researchers first recorded the brain activity of real mice as they watched movies — made-for-people movies. The films ideally would approximate what the mice might see in natural settings.

“It’s very hard to sample a realistic movie for mice, because nobody makes Hollywood movies for mice,” Tolias said. But action movies came close enough.

Mice have low-resolution vision — similar to our peripheral vision — meaning they mainly see movement rather than details or color. “Mice like movement, which strongly activates their visual system, so we showed them movies that have a lot of action,” Tolias said.

Over many short viewing sessions, the researchers recorded more than 900 minutes of brain activity from eight mice watching clips of action-packed movies, such as Mad Max. Cameras monitored their eye movements and behavior.

The researchers used the aggregated data to train a core model, which could then be customized into a digital twin of any individual mouse with a bit of additional training.

Accurate predictions

These digital twins were able to closely simulate the neural activity of their biological counterparts in response to a variety of new visual stimuli, including videos and static images. The large quantity of aggregated training data was key to the digital twins’ success, Tolias said. “They were impressively accurate because they were trained on such large datasets.”

Though trained only on neural activity, the new models could generalize to other types of data.

The digital twin of one particular mouse was able to predict the anatomical locations and cell type of thousands of neurons in the visual cortex as well as the connections between these neurons.

The researchers verified these predictions against high-resolution, electron microscope imaging of that mouse’s visual cortex, which was part of a larger project to map the structure and function of the mouse visual cortex in unprecedented detail. The results of that project, known as MICrONS, was published simultaneously in Nature.

Opening the black box

Because a digital twin can function long past the lifespan of a mouse, scientists could perform a virtually unlimited number of experiments on essentially the same animal. Experiments that would take years could be completed in hours, and millions of experiments could run simultaneously, speeding up research into how the brain processes information and the principles of intelligence.

“We’re trying to open the black box, so to speak, to understand the brain at the level of individual neurons or populations of neurons and how they work together to encode information,” Tolias said.

In fact, the new models are already yielding new insights. In another related study, also simultaneously published in Nature, researchers used a digital twin to discover how neurons in the visual cortex choose other neurons with which to form connections.

Scientists had known that similar neurons tend to form connections, like people forming friendships. The digital twin revealed which similarities mattered the most. Neurons prefer to connect with neurons that respond to the same stimulus — the color blue, for example — over neurons that respond to the same area of visual space.

“It’s like someone selecting friends based on what they like and not where they are,” Tolias said. “We learned this more precise rule of how the brain is organized.”

The researchers plan to extend their modeling into other brain areas and to animals, including primates, with more advanced cognitive capabilities.

“Eventually, I believe it will be possible to build digital twins of at least parts of the human brain,” Tolias said. “This is just the tip of the iceberg.”

Researchers from the University Göttingen and the Allen Institute for Brain Science contributed to the work.

The study received funding from the Intelligence Advanced Research Projects Activity, a National Science Foundation NeuroNex grant, the National Institute of Mental Health, the National Institute of Neurological Disorders and Stroke (grant U19MH114830), the National Eye Institute (grant R01 EY026927 and Core Grant for Vision Research T32-EY-002520-37), the European Research Council and the Deutsche Forschungsgemeinschaft.

Share Button

Eight or more drinks per week linked to signs of injury in the brain

Heavy drinkers who have eight or more alcoholic drinks per week have an increased risk of brain lesions called hyaline arteriolosclerosis, signs of brain injury that are associated with memory and thinking problems, according to a study published on April 9, 2025, online in Neurology®, the medical journal of the American Academy of Neurology. The study does not prove that heavy drinking causes brain injury; it only shows an association.

Hyaline arteriolosclerosis is a condition that causes the small blood vessels to narrow, becoming thick and stiff. This makes it harder for blood to flow, which can damage the brain over time. It appears as lesions, areas of damaged tissue in the brain.

“Heavy alcohol consumption is a major global health concern linked to increased health problems and death,” said study author Alberto Fernando Oliveira Justo, PhD, of University of Sao Paulo Medical School in Brazil. “We looked at how alcohol affects the brain as people get older. Our research shows that heavy alcohol consumption is damaging to the brain, which can lead to memory and thinking problems.”

The study included 1,781 people who had an average age of 75 at death. All had brain autopsies.

Researchers examined brain tissue to look for signs of brain injury including tau tangles and hyaline arteriolosclerosis. They also measured brain weight and the height of each participant.

Family members answered questions about participants’ alcohol consumption.

Researchers then divided the participants into four groups: 965 people who never drank, 319 moderate drinkers who had seven or fewer drinks per week; 129 heavy drinkers who had eight or more drinks per week; and 368 former heavy drinkers. Researchers defined one drink as having 14 grams of alcohol, which is about 350 milliliters (ml) of beer, 150 ml of wine or 45 ml of distilled spirits.

Of those who never drank, 40% had vascular brain lesions. Of the moderate drinkers, 45% had vascular brain lesions. Of the heavy drinkers, 44% had vascular brain lesions. Of the former heavy drinkers, 50% had vascular brain lesions.

After adjusting for factors that could affect brain health such as age at death, smoking and physical activity, heavy drinkers had 133% higher odds of having vascular brain lesions compared to those who never drank, former heavy drinkers had 89% higher odds and moderate drinkers, 60%.

Researchers also found heavy and former heavy drinkers had higher odds of developing tau tangles, a biomarker associated with Alzheimer’s disease, with 41% and 31% higher odds, respectively.

Former heavy drinking was associated with a lower brain mass ratio, a smaller proportion of brain mass compared to body mass, and worse cognitive abilities. No link was found between moderate or heavy drinking and brain mass ratio or cognitive abilities.

Justo noted that, in addition to brain injuries, impaired cognitive abilities were observed only in former drinkers.

Researchers also found that heavy drinkers died an average of 13 years earlier than those who never drank.

“We found heavy drinking is directly linked to signs of injury in the brain, and this can cause long-term effects on brain health, which may impact memory and thinking abilities,” said Justo. “Understanding these effects is crucial for public health awareness and continuing to implement preventive measures to reduce heavy drinking.”

A limitation of the study was that it did not look at participants before death and did not have information on the duration of alcohol consumption and cognitive abilities.

The study was supported by The São Paulo Research Foundation.

Share Button

Saliva test may turn tide on prostate cancer, claim scientists

Analysing DNA in saliva can identify men at the greatest risk of prostate cancer

Share Button

How much food can the world grow? International team calls for new yield potential estimates

An international team of agronomists is calling for a new approach to estimate crop yield potential and gaps — information that is critical in planning how to meet growing food demand.

University of Nebraska-Lincoln researchers made major contributions to the study, published online April 8 in the journal Nature Food.

“We are in a race to feed the world and to try to feed the population with the available agricultural land that we have,” said Patricio Grassini, Sunkist Distinguished Professor of Agronomy and one of the paper’s authors.

To do so requires estimates that predict both yield potential, as determined by weather and soil properties, and yield gaps, which is the difference between yield potential and current farm yields, which indicates the room that exists to increase food production on existing cropland. Those estimates are essential in making investments in agricultural research and development, both from public and private sources.

At issue is how best to compile those estimates.

In the Nature Food paper, a team that includes scientists from Nebraska and three other institutions calls into question the statistical methods now widely used. In addition to Grassini, Husker authors of the study included Fatima Tenorio, Fernando Aramburu Merlos and Juan Rattalino Edreira, research assistant professors of agronomy.

In the United States, for example, current statistical models tend to rely too heavily on best-case scenarios — the most productive counties with the most fertile soils in a year with the most favorable weather, Grassini said. The methods also extrapolate a single yield potential across large regions with a wide diversity of climates and soils that likely would produce a similarly wide range in yield potential.

“Therefore, if you use that year as a reference, you are going to be overestimating your production potential because the best county with the best soils in the best year doesn’t really represent your average climate or your most typical soil across the state,” Grassini said.

But in other parts of the world — Africa, for example — these models might underestimate crop yield. There, farmers may have limited access to inputs compared to producers in other areas, thus attaining yields far below what the climate can support.

This statistical approach also leads to conflicting results, with production potential estimates almost doubling from one method to another. Grassini said this approach — driven mostly by geographers and statisticians, not agronomists — has been largely accepted, and more rigorous analysis is needed.

The research team’s conclusions are explained in the paper, titled “Statistical approaches are inadequate for accurate estimation of yield potential and gaps at regional level.”

The study compared estimates of yield potential and yield gaps of major U.S. rainfed crops — corn, soybeans and wheat — derived from four statistical models against those derived from a “bottom-up” spatial scaling approach based on robust crop modeling and local weather and soil data, such as the Global Yield Gap and Water Productivity Atlas developed at Nebraska.

Process-based crop models used in this study have been rigorously validated for their capacity to estimate yield potential based on experimental data from well-managed crops grown across a wide range of environments. This bottom-up approach, which better incorporates long-term data and regional variations, is clearly superior, the team found.

“I expect some controversy,” Grassini said of the team’s conclusions challenging the conventional wisdom.

The approach recommended by the team should better capture yield gaps, which “can help identify regions with largest room to increase crop production, which, in turn provides a basis to orient agricultural research and development programs.”

“This is a call to set the record straight because if we are going to use this information to inform policy and our investments, we better make sure that the information is sound and has been validated,” Grassini said.

Additional team members included Romulo Lollato, associate professor of agronomy, Kansas State University; Sotirios Archontoulis, professor of agronomy, Iowa State University; and Antoine Couëdel, who completed his postdoctoral research at Nebraska and is a researcher at the French Agricultural Research Centre for International Development, France.

Share Button

Trump threatens to end pharmaceuticals tariff exemption

The US president vows “major” tariffs on imported medicines – raising fears of an increase in costs for Americans.

Share Button

Infected blood victims losing faith as inquiry hearings restart

Inquiry chair is acting amid grave concerns over payouts to victims after final report was published last year.

Share Button