Could vaccines end the winter vomiting bug?

The easily spreadable virus can affect people of all ages and have huge consequences during winter.

Share Button

How the coronavirus defeats the innate immune response

The novel coronavirus SARS-CoV-2 has an enzyme that can counteract a cell’s innate defense mechanism against viruses, explaining why it is more infectious than the previous SARS and MERS-causing viruses. The Kobe University discovery may point the way to the development of more effective drugs against this and possibly similar, future diseases.

When a virus attacks, the body’s immune response has two basic layers of defense: the innate and the adaptive immune systems. While the adaptive immune system grows stronger against a specific pathogen as the body is exposed to it multiple times and which forms the basis of vaccinations, the innate immune system is an assortment of molecular mechanisms that work against a broad range of pathogens at a basic level. The Kobe University virologist SHOJI Ikuo says, “The new coronavirus, however, is so infectious that we wondered what clever mechanisms the virus employs to evade the innate immune system so effectively.”

Shoji’s team previously worked on the immune response to hepatitis viruses and investigated the role of a molecular tag called “ISG15” the innate immune system attaches to the virus’s building blocks. Having learned that the novel coronavirus has an enzyme that is especially effective in removing this tag, he decided to use his team’s expertise to elucidate the effect of the ISG15 tag on the coronavirus and the mechanism of the virus’s countermeasures.

In a paper in the Journal of Virology, the Kobe University-led team is now the first to report that the ISG15 tag gets attached to a specific location on the virus’s nucleocapsid protein, the scaffold that packages the pathogen’s genetic material. For the virus to assemble, many copies of the nucleocapsid protein need to attach to each other, but the ISG15 tag prevents this, which is the mechanism behind the tag’s antiviral action. “However, the novel coronavirus also has an enzyme that can remove the tags from its nucleocapsid, recovering its ability to assemble new viruses and thus overcoming the innate immune response,” explains Shoji.

The novel coronavirus shares many traits with the SARS and MERS viruses, which all belong to the same family of viruses. And these viruses, too, have an enzyme that can remove the ISG15 tag. However, Shoji’s team found that their versions are less efficient at it than the one in the novel coronavirus. And in fact, it has been reported recently that the previous viruses’ enzymes have a different primary target. “These results suggest that the novel coronavirus is simply better at evading this aspect of the innate immune system’s defense mechanism, which explains why it is so infectious,” says Shoji.

But understanding just why the novel coronavirus is so effective also points the way to developing more effective treatments. The Kobe University researcher explains: “We may be able to develop new antiviral drugs if we can inhibit the function of the viral enzyme that removes the ISG15 tag. Future therapeutic strategies may also include antiviral agents that directly target the nucleocapsid protein, or a combination of these two approaches.”

This research was funded by the Kansai Economic Federation, the Hyogo Science and Technology Association (grant 3501) and the Ministry of Education, Culture, Sports, Science and Technology Japan (grant 18042-203556). It was conducted in collaboration with researchers from Universitas Gadjah Mada, Niigata University, the University of Yamanashi, Hokkaido University and Osaka University.

Share Button

Full-bodied cheese flavor quickly and efficiently

Peptides formed during cheese ripening are crucial for the full-bodied flavor of aged cheeses, known as kokumi. A research team led by the Leibniz-Institute for Food Systems Biology at the Technical University of Munich has now developed a new method to analyze these flavor-relevant peptides precisely, quickly, and efficiently. Based on more than 120 cheese samples, the team has also created a database that can be used in the future to predict flavor development during cheese ripening.

The term kokumi derives from Japanese and refers to a full-bodied and long-lasting taste experience. The taste impression is particularly pronounced in aged cheeses, mainly due to the increasing concentration of gamma-glutamyl dipeptides. These are small molecules that consist of a link between glutamic acid and another amino acid.

Depending on how the two amino acids are linked, researchers distinguish between gamma-, alpha-, and X-glutamyl dipeptides, with the latter two not contributing to the kokumi effect. The high polarity of the glutamyl dipeptides, as well as their great structural similarity with different flavor contributions, represent a major challenge for food analysis.

Efficient analysis method developed

Nevertheless, the team led by principal investigator Andreas Dunkel of the Leibniz Institute has succeeded in developing a new efficient analysis method based on ultra-high performance liquid chromatography-mass spectrometry. For the first time, it can precisely and selectively determine the concentrations of all 56 gamma-glutamyl dipeptide variants in just 22 minutes. Optimized sample preparation makes it possible to analyze 60 cheese samples per day.

“This is a significant improvement compared to other methods. Our tests have shown that our method is faster, more efficient, and yet reliable — it delivers reproducible results and detects even the smallest concentrations,” says first author Sonja Maria Fröhlich, a doctoral student at the Leibniz Institute. To further investigate the influence of ripening time on gamma-glutamyl dipeptide concentrations, the researchers applied the method to 122 cheese samples from Europe and the USA after the test phase. The ripening times of the cheese ranged from two weeks to 15 years.

Mold cultures accelerate flavor development

The results show that, as expected, the concentrations of glutamyl dipeptides increase with increasing ripeness. “Interestingly, the addition of blue and white mold cultures led to significantly higher gamma-glutamyl dipeptide concentrations, even at shorter ripening times,” says Andreas Dunkel, who heads the Integrative Food Systems Analysisresearch group at the Leibniz Institute.

The food chemist adds: “The concentration profiles we have determined for different stages of ripening and different types of cheese can be used in the future as a database for prediction models. The latter could, for example, be used to objectively monitor flavor development during cheese ripening, to shorten ripening times, or to develop new plant-based cheese products with high consumer acceptance.”

“In the sense of an interdisciplinary, food systems biology research approach, one of our goals is to combine analytical research results with bioinformatic methods to develop predictive models suitable to support sustainable food production. This is also the starting point of the project led by Andreas Dunkel,” concludes Veronika Somoza, director of the Leibniz Institute.

Share Button

Crucial role of cerebellum in social and cognitive functioning

“People with cerebellar abnormalities often experience motor issues,” Van Overwalle explains. “For example, they struggle to smoothly touch their nose with a finger. These difficulties highlight the cerebellum’s essential role in refining motor movements.”

However, Van Overwalle’s research extends beyond motor functions, exploring the cerebellum’s involvement in social and cognitive abilities. His findings reveal that abnormalities in the cerebellum not only lead to motor deficits but are also linked to emotional and behavioral disorders. He references research on individuals with autism, demonstrating how non-invasive brain stimulation techniques like magnetic stimulation can improve social task performance.

“We’ve seen improvements in the sequence of cognitive tasks in people with autism through magnetic stimulation,” says Van Overwalle. “We’re now testing more complex tasks to see if these effects can be further enhanced, with the ultimate goal of developing practical treatments for people with autism.”

A notable breakthrough is the use of transcranial electrical stimulation (tES), a more affordable and accessible technique compared to magnetic stimulation. While the effects of tES are still limited, the research group is committed to further development, seeing its potential for wide-scale application in the future.

This research offers a fresh perspective on the cerebellum’s role and paves the way for new treatments for psychiatric and neurological conditions, such as autism spectrum disorders. “Our hope is to refine these techniques further to improve social and cognitive functions in people with autism,” concludes Van Overwalle.

Share Button

Infertility made me feel guilty, says TV newsreader

Andrea Byrne says she felt her husband would be “better off” without her during fertility treatment.

Share Button

AI to help doctors spot broken bones on X-rays

It is safe, could speed up diagnosis and relieve NHS pressure, the health assessment body says.

Share Button

Plant CO2 uptake rises by nearly one third in new global estimates

Plants the world over are absorbing about 31% more carbon dioxide than previously thought, according to a new assessment developed by scientists. The research, detailed in the journal Nature, is expected to improve Earth system simulations that scientists use to predict the future climate, and spotlights the importance of natural carbon sequestration for greenhouse gas mitigation.

The amount of CO2 removed from the atmosphere via photosynthesis from land plants is known as Terrestrial Gross Primary Production, or GPP. It represents the largest carbon exchange between land and atmosphere on the planet. GPP is typically cited in petagrams of carbon per year. One petagram equals 1 billion metric tons, which is roughly the amount of CO2 emitted each year from 238 million gas-powered passenger vehicles.

A team of scientists led by Cornell University, with support from the Department of Energy’s Oak Ridge National Laboratory, used new models and measurements to assess GPP from the land at 157 petagrams of carbon per year, up from an estimate of 120 petagrams established 40 years ago and currently used in most estimates of Earth’s carbon cycle. The results are described in the paper, “Terrestrial Photosynthesis Inferred from Plant Carbonyl Sulfide Uptake.”

Researchers developed an integrated model that traces the movement of the chemical compound carbonyl sulfide, or OCS, from the air into leaf chloroplasts, the factories inside plant cells that carry out photosynthesis. The research team quantified photosynthetic activity by tracking OCS. The compound largely follows the same path through a leaf as CO2, is closely related to photosynthesis and is easier to track and measure than CO2 diffusion. For these reasons, OCS has been used as a photosynthesis proxy at the plant and leaf levels. This study showed that OCS is well suited to estimate photosynthesis at large scales and over long periods of time, making it a reliable indicator of worldwide GPP.

The team used plant data from a variety of sources to inform model development. One of the sources was the LeafWeb database, established at ORNL in support of the DOE Terrestrial Ecosystem Science Scientific Focus Area, or TES-SFA. LeafWeb collects data about photosynthetic traits from scientists around the world to support carbon cycle modeling. The scientists verified the model results by comparing them with high-resolution data from environmental monitoring towers instead of satellite observations, which can be hindered by clouds, particularly in the tropics.

Key to the new estimate is better representation of a process called mesophyll diffusion — how OCS and CO2 move from leaves into chloroplasts where carbon fixation occurs. Understanding mesophyll diffusion is essential to figuring out how efficiently plants are conducting photosynthesis, and even how they might adapt to changing environments.

Lianhong Gu, co-author, photosynthesis expert and distinguished staff scientist in ORNL’s Environmental Sciences Division, helped develop the project’s mesophyll conductance model, which represents numerically the diffusion of OCS in leaves, as well as the linkage between OCS diffusion and photosynthesis.

“Figuring out how much CO2 plants fix each year is a conundrum that scientists have been working on for a while,” Gu said. “The original estimate of 120 petagrams per year was established in the 1980s, and it stuck as we tried to figure out a new approach. It’s important that we get a good handle on global GPP since that initial land carbon uptake affects the rest of our representations of Earth’s carbon cycle.”

“We have to make sure the fundamental processes in the carbon cycle are properly represented in our larger-scale models,” Gu added. “For those Earth-scale simulations to work well, they need to represent the best understanding of the processes at work. This work represents a major step forward in terms of providing a definitive number.”

Pan-tropical rainforests accounted for the biggest difference between previous estimates and the new figures, a finding that was corroborated by ground measurements, Gu said. The discovery suggests that rainforests are a more important natural carbon sink than previously estimated using satellite data.

Understanding how much carbon can be stored in land ecosystems, especially in forests with their large accumulations of biomass in wood, is essential to making predictions of future climate change.

“Nailing down our estimates of GPP with reliable global-scale observations is a critical step in improving our predictions of future CO2 in the atmosphere, and the consequences for global climate” said Peter Thornton, Corporate Fellow and lead for the Earth Systems Science Section at ORNL.

The results of this study point to the importance of including key processes, such as mesophyll conductance, in model representations of photosynthesis. DOE’s Next Generation Ecosystem Experiments in the Tropics has the goal of advancing model predictions of tropical forest carbon cycle response to climate change. These results can inform new model development that will reduce uncertainty in predictions of tropical forest GPP.

In addition to Cornell’s School of Integrative Plant Sciences, other collaborators on the project were Wageningen University and Research of The Netherlands, Carnegie Institution for Sciences, Colorado State University, University of California Santa Cruz and the NASA Jet Propulsion Laboratory.

Support came from Cornell, the National Science Foundation and the ORNL TES-SFA, sponsored by DOE’s Office of Science Biological and Environmental Research program.

Share Button

Shaking from April’s sizable New Jersey earthquake traveled strangely far

When a magnitude 4.8 earthquake struck northern New Jersey’s Tewksbury township on April 5, it triggered widespread alarm. Small tremors occur sporadically in the region, but this was the biggest since 1884, when a quake of approximately magnitude 5 struck under the seabed off Brooklyn, cracking walls and toppling chimneys.

Based on existing models, the earthquake should have done substantial damage at its epicenter, but that didn’t happen. Meanwhile, relatively distant New York City shook much harder than expected, causing damage, albeit minor. Outsize shaking extended all the way to Virginia and Maine. A new study suggests why this happened, calling into question some assumptions about regional earthquake hazard.

“There was some peculiar behavior,” said study coauthor Won-Young Kim of the Columbia Climate School’s Lamont-Doherty Earth Observatory

While 4.8 is not a major quake in global terms, people in the highly populous U.S. Northeast are not used to anything that big. The U.S. Geological Survey (USGS) estimates it was felt by some 42 million people; a USGS online portal that crowd-sources first-person reports of shaking received nearly 184,000 entries — the most ever from any U.S. quake, according to a companion paper about the event. Both papers just appeared in the journal The Seismic Record.

Hours after the quake, Kim and colleagues headed to the epicenter to survey the situation. “We expected some property damage — chimneys knocked down, walls cracked or plaster fallen, but there were no obvious signs,” said Kim. “We talked to police officers, but they were not very excited about it. Like nothing happened. It was a surprising response for a magnitude 4.8 earthquake.”

Surface motion generated by earthquakes is measured on the Modified Mercalli Intensity Scale. Based on the magnitude, the quake’s depth (a fairly shallow 5 kilometers, or 2.9 miles) and area geology, existing models posit that a 10-kilometer area around the epicenter should have seen intensity VII shaking on this scale, described as “very strong.” Most well designed and built structures would probably get off without much damage, but others of lesser design or materials could collapse, especially unreinforced masonry walls and chimneys.

However, no one at or around the epicenter reported intensity VII shaking or anything close to it. Damage was limited to minor cracking in some dry wall and a few items knocked off shelves. The only exception: an already crumbling grist mill built in the 1760s of unreinforced stone, and already largely a wreck. About 3.5 miles from the epicenter, part of the mill’s facade toppled.

Usually, earthquake shaking fades out in a more or less symmetrical bull’s eye pattern from the source. But that did not happen either; stronger than expected shaking extended far out, mainly to the northeast, and to a lesser extent other directions.

In Newark, N.J, some 20 miles from the epicenter, three row houses were partly toppled, and dozens of people had to be evacuated. Residents of New York City, 40 or 50 miles away, reported intensity IV motion, with sustained vibrations of windows, doors and walls. More than 150 buildings reported minor damage, mainly superficial cracks in masonry. However, inspectors ordered two Bronx buildings to erect protective sidewalk sheds when cracks appeared in their facades, and a Brooklyn public school had to close its gym for repairs because of vertical step-shaped cracks along an interior wall. Gas and water lines developed leaks as far off as the lower Hudson Valley, and on Long Island, the front of someone’s Jeep slumped into a suddenly opened sinkhole. Even people in parts of New Hampshire, some 280 miles away, reported intensity III shaking, similar to a big truck passing by.

To understand what happened, Kim and colleagues at South Korea’s Seoul National University analyzed so-called Lg waves. These are a type of low-frequency wave of energy that bounces back and forth between the Earth’s surface and the Moho — the boundary between the Earth’s crust and the mantle, which in this area lies about 35 kilometers down. The analysis suggested that the quake took place on a previously unmapped fault that runs south to north. The fault is not vertical, but rather dips eastward into the Earth at about a 45-degree angle.

According to the analysis, the movement was rapid and complex — a circular combination of the two sides of the fault sliding horizontally against each other (known as strike-slip motion) and one side also shoving itself up and over the other (known as a thrust). Once the rupture started, it spread horizontally to the north. Usually much of the energy from such a quake takes the path of least resistance — that is, straight up, to the surface, where pressure on the rock is the least. That is what makes the epicenter the most dangerous place to be.

That was not the case here, the researchers say. Instead, much of the energy headed downward, along the fault’s dip, and continued until it hit the Moho. Then it bounced back up, emerging among other places under New York City, which was right in the way. Then the wave bounced back down and re-emerged further away in New England, somewhat weaker, and so on, until it petered out. The long-distance echoes were likely strengthened by the fact that most rocks underlying this region are hard and dense, and conduct energy efficiently, like the ringing of a bell.

The area from Philadelphia to southwestern Connecticut has seen some 500 known quakes from the 1600s to the present, but many others have almost certainly gone unnoted before modern seismic instruments came along. Most are so faint, few if any people feel them, and the vast majority of other quakes have been harmless. But the threat could be greater than previously thought, according to an earlier paper led by Lamont-Doherty seismologist Lynn Sykes.

These quakes are not caused by ongoing movements of giant tectonic plates like those in much more hazardous places like California. Rather, they emanate from ancient fault zones dating as far back as 200 million years, when what is now Europe tore away from what is now North America, cracking up the subsurface with massive earthquakes. Some of these crumbly zones are still settling and readjusting themselves, and occasionally parts of them move with a jolt.

Based on the short historical record, quakes the size of April’s or slightly larger come along roughly every 100 years. But based on the sizes of known faults and other calculations, Sykes et al. have suggested that the area could see a magnitude 6 every 700 years, and a magnitude 7 every 3,400 years. The magnitude scale is exponential, so a magnitude 6 is 10 times more powerful than a 5, while a magnitude 7 is 100 times more powerful than a 5. No one knows if such quakes have occurred in human time or could, but if one did, it would be catastrophic.

The April 5 quake has brought about a spurt of new research. In cooperation with the USGS and other researchers, Kim helped place a temporary network of dozens of seismometers near the epicenter to monitor aftershocks, which continued for weeks. These signals are being used to better map various details of the quake, and the area’s faults.

Lamont-Doherty structural geologist Folarin Kolawole and colleagues have been mapping numerous bedrock fractures near the epicenter caused by past earthquakes of indeterminate ages. These could well be millions of years old, says Kolawole, but they could also point to current, unmapped zones of weakness lurking below.

Meanwhile, Lamont-Doherty geologist William Menke is working to document possible prehistoric quakes in the more recent past. New York’s Harriman State Park, just over the border from New Jersey, is littered with giant boulders dropped onto the surface when glaciers from the last ice age melted, some 15,000 to 20,000 years ago. Many are precariously balanced in their original positions. Menke’s hypothesis: if he can calculate the earthquake force that would be required to tip the boulders over, he can rule out an earthquake of that size, at least for that time period.

Kim said that the new study suggests the need to re-evaluate how shaking from any future sizable quake may be distributed across the region. “Some that are not even that big could maybe focus energy toward population centers. If [the April] earthquake was just a little stronger, or a little closer to New York City, the effect would be much greater,” he said. “We need to understand this phenomenon and its implications for ground motion prediction.”

The study’s lead author is YoungHee Kim; the other coauthors are Sangwoo Han, Jun Yong Park and Min-Seong Seo, all of Seoul National University.

Share Button

Creating a spatial map of the sea: New research visualizes how fishing communities can change fishing habits to adapt to climate change

In a massive research project spanning five years and stretching the length of the Northeast seaboard, a Wellesley College professor is examining how various fishing communities can change their fishing habits in order to adapt to climate change.

Rebecca Selden, an assistant professor of biological sciences at Wellesley, is creating a “spatial sea map” designed to illustrate the adaptive styles of 266 fishing communities stretching across the East Coast from North Carolina to Maine. Her research, conducted with colleagues from four other institutions, was published on October 15 in ICES Journal of Marine Science.

Selden’s project is one of the first to provide detailed, high-resolution information about how individual communities could change their fishing patterns in response to climate change. Such community-level information is critical to understand which communities might be most vulnerable or resilient to changes in the distribution of species that they fish.

Climate change is changing the seascape in many ways, Selden notes. The water is changing — becoming more acidified, for example. Species are relocating in order to adapt to changing water temperatures. And the ocean is being used in new and different ways — for ocean-based wind turbines, for example.

“These changes can challenge fishing communities that rely on marine resources,” Selden says. “Many communities have diversified what they catch, or where they fish, to cope with changes in where fish are found.” But too often, we don’t understand how specific communities are adapting. “We need to understand what works in different locations,” Selden notes.

In order to better understand how individual communities are adapting, Selden helped develop a Communities at Sea (CaS) database that links historical fishing patterns at sea to port communities. This allows researchers and policy makers to quantitatively evaluate catch flexibility, catch switching, and fishing ground mobility of fishing communities in the Northeast U.S.

Her analysis shows that different communities, and different gear types, are adapting in different ways to changes in the seascape. For example, dredge fleets that specialize on scallop are highly mobile, which bodes well for being able to follow scallop as they move north or deeper. Trawl fleets, on the other hand, are more likely to maintain their traditional fishing locations, but change what they’re fishing for. Lobster trap communities in the Gulf of Maine maintained a focus on both their traditional fishing location and their traditional catch. Meanwhile, Southern New England trap fleets dealing with declines in lobster populations have shown surprising levels of catch flexibility, focusing less on lobster and more on emerging fisheries like Jonah crab and whelk.

As a result, ports that are only separated by tens of miles can have dramatically different capacities for change — and dramatically different methods of change — based on the portfolio of fleets that land there.

The resulting “spatial seascape” that Selden has created yields important community-level information that is critical to identifying which fishing communities are least able to change what they catch and where they fish. This information will help prioritize efforts to reduce barriers to diversification, develop new markets, and create policies to allow fishermen to successfully adapt to changing ocean conditions, Selden notes.

“Fishermen are not adverse to adaptation,” Selden says. “The challenges [with adaptation] tend to be associated with regulations and policies, not fishermen or fishing communities themselves.”

Some of the communities in Selden’s study are small and perhaps relatively unfamiliar — like Harwichport, Massachusetts, with less than five vessels in each of the two fleets that currently land there. Other communities in the study are larger and more well known, including New Bedford, Massachusetts, the nation’s top port in terms of total fishery revenue for the last 20 years

One goal of the Selden’s research is to draw attention to some of the smaller communities, which might otherwise be less likely to be positively affected by new policies, Selden says.

Going forward, Selden is currently working to compare the adaptive capacity of fishing communities on the U.S. East and West Coasts through collaboration with natural and social scientists at NOAA’s Northwest Fisheries Science Center.

And Selden’s big hope is that her research can help influence policy decisions. “So many fishermen have an immense capacity for change, ” she says. “It’s often regulations that are the limiting factor. By creating a complete marine spatial map of the lower 48 states, we can help people make better decisions at the local, regional and international levels.”

Share Button

Jokey reform ideas removed from NHS website

The health department says it is removing “irrelevant” posts from an online consultation.

Share Button