New high-resolution 3D maps show how the brain’s blood vessels changes with age

Healthy blood vessels matter for more than just heart health. Vascular well-being is critical for brain health and potentially in addressing age-related cognitive decline and neurodegenerative disorders, like Alzheimer’s disease, according to new study led by Penn State researchers. The findings point to an understudied but possible key role the brain’s vascular network — or energy infrastructure — plays in the onset of neurodegenerative disease.

They published their work today (July 30) in Nature Communications.

Using advanced imaging techniques, the team developed maps of a mouse brain that illustrate how vascular cells and structures like blood vessels change with age and identified areas that are vulnerable to deterioration. When blood vessels degrade, nerve cells in the brain, called neurons, are starved of energy, causing them to malfunction or die. It can lead to a condition called vascular dementia, the second leading cause of cognitive impairment in older adults, and symptoms like sleep disturbance.

“With something like Alzheimer’s disease, by the time you can see vascular changes and significant brain shrinkage on a MRI, cell death has already occurred. We need to understand how these cells and structures change before a major catastrophe happens,” said Yongsoo Kim, associate professor of neural and behavioral sciences at Penn State College of Medicine and senior author of the study. “This study provides early signs of neurodegenerative disorders, potentially leading to earlier diagnosis, and clues for how we can slow down the aging process and cognitive changes.”

According to Kim, aging is one of the primary factors involved in neurodegenerative disorders.

“Yet, we really don’t have a good baseline understanding of how normal aging itself changes the brain, particularly the brain’s vasculature,” Kim said. And with the aging population in the United States growing, he said it’s critical to understand these changes, especially within the network of blood vessels.

Blood vessels, especially micro-vessels, regulate oxygen and energy supply and waste removal to and from neurons. Despite their importance, Kim said, most existing research focuses on how neuron structure and function degenerates over time, rather than the vasculature. When researchers do study the brain’s vasculature, they’ve primarily examined larger blood vessels or focused on a single, easy-to-access region of the brain, the somatosensory cortex. More importantly, typical neuroimaging techniques, like MRI, don’t provide high enough resolution to see what’s happening in the tiny blood vessels, which make up 80% to 85% of the brain’s vasculature, according to Kim.

Kim and the research team produced a detailed map of the vascular network of the whole mouse brain using two high-resolution 3D mapping techniques: serial two-photon tomography — a technique that creates a series of stacked 2D images — and light sheet fluorescence microscopy, which images intact 3D samples to visualize the whole brain at a single cell-resolution. They imaged the brains of young and old mice to chart vasculature changes across the brain with normal aging.

“Because we’re doing high-resolution mapping with the sufficient resolution, we can reconstruct the whole vascular structure and scan the entire brain to pinpoint areas that undergo selective degeneration with age,” Kim said. “What we found is that the area that most people study showed the least amount of change, whereas profound change happens in areas in the deep areas of the brain. This suggests that we’ve been looking at the wrong area when it comes to aging studies.”

The images showed that changes in the vascular network don’t occur equally across the brain. Rather, they were concentrated in the basal forebrain, deep cortical layers and hippocampal network, suggesting these areas are more vulnerable to vascular degeneration. These regions play a role in attention, sleep, memory processing and storage, among other functions.

As brains age, vascular length and branching density decreases approximately 10%, indicating that there’s a sparser network to distribute blood. Arteries in older brains also appear more twisted compared to those in younger brains, which can impede blood flow, especially to areas further away from the main arteries like the deep cortical layers, Kim explained.

The team also examined functional changes of vasculature and found that the system responds more slowly in older brains. That means that it can’t provide the neurons with energy as quickly and readily as the cells may need. There’s also a loss of pericytes, a type of cell that regulates blood supply and blood vessel permeability, too. As a result, the blood vessels become “leaky,” compromising the blood-brain barrier.

This study builds on the group’s previous research, where they mapped the vasculature of a young mouse brain. Next, they are studying how Alzheimer’s disease-induced changes in the brain influences vascular health and neuronal function. Ultimately, they said they hope their work will lead to treatments for neurodegenerative disorders.

Hannah Bennett, dual medical degree and doctoral degree student, and Steffy Manjila, postdoctoral scholar, co-led the study along with Quingguang Zhang, who was assistant research professor at Penn State at the time of the research and is currently assistant professor at Michigan State University, and Yuan-ting Wu, who was previously research scientist at Penn State and currently project scientist at Cedars-Sinai Medical Center. Other Penn State authors on the paper include: Patrick Drew, professor of engineering science and mechanics, of neurosurgery, of biology and of biomedical engineering and interim director of the Huck Institutes of the Life Sciences; Uree Chon, research technician; Donghui Shin, research technologist; Daniel Vanselow, research project manager; Hyun-Jae Pi, data scientist.

TheNational Institutes of Health and the American Heart Association funded this work.

Share Button

Genes or environment? A new model for understanding disease risk factors

Every disease is shaped by a genetic component as well as environmental factors like air pollution, climate and socioeconomic status. However, the extent to which genetics or environment plays a role in disease risk — and how much can be attributed to each — isn’t well understood. As such, the actions individuals can take to reduce their risk for disease aren’t often clear.

A team led by Penn State College of Medicine researchers found a way to tease apart genetic and environmental effects of disease risk using a large, nationally representative sample. They found that, in some cases, previous assessments overstated the contribution of one’s genes to disease risk and that lifestyle and environmental factors play a larger role than previously believed. Unlike genetics, environmental factors, like exposure to air pollution, can be more easily modified. That means there are potentially more opportunities to mitigate disease risk. The researchers published their work in Nature Communications.

“We’re trying to disentangle how much genetics and how much the environment influences the development of disease. If we more accurately understand how each contributes, we can better predict disease risk and design more effective interventions, particularly in the era of precision medicine,” said Bibo Jiang, assistant professor of public health sciences at the Penn State College of Medicine and senior author of the study.

The researchers said that in the past, it’s been difficult to quantify and measure environmental risk factors since they can encompass everything from diet and exercise to climate. However, if environmental factors aren’t considered in models of disease risk, analyses may falsely attribute the shared disease risks among family members to genetics.

“People living in the same neighborhood share the same level of air pollution, socioeconomic status, access to health care providers and food environment,” said Dajiang Liu, distinguished professor, vice chair for research, director of artificial intelligence and biomedical informatics at the Penn State College of Medicine and co-senior author of the study. “If we can tease apart these shared environments, what’s remaining could more accurately reflect genetic heritability of disease.”

In this study, the team developed a spatial mixed linear effect (SMILE) model that incorporates both genetics and geolocation data. Geolocation — a person’s approximate geographical location — served as a surrogate measure for community-level environmental risk factors.

Using data from IBM MarketScan, a health insurance claims database with electronic health records from more than 50 million individuals from employer-based health insurance policies in the United States, the research team filtered out information for more than 257,000 nuclear families and compiled disease outcomes for 1,083 diseases. They then augmented the data to include publicly available environmental data, including climate and sociodemographic data, as well as levels of particulate matter 2.5 (PM2.5) and nitrogen dioxide (NO2).

The team’s analysis led to more refined estimates of the contributors to disease risk. For example, previous studies concluded that genetics contributed 37.7% of the risk of developing Type 2 diabetes. When the research team reassessed the data, their model, with its consideration of environmental effects, found that the estimated genetic contribution to Type 2 diabetes risk decreased to 28.4%; a bigger share of disease risk can be attributed to environmental factors. Similarly, estimated contribution to obesity risk attributed to genetics decreased from 53.1% to 46.3% when adjusted for environmental factors.

“Previous studies concluded that genetics played a much larger role in disease risk prediction, and our study recalibrated those numbers,” Liu said. “That means that people can stay hopeful even though they have family relatives with Type 2 diabetes, for example, because there’s a lot they can do to reduce their own risk.”

The research team also used the data to quantitatively assess whether two specific pollutants in the air — PM2.5 and NO2 — causally influence disease risks. Previous studies, the researchers said, lump PM2.5 and NO2 together as one collective measure of air pollution. However, what they found in this study was that the two pollutants have different and distinct causal relationships with health conditions. For instance, NO2 is shown to directly cause conditions like high cholesterol, irritable bowel syndrome and both Type 1 and Type 2 diabetes, but not PM2.5. PM2.5, on the other hand, may have a more direct causal effect on lung function and sleep disorders.

Ultimately, the researchers said this model will allow for a more in depth look at questions about why some diseases may be more prevalent in certain geographic locations.

Other Penn State authors on the paper include: Havell Markus and Austin Montgomery, both dual medical degree and doctoral degree students at the Penn State College of Medicine; Laura Carrel, professor of biochemistry and molecular biology; Arthur Berg, professor of public health sciences; and Qunhua Li, professor of statistics. Daniel McGuire, who was a doctoral student in the biostatistics program at the time of the research, co-led the study. Co-author Lina Yang and Jingyu Xu, who were doctoral students in the biostatistics program at the time of the research, also contributed to the paper.

The National Institutes of Health and the Penn State College of Medicine’s artificial intelligence and biomedical informatics pilot funding program supported this work in part. Some of the materials employed in this work were provided by the Center for Applied Studies in Health Economics at the Penn State College of Medicine.

Share Button

UK swelters as hottest day of the year confirmed

Temperatures on Tuesday reach 32C – exceeding the previous hottest day set earlier in the month.

Share Button

Super-black wood can improve telescopes, optical devices and consumer goods

Thanks to an accidental discovery, researchers at the University of British Columbia have created a new super-black material that absorbs almost all light, opening potential applications in fine jewelry, solar cells and precision optical devices.

Professor Philip Evans and PhD student Kenny Cheng were experimenting with high-energy plasma to make wood more water-repellent. However, when they applied the technique to the cut ends of wood cells, the surfaces turned extremely black.

Measurements by Texas A&M University’s department of physics and astronomy confirmed that the material reflected less than one per cent of visible light, absorbing almost all the light that struck it.

Instead of discarding this accidental finding, the team decided to shift their focus to designing super-black materials, contributing a new approach to the search for the darkest materials on Earth.

“Ultra-black or super-black material can absorb more than 99 per cent of the light that strikes it — significantly more so than normal black paint, which absorbs about 97.5 per cent of light,” explained Dr. Evans, a professor in the faculty of forestry and BC Leadership Chair in Advanced Forest Products Manufacturing Technology.

Super-black materials are increasingly sought after in astronomy, where ultra-black coatings on devices help reduce stray light and improve image clarity. Super-black coatings can enhance the efficiency of solar cells. They are also used in making art pieces and luxury consumer items like watches.

The researchers have developed prototype commercial products using their super-black wood, initially focusing on watches and jewelry, with plans to explore other commercial applications in the future.

Wonder wood

The team named and trademarked their discovery Nxylon (niks-uh-lon), after Nyx, the Greek goddess of the night, and xylon, the Greek word for wood.

Most surprisingly, Nxylon remains black even when coated with an alloy, such as the gold coating applied to the wood to make it electrically conductive enough to be viewed and studied using an electron microscope. This is because Nxylon’s structure inherently prevents light from escaping rather than depending on black pigments.

The UBC team have demonstrated that Nxylon can replace expensive and rare black woods like ebony and rosewood for watch faces, and it can be used in jewelry to replace the black gemstone onyx.

“Nxylon’s composition combines the benefits of natural materials with unique structural features, making it lightweight, stiff and easy to cut into intricate shapes,” said Dr. Evans.

Made from basswood, a tree widely found in North America and valued for hand carving, boxes, shutters and musical instruments, Nxylon can also use other types of wood such as European lime wood.

Breathing new life into forestry

Dr. Evans and his colleagues plan to launch a startup, Nxylon Corporation of Canada, to scale up applications of Nxylon in collaboration with jewellers, artists and tech product designers. They also plan to develop a commercial-scale plasma reactor to produce larger super-black wood samples suitable for non-reflective ceiling and wall tiles.

“Nxylon can be made from sustainable and renewable materials widely found in North America and Europe, leading to new applications for wood. The wood industry in B.C. is often seen as a sunset industry focused on commodity products — our research demonstrates its great untapped potential,” said Dr. Evans.

Other researchers who contributed to this work include Vickie Ma, Dengcheng Feng and Sara Xu (all from UBC’s faculty of forestry); Luke Schmidt (Texas A&M); and Mick Turner (The Australian National University).

Share Button

Ditching of social care plan is a tragedy – Dilnot

The plan would have introduced an £86,000 cap on the amount an older or disabled person would have to pay towards their support.

Share Button

MicroRNA study sets stage for crop improvements

MicroRNAs can make plants more capable of withstanding drought, salinity, pathogens and more. However, in a recent study published in Nature Plants, Texas A&M AgriLife Research scientists showed just how much we didn’t know about the intricate processes plants use to produce them.

MicroRNAs are small molecules that can guide proteins to decrease gene expression, and engineering artificial versions allows scientists to target specific genes for crop improvement.

“Though these microRNA molecules are very tiny, their impacts are huge,” said Xiuren Zhang, Ph.D., Christine Richardson Endowed Professor in the Texas A&M College of Agriculture and Life Sciences Department of Biochemistry and Biophysics, adjunct professor in the Texas A&M College of Arts and Sciences Department of Biology, and principal investigator of the study.

Changhao Li, Ph.D., and Xingxing Yan served as co-first authors of the study, with supervision from Xiuren Zhang, Ph.D. The team’s work has substantially revised the current understanding of microRNA biogenesis in the model organism Arabidopsis thaliana. (Jiaying Zhu/Texas A&M AgriLife)

Using precise mutations and a clever experimental design, Texas A&M AgriLife researchers reevaluated the landscape of microRNAs in the model organism Arabidopsis thaliana and found that fewer than half of them were correctly identified as microRNAs, while the others are miscategorized or require further investigation.

In addition to clarifying genuine microRNA molecules in Arabidopsis thaliana, the study supplies an effective experimental design for repeating the analysis in other crops and even in animals, which likely need a similar review. The team’s discoveries also helped them create updated guidelines for designing artificial microRNAs, opening the door to improvement in crops like corn, wheat, soybeans and rice.

Xingxing Yan, a graduate research assistant, and Changhao Li, Ph.D., a postdoctoral research associate, were co-first authors of the study. It was funded by the National Institutes of Health, National Science Foundation and the Welch Foundation.

A decade-old endeavor

MicroRNAs have a uniform length of around 21 to 24 nucleotides. But in plants, Zhang said their precursors come in a range of shapes and sizes.

Because of the precursors’ structural diversity, determining which key features are most important for their processing has been a challenge, and it’s left the question of how microRNAs are generated in plants largely unexplored and unverified.

Arabidopsis thaliana, also known as thale cress and mouse-ear cress, is a model organism for plant biology. Its relatively small genome, quick growth and production of many seeds make it exceptionally useful in research. (Xingxing Yan/Texas A&M AgriLife)

About 10 years ago, Zhang said, he and his lab found a pattern between a loop on the precursor microRNA structure and the first cut site. This initial cut is significant because it determines the first nucleotide on the mature microRNA molecule, an important factor for directing it to the correct location in a cell.

Unfortunately, of the 326 posited microRNA precursors in Arabidopsis thaliana, only a few had the ideal reference loop that Zhang’s lab found — according to the computational models, at least.

“The models are based on pure chemistry,” Zhang said. “They focus only on the free energy, on what should be the most stable form. But it couldn’t explain why so many diverse precursors can end up with products of the same size.”

Rather than relying on the models, Zhang’s lab sought to verify the microRNA precursors within plants. They wanted to find the first cut sites on the precursors and confirm their structural determinants within cells.

Unexpected findings

To do this, the researchers made highly specific mutations to the dicer protein, which, as its name implies, is responsible for making precise cuts to the microRNA precursor. Normally, the protein acts like two hands that hold a double strand of precursor RNA and cut at a site in each strand concurrently before releasing the RNA molecule.

“We made point mutations at two locations separately in the dicer-like protein to make them semi-active,” Yan said. “That way, they can only cut one strand and stop before further processing. This gives us a chance to capture the intermediate products of the microRNA precursor, telling us the initial processing sites and that first nucleotide.”

Their results showed that only 147 of the 326 posited microRNA precursors interact with the dicer protein definitively, marking these as genuine microRNA precursors. Eighty-one didn’t interact at all, suggesting they should be reclassified as a different type of RNA. Around 100 require further investigation.

The team also used an advanced high-throughput technique and new computational method to map out the structures of microRNA precursors in their natural cell conditions and found that, of the 147 genuine microRNA molecules, about 95% of their structures in cells differed from computer predictions.

“We found several results quite different from predictions and from the literature,” Li said. “We were able to combine biochemical results with next-generation sequencing to get more information, and now our understanding of the structures is much more accurate.”

The future

The team still has more microRNA precursors to validate in Arabidopsis thaliana, but Zhang said they are excited to pursue collaborations to investigate microRNA processing in agricultural crops for more practical applications.

“We want to find out more about what kind of microRNAs are in other crops, how they’re processed and how we can make artificial microRNAs in them,” he said. “This study provides resources that can be used widely, and now we can use it to revisit other crops, find what needs to be corrected, and see what else we can do with this tool.”

Share Button

Study reveals link between transthyretin levels and heart disease risk

Physician-scientists from the University of Alabama at Birmingham Marnix E. Heersink School of Medicine have uncovered significant findings regarding the impact of transthyretin, or TTR, protein levels on heart disease risk. The study, recently published in Nature Communications, explores how variations in TTR levels are associated with adverse clinical outcomes, providing new insights into the prevention and management of amyloid heart disease. Transthyretin is a transport protein produced in the liver, and its misfolding is linked to the development of cardiac amyloidosis, a condition that leads to heart failure and increased mortality.

The study, led by Pankaj Arora, M.D., and Naman Shetty, M.D., examined data from 35,206 participants in the UK Biobank. The researchers investigated the clinical correlates of TTR levels, differences in TTR levels based on genetic variations and the association of TTR levels with health outcomes.

Arora and his team found that lower TTR levels are significantly associated with an increased risk of heart failure and all-cause mortality. Specifically, individuals with low TTR levels had a 17 percent higher risk of heart failure and an 18 percent higher risk of death from any cause compared to those with higher TTR levels. These findings were even more pronounced in individuals carrying the V142I TTR gene variant, which is known to destabilize the TTR protein.

The study revealed that TTR levels were lower in females compared to males and were influenced by several health factors. Higher systolic and diastolic blood pressure, total cholesterol, albumin levels, triglyceride levels, and creatinine levels were associated with increased TTR levels. Higher C-reactive protein levels were linked to lower TTR levels. Notably, carriers of the V142I TTR gene variant had significantly lower TTR levels compared to non-carriers, highlighting a genetic influence on this protein.

“Our research highlights the critical role of TTR levels in predicting heart disease risk,” Arora said. “By understanding the factors that influence TTR levels, we can better identify individuals at high risk and develop targeted interventions to prevent adverse outcomes.”

“These findings underscore the potential benefits of incorporating TTR level measurements in screening programs, especially for individuals with genetic predispositions,” Shetty said.

Arora, the senior author and a cardiologist at the UAB Cardiovascular Institute, says the implications of this study are far-reaching. It suggests that monitoring of TTR levels could be a valuable tool in managing heart disease risk, particularly for those with known genetic variations like the V142I TTR variant. Low TTR levels raise the pre-test probability of a positive genetic test, specifically for detecting the V142I variant, which typically takes time to process.

“This information can be used to counsel family members while they await the results of genetic testing,” Arora said. “This research marks a significant step forward in the quest to understand and mitigate the risks associated with cardiac amyloidosis and other heart-related conditions.”

Share Button

NASA data shows July 22, 2024 was Earth’s hottest day on record

July 22, 2024, was the hottest day on record, according to a NASA analysis of global daily temperature data. July 21 and 23 of this year also exceeded the previous daily record, set in July 2023. These record-breaking temperatures are part of a long-term warming trend driven by human activities, primarily the emission of greenhouse gases. As part of its mission to expand our understanding of Earth, NASA collects critical long-term observations of our changing planet.

“In a year that has been the hottest on record to date, these past two weeks have been particularly brutal,” said NASA Administrator Bill Nelson. “Through our over two dozen Earth-observing satellites and over 60 years of data, NASA is providing critical analyses of how our planet is changing and how local communities can prepare, adapt, and stay safe. We are proud to be part of the Biden-Harris Administration efforts to protect communities from extreme heat.”

This preliminary finding comes from data analyses from Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) and Goddard Earth Observing System Forward Processing (GEOS-FP) systems, which combine millions of global observations from instruments on land, sea, air, and satellites using atmospheric models. GEOS-FP provides rapid, near-real time weather data, while the MERRA-2 climate reanalysis takes longer but ensures the use of best quality observations. These models are run by the Global Modeling and Assimilation Office (GMAO) at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

Daily global average temperature values from MERRA-2 for the years 1980-2022 are shown in white, values for the year 2023 are shown in pink, and values from 2024 through June are shown in red. Daily global temperature values from July 1 to 23, 2024, from GEOS-FP are shown in purple. The results agree with an independent analysis from the European Union’s Copernicus Earth Observation Programme. While the analyses have small differences, they show broad agreement in the change in temperature over time and hottest days.

The latest daily temperature records follow 13 months of consecutive monthly temperature records, according to scientists from NASA’s Goddard Institute for Space Studies in New York. Their analysis was based on the GISTEMP record, which uses surface instrumental data alone and provides a longer-term view of changes in global temperatures at monthly and annual resolutions going back to the late 19th century.

Share Button

Junior doctors offered 22% pay rise in deal to end strike action

It is thought the new pay deal being offered is worth 22%, on average, over two years.

Share Button

Puberty blockers ban is lawful, says High Court

Ban on puberty blockers introduced by last government using emergency legislation was lawful, High Court rules.

Share Button