NASA’s Hubble traces hidden history of Andromeda galaxy

In the years following the launch of NASA’s Hubble Space Telescope, astronomers have tallied over 1 trillion galaxies in the universe. But only one galaxy stands out as the most important nearby stellar island to our Milky Way — the magnificent Andromeda galaxy (Messier 31). It can be seen with the naked eye on a very clear autumn night as a faint cigar-shaped object roughly the apparent angular diameter of our Moon.

A century ago, Edwin Hubble first established that this so-called “spiral nebula” was actually very far outside our own Milky Way galaxy — at a distance of approximately 2.5 million light-years or roughly 25 Milky Way diameters. Prior to that, astronomers had long thought that the Milky way encompassed the entire universe. Overnight, Hubble’s discovery turned cosmology upside down by unveiling an infinitely grander universe.

Now, a century later, the space telescope named for Hubble has accomplished the most comprehensive survey of this enticing empire of stars. The Hubble telescope is yielding new clues to the evolutionary history of Andromeda, and it looks markedly different from the Milky Way’s history.

Without Andromeda as a proxy for spiral galaxies in the universe at large, astronomers would know much less about the structure and evolution of our own Milky Way. That’s because we are embedded inside the Milky Way. This is like trying to understand the layout of New York City by standing in the middle of Central Park.

“With Hubble we can get into enormous detail about what’s happening on a holistic scale across the entire disk of the galaxy. You can’t do that with any other large galaxy,” said principal investigator Ben Williams of the University of Washington. Hubble’s sharp imaging capabilities can resolve more than 200 million stars in the Andromeda galaxy, detecting only stars brighter than our Sun. They look like grains of sand across the beach. But that’s just the tip of the iceberg. Andromeda’s total population is estimated to be 1 trillion stars, with many less massive stars falling below Hubble’s sensitivity limit.

Photographing Andromeda was a herculean task because the galaxy is a much bigger target on the sky than the galaxies Hubble routinely observes, which are often billions of light-years away. The full mosaic was carried out under two Hubble programs. In total, it required over 1,000 Hubble orbits, spanning more than a decade.

This panorama started with the Panchromatic Hubble Andromeda Treasury (PHAT) program about a decade ago. Images were obtained at near-ultraviolet, visible, and near-infrared wavelengths using the Advanced Camera for Surveys and the Wide Field Camera 3 aboard Hubble to photograph the northern half of Andromeda.

This program was followed up by the Panchromatic Hubble Andromeda Southern Treasury (PHAST), recently published in The Astrophysical Journal and led by Zhuo Chen at the University of Washington, which added images of approximately 100 million stars in the southern half of Andromeda. This region is structurally unique and more sensitive to the galaxy’s merger history than the northern disk mapped by the PHAT survey.

The combined programs collectively cover the entire disk of Andromeda, which is seen almost edge-on — tilted by 77 degrees relative to Earth’s view. The galaxy is so large that the mosaic is assembled from approximately 600 separate fields of view. The mosaic image is made up of at least 2.5 billion pixels.

The complementary Hubble survey programs provide information about the age, heavy-element abundance, and stellar masses inside Andromeda. This will allow astronomers to distinguish between competing scenarios where Andromeda merged with one or more galaxies. Hubble’s detailed measurements constrain models of Andromeda’s merger history and disk evolution.

A Galactic ‘Train Wreck’

Though the Milky Way and Andromeda formed presumably around the same time many billions of years ago, observational evidence shows that they have very different evolutionary histories, despite growing up in the same cosmological neighborhood. Andromeda seems to be more highly populated with younger stars and unusual features like coherent streams of stars, say researchers. This implies it has a more active recent star-formation and interaction history than the Milky Way.

“Andromeda’s a train wreck. It looks like it has been through some kind of event that caused it to form a lot of stars and then just shut down,” said Daniel Weisz at the University of California, Berkeley. “This was probably due to a collision with another galaxy in the neighborhood.”

A possible culprit is the compact satellite galaxy Messier 32, which resembles the stripped-down core of a once-spiral galaxy that may have interacted with Andromeda in the past. Computer simulations suggest that when a close encounter with another galaxy uses up all the available interstellar gas, star formation subsides.

“Andromeda looks like a transitional type of galaxy that’s between a star-forming spiral and a sort of elliptical galaxy dominated by aging red stars,” said Weisz. “We can tell it’s got this big central bulge of older stars and a star-forming disk that’s not as active as you might expect given the galaxy’s mass.”

“This detailed look at the resolved stars will help us to piece together the galaxy’s past merger and interaction history,” added Williams.

Hubble’s new findings will support future observations by NASA’s James Webb Space Telescope and the upcoming Nancy Grace Roman Space Telescope. Essentially a wide-angle version of Hubble (with the same sized mirror), Roman will capture the equivalent of at least 100 high-resolution Hubble images in a single exposure. These observations will complement and extend Hubble’s huge dataset.

Share Button

Technology for oxidizing atmospheric methane won’t help the climate

As the atmosphere continues to fill with greenhouse gases from human activities, many proposals have surfaced to “geoengineer” climate-saving solutions, that is, alter the atmosphere at a global scale to either reduce the concentrations of carbon or mute its warming effect.

One recent proposal seeks to infuse the atmosphere with hydrogen peroxide, insisting that it would both oxidize methane (CH4), an extremely potent greenhouse gas while improving air quality.

Too good to be true?

University of Utah atmospheric scientists Alfred Mayhew and Jessica Haskins were skeptical, so they set out to test the claims behind this proposal. Their results, published on Jan. 3, confirm their doubts and offer a reality check to agencies considering such proposals as a way to stave off climate change.

“Our work showed that the efficiency of the proposed technology was quite low, meaning widespread adoption of the technology would be required to make any meaningful impact on atmospheric CH4,” said Mayhew, a postdoctoral researcher with the university’s Wilkes Center for Climate Science & Policy. “Then, our results indicate that if this technology is adopted at scale, then we start to see some negative air-quality side effects, particularly for wintertime particulate matter air pollution.”

To conduct the study, the Utah scientists modeled what would happen if you deployed the technology patented by a Canadian company, which is proposing to spray aerosolized hydrogen peroxide, or H₂O₂, into the atmosphere during daylight hours from 600-meter towers. These towers would approach the height of the world’s tallest radio towers.

“When that hydrogen peroxide is in the presence of sunlight, it’s going to make a really powerful oxidant, the hydroxyl radical OH,” said Haskins, an assistant professor of atmospheric sciences. “That’s a natural scrubber in the atmosphere, and it’s going to help speed up the conversion of methane to CO₂.”

Methane is a single-bonded molecule combination of carbon and hydrogen, as opposed to the double-bonded compounds that are far more common in the atmosphere. Hydroxyls are more likely to oxidize those double-bonded molecules, such as the isoprene coming off trees or volatile organic compounds, so OH is just not that efficient for breaking down methane, according to Haskins.

“OH doesn’t react fast with methane,” Haskins said. “It’s reacting with so many other things.”

Methane’s outsized impact on the climate

While carbon dioxide from fossil fuels gets much of the blame for climate change, methane is also a big contributor. Eventually, methane breaks down into carbon dioxide and water.

The primary ingredient in the natural gas burned in home appliances and power plants, methane, or CH4, packs 76 times more climate-warming punch than carbon dioxide over a 20-year timeframe. Methane persists in the atmosphere for only 12 years, but the gas is blamed for nearly a third of the rise in global temperatures since the Industrial Revolution, according to the International Energy Agency.

Anthropogenic sources, primarily oil, gas and coal operations and landfills, account for 60% of global methane emissions.

Artificially speeding up methane oxidation could slow climate change, but such geoengineering projects could carry adverse environmental impacts, which Haskins’s lab seeks to characterize. A recent report from the National Academy of Sciences concluded the unintended consequences of atmospheric methane removal technologies are likely significant but poorly understood. Haskins’ study is heeding the report’s call to scrutinize these technologies, such as the one that would release vast amounts of hydrogen peroxide.

“We could buy ourselves about 50 years and avoid some of the immediate impacts of climate change if we did this, but no one had actually previously done any side-effects studies to see what was going to happen,” Haskins said. “This is very first paper to assess any air quality side effects of such geoengineering solutions.”

Geoengineering’s potential side effects

Manipulating a system as complex as Earth’s atmosphere is an inherently dangerous action, potentially resulting in unforeseen problems.

“There’s so many feedbacks that can go on in the climate. Atmospheric chemistry is just one example. You change one thing and you think it’s going to do this, but it actually may do the opposite in one place versus the other,” Haskins said. “You have to be really careful and do these sorts of assessments. Is this a responsible thing to do? What’s the impact going to be?”

By way of example, Haskins raised the troubling history of humanmade gasses called chlorofluorocarbons, or CFCs, which ate into the protective layer of ozone that shields Earth from harmful ultraviolet radiation.

“We started using CFCs in industry as propellants and refrigerants, and suddenly we cause the ozone hole,” she said. “And we’ve been dealing with the consequences of that for 40 years. And we still won’t have a fully resolved no-ozone-hole year until probably 2060, so we have to be careful of what we’re doing.”

Mayhew and Haskins used a global chemical-transport model, called GEOS-Chem, to simulate the proposal to release hydrogen peroxide from towers. The goal was to estimate how much methane would be oxidized under three different emission scenarios, from light to extreme.

Their simulation envisioned the use of 50 towers spread around North America. Replicating the company’s proposal, the medium-release scenario called for each tower to spray 612 grams, or 1.35 pounds, per second for 10 hours a day for a year.

“This proposed solution just won’t remove any meaningful amount of methane from the atmosphere. It’s not going to solve global warming. At most, we found 50 towers could reduce 0.01% of annual anthropogenic methane emissions,” Haskins said. “You’d need about 352,000 of them to remove 50% of anthropogenic methane. It’s an insane number. And if you did 50 high-emission towers, you’d still need about 43,000.”

In the meantime, places with poor wintertime air quality could see particulate pollution get much worse.

“There’s potential that future research could show that the air quality impacts of placing these towers close to methane point sources is minimal if they’re activated at certain times of the year, and far from large population centers,” Mayhew said. “If that’s the case, then this technology (or similar approaches) could play a very small role in combating warming, but it’s clear from our work that the air-quality side effects should be placed as a central consideration for any proposed real-world implementation of technology like this.”

Share Button

Insights into how populations conform or go against the crowd

Cultural traits — the information, beliefs, behaviors, customs, and practices that shape the character of a population — are influenced by conformity, the tendency to align with others, or anti-conformity, the choice to deliberately diverge. A new way to model this dynamic interplay could ultimately help explain societal phenomena like political polarization, cultural trends, and the spread of misinformation.

A study published in the Proceedings of the National Academy of Sciences outlines this novel approach. Presenting a mathematical model, SFI Complexity Postdoctoral Fellow Kaleda Denton with colleagues at Stanford University — former SFI Post-baccalaureate Fellow Elisa Heinrich Mora, SFI External Professor Marcus Feldman, and Michael Palmer — expand on previous research to offer a more realistic representation of how conformist and anti-conformist biases shape the transmission of cultural traits through a population.

“The idea behind this research was to come up with a better way to mathematically represent how individuals make decisions in the real world,” says Denton. “If we can do that, we can then scale things up to see what would happen in a population of 10,000 people over the long run.”

Traditional models of conformity often assume individuals gravitate toward the average or “mean” trait in a population. This concept works well if the most popular traits are near this mean, which may be the case for, say, working hours or food portion sizes. However, the mean is a poor indicator of popularity in other cases; for example, if most people fall on either the far left or far right of a political spectrum, but the mean lies in the center.

To address this gap, the authors designed a model that incorporates trait clustering. In this model, individuals conform by adopting traits that are more clustered together (e.g., variations of a far-left political belief) rather than the mean trait in the population (e.g., the centrist view). Anti-conformists, on the other hand, deliberately distance themselves from the traits of their peers, creating polarization.

Using computer simulations, the team analyzed how traits spread across populations over multiple generations. Conformity often led to groups clustering around specific traits, but not necessarily the average. Anti-conformity created a starkly different pattern: a U-shaped distribution, with individuals clustering at the extremes and leaving the middle sparsely populated.

One significant finding was that populations rarely converge to a single trait unless the unrealistic assumption of perfect behavioral copying is imposed. Instead, even small variations in how individuals interpret or adopt traits result in persistent diversity.

“These outcomes align with what we observe in the real world, where cultural practices and ideologies don’t simply average out but instead maintain significant variation,” Denton says.

The research also challenges the notion that conformity always leads to homogeneity. The model shows that under certain conditions, conformity can sustain diversity, while anti-conformity amplifies polarization.

Denton sees broad implications for the study. “This framework could help explain voting behavior, social media trends, or even how people estimate values in group settings,” she says. “It offers a way to understand how individual decisions aggregate into societal patterns, whether that’s consensus-building or polarization.” This model can be tested on real-world data in future studies.

“We’re excited to see if this framework works in different scenarios,” Denton said. “The ultimate goal is to understand how individual choices influence entire populations over time.

Share Button

The universe is expanding too fast to fit theories: Hubble tension in crisis

The Universe really seems to be expanding fast. Too fast, even.

A new measurement confirms what previous — and highly debated — results had shown: The Universe is expanding faster than predicted by theoretical models, and faster than can be explained by our current understanding of physics.

This discrepancy between model and data became known as the Hubble tension. Now, results published in the Astrophysical Journal Letters provide even stronger support to the faster rate of expansion.

“The tension now turns into a crisis,” said Dan Scolnic, who led the research team.

Determining the expansion rate of the Universe — known as the Hubble constant — has been a major scientific pursuit ever since 1929, when Edwin Hubble first discovered that the Universe was expanding.

Scolnic, an associate professor of physics at Duke University, explains it as trying to build the Universe’s growth chart: we know what size it had at the Big Bang, but how did it get to the size it is now? In his analogy, the Universe’s baby picture represents the distant Universe, the primordial seeds of galaxies. The Universe’s current headshot represents the local Universe, which contains the Milky Way and its neighbors. The standard model of cosmology is the growth curve connecting the two. The problem is: things don’t connect.

“This is saying, to some respect, that our model of cosmology might be broken,” said Scolnic.

Measuring the Universe requires a cosmic ladder, which is a succession of methods used to measure the distances to celestial objects, with each method, or “rung,” relying on the previous for calibration.

The ladder used by Scolnic was created by a separate team using data from the Dark Energy Spectroscopic Instrument (DESI), which is observing more than 100,000 galaxies every night from its vantage point at the Kitt Peak National Observatory.

Scolnic recognized that this ladder could be anchored closer to Earth with a more precise distance to the Coma Cluster, one of the galaxy clusters nearest to us.

“The DESI collaboration did the really hard part, their ladder was missing the first rung,” said Scolnic. “I knew how to get it, and I knew that that would give us one of the most precise measurements of the Hubble constant we could get, so when their paper came out, I dropped absolutely everything and worked on this non-stop.”

To get a precise distance to the Coma cluster, Scolnic and his collaborators, with funding from the Templeton foundation, used the light curves from 12 Type Ia supernovae within the cluster. Just like candles lighting a dark path, Type Ia supernovae have a predictable luminosity that correlates to their distance, making them reliable objects for distance calculations.

The team arrived at a distance of about 320 million light-years, nearly in the center of the range of distances reported across 40 years of previous studies — a reassuring sign of its accuracy.

“This measurement isn’t biased by how we think the Hubble tension story will end,” said Scolnic. “This cluster is in our backyard, it has been measured long before anyone knew how important it was going to be.”

Using this high-precision measurement as a first rung, the team calibrated the rest of the cosmic distance ladder. They arrived at a value for the Hubble constant of 76.5 kilometers per second per megaparsec, which essentially means that the local Universe is expanding 76.5 kilometers per second faster every 3.26 million light-years.

This value matches existing measurements of the expansion rate of the local Universe. However, like all of those measurements, it conflicts with measurements of the Hubble constant using predictions from the distant Universe. In other words: it matches the Universe’s expansion rate as other teams have recently measured it, but not as our current understanding of physics predicts it. The longstanding question is: is the flaw in the measurements or in the models?

Scolnic’s team’s new results adds tremendous support to the emerging picture that the root of the Hubble tension lies in the models.

“Over the last decade or so, there’s been a lot of re-analysis from the community to see if my team’s original results were correct,” said Scolnic, whose research has consistently challenged the Hubble constant predicted using the standard model of physics. “Ultimately, even though we’re swapping out so many of the pieces, we all still get a very similar number. So, for me, this is as good of a confirmation as it’s ever gotten.”

“We’re at a point where we’re pressing really hard against the models we’ve been using for two and a half decades, and we’re seeing that things aren’t matching up,” said Scolnic. “This may be reshaping how we think about the Universe, and it’s exciting! There are still surprises left in cosmology, and who knows what discoveries will come next?”

This work was conducted with funding from the Templeton Foundation, the Department of Energy, the David and Lucile Packard Foundation, the Sloan Foundation, the National Science Foundation and NASA.

Share Button

In the Northeast, 50% of adult ticks carry Lyme disease carrying bacteria

Across most of the Northeast, getting bitten by a blacklegged tick — also called a deer tick — is a risk during spring, summer, and fall. A new Dartmouth study, published in Parasites and Vectors, finds that 50% of adult blacklegged ticks carry the bacteria that causes Lyme disease while 20% to 25% of the younger (nymph) blacklegged ticks carry the bacteria.

A team of researchers from universities, health departments, and agricultural agencies from across the Northeast conducted a meta-analysis of data on how many blacklegged ticks there are and how many of them have the potential to pass pathogens responsible for Lyme disease and three other tick-borne diseases in the Northeast from 1989 to 2021, including in Connecticut, New York, New Hampshire, Vermont, and Maine.

Data was collected in Maine starting in 1989, while most of the other states began data collection in the mid- 2000s. Massachusetts and Rhode Island were not represented in the study due to either unavailable or insufficient data.

Lyme disease was first discovered in Lyme, Conn, in 1975. Its symptoms can vary depending on the stage and severity of the disease but can include a rash, fever, chills, fatigue, muscle or joint aches, and swollen lymph nodes. If left untreated, prolonged and more severe symptoms may develop.

Lyme disease is caused by a bacteria calledBorrelia burgdorferi. Some but not all, white-footed mice, chipmunks, birds, squirrels, and other small animals carry the bacteria in their blood, making them “competent” hosts. Blacklegged ticks are not born infected with the Lyme disease bacteria. But when a blacklegged tick feeds on an infected host, the tick can get the bacteria that causes Lyme disease and then potentially spread it to humans through its bite. Other animals, like white-tailed deer, are “incompetent hosts,” so while they are a food source for blacklegged ticks, they do not transmit the Lyme disease bacteria.

Blacklegged ticks typically consume three blood meals over the course of a two-year life cycle: after they hatch into larvae in the midsummer of their first year; as nymphs during the following late spring, often in May or June; and as adults that fall, most likely between September and November.

In general, ticks must be attached to a person for at least 24 hours to transmit the Lyme disease bacteria. So even though adult blacklegged ticks are more likely to carry the bacteria than younger ticks, because they are bigger, about the size of a sesame seed, there is particular concern about the younger ticks or nymphs, which are only about the size of a poppy seed, making them difficult to spot.

“While the bacteria responsible for Lyme disease has a complicated chain of transmission, our results show the relative abundance of blacklegged ticks, and just how many of them are carrying disease-causing pathogens throughout the Northeast,” says lead author Lucas Price, who was a postdoctoral fellow in geography at Dartmouth at the time of the study and is now a wildlife biologist at the Interior Department’s Bureau of Land Management.

The researchers analyzed the abundance of blacklegged ticks and the presence of Lyme disease bacteria and other pathogens so that they could determine how blacklegged ticks and the pathogens they carry are changing in time and space.

“Contrary to the well-documented spread of blacklegged ticks and Lyme disease over the past 30 years, we found very small changes in the abundance of blacklegged ticks, but think this is likely because we usually don’t start sampling a location for blacklegged ticks until they’re already established,” says senior author Jonathan Winter, an associate professor of geography and director of the Applied Hydroclimatology Group at Dartmouth. However, we did find an increase in the percentage of blacklegged ticks that carry the Lyme disease bacteria.”

These findings underscore advice from the Centers for Disease Control and Prevention and health professionals, who recommend a range of tick bite prevention measures, including conducting full-body tick checks after spending time outdoors in regions where pathogen-carrying ticks are present. While much of the data were already available publicly prior to the study, the team made the surveys consistent across states, creating one of the most comprehensive tick abundance and pathogen prevalence datasets in the United States, and establishing a baseline that can be used in the future.

The researchers have another study underway examining the relationship between climate change and the prevalence of blacklegged ticks and Lyme disease in the Northeast.

Joseph Savage, a graduate student in the Ecology, Evolution, Environment & Society program at Dartmouth, also contributed to the study.

Share Button

Immune complex shaves stem cells to protect against cancer

A group of immune proteins called the inflammasome can help prevent blood stem cells from becoming malignant by removing certain receptors from their surfaces and blocking cancer gene activity, according to a preclinical study by Weill Cornell Medicine investigators.

The study, published Jan. 2 in Nature Immunology, may lead to therapies that target the earliest stages of cancer. The findings bolster the idea that the inflammasome has a dual role — it promotes inflammation associated with poor outcomes in late cancer stages, but early on, it can help prevent cells from becoming cancerous in the first place.

“What was striking was that the innate immune system, which includes the inflammasome, has a role beyond infection,” said Dr. Julie Magarian Blander, the Gladys and Roland Harriman Professor of Immunology in Medicine and a member of the Jill Roberts Institute for Research in Inflammatory Bowel Disease at Weill Cornell Medicine. “We found that it functions in maintaining homeostasis in the tissue, keeping an eye on whether stem cells are proliferating too much. By doing so, it prevents cells from becoming cancerous and this activity is independent of inflammation.”

The co-first authors of the study are Dr. Andrew Kent, an assistant professor of medicine-hematology at the University of Colorado School of Medicine and Dr. Kristel Joy Yee Mon, a postdoctoral associate in Dr. Blander’s lab.

Origin Story

By the time patients typically go to the doctor with cancer symptoms, tumors have already formed. As a result, very little is known about cancer’s beginnings.

To get a better understanding of how the disease takes hold, Dr. Blander and her colleagues chose to study a mouse model of B-cell lymphoma called Eµ-myc, which has a mutation in the Myc oncogene. These mice have a long delay before tumors develop, giving researchers a chance to observe what happens early on in cancer. Because B-cell lymphoma develops in a type of white blood cell, the team examined their precursors, called hematopoietic stem cells, in the mice.

Genetically disrupting inflammasome activity in the Eµ-myc mice greatly accelerated stem cell proliferation and tumor development. The investigators were surprised to find that stem cells in control mice that lacked the inflammasome also proliferated at a fast pace compared with wild-type mice, suggesting that the complex has an important role in healthy cells, too. The team found that without the inflammasome, the stem cells have high levels of the protein Ras, which is another oncogene product. This protein can work together with mutant Myc to drive cancer, so the inflammasome’s normal job of keeping Ras in check delays tumorigenesis.

Ground zero for the protective activity was not the hematopoietic stem cells themselves, but the bone marrow stroma, a collection of many cell types surrounding and nurturing the stem cells.

Higher levels of soluble tumor necrosis factor (TNF) receptors were found in the stroma of control mice compared with the inflammasome-deficient mice. “It turned out that TNF receptors were being shed from stem cells in control mice, and they were being retained on stem cells from inflammasome-deficient mice. Higher TNF receptor levels lead to increased stem cell proliferation. Maintaining a healthy level of TNF receptors becomes important for these stem cells to maintain homeostatic control of proliferation,” said Dr. Blander. “We think that the inflammasome in the stroma is orchestrating something where it’s cleaving TNF receptors, shaving them off the stem cells.”

Next steps

Next, the team will test for the inflammasome’s protective effects in other tissues. In addition, they will determine which of the stromal cell types is responsible for the activity, and which molecules the inflammasome is using to suppress cell growth.

Ultimately, the researchers hope that the study will lay the groundwork for a therapeutic that would stave off cancer. “A therapy could target the inflammasome, but it should be directed only to the inflammation side of its activity that is associated with tumor progression,” said Dr. Blander, who is also a member of the Sandra and Edward Meyer Cancer Center at Weill Cornell Medicine. “You want to protect the inflammasome’s beneficial function of delaying tumorigenesis.”

Share Button

Research on past hurricanes aims to reduce future risk

Tropical storms like hurricanes are not only terrifying, but also incredibly costly for coastal regions across the United States, Mexico, Central America and the Caribbean. Beyond the immediate devastation, these storms contribute to significant economic losses and human displacement. In 2023 alone, climate migration linked to such events saw 2.5 million individuals attempt to cross the U.S. southern land border.

New research led by The University of Texas at Arlington emphasizes that studying the impacts of past tropical storms can help communities better prepare for future storms. A key part of the study is analyzing the types and quantities of storm-related precipitation in affected regions to understand its role on local water resources. By mitigating excessive damage, such preparation could enable more people to remain in their home countries. This is increasingly urgent as climate change is expected to make tropical storms 10-15% more frequent and intense.

“We already know that tropical storms have a huge impact on water resources in communities, but few studies have examined the water runoff from these events and how they impact local populations — that’s where our research comes in,” said Ricardo Sánchez-Murillo, lead author of the study and associate professor of earth and environmental sciences at UTA.

Dr. Sánchez-Murillo and his team, in collaboration with international partners from hurricane-prone regions in the Bahamas, Costa Rica, the Dominican Republic, El Salvador, Honduras, Jamaica, Mexico, Nicaragua, and Trinidad and Tobago, analyzed water “fingerprints” known as isotopic compositions. By studying isotopic data from past storms, they provided new insights into how storm-related precipitation influences regional water cycles, adding depth to our understanding of these weather events.

“Our comprehensive analysis of isotopic compositions in tropical storm-derived precipitation offers a deeper understanding of the role these weather systems play in regional water cycles and climate predictions,” said Sánchez-Murillo. “These results underscore the significance of accounting for storm-related precipitation. We feel that understanding precipitation impacts will help communities better prepare for extreme storms and manage local water resources both before and after the storms.”

The research team, which includes researchers from Brown University, Clemson University, Florida International University, Humboldt University, Oberlin College, Rice University, the University of Aberdeen, the University of Houston, the University of Tennessee and Washington State University, plans to expand its work. Future studies will investigate evaporation and groundwater recharge patterns resulting from tropical storms, as well as how storm paths might shift due to climate change.

“This research has broad implications for improving our understanding of how tropical storms impact water resources and climate, leading to better predictions and management strategies,” Sánchez-Murillo said.

This research was funded in part from grants from the International Atomic Energy Agency and an Early Career Fellowship from the Gulf Research Program of the National Academics of Science, Engineering, and Medicine.

Share Button

Bacteria in polymers form cables that grow into living gels

Scientists at Caltech and Princeton University have discovered that bacterial cells growing in a solution of polymers, such as mucus, form long cables that buckle and twist on each other, building a kind of “living Jell-O.”

The finding could be particularly important to the study and treatment of diseases such as cystic fibrosis, in which the mucus that lines the lungs becomes more concentrated, often causing bacterial infections that take hold in that mucus to become life threatening. This discovery could also have implications in studies of polymer-secreting conglomerations of bacteria known as biofilms — the slippery goo on river rocks, for example — and in industrial applications where they can cause equipment malfunctions and health hazards.

The work is described in a paper published on January 17 in the journal Science Advances.

“We’ve discovered that when many bacteria grow in fluids containing spaghetti-like molecules called polymers, such as mucus in the lungs, they form cable-like structures that intertwine like living gels,” says Sujit Datta, a professor of chemical engineering, bioengineering, and biophysics at Caltech and corresponding author of the new paper. “And, interestingly, there are similarities between the physics of how these structures form and the microscopic physics underlying many nonliving gels, like Purell or Jell-O.”

Datta recently moved to Caltech from Princeton University. One of his graduate students at Princeton, Sebastian Gonzalez La Corte, is lead author of the paper. He and Datta had been interested in how mucus concentration changes in the lungs and guts of cystic fibrosis patients — in whom more polymers than usual are present. Working with mucus samples provided by colleagues at MIT, Gonzalez La Corte grew E. coli bacteria (commonly used in laboratory studies) in regular liquid and in cystic fibrosis-like samples and then observed the specimens under a microscope to watch how the bacterial cells grew in each case.

He focused on cells that had lost the ability to swim, as is the case for many bacteria in nature. Under normal circumstances, when such a cell divides into two, the resulting cells separate and diffuse away from each other. However, Gonzalez La Corte found that in a polymeric solution, the copied cells remained stuck to each other, end to end.

“As cells continue to divide and stick to each other, they start to form these beautiful long structures that we call cables,” Gonzalez La Corte says. “At some point, they actually bend and fold on each other and form an entangled network.”

The team found that the cables continue to elongate and grow as long as the cells have the nutrients they need, eventually creating chains that are thousands of cells long.

Subsequent experiments showed that it does not seem to matter which bacterial species are introduced, nor does the type of organic polymer solution make a difference; once enough polymer surrounds the bacterial cells, the cables grow. The researchers even saw the same result with bacteria in synthetic polymers.

Although the initial motivation for the study was to better understand the growth of infections in patients with cystic fibrosis, the findings are more broadly relevant. Mucus plays an important role in the human body, not only in the lungs but also in the gut and in the cervicovaginal tract. And Datta says the work is also important in the context of biofilms, groupings of bacteria that grow an encapsulating polymer matrix of their own. There are biofilms in the human body, such as dental plaque, but they are also extremely common in soil and in industrial settings, where they can damage equipment and cause health hazards.

“That polymer matrix that they’ve secreted is what makes biofilms so tough to remove from surfaces and treat with antibiotics,” Datta says. “Understanding how cells grow in that matrix could be key to discovering how to better control biofilms.”

Understanding the Physics Behind the Cables

Through carefully designed experiments, the team found that the external pressure exerted by the polymers surrounding the dividing cells is what forces the cells together and holds them in place. In physics, such an attractive force that is under the control of an outside pressure is called a depletion interaction. Gonzalez La Corte used the theory of depletion interaction to create a theoretical model of bacterial cable growth. The model can predict when a cable will survive and grow in a polymeric environment.

“Now we can actually use established theories from polymer physics, which were developed for completely different things, in these biological systems to quantitatively predict when these cables will arise,” Datta says.

Why Do the Bacteria Form These Cables?

“We discovered this interesting, unusual, very unexpected phenomenon,” Datta says. “We can also explain why it happens from a mechanistic, physics perspective. Now the question is: What are the biological implications?”

Interestingly, there are two possibilities: The bacteria could be clumping together to form this network of living gel in an effort to make themselves larger and therefore more difficult for immune cells to engulf and destroy. Alternately, cable formation could actually be harmful to the bacteria. After all, the secretions from the host cause the bacteria to build the cables. “Mucus isn’t static; for example, in the lungs, it’s being constantly swept up by little hairs on the surface of the lungs and propelled upward,” Datta says. “Could it be that when bacteria are all clumped together in these cables, it’s actually easier to get rid of them — to expel them out of the body?”

For now, no one knows which possibility is correct, and Datta says that is what makes this project remain interesting. “Now that we have found this phenomenon, we can frame these new questions and design further experiments to test our suspicions,” he says.

Share Button

Calorie labels have small effect on eating habits – study

The policy was brought in two years ago in England to try to encourage healthier food choices.

Share Button

Deaths of 56 babies at Leeds hospitals may have been preventable, BBC told

Two whistleblowers also believe the Leeds Teaching Hospitals NHS Trust’s maternity units are unsafe.

Share Button