The country singer says her baby daughter spent 10 days in hospital with respiratory syncytial virus.
Category Archives: Mind Building
Fine-tuned brain-computer interface makes prosthetic limbs feel more real
You can probably complete an amazing number of tasks with your hands without looking at them. But if you put on gloves that muffle your sense of touch, many of those simple tasks become frustrating. Take away proprioception — your ability to sense your body’s relative position and movement — and you might even end up breaking an object or injuring yourself.
“Most people don’t realize how often they rely on touch instead of vision — typing, walking, picking up a flimsy cup of water,” said Charles Greenspon, PhD, a neuroscientist at the University of Chicago. “If you can’t feel, you have to constantly watch your hand while doing anything, and you still risk spilling, crushing or dropping objects.”
Greenspon and his research collaborators recently published papers in Nature Biomedical Engineering and Science documenting major progress on a technology designed to address precisely this problem: direct, carefully timed electrical stimulation of the brain that can recreate tactile feedback to give nuanced “feeling” to prosthetic hands.
The science of restoring sensation
These new studies build on years of collaboration among scientists and engineers at UChicago, the University of Pittsburgh, Northwestern University, Case Western Reserve University and Blackrock Neurotech. Together they are designing, building, implementing and refining brain-computer interfaces (BCIs) and robotic prosthetic arms aimed at restoring both motor control and sensation in people who have lost significant limb function.
On the UChicago side, the research was led by neuroscientist Sliman Bensmaia, PhD, until his unexpected passing in 2023.
The researchers’ approach to prosthetic sensation involves placing tiny electrode arrays in the parts of the brain responsible for moving and feeling the hand. On one side, a participant can move a robotic arm by simply thinking about movement, and on the other side, sensors on that robotic limb can trigger pulses of electrical activity called intracortical microstimulation (ICMS) in the part of the brain dedicated to touch.
For about a decade, Greenspon explained, this stimulation of the touch center could only provide a simple sense of contact in different places on the hand.
“We could evoke the feeling that you were touching something, but it was mostly just an on/off signal, and often it was pretty weak and difficult to tell where on the hand contact occurred,” he said.
The newly published results mark important milestones in moving past these limitations.
Advancing understanding of artificial touch
In the first study, published in Nature Biomedical Engineering, Greenspon and his colleagues focused on ensuring that electrically evoked touch sensations are stable, accurately localized and strong enough to be useful for everyday tasks.
By delivering short pulses to individual electrodes in participants’ touch centers and having them report where and how strongly they felt each sensation, the researchers created detailed “maps” of brain areas that corresponded to specific parts of the hand. The testing revealed that when two closely spaced electrodes are stimulated together, participants feel a stronger, clearer touch, which can improve their ability to locate and gauge pressure on the correct part of the hand.
The researchers also conducted exhaustive tests to confirm that the same electrode consistently creates a sensation corresponding to a specific location.
“If I stimulate an electrode on day one and a participant feels it on their thumb, we can test that same electrode on day 100, day 1,000, even many years later, and they still feel it in roughly the same spot,” said Greenspon, who was the lead author on this paper.
From a practical standpoint, any clinical device would need to be stable enough for a patient to rely on it in everyday life. An electrode that continually shifts its “touch location” or produces inconsistent sensations would be frustrating and require frequent recalibration. By contrast, the long-term consistency this study revealed could allow prosthetic users to develop confidence in their motor control and sense of touch, much as they would in their natural limbs.
Adding feelings of movement and shapes
The complementary Science paper went a step further to make artificial touch even more immersive and intuitive. The project was led by first author Giacomo Valle, PhD, a former postdoctoral fellow at UChicago who is now continuing his bionics research at Chalmers University of Technology in Sweden.
“Two electrodes next to each other in the brain don’t create sensations that ’tile’ the hand in neat little patches with one-to-one correspondence; instead, the sensory locations overlap,” explained Greenspon, who shared senior authorship of this paper with Bensmaia.
The researchers decided to test whether they could use this overlapping nature to create sensations that could let users feel the boundaries of an object or the motion of something sliding along their skin. After identifying pairs or clusters of electrodes whose “touch zones” overlapped, the scientists activated them in carefully orchestrated patterns to generate sensations that progressed across the sensory map.
Participants described feeling a gentle gliding touch passing smoothly over their fingers, despite the stimulus being delivered in small, discrete steps. The scientists attribute this result to the brain’s remarkable ability to stitch together sensory inputs and interpret them as coherent, moving experiences by “filling in” gaps in perception.
The approach of sequentially activating electrodes also significantly improved participants’ ability to distinguish complex tactile shapes and respond to changes in the objects they touched. They could sometimes identify letters of the alphabet electrically “traced” on their fingertips, and they could use a bionic arm to steady a steering wheel when it began to slip through the hand.
These advancements help move bionic feedback closer to the precise, complex, adaptive abilities of natural touch, paving the way for prosthetics that enable confident handling of everyday objects and responses to shifting stimuli.
The future of neuroprosthetics
The researchers hope that as electrode designs and surgical methods continue to improve, the coverage across the hand will become even finer, enabling more lifelike feedback.
“We hope to integrate the results of these two studies into our robotics systems, where we have already shown that even simple stimulation strategies can improve people’s abilities to control robotic arms with their brains,” said co-author Robert Gaunt, PhD, associate professor of physical medicine and rehabilitation and lead of the stimulation work at the University of Pittsburgh.
Greenspon emphasized that the motivation behind this work is to enhance independence and quality of life for people living with limb loss or paralysis.
“We all care about the people in our lives who get injured and lose the use of a limb — this research is for them,” he said. “This is how we restore touch to people. It’s the forefront of restorative neurotechnology, and we’re working to expand the approach to other regions of the brain.”
The approach also holds promise for people with other types of sensory loss. In fact, the group has also collaborated with surgeons and obstetricians at UChicago on the Bionic Breast Project, which aims to produce an implantable device that can restore the sense of touch after mastectomy.
Although many challenges remain, these latest studies offer evidence that the path to restoring touch is becoming clearer. With each new set of findings, researchers come closer to a future in which a prosthetic body part is not just a functional tool, but a way to experience the world.
New chainmail-like material could be the future of armor
In a remarkable feat of chemistry, a Northwestern University-led research team has developed the first two-dimensional (2D) mechanically interlocked material.
Resembling the interlocking links in chainmail, the nanoscale material exhibits exceptional flexibility and strength. With further work, it holds promise for use in high-performance, light-weight body armor and other uses that demand lightweight, flexible and tough materials.
Publishing on Friday (Jan. 17) in the journal Science, the study marks several firsts for the field. Not only is it the first 2D mechanically interlocked polymer, but the novel material also contains 100 trillion mechanical bonds per 1 square centimeter — the highest density of mechanical bonds ever achieved. The researchers produced this material using a new, highly efficient and scalable polymerization process.
“We made a completely new polymer structure,” said Northwestern’s William Dichtel, the study’s corresponding author. “It’s similar to chainmail in that it cannot easily rip because each of the mechanical bonds has a bit of freedom to slide around. If you pull it, it can dissipate the applied force in multiple directions. And if you want to rip it apart, you would have to break it in many, many different places. We are continuing to explore its properties and will probably be studying it for years.”
Dichtel is the Robert L. Letsinger Professor of Chemistry at the Weinberg College of Arts and Sciences and a member of the International Institute of Nanotechnology (IIN) and the Paula M. Trienens Institute for Sustainability and Energy. Madison Bardot, a Ph.D. candidate in Dichtel’s laboratory and IIN Ryan Fellow, is the study’s first author.
Inventing a new process
For years, researchers have attempted to develop mechanically interlocked molecules with polymers but found it near impossible to coax polymers to form mechanical bonds.
To overcome this challenge, Dichtel’s team took a whole new approach. They started with X-shaped monomers — which are the building blocks of polymers — and arranged them into a specific, highly ordered crystalline structure. Then, they reacted these crystals with another molecule to create bonds between the molecules within the crystal.
“I give a lot of credit to Madison because she came up with this concept for forming the mechanically interlocked polymer,” Dichtel said. “It was a high-risk, high-reward idea where we had to question our assumptions about what types of reactions are possible in molecular crystals.”
The resulting crystals comprise layers and layers of 2D interlocked polymer sheets. Within the polymer sheets, the ends of the X-shaped monomers are bonded to the ends of other X-shaped monomers. Then, more monomers are threaded through the gaps in between. Despite its rigid structure, the polymer is surprisingly flexible. Dichtel’s team also found that dissolving the polymer in solution caused the layers of interlocked monomers to peel off each other.
“After the polymer is formed, there’s not a whole lot holding the structure together,” Dichtel said. “So, when we put it in solvent, the crystal dissolves, but each 2D layer holds together. We can manipulate those individual sheets.”
To examine the structure at the nanoscale, collaborators at Cornell University, led by Professor David Muller, used cutting-edge electron microscopy techniques. The images revealed the polymer’s high degree of crystallinity, confirmed its interlocked structure and indicated its high flexibility.
Dichtel’s team also found the new material can be produced in large quantities. Previous polymers containing mechanical bonds typically have been prepared in very small quantities using methods that are unlikely to be scalable. Dichtel’s team, on the other hand, made half a kilogram of their new material and assume even larger amounts are possible as their most promising applications emerge.
Adding strength to tough polymers
Inspired by the material’s inherent strength, Dichtel’s collaborators at Duke University, led by Professor Matthew Becker, added it to Ultem. In the same family as Kevlar, Ultem is an incredibly strong material that can withstand extreme temperatures as well as acidic and caustic chemicals. The researchers developed a composite material of 97.5% Ultem fiber and just 2.5% of the 2D polymer. That small percentage dramatically increased Ultem’s overall strength and toughness.
Dichtel envisions his group’s new polymer might have a future as a specialty material for light-weight body armor and ballistic fabrics.
“We have a lot more analysis to do, but we can tell that it improves the strength of these composite materials,” Dichtel said. “Almost every property we have measured has been exceptional in some way.”
Steeped in Northwestern history
The authors dedicated the paper to the memory of former Northwestern chemist Sir Fraser Stoddart, who introduced the concept of mechanical bonds in the 1980s. Ultimately, he elaborated these bonds into molecular machines that switch, rotate, contract and expand in controllable ways. Stoddart, who passed away last month, received the 2016 Nobel Prize in Chemistry for this work.
“Molecules don’t just thread themselves through each other on their own, so Fraser developed ingenious ways to template interlocked structures,” said Dichtel, who was a postdoctoral researcher in Stoddart’s lab at UCLA. “But even these methods have stopped short of being practical enough to use in big molecules like polymers. In our present work, the molecules are held firmly in place in a crystal, which templates the formation of a mechanical bond around each one.
“So, these mechanical bonds have deep tradition at Northwestern, and we are excited to explore their possibilities in ways that have not yet been possible.”
The study, “Mechanically interlocked two-dimensional polymers,” was primarily supported by the Defense Advanced Research Projects Agency (contract number HR00112320041) and Northwestern’s IIN (Ryan Fellows Program).
The megadroughts are upon us
Increasingly common since 1980, persistent multi-year droughts will continue to advance with the warming climate, warns a study from the Swiss Federal Institute for Forest, Snow, and Landscape Research (WSL), with Professor Francesca Pellicciotti from the Institute of Science and Technology Austria (ISTA) participating. This publicly available forty-year global quantitative inventory, now published in Science, seeks to inform policy regarding the environmental impact of human-induced climate change. It also detected previously ‘overlooked’ events.
Fifteen years of a persistent, devastating megadrought — the longest lasting in a thousand years — have nearly dried out Chile’s water reserves, even affecting the country’s vital mining output. This is but one blatant example of how the warming climate is causing multi-year droughts and acute water crises in vulnerable regions around the globe. However, droughts tend only to be noticed when they damage agriculture or visibly affect forests. Thus, some pressing questions arise: Can we consistently identify extreme multi-year droughts and examine their impacts on ecosystems? And what can we learn from the drought patterns of the past forty years?
To answer these questions, researchers from the Swiss Federal Institute for Forest, Snow, and Landscape Research (WSL) and the Institute of Science and Technology Austria (ISTA) have analyzed global meteorological data and modeled droughts between 1980 and 2018. They demonstrated a worrying increase in multi-year droughts that became longer, more frequent, and more extreme, covering more land. “Each year since 1980, drought-stricken areas have spread by an additional fifty thousand square kilometers on average — that’s roughly the area of Slovakia, or the US states of Vermont and New Hampshire put together — , causing enormous damage to ecosystems, agriculture, and energy production,” says ISTA Professor Francesca Pellicciotti, the Principal Investigator of the WSL-funded EMERGE Project, under which the present study was conducted. The team aims to unveil the possible long-lasting effects of persistent droughts around the globe and help inform policy preparing for more frequent and severe future megadroughts.
Unveiling extreme droughts that flew under the radar
The international team used the CHELSA climate data prepared by WSL Senior Researcher and study author Dirk Karger, which goes back to 1979. They calculated anomalies in rainfall and evapotranspiration — water evaporation from soil and plants — and their impact on natural ecosystems worldwide. This allowed them to determine the occurrence of multi-year droughts both in well-studied and less accessible regions of the planet, especially in areas like tropical forests and the Andes, where little observational data is available. “Our method not only mapped well-documented droughts but also shed light on extreme droughts that flew under the radar, such as the one that affected the Congo rainforest from 2010 to 2018,” says Karger. This discrepancy is likely due to how forests in various climate regions respond to drought episodes. “While temperate grasslands have been most affected in the past forty years, boreal and tropical forests appeared to withstand drought more effectively and even displayed paradoxical effects during the onset of drought.” But how long can these forests resist the harsh blow of climate change?
Contrasting impacts on ecosystems
The persistently rising temperatures, extended droughts, and higher evapotranspiration ultimately lead to dryer and browner ecosystems, despite also causing heavier precipitation episodes. Thus, scientists can use satellite images to monitor the effect of drought by tracking changes in vegetation greenness over time. While this analysis works well for temperate grasslands, the changes in greenness cannot be tracked as easily over dense tropical forest canopies, leading to underestimated effects of drought in such areas. Thus, to ensure consistent results worldwide, the team developed a multistep analysis that better resolves the changes in high-leaf regions and ranked the droughts by their severity since 1980. Unsurprisingly, they showed that megadroughts had the highest immediate impact on temperate grasslands. ‘Hotspot’ regions included the western USA, central and eastern Mongolia, and particularly southeastern Australia, where the data overlapped with two well-documented multi-year ecological droughts. On the other hand, the team shed additional light on the paradoxical effects observed in the tropical and boreal forests. While tropical forests can offset the expected effects of drought as long as they have enough water reserves to buffer the decrease in rainfall, boreal forests and tundra react in their distinct way. It turns out that the warming climate extends the boreal growth season since vegetation growth in these regions is limited by lower temperatures rather than water availability.
Droughts evolve in time and space
The results show that the trend of intensifying megadroughts is clear: The team generated the first global — and globally consistent — picture of megadroughts and their impact on vegetation at high resolution. However, the long-term effects on the planet and its ecosystems remain largely unknown. Meanwhile, the data already agrees with the observed widely greening pan-Arctic. “But in the event of long-term extreme water shortages, trees in tropical and boreal regions can die, leading to long-lasting damage to these ecosystems. Especially, the boreal vegetation will likely take the longest to recover from such a climate disaster,” says Karger. Pellicciotti hopes the team’s result will help change our perception of droughts and how to prepare for them: “Currently, mitigation strategies largely consider droughts as yearly or seasonal events, which stands in stark contrast to the longer and more severe megadroughts we will face in the future,” she says. “We hope that the publicly available inventory of droughts we are putting out will help orient policymakers toward more realistic preparation and prevention measures.” As a glaciologist, Pellicciotti also seeks to examine the effects of megadroughts in the mountains and how glaciers can buffer them. She leads a collaborative project titled “MegaWat — Megadroughts in the Water Towers of Europe — From Process Understanding to Strategies for Management and Adaptation.”
Project and funding information The present study was conducted within the scope of the EMERGE Project of the Swiss Federal Institute for Forest, Snow, and Landscape Research (WSL) with Professor Francesca Pellicciotti from the Institute of Science and Technology Austria (ISTA) serving as its Principal Investigator. The research was supported by funding from the Extreme Program of the WSL for the EMERGE project.
Large UK-wide pandemic preparedness tests planned this year
The stress test will involve thousands of people to help the UK prepare for potential future threats.
Patients dying in hospital corridors, say nurses
The Royal College of Nursing details harrowing cases from more than 5,000 nurses across the UK.
Rise of vaccine distrust – why more of us are questioning jabs
Confidence in all types of vaccination has taken a hit. The question is why, and what can be done about it?
NASA celebrates Edwin Hubble’s discovery of a new universe
For humans, the most important star in the universe is our Sun. The second-most important star is nestled inside the Andromeda galaxy. Don’t go looking for it — the flickering star is 2.2 million light-years away, and is 1/100,000th the brightness of the faintest star visible to the human eye.
Yet, a century ago, its discovery by Edwin Hubble, then an astronomer at Carnegie Observatories, opened humanity’s eyes as to how large the universe really is, and revealed that our Milky Way galaxy is just one of hundreds of billions of galaxies in the universe ushered in the coming-of-age for humans as a curious species that could scientifically ponder our own creation through the message of starlight. Carnegie Science and NASA are celebrating this centennial at the 245th meeting of the American Astronomical Society in Washington, D.C.
The seemingly inauspicious star, simply named V1, flung open a Pandora’s box full of mysteries about time and space that are still challenging astronomers today. Using the largest telescope in the world at that time, the Carnegie-funded 100-inch Hooker Telescope at Mount Wilson Observatory in California, Hubble discovered the demure star in 1923. This rare type of pulsating star, called a Cepheid variable, is used as milepost markers for distant celestial objects. There are no tape-measures in space, but by the early 20th century Henrietta Swan Leavitt had discovered that the pulsation period of Cepheid variables is directly tied to their luminosity.
Many astronomers long believed that the edge of the Milky Way marked the edge of the entire universe. But Hubble determined that V1, located inside the Andromeda “nebula,” was at a distance that far exceeded anything in our own Milky Way galaxy. This led Hubble to the jaw-dropping realization that the universe extends far beyond our own galaxy.
In fact Hubble had suspected there was a larger universe out there, but here was the proof in the pudding. He was so amazed he scribbled an exclamation mark on the photographic plate of Andromeda that pinpointed the variable star.
As a result, the science of cosmology exploded almost overnight. Hubble’s contemporary, the distinguished Harvard astronomer Harlow Shapley, upon Hubble notifying him of the discovery, was devastated. “Here is the letter that destroyed my universe,” he lamented to fellow astronomer Cecilia Payne-Gaposchkin, who was in his office when he opened Hubble’s message.
Just three years earlier, Shapley had presented his observational interpretation of a much smaller universe in a debate one evening at the Smithsonian Museum of Natural History in Washington. He maintained that the Milky Way galaxy was so huge, it must encompass the entirety of the universe. Shapley insisted that the mysteriously fuzzy “spiral nebulae,” such as Andromeda, were simply stars forming on the periphery of our Milky Way, and inconsequential.
Little could Hubble have imagined that 70 years later, an extraordinary telescope named after him, lofted hundreds of miles above the Earth, would continue his legacy. The marvelous telescope made “Hubble” a household word, synonymous with wonderous astronomy.
Today, NASA’s Hubble Space Telescope pushes the frontiers of knowledge over 10 times farther than Edwin Hubble could ever see. The space telescope has lifted the curtain on a compulsive universe full of active stars, colliding galaxies, and runaway black holes, among the celestial fireworks of the interplay between matter and energy.
Edwin Hubble was the first astronomer to take the initial steps that would ultimately lead to the Hubble Space Telescope, revealing a seemingly infinite ocean of galaxies. He thought that, despite their abundance, galaxies came in just a few specific shapes: pinwheel spirals, football-shaped ellipticals, and oddball irregular galaxies. He thought these might be clues to galaxy evolution — but the answer had to wait for the Hubble Space Telescope’s legendary Hubble Deep Field in 1994.
The most impactful finding that Edwin Hubble’s analysis showed was that the farther the galaxy is, the faster it appears to be receding from Earth. The universe looked like it was expanding like a balloon. This was based on Hubble tying galaxy distances to the reddening of light — the redshift — that proportionally increased the father away the galaxies are.
The redshift data were first collected by Lowell Observatory astronomer Vesto Slipher, who spectroscopically studied the “spiral nebulae” a decade before Hubble. Slipher did not know they were extragalactic, but Hubble made the connection. Slipher first interpreted his redshift data an example of the Doppler effect. This phenomenon is caused by light being stretched to longer, redder wavelengths if a source is moving away from us. To Slipher, it was curious that all the spiral nebulae appeared to be moving away from Earth.
Two years prior to Hubble publishing his findings, the Belgian physicist and Jesuit priest Georges Lemaître analyzed the Hubble and Slifer observations and first came to the conclusion of an expanding universe. This proportionality between galaxies’ distances and redshifts is today termed Hubble-Lemaître’s law.
Because the universe appeared to be uniformly expanding, Lemaître further realized that the expansion rate could be run back into time — like rewinding a movie — until the universe was unimaginably small, hot, and dense. It wasn’t until 1949 that the term “big bang” came into fashion.
This was a relief to Edwin Hubble’s contemporary, Albert Einstein, who deduced the universe could not remain stationary without imploding under gravity’s pull. The rate of cosmic expansion is now known as the Hubble Constant.
Ironically, Hubble himself never fully accepted the runaway universe as an interpretation of the redshift data. He suspected that some unknown physics phenomenon was giving the illusion that the galaxies were flying away from each other. He was partly right in that Einstein’s theory of special relativity explained redshift as an effect of time-dilation that is proportional to the stretching of expanding space. The galaxies only appear to be zooming through the universe. Space is expanding instead.
After decades of precise measurements, the Hubble telescope came along to nail down the expansion rate precisely, giving the universe an age of 13.8 billion years. This required establishing the first rung of what astronomers call the “cosmic distance ladder” needed to build a yardstick to far-flung galaxies. They are cousins to V1, Cepheid variable stars that the Hubble telescope can detect out to over 100 times farther from Earth than the star Edwin Hubble first found.
Astrophysics was turned on its head again in 1998 when the Hubble telescope and other observatories discovered that the universe was expanding at an ever-faster rate, through a phenomenon dubbed “dark energy.” Einstein first toyed with this idea of a repulsive form of gravity in space, calling it the cosmological constant.
Even more mysteriously, the current expansion rate appears to be different than what modern cosmological models of the developing universe would predict, further confounding theoreticians. Today astronomers are wrestling with the idea that whatever is accelerating the universe may be changing over time. NASA’s Roman Space Telescope, with the ability to do large cosmic surveys, should lead to new insights into the behavior of dark matter and dark energy. Roman will likely measure the Hubble constant via lensed supernovae.
This grand century-long adventure, plumbing depths of the unknown, began with Hubble photographing a large smudge of light, the Andromeda galaxy, at the Mount Wilson Observatory high above Los Angeles.
In short, Edwin Hubble is the man who wiped away the ancient universe and discovered a new universe that would shrink humanity’s self-perception into being an insignificant speck in the cosmos.
This quasar may have helped turn the lights on for the universe
A Yale-led team of astronomers has detected an intensely brightening and dimming quasar that may help explain how some objects in the early universe grew at a highly accelerated rate.
The discovery, announced Jan. 14 at the winter meeting of the American Astronomical Society, is the most distant object detected by the NuSTAR X-ray space telescope (which launched in 2012) and stands as one of the most highly “variable” quasars ever identified.
“In this work, we have discovered that this quasar is very likely to be a supermassive black hole with a jet pointed towards Earth — and we are seeing it in the first billion years of the universe,” said Lea Marcotulli, a postdoctoral fellow in astrophysics at Yale and lead author of a new study published Jan. 14 in The Astrophysical Journal Letters.
Quasars are among the oldest, brightest objects in the universe. Formed from active galactic nuclei (AGN) — areas at the center of galaxies where a black hole is drawing in matter — quasars emit electromagnetic radiation that can be spotted in radio, infrared, visible, ultraviolet, X-ray, and gamma-ray wavelengths. This “visibility” has made quasars a helpful proxy for trying to understand the structure and evolution of the cosmos.
For example, astronomers look to quasars to study reionization, a period less than a billion years after the Big Bang when electrically neutral hydrogen atoms became charged and the first generation of stars lit up the universe.
“The epoch of reionization is considered the end of the universe’s dark ages,” said Thomas Connor, an astronomer at the Chandra X-Ray Center and co-corresponding author of the study. “The precise timeline and source class responsible for reionization are still debated, and actively accreting supermassive black holes are one proposed culprit.”
For the study, the researchers compared NuSTAR observations of a distant quasar — designated J1429+5447 — with unrelated observations of four months earlier by the Chandra X-ray telescope. The researchers found that the quasar’s X-ray emissions had doubled in that very short time (due to relativistic effects, the four months on Earth corresponded to only two weeks for the quasar).
“This level of X-ray variability, in terms of intensity and rapidity, is extreme,” said Meg Urry, the Israel Munson Professor of Physics and Astronomy in Yale’s Faculty of Arts and Sciences and co-author of the study. “It is almost certainly explained by a jet pointing toward us — a cone in which particles are transported up to a million light years away from the central, supermassive black hole. Because the jet moves at nearly the speed of light, effects of Einstein’s theory of special relativity speed up and amplify the variability.”
The researchers said their findings offer crucial, much-needed information for astronomers studying reionization. It may also point astronomers toward other supermassive black hole candidates from the early universe.
“Finding more supermassive black holes that are potentially hosting jets raises the question as to how these black holes grew so big in such a short timescale, and what the connection may be to jet triggering mechanisms,” Marcotulli said.
NASA supported the research.
Is eating more red meat bad for your brain?
People who eat more red meat, especially processed red meat like bacon, sausage and bologna, are more likely to have a higher risk of cognitive decline and dementia when compared to those who eat very little red meat, according to a study published in the January 15, 2025, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Red meat is high in saturated fat and has been shown in previous studies to increase the risk of type 2 diabetes and heart disease, which are both linked to reduced brain health,” said study author Dong Wang, MD, ScD, of Brigham and Women’s Hospital in Boston. “Our study found processed red meat may increase the risk of cognitive decline and dementia, but the good news is that it also found that replacing it with healthier alternatives, like nuts, fish and poultry, may reduce a person’s risk.”
To examine the risk of dementia, researchers included a group of 133,771 people with an average age of 49 who did not have dementia at the start of the study. They were followed up to 43 years. Of this group, 11,173 people developed dementia.
Participants completed a food diary every two to four years, listing what they ate and how often.
Researchers defined processed red meat as bacon, hot dogs, sausages, salami, bologna and other processed meat products. They defined unprocessed red meat as beef, pork, lamb and hamburger. A serving of red meat is three ounces, about the size of a deck of cards.
Researchers calculated how much red meat participants ate on average per day.
For processed red meat, they divided participants into three groups. The low group ate an average of fewer than 0.10 servings per day; the medium group ate between 0.10 and 0.24 servings per day; and the high group, 0.25 or more servings per day.
After adjusting for factors such as age, sex and other risk factors for cognitive decline, researchers found that participants in the high group had a 13% higher risk of developing dementia compared to those in the low group.
For unprocessed red meat, researchers compared people who ate an average of less than one half serving per day to people who ate one or more servings per day and did not find a difference in dementia risk.
To measure subjective cognitive decline, researchers looked at a different group of 43,966 participants with an average age of 78. Subjective cognitive decline is when a person reports memory and thinking problems before any decline is large enough to show up on standard tests.
The subjective cognitive decline group took surveys rating their own memory and thinking skills twice during the study.
After adjusting for factors such as age, sex and other risk factors for cognitive decline, researchers found that participants who ate an average of 0.25 servings or more per day of processed red meat had a 14% higher risk of subjective cognitive decline compared to those who ate an average of fewer than 0.10 servings per day.
They also found people who ate one or more servings of unprocessed red meat per day had a 16% higher risk of subjective cognitive decline compared to people who ate less than a half serving per day.
To measure objective cognitive function, researchers looked at a different group of 17,458 female participants with an average age of 74. Objective cognitive function is how well your brain works to remember, think and solve problems.
This group took memory and thinking tests four times during the study.
After adjusting for factors such as age, sex and other risk factors for cognitive decline, researchers found that eating higher processed red meat was associated with faster brain aging in global cognition with 1.61 years with each additional serving per day and in verbal memory with 1.69 years with each additional serving per day.
Finally, researchers found that replacing one serving per day of processed red meat with one serving per day of nuts and legumes was associated with a 19% lower risk of dementia and 1.37 fewer years of cognitive aging. Making the same substitution for fish was associated with a 28% lower risk of dementia and replacing with chicken was associated with a 16% lower risk of dementia.
“Reducing how much red meat a person eats and replacing it with other protein sources and plant-based options could be included in dietary guidelines to promote cognitive health,” said Wang. “More research is needed to assess our findings in more diverse groups.”
A limitation of the study was that it primarily looked at white health care professionals, so the results might not be the same for other race, ethnic and non-binary sex and gender populations.
The study was supported by the National Institutes of Health.