Meteorite contains evidence of liquid water on Mars 742 million years ago

An asteroid struck Mars 11 million years ago and sent pieces of the red planet hurtling through space. One of these chunks of Mars eventually crashed into the Earth somewhere near Purdue and is one of the few meteorites that can be traced directly to Mars. This meteorite was rediscovered in a drawer at Purdue University in 1931 and therefore named the Lafayette Meteorite.

During early investigations of the Lafayette Meteorite, scientists discovered that it had interacted with liquid water while on Mars. Scientists have long wondered when that interaction with liquid water took place. An international collaboration of scientists including two from Purdue University’s College of Science have recently determined the age of the minerals in the Lafayette Meteorite that formed when there was liquid water. The team has published its findings in Geochemical Perspective Letters.

Marissa Tremblay, assistant professor with the Department of Earth, Atmospheric, and Planetary Sciences (EAPS) at Purdue University, is the lead author of this publication. She uses noble gases like helium, neon and argon, to study the physical and chemical processes shaping the surfaces of Earth and other planets. She explains that some meteorites from Mars contain minerals that formed through interaction with liquid water while still on Mars.

“Dating these minerals can therefore tell us when there was liquid water at or near the surface of Mars in the planet’s geologic past,” she says. “We dated these minerals in the Martian meteorite Lafayette and found that they formed 742 million years ago. We do not think there was abundant liquid water on the surface of Mars at this time. Instead, we think the water came from the melting of nearby subsurface ice called permafrost, and that the permafrost melting was caused by magmatic activity that still occurs periodically on Mars to the present day.”

In this publication, her team demonstrated that the age obtained for the timing of water-rock interaction on Mars was robust and that the chronometer used was not affected by things that happened to Lafayette after it was altered in the presence of water.

“The age could have been affected by the impact that ejected the Lafayette Meteorite from Mars, the heating Lafayette experienced during the 11 million years it was floating out in space, or the heating Lafayette experienced when it fell to Earth and burned up a little bit in Earth’s atmosphere,” she says. “But we were able to demonstrate that none of these things affected the age of aqueous alteration in Lafayette.”

Ryan Ickert, senior research scientist with Purdue EAPS, is a co-author of the paper. He uses heavy radioactive and stable isotopes to study the timescales of geological processes. He demonstrated that other isotope data (previously used to estimate the timing of water-rock interaction on Mars) were problematic and had likely been affected by other processes.

“This meteorite uniquely has evidence that it has reacted with water. The exact date of this was controversial, and our publication dates when water was present,” he says.

Found in a drawer

Thanks to research, quite a bit is known about the Lafayette Meteorite’s origin story. It was ejected from the surface of Mars about 11 million years ago by an impact event.

“We know this because once it was ejected from Mars, the meteorite experienced bombardment by cosmic ray particles in outer space, that caused certain isotopes to be produced in Lafayette,” Tremblay says. “Many meteoroids are produced by impacts on Mars and other planetary bodies, but only a handful will eventually fall to Earth.”

But once Lafayette hit Earth, the story gets a little muddy. It is known for certain that the meteorite was found in a drawer at Purdue University in 1931. But how it got there is still a mystery. Tremblay and others made strides in explaining the history of the post-Earth timeline in a recent publication.

“We used organic contaminants from Earth found on Lafayette (specifically, crop diseases) that were particularly prevalent in certain years to narrow down when it might have fallen, and whether the meteorite fall may have been witnessed by someone,” Tremblay says.

Meteorites: time capsules of the universe

Meteorites are solid time capsules from planets and celestial bodies from our universe. They carry with them bits of data that can be unlocked by geochronologists. They set themselves apart from rocks that may be found on Earth by a crust that forms from its descent through our atmosphere and often form a fiery entrance visible in the night’s sky.

“We can identify meteorites by studying what minerals are present in them and the relationships between these minerals inside the meteorite,” says Tremblay. “Meteorites are often denser than Earth rocks, contain metal, and are magnetic. We can also look for things like a fusion crust that forms during entry into Earth’s atmosphere. Finally, we can use the chemistry of meteorites (specifically their oxygen isotope composition) to fingerprint which planetary body they came from or which type of meteorite it belongs to.”

An international collab

The team involved with this publication included an international collaboration of scientists. The team also includes Darren F. Mark, Dan N. Barfod, Benjamin E. Cohen, Martin R. Lee, Tim Tomkinson and Caroline L. Smith representing the Scottish Universities Environmental Research Centre (SUERC), the Department of Earth and Environmental Science at the University of St Andrews, the School of Geographical and Earth Sciences at the University of Glasgow, the School of Earth Sciences at the University of Bristol, and the Science Group at The Natural History Museum in London.

“Before moving to Purdue, Ryan and I were both based at the Scottish Universities Environmental Research Centre, where the argon-argon isotopic analyses of the alteration minerals in Lafayette took place” Tremblay says. “Our collaborators at SUERC, the University of Glasgow, and the Natural History Museum have previously done a lot of work studying the history of Lafayette.”

Dating the alteration minerals in Lafayette and, more generally, in this class of meteorites from Mars called nakhlites, has been a long-term objective in planetary science because scientists know that the alteration happened in the presence of liquid water on Mars. However, these materials are especially difficult to date, and previous attempts at dating them had either been very uncertain and/or likely affected by processes other than aqueous alteration.

“We have demonstrated a robust way to date alteration minerals in meteorites that can be applied to other meteorites and planetary bodies to understand when liquid water might have been present,” Tremblay says.

Because of the Stahura Undergraduate Meteorite Fund, Tremblay and Ickert will be able to continue studying the geochemistry and histories of meteorites and undergraduates at Purdue EAPS will be able to assist in this research.

Share Button

Organ donation: Opt-out defaults do not increase donation rates, study finds

A recent study by the Max Planck Institute for Human Development, in collaboration with the MSB Medical School Berlin and the Max Planck UCL Centre for Computational Psychiatry and Ageing Research, shows that switching to an opt-out organ donation policy, where all adults are presumed organ donors unless they explicitly opt out, does not increase donations from deceased donors. The results of the study have been published in the journal Public Health.

With the demand for donor organs far outstripping the supply, calls for changes in public policy are growing. An opt-out (‘presumed consent’) default policy is often seen as a promising approach. This policy stipulates that all adults are automatically considered potential organ donors after their death, unless they explicitly withdraw their consent during their lifetime. In contrast, the opt-in (‘explicit consent’) system requires potential donors to actively consent to donate their organs after they die. The discussion around implementing an opt-out policy has recently gained traction again in Germany, raising the question of whether such a change in policy would actually lead to an increase in the number of deceased organ donors.

A recent analysis of all member countries of the Organisation for Economic Co-operation and Development (OECD) found no significant difference in deceased donor rates between opt-in and opt-out countries, but significantly fewer living donors — individuals who voluntarily donate organs, like a kidney, while alive — in opt-out countries. However, such cross-sectional analyses cannot control for all country-specific factors like health infrastructure, culture, and religious issues — all of which can influence donation rates.

To address the limitations of prior research, the current study used a longitudinal approach, analyzing changes in deceased donor rates over time in five countries — Argentina, Chile, Sweden, Uruguay, and Wales — that had switched from an opt-in to an opt-out default policy. This method provided a more reliable assessment of the impact of opt-out policies by controlling for long-term trends and country-specific factors.

Data was collected from international databases, including the International Registry in Organ Donation and Transplantation (IRODaT) and the Global Observatory on Donation and Transplantation (GODT). Of the 39 countries that had changed from explicit to presumed consent by December 2019, only five could be included in the analysis due to a lack of historical data for changes made before the IRODaT database was launched in 1996 and because presumed consent practices often existed informally prior to formal legislation.

Consistent with previous cross-sectional analyses, the study found that switching the default from opt-in to opt-out did not lead to any increase in organ donation rates in the five countries considered. Moreover, the results indicated that the opt-out default did not cause even a slight upward curve in organ donations: the long-term trend remained the same, showing no change in the rate following the switch. As expected, the results did show a reduction in deceased donations with the onset of the COVID-19 pandemic, with only a slow recovery observed by 2022.

“Simply switching to an opt-out system does not automatically lead to more organ donations,” states author Mattea Dallacker, who led the project at the Center for Adaptive Rationality at the Max Planck Institute for Human Development. “Without accompanying measures, such as investments in the healthcare system and public awareness campaigns, a shift to an opt-out default is unlikely to increase organ donations. There is no easy solution to the complex challenge of boosting organ donation rates,” she continues.

The study also underscores the crucial role of relatives in organ donation decisions. Even in presumed consent systems, where individuals are considered donors unless they opt out, families are often consulted and can override the presumed consent. Since many people do not talk about their donation wishes with loved ones, presumed consent can lead to uncertainty and hesitation among families, potentially resulting in refusals.

“A possible alternative to the opt-out system is a mandatory choice system,” says Ralph Hertwig, Director at the Center for Adaptive Rationality at the Max Planck Institute for Human Development. “This would allow citizens to explicitly register their consent or objection to organ donation, when applying for a driver’s license or ID card, for example. This active choice system could prompt people to make an informed decision, which would eliminate the perceived ambiguity about their preference that appears to lead to higher family refusal rates. Good and accessible information about organ donation is essential for informed choice,” Hertwig continues.

Share Button

Gas-churning monster black holes

Scientists using observations from NASA’s Neil Gehrels Swift Observatory have discovered, for the first time, the signal from a pair of monster black holes disrupting a cloud of gas in the center of a galaxy.

“It’s a very weird event, called AT 2021hdr, that keeps recurring every few months,” said Lorena Hernández-García, an astrophysicist at the Millennium Institute of Astrophysics, the Millennium Nucleus on Transversal Research and Technology to Explore Supermassive Black Holes, and University of Valparaíso in Chile. “We think that a gas cloud engulfed the black holes. As they orbit each other, the black holes interact with the cloud, perturbing and consuming its gas. This produces an oscillating pattern in the light from the system.”

A paper about AT 2021hdr, led by Hernández-García, was published Nov. 13 in the journal Astronomy and Astrophysics.

The dual black holes are in the center of a galaxy called 2MASX J21240027+3409114, located 1 billion light-years away in the northern constellation Cygnus. The pair are about 16 billion miles (26 billion kilometers) apart, close enough that light only takes a day to travel between them. Together they contain 40 million times the Sun’s mass.

Scientists estimate the black holes complete an orbit every 130 days and will collide and merge in approximately 70,000 years.

AT 2021hdr was first spotted in March 2021 by the Caltech-led ZTF (Zwicky Transient Facility) at the Palomar Observatory in California. It was flagged as a potentially interesting source by ALeRCE (Automatic Learning for the Rapid Classification of Events). This multidisciplinary team combines artificial intelligence tools with human expertise to report events in the night sky to the astronomical community using the mountains of data collected by survey programs like ZTF.

“Although this flare was originally thought to be a supernova, outbursts in 2022 made us think of other explanations,” said co-author Alejandra Muñoz-Arancibia, an ALeRCE team member and astrophysicist at the Millennium Institute of Astrophysics and the Center for Mathematical Modeling at the University of Chile. “Each subsequent event has helped us refine our model of what’s going on in the system.”

Since the first flare, ZTF has detected outbursts from AT 2021hdr every 60 to 90 days.

Hernández-García and her team have been observing the source with Swift since November 2022. Swift helped them determine that the binary produces oscillations in ultraviolet and X-ray light on the same time scales as ZTF sees them in the visible range.

The researchers conducted a Goldilocks-type elimination of different models to explain what they saw in the data.

Initially, they thought the signal could be the byproduct of normal activity in the galactic center. Then they considered whether a tidal disruption event — the destruction of a star that wandered too close to one of the black holes — could be the cause.

Finally, they settled on another possibility, the tidal disruption of a gas cloud, one that was bigger than the binary itself. When the cloud encountered the black holes, gravity ripped it apart, forming filaments around the pair, and friction started to heat it. The gas got particularly dense and hot close to the black holes. As the binary orbits, the complex interplay of forces ejects some of the gas from the system on each rotation. These interactions produce the fluctuating light Swift and ZTF observe.

Hernández-García and her team plan to continue observations of AT 2021hdr to better understand the system and improve their models. They’re also interested in studying its home galaxy, which is currently merging with another one nearby — an event first reported in their paper.

“As Swift approaches its 20th anniversary, it’s incredible to see all the new science it’s still helping the community accomplish,” said S. Bradley Cenko, Swift’s principal investigator at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “There’s still so much it has left to teach us about our ever-changing cosmos.”

Share Button

NHS managers to be sacked in failing hospitals

Hospitals in England will be ranked on care and finances, so patients can look for good service.

Share Button

Vogue boss ‘concerned’ by return to skinny models

British Vogue’s editor says skinny models are back “in”, partly fuelled by weight loss drug Ozempic.

Share Button

‘My wife died because the NHS used cheap labour’

Roy Pollitt’s wife died after a physician associate mistakenly left a drain in her body for 21 hours.

Share Button

Young coral use metabolic tricks to resist bleaching

Coral larvae reduce their metabolism and increase nitrogen uptake to resist bleaching in high temperatures, according to a study published November 12 in the open-access journal PLOS Biology by Ariana S. Huffmyer of the University of Washington, US, and colleagues.

High ocean temperatures cause coral bleaching, which results from the disruption of the relationship between corals and their symbiotic algae, an increasing concern as global temperatures rise. However, relatively little research has examined the effects of high temperatures during early life stages of corals.

In this study, Huffmyer and colleagues exposed coral larvae to high temperatures at the Hawai’i Institute of Marine Biology. For three days during their first week of development, the larvae and their algal symbionts were treated to temperatures 2.5 degrees Celsius above ambient temperature, similar to expected changes in seawater due to climate change. The coral larvae showed no signs of bleaching in the heated water, and they were able to maintain rates of algal photosynthesis and the supply of carbon-based nutrition from the algae to the host. However, there was a 19% reduction in coral metabolism, as well as increased uptake and storage of nitrogen by the coral, both of which are apparent strategies that improve coral survival.

Reduced metabolism allows the coral to conserve energy and resources, also seen in adult corals during bleaching. The change in nitrogen cycling seems to be an adaptation by the coral to limit the amount of nitrogen available to the algae, thus preventing algal overgrowth and the destabilization of the coral-algae relationship.

It remains unclear how effective these strategies are at higher temperatures and for longer durations. Further research into the details and limitations of coral reaction to high temperatures will provide crucial knowledge for predicting coral response and protecting coral reefs as global temperatures continue to rise.

The authors add, “This research reveals that coral larvae must invest in their nutritional partnership with algae to withstand stress, offering key insights into strategies to avoid bleaching in earliest life stages of corals.”

Share Button

Assisted dying could stop harrowing deaths, says MP behind bill

Adults expected to die within six months would be eligible under the proposals for England and Wales.

Share Button

‘I might be dead before a decision is made’: Terminally-ill people on assisted dying

Nik is worried assisted dying could lead to coercion – but Elise, who has cancer, wants the choice.

Share Button

Giving robots superhuman vision using radio signals

In the race to develop robust perception systems for robots, one persistent challenge has been operating in bad weather and harsh conditions. For example, traditional, light-based vision sensors such as cameras or LiDAR (Light Detection And Ranging) fail in heavy smoke and fog.

However, nature has shown that vision doesn’t have to be constrained by light’s limitations — many organisms have evolved ways to perceive their environment without relying on light. Bats navigate using the echoes of sound waves, while sharks hunt by sensing electrical fields from their prey’s movements.

Radio waves, whose wavelengths are orders of magnitude longer than light waves, can better penetrate smoke and fog, and can even see through certain materials — all capabilities beyond human vision. Yet robots have traditionally relied on a limited toolbox: they either use cameras and LiDAR, which provide detailed images but fail in challenging conditions, or traditional radar, which can see through walls and other occlusions but produces crude, low-resolution images.

Now, researchers from the University of Pennsylvania School of Engineering and Applied Science (Penn Engineering) have developed PanoRadar, a new tool to give robots superhuman vision by transforming simple radio waves into detailed, 3D views of the environment.

“Our initial question was whether we could combine the best of both sensing modalities,” says Mingmin Zhao, Assistant Professor in Computer and Information Science. “The robustness of radio signals, which is resilient to fog and other challenging conditions, and the high resolution of visual sensors.”

In a paper to be presented at the 2024 International Conference on Mobile Computing and Networking (MobiCom), Zhao and his team from the Wireless, Audio, Vision, and Electronics for Sensing (WAVES) Lab and the Penn Research In Embedded Computing and Integrated Systems Engineering (PRECISE) Center, including doctoral student Haowen Lai, recent master’s graduate Gaoxiang Luo and undergraduate research assistant Yifei (Freddy) Liu, describe how PanoRadar leverages radio waves and artificial intelligence (AI) to let robots navigate even the most challenging environments, like smoke-filled buildings or foggy roads.

PanoRadar is a sensor that operates like a lighthouse that sweeps its beam in a circle to scan the entire horizon. The system consists of a rotating vertical array of antennas that scans its surroundings. As they rotate, these antennas send out radio waves and listen for their reflections from the environment, much like how a lighthouse’s beam reveals the presence of ships and coastal features.

Thanks to the power of AI, PanoRadar goes beyond this simple scanning strategy. Unlike a lighthouse that simply illuminates different areas as it rotates, PanoRadar cleverly combines measurements from all rotation angles to enhance its imaging resolution. While the sensor itself is only a fraction of the cost of typically expensive LiDAR systems, this rotation strategy creates a dense array of virtual measurement points, which allows PanoRadar to achieve imaging resolution comparable to LiDAR. “The key innovation is in how we process these radio wave measurements,” explains Zhao. “Our signal processing and machine learning algorithms are able to extract rich 3D information from the environment.”

One of the biggest challenges Zhao’s team faced was developing algorithms to maintain high-resolution imaging while the robot moves. “To achieve LiDAR-comparable resolution with radio signals, we needed to combine measurements from many different positions with sub-millimeter accuracy,” explains Lai, the lead author of the paper. “This becomes particularly challenging when the robot is moving, as even small motion errors can significantly impact the imaging quality.”

Another challenge the team tackled was teaching their system to understand what it sees. “Indoor environments have consistent patterns and geometries,” says Luo. “We leveraged these patterns to help our AI system interpret the radar signals, similar to how humans learn to make sense of what they see.” During the training process, the machine learning model relied on LiDAR data to check its understanding against reality and was able to continue to improve itself.

“Our field tests across different buildings showed how radio sensing can excel where traditional sensors struggle,” says Liu. “The system maintains precise tracking through smoke and can even map spaces with glass walls.” This is because radio waves aren’t easily blocked by airborne particles, and the system can even “capture” things that LiDAR can’t, like glass surfaces. PanoRadar’s high resolution also means it can accurately detect people, a critical feature for applications like autonomous vehicles and rescue missions in hazardous environments.

Looking ahead, the team plans to explore how PanoRadar could work alongside other sensing technologies like cameras and LiDAR, creating more robust, multi-modal perception systems for robots. The team is also expanding their tests to include various robotic platforms and autonomous vehicles. “For high-stakes tasks, having multiple ways of sensing the environment is crucial,” says Zhao. “Each sensor has its strengths and weaknesses, and by combining them intelligently, we can create robots that are better equipped to handle real-world challenges.”

This study was conducted at the University of Pennsylvania School of Engineering and Applied Science and supported by a faculty startup fund.

Share Button