Breakthrough is a game changer in heart valve technology

Now, a team of UBC Okanagan researchers believe they have found a way to harness the strengths of both technologies in a way that could be life-changing — and life-saving — for many. Dr. Hadi Mohammadi and his fellow researchers in the Heart Valve Performance Laboratory at UBC Okanagan are focused on developing the mechanical heart valves of the future.

Dr. Mohammadi, an Associate Professor with the School of Engineering, says their latest work, dubbed the iValve, is their most advanced yet and combines the best of both technologies — mechanical and tissue — when it comes to replacement heart valves.

“Tissue valves generally perform better than mechanical valves because of their shape, but last only 15 to 20 years on average, which would require another replacement. Mechanical valves can last a lifetime, but do not perform as well as tissue valves, requiring patients to take daily anticoagulants,” says Dr. Mohammadi.

“We have produced a new mechanical heart valve that combines the best of both worlds — offering the performance of tissue valves with the long-lasting durability of mechanical valves. We believe this valve could make life easier and safer for patients,” he adds.

The breakthrough valve was made possible through an international collaboration with ViVitro Labs and independent consultants Lawrence Scotten and Rolland Siegel. The research was funded by Angeleno Medical and published this month in the Journal of Biomechanics.

“This is the only valve of its kind to be designed and built in Canada,” notes Dr. Mohammadi. “We are incredibly proud of this valve as an example of the engineering innovation coming from UBC and Canada.”

Dr. Mohammadi also says while mechanical heart valve replacements have long been in use, the long-standing challenge has been to perfect the technology for the smallest hearts — tiny infants.

“What is particularly exciting about the iValve, is that it was specifically designed for high-heart-rate applications, such as in pediatric patients,” explains Dr. Mohammadi.

Now that their prototype performs well in mechanical lab tests, the researchers will bring it to animal and clinical trials. If all goes well, they hope the iValve could be ready for those trials within two years.

In the meantime, they will also be using the technology and techniques to develop new valves.

“This valve is designed to allow blood flow to the aorta, which is the body’s largest artery, and the blood vessel that carries oxygen-rich blood away from the heart throughout your body,” explains Mohammadi. “Next, we will take what we have learned and develop one for the mitral valve. That valve is responsible for making sure that blood flows from your left atrium to your left ventricle. It also ensures that blood doesn’t flow backward between those two chambers.”

Heart Valve Performance Lab Manager Dr. Dylan Goode is excited about what the future holds for the iValve — and for the benefits it could bring to patients.

Dr. Goode began working with Dr. Mohammadi in 2018 while completing his Master of Applied Science in Mechanical Engineering. Recently, he successfully defended his doctoral dissertation, which documents his design work, fabrication and testing of the iValve.

“We have shown that the iValve can provide the structural benefits of a mechanical heart valve and last a patient’s lifespan while providing improved hemodynamic performance, meaning an improvement of the way in which blood flows through vessels.”

Dr. Goode notes the new iValve could also mean a major improvement in lifestyle for these patients who endure a routine of regular anticoagulant therapy — blood thinners — which can increase their risk of severe bleeding, blood clots or damage to tissues and organs if blood flow is impeded.

Share Button

Experimental mRNA cancer vaccine shows potential for advanced stage cancer patients in Phase 1 trial

Interim data from the Phase I dose escalation part of the mRNA cancer immunotherapy (mRNA-4359), show promise in patients with advanced solid cancers.

The investigational mRNA cancer immunotherapy is targeted for patients with lung cancer, melanoma and other solid tumours. Nineteen patients with advanced stage cancers received between one and nine doses of the immunotherapy treatment. Scientists have found the immunotherapy created an immune response against cancer and was well tolerated, with adverse events including fatigue, injection site pain and fever.

Results from the Phase I trial, also the first-in-human study of the therapy, are being presented on Saturday, 14th September at the European Society of Medical Oncology conference in Barcelona by the UK Chief Investigator of the trial from King’s College London and Guy’s and St Thomas’ NHS Foundation Trust. The trial is sponsored by Moderna.

The mRNA immunotherapy is just one of many cancer vaccines entering clinical trials around the world. The therapy works by presenting common markers of tumours to patients’ immune systems, training them to recognise and fight cancer cells that express them and potentially eliminate cells that could supress the immune system.

The Phase I trial was designed to test the safety and tolerability of the immunotherapy, and secondary and tertiary objectives were to assess the radiographic and immunological responses.

Eight out of sixteen patients who could have their responses evaluated were able to demonstrate their tumour size did not grow and no new tumours appeared.

Data also showed the mRNA immunotherapy could activate the immune system in many patients, generating immune cells in the blood that could recognise the two proteins of interest (PD-L1 and IDO1). Researchers were able to show in some patients that the immunotherapy can increase levels of important immune cells that can kill cancer cells as well as reduced levels of other immune cells that can prevent the immune system from fighting cancer.

The results should be treated with caution, say the study authors, as the sample size was small and the primary objective of the study was to test for safety and determine the optimal dose of the immunotherapy. However, these promising early results support further research into mRNA-4359.

The trial continues to recruit patients with melanoma and lung cancer in combination with the immunotherapy drug pembrolizumab to provide more information on the safety and efficacy of the therapy.

The UK’s Chief Investigator of the trial Dr Debashis Sarker, a Clinical Reader in Experimental Oncology at King’s College London and a consultant in medical oncology at Guy’s and St Thomas’ NHS Foundation Trust, said: “This study evaluating an mRNA cancer immunotherapy is an important first step in hopefully developing a new treatment for patients with advanced cancers.

“We have shown that the therapy is well tolerated without serious side effects and can stimulate the body’s immune system in a way that could help to treat cancer more effectively. However, as this study has only involved a small number of patients to date, it’s too early to say how effective this could be for people with advanced stage cancer.

“The trial continues to recruit patients with melanoma and lung cancers and is a huge international effort across the UK, USA, Spain and Australia.”

Kyle Holen, M.D., Moderna’s Senior Vice President and Head of Development, Therapeutics and Oncology, said: “We are encouraged by the Phase 1 results of mRNA-4359, which demonstrate its potential to elicit strong antigen-specific T-cell responses while maintaining a manageable safety profile.

“This novel approach could be a key component in shifting the tumour microenvironment toward a more immune-permissive state, offering potential hope for patients with advanced solid tumours.”

Share Button

Key factors that impact long-term weight loss in patients prescribed GLP-1 RA medications

A Cleveland Clinic study identified key factors that can impact the long-term weight loss of patients with obesity who were prescribed injectable semaglutide or liraglutide for the treatment of type 2 diabetes or obesity. The study was published in JAMA Network Open.

“In patients with obesity who were prescribed semaglutide or liraglutide, we found that long-term weight reduction varied significantly based on the medication’s active agent, treatment indication, dosage and persistence with the medication,” said Hamlet Gasoyan, Ph.D., lead author of the study and a researcher with Cleveland Clinic’s Center for Value-Based Care Research.

Semaglutide (sold under the brand names Wegovy and Ozempic) and liraglutide (sold under the brand names Saxenda and Victoza) are glucagon-like peptide-1 receptor agonists, or GLP-1 RA medications. Those FDA-approved medications help lower blood sugar levels and promote weight loss.

Obesity is a complex chronic disease that affects more than 41% of the U.S. adult population. Clinical trials have shown that anti-obesity medications are effective; however, there is limited data in real-world settings regarding the factors associated with long-term weight change and clinically significant weight loss.

In this study, the researchers identified key factors that were associated with long-term weight loss of patients with obesity. They also indicated the elements that were linked to the probability of achieving 10% or more weight loss.

This retrospective cohort study included 3,389 adult patients with obesity who initiated treatment with injectable semaglutide or liraglutide between July 1, 2015, and June 30, 2022. Follow-up ended in July 2023.

At the start of the study, the median baseline body mass index among study participants was 38.5; 82.2% had type 2 diabetes as treatment indication. Among the patients, 68.5% were white, 20.3% were Black, and 7.0% were Hispanic. More than half of the participants were female (54.7%). Most of the patients received treatment for type 2 diabetes. Overall, 39.6% were prescribed semaglutide for type 2 diabetes, 42.6% liraglutide for type 2 diabetes, 11.1% semaglutide for obesity, and 6.7% liraglutide for obesity.

Results show that one year after the initial prescription’s fill, weight change was associated with the following factors:

  • Persistence with medication. On average, patients who were persistent with the medication at one year experienced -5.5% weight change versus -2.8% among patients who had 90-275 medication coverage days within the first year and -1.8% among those with less than 90 covered days.

Researchers found that four in 10 patients (40.7%) were persistent with their medication one year after their initial prescription’s fill. The proportion of patients who were persistent with semaglutide was 45.8% versus 35.6% in patients receiving liraglutide.

Among patients who persisted with their medication at 12 months, the average reduction in body weight was -12.9% with semaglutide for obesity, compared to -5.9% with semaglutide for type 2 diabetes. The reduction in body weight was -5.6% with liraglutide for obesity, compared to -3.1% with liraglutide for type 2 diabetes.

Studies have shown that achieving sustained weight loss of 10% or more provides clinically significant health benefits. With that in mind, Dr. Gasoyan and colleagues looked at the proportion of patients who achieved 10% or more weight reduction.

Overall, 37.4% of patients receiving semaglutide for obesity achieved 10% or more body weight reduction compared to 16.6% of patients receiving semaglutide for type 2 diabetes. In comparison, 14.5% of those receiving liraglutide for obesity achieved 10% or more body weight reduction versus 9.3% of those receiving liraglutide for type 2 diabetes.

Among patients who persisted with their medication one year after their initial prescriptions, the proportion who achieved 10% or more weight reduction was 61% with semaglutide for obesity, 23.1% with semaglutide for type 2 diabetes, 28.6% with liraglutide for obesity, and 12.3% with liraglutide for type 2 diabetes.

Based on the study’s multivariable analysis that accounted for relevant socio-demographic and clinical variables, the following factors were associated with higher odds of achieving 10% or more weight reduction one year after the initial prescriptions:

“Our findings could help inform patients and providers regarding some of the key factors that are associated with the probability of achieving sustained weight loss of a magnitude large enough to provide clinically significant health benefits,” said Dr. Gasoyan. “Having real-world data could help manage expectations regarding weight reduction with GLP-1 RA medications and reinforce that persistence is key to achieve meaningful results.”

In a previous study, Dr. Gasoyan and colleagues looked at the factors influencing the long-term use of anti-obesity medications. Future research will continue to explore patients’ persistence and health outcomes with GLP-1 RA medications. ?

Dr. Gasoyan is supported by a grant from the National Cancer Institute.

Share Button

Antibody-drug conjugate found effective against brain metastases in patients with HER2-positive breast cancer

A drug that delivers chemotherapy directly to tumors has shown impressive activity against some of the hardest-to-reach cancer cells: those that have spread to the brain in patients with advanced HER2-positive breast cancer. The findings, from an international clinical trial led by Dana-Farber Cancer Institute researchers, reinforce earlier findings of the benefits of the drug — trastuzumab deruxtecan (T-DXd), an antibody-drug conjugate — in these patients, trial leaders say.

The results of the trial, dubbed the DESTINY-Breast12 study, were presented today at the European Society of Medical Oncology (ESMO) Congress 2024 in Barcelona, Spain, and published simultaneously in a paper in the journal Nature Medicine.

The findings point to T-DXd as a valuable new treatment option for patients with a particularly challenging form of cancer, researchers say. “As many as half of patients with HER2-positive breast cancer develop brain metastases, which often has a poorer prognosis than breast cancer that hasn’t spread to the brain,” says Nancy Lin, MD, leader of the trial and senior author of the study in Nature Medicine. Lin is the associate chief of the Division of Breast Oncology, Dana-Farber, Susan F. Smith Center for Women’s Cancers, and the director of the Metastatic Breast Cancer Program. Localized therapies such as surgery, radiosurgery, and radiation therapy to the brain, are used to treat brain metastases, but the disease usually progresses in the central nervous system — the brain and spinal cord — within six to 12 months of treatment.

Trastuzumab deruxtecan consists of the drug deruxtecan — a chemotherapy agent — linked to an antibody that targets the HER2 protein on breast cancer cells. Trastuzumab itself is a mainstay treatment of HER2-positive breast cancer that has spread to other parts of the body, including the brain. But as with treatments directed specifically at the brain, patients receiving trastuzumab usually have their disease progress, often in the central nervous system.

“Additional systemic therapies for patients with brain metastases are urgently needed,” Lin remarks.

The DESTINY-Breast12 trial involved 504 patients with HER-2 positive breast cancer treated at 78 cancer centers in Western Europe, Japan, Australia, and the U.S. Two hundred sixty-three participants had active or stable brain metastases and 241 had no brain metastases. All had received at least one therapy before enrolling in the trial.

After a median follow-up of 15.4 months, progression-free survival of participants with brain metastases — the length of time patients lived with the cancer before it worsened — was a median of 17.3 months, investigators found. 12- month progression-free survival was 61.6%. Seventy-one percent of participants had an intracranial objective response — a measurable decrease of their cancer in the central nervous system. As expected, there was also a high rate of response in tumors outside of the central nervous system in patients with or without brain metastases. Ninety percent of patients in both groups were alive a year after beginning T-DXd treatment.

The side effects associated with T-DXd were consistent with those reported in previous studies and included nausea, constipation, neutropenia (low levels of a type of white blood cells), fatigue, and anemia. Interstitial lung disease (ILD), a known risk of T-DXd, was observed at similar rates to prior studies, and vigilance to this potentially fatal side effect remains critical.

“Our data show that T-DXd has substantial and durable activity within the brain in patients with HER2-positive breast cancer that has metastasized there,” Lin says. “These results support the use of the drug going forward in this patient population.”

Share Button

Babies born to women consuming a high fat, sugary diet at greater risk of cardiovascular disease and diabetes in later life

Babies born to pregnant women with obesity are more likely to develop heart problems and diabetes as adults due to fetal damage caused by the high-fat, high-energy diet of their mother.

That’s the groundbreaking finding from a new study published in the Journal of Physiology that shows for the first time that maternal obesity alters a critical thyroid hormone in the fetal heart, disrupting its development.

Consuming a high-fat, sugary diet during pregnancy also increases the likelihood of the unborn baby becoming insulin resistant in adulthood, potentially triggering diabetes and causing cardiovascular disease. This is despite babies being a normal weight at birth.

University of South Australia researchers identified the link by analysing tissue samples from the fetuses of pregnant baboons fed a high-fat, high-energy diet in a biomedical research institute in the United States. They then compared this to fetuses from baboons on a control diet.

Lead author, University of South Australia PhD candidate Melanie Bertossa, says the findings are significant because they demonstrate a clear link between an unhealthy diet high in saturated fats and sugar, and poor cardiovascular health.

“There has been a long-standing debate as to whether high-fat diets induce a hyper- or hypothyroid state in the fetal heart. Our evidence points to the latter,” Bertossa says.

“We found that a maternal high-fat, high-energy diet reduced concentrations of the active thyroid hormone T3, which acts like a switch around late gestation, telling the fetal heart to start preparing for life after birth. Without this signal, the fetal heart develops differently.”

Bertossa says that diets high in fat and sugar can alter the molecular pathways involved in insulin signalling and critical proteins involved in glucose uptake in the fetal heart. This increases the risk of cardiac insulin resistance, often leading to diabetes in adulthood.

“You’re born with all the heart cells you will ever have. The heart doesn’t make enough new heart muscle cells after birth to repair any damage, so changes that negatively impact these cells before birth could persist for a lifetime.

“These permanent changes could cause a further decline in heart health once children reach adolescence and adulthood when the heart starts to age.”

Senior author, UniSA Professor of Physiology Janna Morrison, says the study demonstrates the importance of good maternal nutrition in the leadup to pregnancy, not only for the mother’s sake but also for the health of the baby.

“Poor cardiac outcomes were seen in babies that had a normal birth weight — a sign that should guide future clinical practice,” Prof Morrison says.

“Cardiometabolic health screening should be performed on all babies born from these types of pregnancies, not just those born too small or too large, with the goal being to detect heart disease risks earlier.”

Prof Morrison says that if rising rates of high-fat sugary diets are not addressed, more people will develop health complications such as diabetes and cardiovascular disease, which could result in shorter life spans in the decades ahead.

“Hopefully, with the knowledge we have now about the negative health impacts of obesity, there is potential to change this trajectory.”

The researchers are currently undertaking long-term studies of babies born to women on high- fat high-energy diets to track their health over decades.

Share Button

CRISPR/Cas9 modifies euglena to create potential biofuel source

News about biofuels sometimes mentions used cooking oil as a feedstock, but if these substances contain animal fat, they can solidify in colder temperatures. This happens because, chemically, the fatty acids of these and many other saturated fats have long carbon chains with single bonds. Enter the euglena. An Osaka Metropolitan University team has found a way to have one species of this microalgae produce wax esters with shorter carbon chains than usual.

Using CRISPR/Cas9 to edit the genome of Euglena gracilis, Dr. Masami Nakazawa and her team at the Graduate School of Agriculture’s Department of Applied Biochemistry produced stable mutants that created wax esters two carbons shorter than the wild-type species.

This improvement in the cold flow of the wax esters makes them more applicable as feedstock for biofuels. Among the factors favorable to using Euglena gracilis as a biofuel source are its ability to grow easily through photosynthesis and anaerobic production of wax esters.

“This achievement is expected to serve as a fundamental technology for replacing some petroleum-based production of wax esters with biological sources,” Dr. Nakazawa affirmed.

Share Button

Energy transmission in quantum field theory requires information

An international team of researchers has found a surprisingly simple relationship between the rates of energy and information transmission across an interface connecting two quantum field theories. Their work was published in Physical Review Letters on August 30.

The interface between different quantum field theories is an important concept that arises in a variety of problems in particle physics and condensed matter physics. However, it has been difficult to calculate the transmission rates of energy and information across interfaces.

Hirosi Ooguri, Professor at the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU, WPI) at the University of Tokyo and Fred Kavli Professor at the California Institute of Technology, together with his collaborators, Associate Professor Yuya Kusuki at Kyushu University, and Professor Andreas Karch and graduate students Hao-Yu Sun and Mianqi Wang at the University of Texas, Austin, showed that for theories in two dimensions with scale invariance there are simple and universal inequalities between three quantities: Energy transfer rate, Information transfer rate, and the size of Hilbert space (measured by the rate of increase of the number of states at high energy). Namely,

[ energy transmittance ] ≤ [ information transmittance] ≤ [ size of the Hilbert space ].

These inequalities imply that, in order to transmit energy, information must also be transmitted, and both require a sufficient number of states. They also showed that no stronger inequality is possible.

Both energy and information transmissions are important quantities, but they are difficult to calculate, and no relationship between them was known. By showing the inequality between these quantities, this paper sheds new light on this important but difficult problem.

Share Button

New discovery aims to improve the design of microelectronic devices

A new study led by researchers at the University of Minnesota Twin Cities is providing new insights into how next-generation electronics, including memory components in computers, breakdown or degrade over time. Understanding the reasons for degradation could help improve efficiency of data storage solutions.

The research is published in ACS Nano, a peer-reviewed scientific journal and is featured on the cover of the journal.

Advances in computing technology continue to increase the demand for efficient data storage solutions. Spintronic magnetic tunnel junctions (MTJs) — nanostructured devices that use the spin of the electrons to improve hard drives, sensors, and other microelectronics systems, including Magnetic Random Access Memory (MRAM) — create promising alternatives for the next generation of memory devices.

MTJs have been the building blocks for the non-volatile memory in products like smart watches and in-memory computing with a promise for applications to improve energy efficiency in AI.

Using a sophisticated electron microscope, researchers looked at the nanopillars within these systems, which are extremely small, transparent layers within the device. The researchers ran a current through the device to see how it operates. As they increased the current, they were able to observe how the device degrades and eventually dies in real time.

“Real-time transmission electron microscopy (TEM) experiments can be challenging, even for experienced researchers,” said Dr. Hwanhui Yun, first author on the paper and postdoctoral research associate in the University of Minnesota’s Department of Chemical Engineering and Material Sciences. “But after dozens of failures and optimizations, working samples were consistently produced.”

By doing this, they discovered that over time with a continuous current, the layers of the device get pinched and cause the device to malfunction. Previous research theorized this, but this is the first time researchers have been able to observe this phenomenon. Once the device forms a “pinhole” (the pinch), it is in the early stages of degradation. As the researchers continued to add more and more current to the device, it melts down and completely burns out.

“What was unusual with this discovery is that we observed this burn out at a much lower temperature than what previous research thought was possible,” said Andre Mkhoyan, a senior author on the paper and professor and Ray D. and Mary T. Johnson Chair in the University of Minnesota Department of Chemical Engineering and Material Sciences. “The temperature was almost half of the temperature that had been expected before.”

Looking more closely at the device at the atomic scale, researchers realized materials that small have very different properties, including melting temperature. This means that the device will completely fail at a very different time frame than anyone has known before.

“There has been a high demand to understand the interfaces between layers in real time under real working conditions, such as applying current and voltage, but no one has achieved this level of understanding before,” said Jian-Ping Wang, a senior author on the paper and a Distinguished McKnight Professor and Robert F. Hartmann Chair in the Department of Electrical and Computer Engineering at the University of Minnesota.

“We are very happy to say that the team has discovered something that will be directly impacting the next generation microelectronic devices for our semiconductor industry,” Wang added.

The researchers hope this knowledge can be used in the future to improve design of computer memory units to increase longevity and efficiency.

In addition to Yun, Mkhoyan, and Wang, the team included University of Minnesota Department of Electrical and Computer Engineering postdoctoral researcher Deyuan Lyu, research associate Yang Lv, former postdoctoral researcher Brandon Zink, and researchers from the University of Arizona Department of Physics.

This work was funded by SMART, one of seven centers of nCORE, a Semiconductor Research Corp. program sponsored by the National Institute of Standards and Technology (NIST); University of Minnesota Grant-in-Aid funding; National Science Foundation (NSF); and Defense Advanced Research Projects Agency (DARPA). The work was completed in collaboration with the University of Minnesota Characterization Facility and the Minnesota Nano Center.

Share Button

Bacteria work together to thrive in difficult conditions

Though a founding concept of ecology suggests that the physical environment determines where organisms can survive, modern scientists have suspected there is more to the story of how microbial communities form in the soil.

In a new study, researchers have determined through both statistical analysis and in experiments that soil pH is a driver of microbial community composition — but that the need to address toxicity released during nitrogen cycling ultimately shapes the final microbial community.

“The physical environment is affecting the nature of microbial interactions, and that affects the assembly of the community,” said co-lead author Karna Gowda, assistant professor of microbiology at The Ohio State University. “People in the field understood these two things must be important at some level, but there wasn’t a lot of evidence for it. We’re adding some specificity and mechanisms to this idea.”

The work helps clarify the microbial underpinnings of global nitrogen cycling and may provide a new way to think about emissions of nitrous oxide, a potent greenhouse gas, Gowda said.

The research was published recently in Nature Microbiology.

Microbes keep soil healthy and productive by recycling nutrients, and are particularly important for converting nitrogen into forms that plants can use. Underground organisms living in the same environment are also highly interconnected, preying on each other, participating in chemical exchanges and providing community benefits.

For this work, Gowda and colleagues used a dataset from a worldwide collection of topsoil samples, sequencing the genomes of microbes present in the samples and analyzing important characteristics of the soil — such as nitrogen and carbon content and pH, a measure of soil’s acidity.

“We wanted to look at trends that were widespread and that would manifest around the planet across very different environments,” Gowda said.

With billions of bacteria present in a sample of soil, the researchers relied on the genetic makeup of microbial communities to determine their functional roles.

The team zeroed in on genes that identified which bacteria were involved in denitrification — converting nitrogen compounds from bioavailable forms into nitrous oxide and dinitrogen gas that’s released in the atmosphere. A bioinformatics analysis showed that soil pH was the most important environmental factor associated with the abundance of these organisms.

To test the statistical finding, the researchers conducted lab enrichment experiments, running a natural microbial community through different conditions of growth.

During denitrification, specific enzymes have roles in the conversion of nitrate into various nitrogen-containing compounds. One of these forms, nitrite, is more toxic in acidic soil (low pH) than it is under neutral conditions with higher pH.

The experiments showed that strains with enzymes called Nar, linked to creating toxic nitrite, and strains with enzymes called Nap, linked to consuming nitrite, fluctuated based on the acidity of the soil.

“We found more of Nar at low pH and less of Nap, and vice versa as the soil pH moved toward neutral,” Gowda said. “So we see two different types of organisms prevalent at acidic versus neutral pH, but we also find that that’s actually not explaining what’s going on. It’s not just the environment that’s determining who’s there — it’s actually the environment plus interactions between more organisms in the community.

“This means that pH is affecting the interaction between organisms in the community in a more or less consistent way — it’s always about the toxicity of nitrite. And this highlights how different bacteria work together to thrive in varying soil pH levels.”

That finding was novel and important, Gowda said. Bacteria and other microorganisms are known to be driven by a will to survive, but they also rely on each other to stay safe — and that cooperation has implications for environmental health, the research suggests.

“While individual fitness effects clearly play a role in defining patterns in many contexts, interactions are likely essential to explaining patterns in a variety of other contexts,” the authors wrote.

Understanding how interactions and the environment affect nitrous oxide emissions could provide new insights into reducing this potent greenhouse gas, Gowda said: Denitrifying bacteria are key sources and sinks of nitrous oxide in agricultural soils. While past studies have focused on the behavior of these nitrous oxide-emitting organisms in different pH conditions, considering their ecological interactions may offer new strategies to lower emissions.

This work was supported by the National Science Foundation, the University of Chicago, the National Institute of General Medical Sciences, a James S. McDonnell Foundation Postdoctoral Fellowship Award, and a Fannie and John Hertz Fellowship Award.

Co-authors include Seppe Kuehn, Kyle Crocker, Kiseok Keith Lee, Milena Chakraverti-Wuerthwein and Zeqian Li of the University of Chicago; Mikhail Tikhonov of Washington University in St. Louis; and Madhav Mani of Northwestern University.

Share Button

Unveiling the math behind your calendar

In a world where organizing a simple meeting can feel like herding cats, new research from Case Western Reserve University reveals just how challenging finding a suitable meeting time becomes as the number of participants grows.

The study, published in the European Physical Journal B, dives into the mathematical complexities of this common task, offering new insights into why scheduling often feels so impossible.

“If you like to think the worst about people, then this study might be for you,” quipped researcher Harsh Mathur, professor of physics at the College of Arts and Sciences at CWRU. “But this is about more than Doodle polls. We started off by wanting to answer this question about polls, but it turns out there is more to the story.”

Researchers used mathematical modeling to calculate the likelihood of successfully scheduling a meeting based on several factors: the number of participants (m), the number of possible meeting times (τ) and the number of times each participant is unavailable (r).

What they found: As the number of participants grows, the probability of scheduling a successful meeting decreases sharply.

Specifically, the probability drops significantly when more than five people are involved — especially if participant availability remains consistent.

“We wanted to know the odds,” Mathur said. “The science of probability actually started with people studying gambling, but it applies just as well to something like scheduling meetings. Our research shows that as the number of participants grows, the number of potential meeting times that need to be polled increases exponentially.

“The project had started half in jest but this exponential behavior got our attention. It showed that scheduling meetings is a difficult problem, on par with some of the great problems in computer science.”

‘More to the story’

Interestingly, researchers found a parallel between scheduling difficulties and physical phenomena. They observed that as the probability of a participant rejecting a proposed meeting time increases, there’s a critical point where the likelihood of successfully scheduling the meeting drops sharply. It’s a phenomenon similar to what is known as “phase transitions” in physics, Mathur said, such as ice melting into water.

“Understanding phase transitions mathematically is a triumph of physics,” he said. “It’s fascinating how something as mundane as scheduling can mirror the complexity of phase transitions.”

Mathur also noted the study’s broader implications, from casual scenarios like sharing appetizers at a restaurant to more complex settings like drafting climate policy reports, where agreement among many is needed.

“Consensus-building is hard,” Mathur said. “Like phase transitions, it’s complex. But that’s also where the beauty of mathematics lies — it gives us tools to understand and quantify these challenges.”

Mathur said the study contributes insights into the complexities of group coordination and decision-making, with potential applications across various fields.

Joining Mathur in the study were physicists Katherine Brown, of Hamilton College, and Onuttom Narayan, of the University of California, Santa Cruz.

Share Button