More families refusing to donate relatives’ organs

Hundreds of patients died waiting for a transplant last year, amid concerns about consent rates.

Share Button

Homemade ‘play-putty’ can read the body’s electric signals

A new study by University of Massachusetts Amherst researchers demonstrates the effectiveness of homemade play putty at reading brain, heart, muscle and eye activity. Published in Device, the research outlines the conductive properties of this material, so-named “squishy circuits.”

“[Squishy circuits] are literally child’s play putty, that is also conductive” describes Dmitry Kireev, assistant professor of biomedical engineering and senior author on the paper.

The conductive squishy circuits — whether homemade or store-bought- are made of flour, water, salt, cream of tartar and vegetable oil. “Salt is what makes it conductive,” Kireev explains. As a child’s toy, this modeling clay is a maleable way to add lights to an art projectby connecting them to a power source as a way to teach kids about circuits. Now, Kireev and his team have demonstrated that the material has more potential.

“We used the squishy circuits as an interface to measure electricity or measure bioelectrical potentials from a human body,” he says. They found that, compared to commercially available gel electrodes, these squishy circuits effectively captured various electrophysiology measurements: electroencephalogram (EEG) for brain activity, electrocardiogram (ECG) for heart recordings, electrooculogram (EOG) for tracking eye movement and electromyography (EMG) for muscle contraction.

“What makes one electrode material better than another in terms of the quality of the measurements is impedance,” he explains. Impedance is a measure that describes the quality of conductivity between two materials. “The lower the impedance between the electrode and the tissue, the better the conductivity in between and the better your ability to measure those bioelectrical potentials.”

The study found that the impedance for the squishy circuit electrode was on par with one of the commercially available gel electrodes and twice as better as a second comparison electrode.

Kireev highlights several benefits to this material. First is cost: Even using pre-made putty, the cost per electrode was about 1cent. Typical electrodes cost on average between $0.25 and $1.

Also, the material is resilient: it can be formed and reformed, molded to the contours of the skin, combined with more putty to make it bigger, reused and easily reconnected if it comes apart. Other comparable state-of-the-art wearable bioelectronics have been made of carbon nanotubes, graphene, silver nanowires and organic polymers. While highly conductive, these materials can be expensive, difficult to handle or make, single use or fragile.

Kireev also highlights the availability of these materials. “It’s something you can do at home or in high school laboratories, for example, if needed,” he says. “You can democratize these applications [so it’s] more widespread.”

He gives credit to his research team of undergraduate students (some of whom have since graduated and are continuing with graduate studies at UMass): Alexandra Katsoulakis, Favour Nakyazze, Max Mchugh, Sean Morris, Monil Bhavsar and Om Tank.

Share Button

Magnifying deep space through the ‘carousel lens’

In a rare and extraordinary discovery, researchers have identified a unique configuration of galaxies that form the most exquisitely aligned gravitational lens found to date. The Carousel Lens is a massive cluster-scale gravitational lens system that will enable researchers to delve deeper into the mysteries of the cosmos, including dark matter and dark energy.

“This is an amazingly lucky ‘galactic line-up’ — a chance alignment of multiple galaxies across a line-of-sight spanning most of the observable universe,” said David Schlegel, a co-author of the study and a senior scientist in Berkeley Lab’s Physics Division. “Finding one such alignment is a needle in the haystack. Finding all of these is like eight needles precisely lined up inside that haystack.”

The Carousel Lens is an alignment consisting of one foreground galaxy cluster (the ‘lens’) and seven background galaxies spanning immense cosmic distances and seen through the gravitationally distorted space-time around the lens. In the dramatic image below:

  • The lensing cluster, located 5 billion light years away from Earth, is shown by its four brightest and most massive galaxies (indicated by La, Lb, Lc, and Ld), and these constitute the foreground of the image.
  • Seven unique galaxies (numbered 1 through 7), appear through the lens. These are located far beyond, at distances from 7.6 to 12 billion light years away from Earth, approaching the limit of the observable universe.
  • Each galaxy’s repeated appearances (indicated by each number’s letter index, e.g., a through d) show differences in shape that are curved and stretched into multiple “fun house mirror” iterations caused by the warped space-time around the lens.
  • Of particular interest is the discovery of an Einstein Cross — the largest known to date — shown in galaxy number 4’s multiple appearances (indicated by 4a, 4b, 4c, and 4d). This rare configuration of multiple images around the center of the lens is an indication of the symmetrical distribution of the lens’ mass (dominated by invisible dark matter) and plays a key role in the lens-modeling process.

Light traveling from far-distant space can be magnified and curved as it passes through the gravitationally distorted space-time of nearer galaxies or clusters of galaxies. In rare instances, a configuration of objects aligns nearly perfectly to form a strong gravitational lens. Using an abundance of new data from the Dark Energy Spectroscopic Instrument (DESI) Legacy Imaging Surveys, recent observations from NASA’s Hubble Space Telescope, and the Perlmutter supercomputer at the National Energy Research Scientific Computing Center (NERSC), the research team built on their earlier studies (in May 2020 and Feb 2021) to identify likely strong lens candidates, laying the groundwork for the current discovery.

“Our team has been searching for strong lenses and modeling the most valuable systems,” explains Xiaosheng Huang, a study co-author and member of Berkeley Lab’s Supernova Cosmology Project, and a professor of physics and astronomy at the University of San Francisco. “The Carousel Lens is an incredible alignment of seven galaxies in five groupings that line up nearly perfectly behind the foreground cluster lens. As they appear through the lens, the multiple images of each of the background galaxies form approximately concentric circular patterns around the foreground lens, as in a carousel. It’s an unprecedented discovery, and the computational model generated shows a highly promising prospect for measuring the properties of the cosmos, including those of dark matter and dark energy.”

The study also involved several Berkeley Lab student researchers, including the lead author, William Sheu, an undergraduate student intern with DESI at the beginning of this study, now a PhD student at UCLA and a DESI collaborator.

The Carousel Lens will enable researchers to study dark energy and dark matter in entirely new ways based on the strength of the observational data and its computational model.

“This is an extremely unusual alignment, which by itself will provide a testbed for cosmological studies,” observes Nathalie Palanque-Delabrouille, director of Berkeley Lab’s Physics Division. “It also shows how the imaging done for DESI can be leveraged for other scientific applications,” such as investigating the mysteries of dark matter and the accelerating expansion of the universe, which is driven by dark energy.

Share Button

Artificial intelligence grunt work can be outsourced using a new blockchain-based framework developed by Concordians

Tomorrow’s workplace will be run on mind-boggling amounts of data. To make sense of it all, businesses, developers and individuals will need better artificial intelligence (AI) systems, better trained AI workers and more efficient number-crunching servers.

While big tech companies have the resources and expertise to meet these demands, they remain beyond the reach of most small and medium-sized enterprises and individuals. To respond to this need, a Concordia-led international team of researchers has developed a new framework to make complex AI tasks more accessible and transparent to users.

The framework, described in an article published in the journal Information Sciences, specializes in providing solutions to deep reinforcement learning (DRL) requests. DRL is a subset of machine learning that combines deep learning, which uses layered neural networks to find patterns in huge data sets, and reinforcement learning, in which an agent learns how to make decisions by interacting with its environment based on a reward/penalty system.

DRL is used in industries as diverse as gaming, robotics, health care and finance.

The framework pairs developers, companies and individuals that have specific but out-of-reach AI needs with service providers who have the resources, expertise and models they require. The service is crowdsourced, built on a blockchain and uses a smart contract — a contract with a pre-defined set of conditions built into the code — to match the users with the appropriate service provider.

“Crowdsourcing the process of training and designing DRL makes the process more transparent and more accessible,” says Ahmed Alagha, a PhD candidate at the Gina Cody School of Engineering and Computer Science and the paper’s lead author.

“With this framework, anyone can sign up and build a history and profile. Based on their expertise, training and ratings, they can be allocated tasks that users are requesting.”

Democratizing DRL

According to his co-author and thesis supervisor Jamal Bentahar, a professor at the Concordia Institute for Information Systems Engineering, this service opens the potential offered by DRL to a much wider population than was previously available.

“To train a DRL model, you need computational resources that are not available to everyone. You also need expertise. This framework offers both,” he says.

The researchers believe that their system’s design will reduce costs and risk by distributing computation efforts via the blockchain. The potentially catastrophic consequences of a server crash or malicious attack are mitigated by having dozens or hundreds of other machines working on the same problem.

“If a centralized server fails, the whole platform goes down,” Alagha explains. “Blockchain gives you distribution and transparency. Everything is logged on it, so it is very difficult to tamper with.”

The difficult and costly process of training a model to work properly can be shortened by having an existing model available that only requires some relatively minor adjustments to fit a user’s particular needs.

“For instance, suppose a large city develops a model that can automate traffic light sequences to optimize traffic flow and minimize accidents. Smaller cities or towns may not have the resources to develop one on their own, but they can use the one the big city developed and adapt it for their own circumstances.”

Hadi Otrok, Shakti Singh and Rabeb Mizouni of Khalifa University in Abu Dhabi contributed this study.

Share Button

Health warning over face-slap fighting

Doctors worry about brain damage from a new type of championship fighting that has grown in popularity.

Share Button

New XEC Covid variant: What are the symptoms and is it spreading in the UK?

XEC has some new mutations that might help it spread this autumn, scientists say.

Share Button

Buffer zones set to come in around abortion clinics

Those who break the law protecting areas around abortion clinics could face an unlimited fine.

Share Button

NHS junior doctors to be known as resident doctors after job title change

They will now be called resident doctors in an attempt to better reflect their expertise.

Share Button

How plant communities change when conquering uninhabited ground

Some plants are able to take over uninhabited spaces like sand dunes, volcanic substrates and rockfall areas. The first colonizers have specific traits that allow them to grow in such hostile environments. Other plants lack such traits but will soon follow these pioneers. Ricardo Martínez-García from the Center for Advanced Systems Understanding (CASUS), an institute of the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), and collaborators from Spain and Brazil investigated the type of interaction between different species on these newly conquered grounds with the help of a mathematical model based on existing root physiology knowledge. Their new model connects the type of species interaction with the general availability of a scarce soil resource. It also reveals the best strategy for the pioneer who can harness a resource that is not freely available.

Plants interact in many different ways and very often we see that one individual is supporting another from a different species. The term experts use for that is facilitation. With symbiotic facilitation, both plants support each other. With commensalistic facilitation, the nurtured plant neither affects its benefactor positively nor negatively. The third type of facilitation is called antagonistic facilitation. Here, the nurtured partner benefits at the expense of the benefactor. The latter, for example, leaves a self-produced resource to the partner, even though he could actually use it himself. The benefactor appears to “accept” this situation: Neither a kind of defensive reaction against the removal of resources nor a complete halt to production can be observed.

“There is an ongoing debate about whether antagonistic facilitation actually exists. Our study provides a clear result: this type of interaction between plant species could occur in nature,” says Dr. Ricardo Martínez-García, CASUS Young Investigator and corresponding author of the study that will be published in the second October issue of the journal New Phytologist. “To prove antagonistic facilitation experimentally, requires a big effort. To begin with, it must be ruled out that this type of interaction is neither symbiotic nor commensalistic facilitation. In addition, it has to be shown that it is not a classic competition where both plants are harming each other in the fight for resources.”

Plants as miners

Martínez-García as well as Ciro Cabal (King Juan Carlos University, Madrid, Spain) and Gabriel A. Maciel (South American Institute for Fundamental Research, São Paulo, Brazil), the two lead authors of the study, focused with their modelling efforts on an example from nature where the existence of antagonistic facilitation had been suspected for a long time: pioneering plants starting to grow on uninhabited ground and other plants soon following track. Such pioneer plants can engineer their environment to increase the availability of certain scarce soil resources like nitrogen and phosphorus. Their abilities certainly help them thrive and they do not seem to be bothered by appearing opportunistic plants helping themselves at the buffet. The bottom line is that the pioneer still benefits from its special traits. From an experimental point of view, the pioneer plant example appears to be a manageable system. Nevertheless, practitioners have not yet been able to determine the type of interaction for this example with certainty. The results of the model presented here are now a strong argument for the existence of antagonistic facilitation in these pioneer areas. Of course, it stands to reason that this type of interaction probably exists not only here, but also elsewhere in nature.

“Our model also shows that plant interactions are an emergent property of resource availability,” adds Cabal. “It turned out that in environments with low and intermediate resource availability antagonistic facilitation is the best strategy. This too was suggested some time ago but it was so far not backed by either experimental data or theoretical models.” Consequently, the research team was not only able to provide reliable results for the general existence of this type of interaction. In fact, antagonistic facilitation is even the optimal interaction between two plant communities under some environmental conditions.

As over time the soil changes and more and more plant species flourish, the interaction between the species changes. Although the pioneer species continue to increase the availability of resources, this no longer affects the other plants due to the generally good resource situation. The phase of antagonistic facilitation is over and all plants compete against each other. Further into the future, the mining capability of the pioneer will even become a burden in this competition. The pioneers are at a disadvantage. In the end, other plant species prevail in the competition and pioneer plants are no longer to be found on the site.

How root modeling helps to explain ecological patterns

Modeling is an important tool in ecology, because it allows to test hypotheses and explore ideas that are hard to investigate in field or laboratory experiments. In these cases, computational simulations can help understanding ecological dynamics and patterns and even guide the design of field and lab experiments. In a 2020 Science paper Cabal, Martínez-García and others presented a mathematical model that predicts the density and spatial distribution of roots of interacting plants. A comparison with greenhouse experiments showed great overlap with the model’s prediction.

For the New Phytologist study, the 2020 root model has been extended and refined to represent the interaction of pioneer plants with their environment as well as with other plants. Among others, the model takes into account the dynamics of an in-demand soil resource (input, decay, availability for the plants, mining trait of the pioneer plants), the size and shape of the plants’ root systems and the costs of growing and maintaining roots, mining the resource as well as transporting it within the plants.

Share Button

New understanding of the limits on nano-noise

Thanks to nanoscale devices as small as human cells, researchers can create groundbreaking material properties, leading to smaller, faster, and more energy-efficient electronics. However, to fully unlock the potential of nanotechnology, addressing noise is crucial. A research team at Chalmers University of Technology, in Sweden, has taken a significant step toward unraveling fundamental constraints on noise, paving the way for future nanoelectronics.

Nanotechnology is rapidly advancing, capturing widespread interest across industries such as communications and energy production. At the nano level — equivalent to a millionth of a millimeter — particles adhere to quantum mechanical laws. By harnessing these properties, materials can be engineered to exhibit enhanced conductivity, magnetism, and energy efficiency.

“Today, we witness the tangible impact of nanotechnology — nanoscale devices are ingredients to faster technologies and nanostructures make materials for power production more efficient,” says Janine Splettstösser, Professor of Applied Quantum Physics at Chalmers.

Devices smaller than the human cell unlocking novel electronic and thermoelectric properties

To manipulate charge and energy currents down to the single-electron level, researchers use so-called nanoscale devices, systems smaller than human cells. These nanoelectronic systems can act as “tiny engines” performing specific tasks, leveraging quantum mechanical properties.

“At the nanoscale, devices can have entirely new and desirable properties. These devices, which are a hundred to ten thousand times smaller than a human cell, allow to design highly efficient energy conversion processes,” says Ludovico Tesser, PhD student in Applied Quantum Physics at Chalmers University of Technology.

Navigating nano-noise: a crucial challenge

However, noise poses a significant hurdle in advancing this nanotechnology research. This disruptive noise is created by electrical charge fluctuations and thermal effects within devices, hindering precise and reliable performance. Despite extensive efforts, researchers have yet to find out to which extent this noise can be eliminated without hindering energy conversion, and our understanding of its mechanisms remains limited. But now a research team at Chalmers has succeeded in taking an important step in the right direction.

In their recent study, published as editor’s suggestion in Physical Review Letters, they investigated thermoelectric heat engines at the nanoscale. These specialised devices are designed to control and convert waste heat into electrical power.

“All electronics emit heat and recently there has been a lot of effort to understand how, at the nano-level, this heat can be converted to useful energy. Tiny thermoelectric heat engines take advantage of quantum mechanical properties and nonthermal effects and, like tiny power plants, can convert the heat into electrical power rather than letting it go to waste,” says Professor Splettstösser.

Balancing noise and power in nanoscale heat engines

However, nanoscale thermoelectric heat engines work better when subject to significant temperature differences. These temperature variations make the already challenging noise researchers are facing even trickier to study and understand. But now, the Chalmers researchers have managed to shed light on a critical trade-off between noise and power in thermoelectric heat engines.

“We can prove that there is a fundamental constraint to the noise directly affecting the performance of the ‘engine’. For example, we can not only see that if you want the device to produce a lot of power, you need to tolerate higher noise levels, but also the exact amount of noise. It clarifies a trade-off relation, that is how much noise one must endure to extract a specific amount of power from these nanoscale engines. We hope that these findings can serve as a guideline in the area going forward to design nanoscale thermoelectric devices with high precision,” says Ludovico Tesser.

Share Button