fbpx
Connect with us

The Conversation

How AI deciphers neural signals to help a man with ALS speak

Published

on

theconversation.com – Nicholas Card, Postdoctoral Fellow of Neuroscience and Neuroengineering, University of California, Davis – 2024-08-22 07:17:14

Casey Harrell, who has ALS, works with a brain-computer interface to turn his into words.

Nicholas Card

Nicholas Card, University of California, Davis

Brain-computer interfaces are a groundbreaking technology that can paralyzed people regain functions they’ve lost, like moving a hand. These devices record signals from the brain and decipher the user’s intended action, bypassing damaged or degraded nerves that would normally transmit those brain signals to control muscles.

Advertisement

Since 2006, demonstrations of brain-computer interfaces in humans have primarily focused on restoring arm and hand movements by enabling people to control computer cursors or robotic arms. Recently, researchers have begun developing speech brain-computer interfaces to restore communication for people who cannot speak.

As the user attempts to , these brain-computer interfaces record the person’s unique brain signals associated with attempted muscle movements for speaking and then translate them into words. These words can then be displayed as text on a screen or spoken aloud using text-to-speech software.

I’m a reseacher in the Neuroprosthetics Lab at the University of California, Davis, which is part of the BrainGate2 clinical trial. My colleagues and I recently demonstrated a speech brain-computer interface that deciphers the attempted speech of a man with ALS, or amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease. The interface converts neural signals into text with over 97% accuracy. Key to our system is a set of artificial intelligence language models โ€“ artificial neural networks that help interpret natural ones.

Recording brain signals

The first step in our speech brain-computer interface is recording brain signals. There are several sources of brain signals, some of which require surgery to record. Surgically implanted recording devices can capture high-quality brain signals because they are placed closer to neurons, resulting in stronger signals with less interference. These neural recording devices include grids of electrodes placed on the brain’s surface or electrodes implanted directly into brain tissue.

Advertisement

In our study, we used electrode arrays surgically placed in the speech motor cortex, the part of the brain that controls muscles related to speech, of the participant, Casey Harrell. We recorded neural activity from 256 electrodes as Harrell attempted to speak.

A small square device with an array of spikes on the bottom and a bundle of wires on the top

An array of 64 electrodes that embed into brain tissue neural signals.

UC Davis

Decoding brain signals

The next challenge is relating the complex brain signals to the words the user is trying to say.

One approach is to map neural activity patterns directly to spoken words. This method requires recording brain signals corresponding to each word multiple times to identify the average relationship between neural activity and specific words. While this strategy works well for small vocabularies, as demonstrated in a 2021 study with a 50-word vocabulary, it becomes impractical for larger ones. Imagine asking the brain-computer interface user to try to say every word in the dictionary multiple times โ€“ it could take months, and it still wouldn’t work for new words.

Advertisement

Instead, we use an alternative strategy: mapping brain signals to phonemes, the basic units of sound that make up words. In English, there are 39 phonemes, ch, er, oo, pl and sh, that can be combined to form any word. We can measure the neural activity associated with every phoneme multiple times just by asking the participant to read a few sentences aloud. By accurately mapping neural activity to phonemes, we can assemble them into any English word, even ones the system wasn’t explicitly trained with.

To map brain signals to phonemes, we use advanced machine learning models. These models are particularly well-suited for this task due to their ability to find patterns in large amounts of complex data that would be impossible for humans to discern. Think of these models as super-smart listeners that can pick out important information from noisy brain signals, much like you might focus on a conversation in a crowded room. Using these models, we were able to decipher phoneme sequences during attempted speech with over 90% accuracy.

The brain-computer interface uses a clone of Casey Harrell’s voice to read aloud the text it deciphers from his neural activity.

From phonemes to words

Once we have the deciphered phoneme sequences, we need to convert them into words and sentences. This is challenging, especially if the deciphered phoneme sequence isn’t perfectly accurate. To solve this puzzle, we use two complementary types of machine learning language models.

The first is n-gram language models, which predict which word is most likely to follow a set of n words. We trained a 5-gram, or five-word, language model on millions of sentences to predict the likelihood of a word based on the previous four words, capturing local context and common phrases. For example, after โ€œI am very good,โ€ it might suggest โ€œโ€ as more likely than โ€œpotatoโ€. Using this model, we convert our phoneme sequences into the 100 most likely word sequences, each with an associated probability.

Advertisement

The second is large language models, which power AI chatbots and also predict which words most likely follow others. We use large language models to refine our choices. These models, trained on vast amounts of diverse text, have a broader understanding of language structure and meaning. They help us determine which of our 100 candidate sentences makes the most sense in a wider context.

By carefully balancing probabilities from the n-gram model, the large language model and our initial phoneme predictions, we can make a highly educated guess about what the brain-computer interface user is trying to say. This multistep allows us to handle the uncertainties in phoneme decoding and produce coherent, contextually appropriate sentences.

Diagram showing a man, his brain, wires and a computer screen

How the UC Davis speech brain-computer interface deciphers neural activity and turns them into words.

UC Davis Health

Real-world benefits

In practice, this speech decoding strategy has been remarkably successful. We’ve enabled Casey Harrell, a man with ALS, to โ€œspeakโ€ with over 97% accuracy using just his thoughts. This breakthrough allows him to easily converse with his and friends for the first time in years, all in the comfort of his own home.

Advertisement

Speech brain-computer interfaces represent a significant step forward in restoring communication. As we continue to refine these devices, they hold the promise of giving a voice to those who have lost the ability to speak, reconnecting them with their loved ones and the world around them.

However, challenges remain, such as making the technology more accessible, portable and durable over years of use. Despite these hurdles, speech brain-computer interfaces are a powerful example of how science and technology can come together to solve complex problems and dramatically improve people’s lives.The Conversation

Nicholas Card, Postdoctoral Fellow of Neuroscience and Neuroengineering, University of California, Davis

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

Advertisement

The post How AI deciphers neural signals to help a man with ALS speak appeared first on .com

The Conversation

Tiny robots and AI algorithms could help to craft material solutions for cleaner environments

Published

on

theconversation.com – Mahshid Ahmadi, Assistant Professor of Materials Science and Engineering, of Tennessee – 2024-09-17 07:31:57

pollution is a global problem, but scientists are investigating new materials that could help clean it up.
AP Photo/Sergei Grits

Mahshid Ahmadi, University of Tennessee

Many human activities release pollutants into the air, and soil. These harmful chemicals threaten the health of both people and the ecosystem. According to the World Health Organization, air pollution causes an estimated 4.2 million deaths annually.

Scientists are looking into , and one potential avenue is a class of materials called photocatalysts. When triggered by light, these materials undergo chemical reactions that initial studies have shown can break down common toxic pollutants.

Advertisement

I am a materials science and engineering researcher at the University of Tennessee. With the help of robots and artificial intelligence, my colleagues and I are making and testing new photocatalysts with the goal of mitigating air pollution.

Breaking down pollutants

The photocatalysts work by generating charged carriers in the presence of light. These charged carriers are tiny particles that can move around and cause chemical reactions. When they into contact with water and oxygen in the , they produce substances called reactive oxygen species. These highly active reactive oxygen species can bond to parts of the pollutants and then either decompose the pollutants or turn them into harmless โ€“ or even useful โ€“ products.

A cube-shaped metal machine with a chamber filled with bright light, and a plate of tubes shown going under the light.
To facilitate the photocatalytic reaction, researchers in the Ahmadi lab put plates of perovskite nanocrystals and pollutants under bright light to see whether the reaction breaks down the pollutants.
Astita Dubey

But some materials used in the photocatalytic have limitations. For example, they can’t start the reaction unless the light has enough energy โ€“ infrared rays with lower energy light, or visible light, won’t trigger the reaction.

Another problem is that the charged particles involved in the reaction can recombine too quickly, which means they join back together before finishing the job. In these cases, the pollutants either do not decompose completely or the process takes a long time to accomplish.

Additionally, the surface of these photocatalysts can sometimes change during or after the photocatalytic reaction, which affects how they work and how efficient they are.

Advertisement

To overcome these limitations, scientists on my team are to develop new photocatalytic materials that work efficiently to break down pollutants. We also focus on making sure these materials are nontoxic so that our pollution-cleaning materials aren’t causing further pollution.

A plate of tiny tubes, with some colored dark blue, others light blue, and others transparent.
This plate from the Ahmadi lab is used while testing how perovskite nanocrystals and light break down pollutants, like the blue dye shown. The light blue color indicates partial degradation, while transparent water signifies complete degradation.
Astita Dubey

Teeny tiny crystals

Scientists on my team use automated experimentation and artificial intelligence to figure out which photocatalytic materials could be the best candidates to quickly break down pollutants. We’re making and testing materials called hybrid perovskites, which are tiny crystals โ€“ they’re about a 10th the thickness of a strand of hair.

These nanocrystals are made of a blend of organic (carbon-based) and inorganic (non-carbon-based) components.

They have a few unique qualities, like their excellent light-absorbing properties, which come from how they’re structured at the atomic level. They’re tiny, but mighty. Optically, they’re amazing too โ€“ they interact with light in fascinating ways to generate a large number of tiny charge carriers and trigger photocatalytic reactions.

These materials efficiently transport electrical charges, which allows them to transport light energy and the chemical reactions. They’re also used to make solar panels more efficient and in LED lights, which create the vibrant displays you see on TV screens.

Advertisement

There are thousands of potential types of hybrid nanocrystals. So, my team wanted to figure out how to make and test as many as we can quickly, to see which are the best candidates for cleaning up toxic pollutants.

Bringing in robots

Instead of making and testing samples by hand โ€“ which takes weeks or months โ€“ we’re using smart robots, which can produce and test at least 100 different materials within an hour. These small liquid-handling robots can precisely move, mix and transfer tiny amounts of liquid from one place to another. They’re controlled by a computer that guides their acceleration and accuracy.

A researcher in a white lab coat smiling at the camera next to a fume hood, with plates of small tubes inside it.
The Opentrons pipetting robot helps Astita Dubey, a visiting scientist working with the Ahmadi lab, synthesize materials and treat them with organic pollutants to test whether they can break down the pollutants.
Jordan Marshall

We also use machine learning to guide this process. Machine learning algorithms can analyze test data quickly and then learn from that data for the next set of experiments executed by the robots. These machine learning algorithms can quickly identify patterns and insights in collected data that would normally take much longer for a human eye to catch.

Our approach aims to simplify and better understand complex photocatalytic systems, helping to create new strategies and materials. By using automated experimentation guided by machine learning, we can now make these systems easier to analyze and interpret, overcoming challenges that were difficult with traditional methods.The Conversation

Mahshid Ahmadi, Assistant Professor of Materials Science and Engineering, University of Tennessee

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement

Read More

The post Tiny robots and AI algorithms could help to craft material solutions for cleaner environments appeared first on .com

Continue Reading

The Conversation

A public health historian sizes up their records

Published

on

theconversation.com – Zachary W. Schulz, Lecturer of History, Auburn – 2024-09-17 07:33:53

The presidential debate on Sept. 10, 2024, did not add much context to what the two candidates would do on beyond their own records.
Visual China Group/Getty Images

Zachary W. Schulz, Auburn University

Health care is a defining issue in the 2024 election โ€“ Democratic presidential nominee Kamala Harris and Republican contender Donald Trump have starkly different records on the issue. Rather than focusing on what they promise to do, let’s examine what their past actions reveal about their approaches to Medicare, the Affordable Care Act, public health , drug policy and child abuse and domestic violence prevention.

As a specialist in public health history and policy, I have carefully examined both candidates’ records on American health care. With years of experience in the health care field and being a cystic fibrosis patient myself, I have developed a deep understanding of our health care system and the political dynamics that shape it.

Advertisement

For me, as for many other Americans, health care is more than just a political issue; it is a deeply personal one.

Medicare

During Harris’ time in the Senate, she co-sponsored the Medicare for All Act, which aimed to expand Medicare to all Americans, effectively eliminating private insurance.

At the presidential debate on Sept. 10, 2024, Harris clarified her former of โ€œMedicare for Allโ€ by emphasizing her prior legislative efforts to preserve and expand protections for patients’ rights and access to affordable health care.

Harris’s legislative efforts, primarily around the 2017-2020 period, reflect a commitment to broadening access to Medicare and reducing costs for seniors. During that time, Harris advocated for the Medicare program to negotiate drug prices directly with pharmaceutical companies.

Advertisement

Later, as vice president, Harris cast a tie-breaking vote on the 2022 Inflation Reduction Act, allowing the to negotiate drug prices for Medicare with pharmaceutical companies.

In contrast, during Trump’s presidency, he made several attempts, some of which were successful, to cut funding for Medicare. The 2020 budget proposed by his administration included cuts to Medicare totaling more than US$800 billion over 10 years, primarily by reducing payments to providers and slowing the growth of the program.

The proposed cuts did not take effect because they required Congressional approval, which was not granted. The plan faced significant opposition due to concerns about potential negative impacts on beneficiaries.

Affordable Care Act

Harris has been a staunch defender of the Affordable Care Act, also known as the ACA or โ€œObamacare.โ€ As a senator, Harris consistently voted against any efforts to repeal the ACA. She advocated for expanding its provisions, supporting legislation that aimed to strengthen protections for people with preexisting conditions and increase funding for Medicaid expansion.

Advertisement

Harris’ record shows a clear commitment to ensuring broader health coverage under the ACA. And, in the recent debate, Harris noted this record and reasserted her commitment to the act.

During his presidency, Trump led multiple efforts to repeal the ACA, including the 2017 American Health Care Act, which would have significantly reduced the scope of expansion and removed individual mandates.

Although these efforts ultimately failed in the Senate, Trump succeeded in weakening the ACA by eliminating the individual mandate penalty through the 2017 Tax Cuts and Jobs Act. In the debate against Harris, Trump reiterated his position that the Affordable Care Act โ€œwas lousy health care,โ€ though he did not ultimately offer a replacement plan, stating only that he has โ€œconcepts of a plan.โ€

Donald Trump claims that as president, he had an obligation to save Obamacare, otherwise known as the Affordable Care Act, but says it is too expensive. He says he has โ€˜concepts of a plan’ for something to replace the ACA.

Public health infrastructure

Harris’ tenure in the Senate, from January 2017 to January 2021, shows a consistent pattern of supporting public health infrastructure. She co-sponsored several bills aimed at increasing funding for community health centers and expanding access to preventive care.

Advertisement

Harris also advocated for more federal funding to address public health emergencies, such as the opioid epidemic and the COVID-19 pandemic.

During Trump’s presidency, however, he made significant cuts to public health programs. The Trump administration proposed budget cuts to the Centers for Disease Control and Prevention and other public health agencies, arguing that they were necessary for fiscal responsibility. These proposals drew criticism for potentially undermining the nation’s ability to respond to public health emergencies, a concern that was underscored by the CDC’s struggles during the early days of the COVID-19 pandemic. Trump frequently has responded to these criticisms by asserting he โ€œcut bureaucratic red tapeโ€ rather than essential services.

Drug pricing policy

Harris has also supported legislation to lower drug prices and increase transparency in the pharmaceutical industry. She co-sponsored the Drug Price Relief Act, which aimed to allow the federal government to negotiate drug prices for Medicare directly. She also supported efforts to import cheaper prescription from Canada. Her record reflects a focus on reducing costs for consumers and increasing access to affordable medications.

Trump’s record on drug policy is mixed. While Trump took credit for some decreases in prescription drug prices during his presidency, his administration’s most significant regulatory changes favored pharmaceutical companies. The administration’s attempts to implement a rule allowing the importation of cheaper drugs from Canada faced significant hurdles and did not to immediate changes.

Advertisement

Trump also ended a rule that would have required pharmaceutical companies to disclose drug prices in television ads, citing concerns over its legality.

Child abuse and domestic violence

Harris has a strong record of advocating for the prevention of child abuse and domestic violence. During her time as California’s attorney general and as a senator, Harris pushed for legislation that increased funding for domestic violence prevention programs and expanded legal protections for survivors. She has consistently supported measures to enhance child welfare services and improve coordination among agencies to protect children.

Trump’s record on these issues is less defined, but his administration did sign into law the Family First Prevention Services Act, which aimed to keep more children safely at home and out of foster care by providing new resources to families. However, critics argue that the Trump administration’s broader cuts to social services and health programs could indirectly undermine efforts to combat child abuse and domestic violence. In addition, some experts suggest that Trump’s family separation policies on the southern border contributed to an increase in child trauma during his administration.The Conversation

Zachary W. Schulz, Lecturer of History, Auburn University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement

Read More

The post A public health historian sizes up their records appeared first on .com

Continue Reading

The Conversation

How researchers measure wildfire smoke exposure doesnโ€™t capture long-term health effects โˆ’ and hides racial disparities

Published

on

theconversation.com – Joan Casey, Associate Professor of Environmental and Occupational Sciences, University of Washington – 2024-09-16 07:26:33

Fine particulate matter from wildfires can cause long-term health harms.
Gary Hershorn/Getty Images

Joan Casey, University of Washington and Rachel Morello-Frosch, University of California, Berkeley

Kids born in 2020 worldwide will experience twice the number of wildfires during their lifetimes with those born in 1960. In California and other western states, frequent wildfires have become as much a part of summer and fall as popsicles and Halloween candy.

Wildfires produce fine particulate matter, or PMโ‚‚.โ‚…, that chokes the and penetrates deep into lungs. Researchers know that short-term exposure to wildfire PMโ‚‚.โ‚… increases acute care visits for cardiorespiratory problems such as asthma. However, the long-term effects of repeated exposure to wildfire PMโ‚‚.โ‚… on chronic health conditions are unclear.

Advertisement

One reason is that scientists have not decided how best to measure this type of intermittent yet ongoing exposure. Environmental epidemiologists and health scientists like us usually summarize long-term exposure to total PMโ‚‚.โ‚… โ€“ which from power plants, industry and transportation โ€“ as average exposure over a year. This might not make sense when measuring exposure to wildfire. Unlike traffic-related air pollution, for example, levels of wildfire PMโ‚‚.โ‚… vary a lot throughout the year.

To improve health and equity research, our team has developed five metrics that better capture long-term exposure to wildfire PMโ‚‚.โ‚….

Measuring fluctuating wildfire PMโ‚‚.โ‚…

To understand why current measurements of wildfire PMโ‚‚.โ‚… aren’t adequately capturing an individual’s long-term exposure, we need to delve into the concept of averages.

Say the mean level of PMโ‚‚.โ‚… over a year was 1 microgram per cubic meter. A person could experience that exposure as 1 microgram per cubic meter every day for 365 days, or as 365 micrograms per cubic meter on a single day.

Advertisement

While these two scenarios result in the same average exposure over a year, they might have very different biological effects. The body might be able to fend off from exposure to 1 microgram per cubic meter each day, but be overwhelmed by a huge, single dose of 365 micrograms per cubic meter.

For perspective, in 2022, Americans experienced an average total PMโ‚‚.โ‚… exposure of 7.8 micrograms per cubic meter. Researchers estimated that in the 35 states that experience wildfires, these wildfires added on average just 0.69 micrograms per cubic meter to total PMโ‚‚.โ‚… each year from 2016 to 2020. This perspective misses the mark, however.

For example, a census tract close to the 2018 Camp Fire experienced an average wildfire PMโ‚‚.โ‚… concentration of 1.2 micrograms per cubic meter between 2006 to 2020. But the actual fire had a peak exposure of 310 micrograms per cubic meter โ€“ the world’s highest level that day.

Orange haze blanketing a city skyline, small silhouette of a person taking a photo by a streetlight
Classic estimates of average PMโ‚‚.โ‚… levels miss the peak exposure of wildfire events.
Angela Weiss/AFP via Getty Images

Scientists want to better understand what such extreme exposures mean for long-term human health. Prior studies on long-term wildfire PMโ‚‚.โ‚… exposure focused mostly on people living close to a large fire, following up years later to check on their health status. This misses any new exposures that took place between baseline and follow-up.

More recent studies have tracked long-term exposure to wildfire PMโ‚‚.โ‚… that changes over time. For example, researchers reported associations between wildfire PMโ‚‚.โ‚… exposure over two years and risk of death from cancer and any other cause in Brazil. This work again relied on long-term average exposure and did not directly capture extreme exposures from intermittent wildfire events. Because the study did not evaluate it, we do not know whether a specific pattern of long-term wildfire PMโ‚‚.โ‚… exposure was worse for health.

Advertisement

Most days, people experience no wildfire PMโ‚‚.โ‚… exposure. Some days, wildfire exposure is intense. As of now, we do not know whether a few very bad days or many slightly bad days are riskier for health.

A new framework

How can we get more realistic estimates that capture the huge peaks in PMโ‚‚.โ‚… levels that people are exposed to during wildfires?

When thinking about the wildfire PMโ‚‚.โ‚… that people experience, exposure scientists โ€“ researchers who study contact between humans and harmful agents in the โ€“ consider frequency, duration and intensity. These interlocking factors help describe the body’s true exposure during a wildfire event.

In our recent study, our team proposed a framework for measuring long-term exposure to wildfire PMโ‚‚.โ‚… that incorporates the frequency, duration and intensity of wildfire events. We applied air quality models to California wildfire data from 2006 to 2020, deriving new metrics that capture a range of exposure types.

Advertisement
Five heat maps of California paired with bar graphs of exposures over time
The researchers proposed five ways to measure long-term wildfire PMโ‚‚.โ‚… exposure.
Casey et al. 2024/PNAS, CC BY-NC-ND

One metric we devised is number of days with any wildfire PMโ‚‚.โ‚… exposure over a long-term period, which can identify even the smallest exposures. Another metric is average concentration of wildfire PMโ‚‚.โ‚… during the peak week of smoke levels over a long period, which highlights locations that experience the most extreme exposures. We also developed several other metrics that may be more useful, depending on what effects are being studied.

Interestingly, these metrics were quite correlated with one another, suggesting places with many days of at least some wildfire PMโ‚‚.โ‚… also had the highest levels overall. Although this can make it difficult to decide between different exposure patterns, the suitability of each metric depends in part on what health effects we are investigating.

Environmental injustice

We also assessed whether certain racial and ethnic groups experienced higher-than-average wildfire PMโ‚‚.โ‚… exposure and found that different groups faced the most exposure depending on the year.

Consider 2018 and 2020, two major wildfire years in California. The most exposed census tracts, by all metrics, were composed primarily of non-Hispanic white individuals in 2018 and Hispanic individuals in 2020. This makes sense, since non-Hispanic white people constitute about 41.6% and Hispanic people 36.4% of California’s population.

To understand whether other groups faced excess wildfire PMโ‚‚.โ‚… exposure, we used relative comparisons. This means we compared the true wildfire PMโ‚‚.โ‚… exposure experienced by each racial and ethnic group with what we would have expected if they were exposed to the state average.

Advertisement

We found that Indigenous communities had the most disproportionate exposure, experiencing 1.68 times more PMโ‚‚.โ‚… than expected. In comparison, non-Hispanic white Californians were 1.13 times more exposed to PMโ‚‚.โ‚… than expected, and multiracial Californians 1.09 times more exposed than expected.

Person holding child, sitting by two other people; in the foreground, a child approaches the camera
Better metrics for long-term PM2.5 exposure can help researchers better understand who’s most vulnerable to wildfire smoke.
Eric Thayer/Stringer via Getty Images News

Rural tribal lands had the highest mean wildfire PMโ‚‚.โ‚… concentrations โ€“ 0.83 micrograms per cubic meter โ€“ of any census tract in our study. A large portion of Native American people in California live in rural , often with higher wildfire risk due to decades of poor forestry management, including legal suppression of cultural burning practices that studies have shown to aid in reducing catastrophic wildfires. Recent state legislation has liability risks of cultural burning on Indigenous lands in California.

Understanding the drivers and health effects of high long-term exposure to wildfire PMโ‚‚.โ‚… among Native American and Alaska Native people can help address substantial health disparities between these groups and other Americans.The Conversation

Joan Casey, Associate Professor of Environmental and Occupational Health Sciences, University of Washington and Rachel Morello-Frosch, Professor of Environmental Science, Policy and Management and of Public Health, University of California, Berkeley

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

Advertisement

The post How researchers measure wildfire smoke exposure doesn’t capture long-term health effects โˆ’ and hides racial disparities appeared first on .com

Continue Reading

Trending