fbpx
Connect with us

The Conversation

Brain implants to restore sight, like Neuralink’s Blindsight, face a fundamental problem − more pixels don’t ensure better vision

Published

on

theconversation.com – Ione Fine, Professor of Psychology, of Washington – 2024-08-06 07:44:42
Human vision can’t be fully reproduced with ones and zeros.
seamartini/iStock via Getty Images Plus

Ione Fine, University of Washington and Geoffrey Boynton, University of Washington

Elon Musk recently pronounced that the next Neuralink will be a “Blindsight” cortical implant to restore vision: “Resolution will be low at first, like early Nintendo graphics, but ultimately may exceed normal human vision.”

Unfortunately, this claim rests on the fallacy that neurons in the brain are like pixels on a screen. It’s not surprising that engineers often assume that “more pixels equals better vision.” After all, that is how monitors and phone screens work.

In our newly published research, we created a computational model of human vision to simulate what sort of vision an extremely high-resolution cortical implant might . A movie of a cat with a resolution of 45,000 pixels is sharp and clear. A movie generated using a simplified version of a model of 45,000 cortical electrodes, each of which stimulates a single neuron, still has a recognizable cat but most of the details of the scene are lost.

Advertisement
The movie on the left is generated using 45,000 pixels. The one on the right is generated using a simulation of a cortical prosthesis with 45,000 electrodes, each of which stimulates a single neuron.

The reason why the movie generated by electrodes is so blurry is because neurons in the human visual cortex do not represent tiny dots or pixels. Instead, each neuron has a particular receptive field, which is the location and pattern a visual stimulus must have in order to make that neuron fire. Electrically stimulating a single neuron produces a blob whose appearance is determined by that neuron’s receptive field. The tiniest electrode – one that stimulates a single neuron – will produce a blob that is roughly the size of your pinkie’s width held at arm’s length.

Consider what happens when you look at a single star in the night sky. Each point in is represented by many thousands of neurons with overlapping receptive fields. A tiny spot of light, such as a star, results in a complex pattern of firing across all these neurons.

To generate the visual experience of seeing a single star with cortical stimulation, you would need to reproduce a pattern of neural responses that is similar to the pattern that would be produced by natural vision.

In order to do this, you would obviously need thousands of electrodes. But you would also need to replicate the correct pattern of neuronal responses, which requires knowing every neuron’s receptive field. Our simulations show that knowing the location of each neuron’s receptive field in space is not enough – if you don’t also know the orientation and size of each receptive field, then the star becomes a fuzzy mess.

Advertisement

So, even a single star – a single, bright pixel – generates an immensely complex neural response in the visual cortex. Imagine the even more complex pattern of cortical stimulation needed to accurately reproduce natural vision.

Some scientists have suggested that by stimulating exactly the right combination of electrodes, it would be possible to produce natural vision. Unfortunately, no one has yet suggested a sensible way to determine the receptive field of each individual neuron in a specific blind patient. Without that information, there is no way to see the . Vision from cortical implants will remain grainy and imperfect, regardless of the number of electrodes.

Sight restoration is not simply an engineering problem. Predicting what kind of vision a device will provide requires knowing how the technology interfaces with the complexities of the human brain.

How we created our virtual patients

In our work as computational neuroscientists, we develop simulations that predict the perceptual experience of seeking to restore their sight.

Advertisement

We have previously created a model to predict the perceptual experience of patients with a retinal implant. To create a virtual patient to predict what cortical implant patients would see, we simulated the neurophysiological architecture of the area of the brain involved in the first stage of visual processing. Our model approximates how receptive fields increase in size from central to peripheral vision and the fact that every neuron has a unique receptive field.

Our model successfully predicted data describing the perceptual experience of participants across a wide range of studies on cortical stimulation in people. confirmed that our model could predict existing data, we used it to make predictions about the quality of vision that possible future cortical implants might produce.

Models like ours are an example of virtual prototyping, which involves using computer to improve product design. These models can facilitate new technology and evaluate device performance. Our study shows they can also offer more realistic expectations about what kind of vision bionic eyes might provide.

First do no harm

In our almost 20 years researching bionic eyes, we’ve seen the complexity of the human brain defeat company after company. Patients pay the cost when these devices fail, left stranded with orphaned technologies in their eye or brain.

Advertisement

The Food and Drug Administration could mandate that sight recovery tech companies must develop failure plans that minimize harm to patients when technologies stop working. Possibilities include requiring companies implanting neuroelectronic devices into patients to participate in technology escrow agreements and carry insurance to ensure continuing medical care and technology support if they go bankrupt.

If cortical implants can achieve anything close to the resolution of our simulations, that would still be an accomplishment worth celebrating. Grainy and imperfect vision would be life-changing for many thousands of people who currently suffer from incurable blindness. But this is a moment for cautious rather than blind optimism.The Conversation

Ione Fine, Professor of Psychology, University of Washington and Geoffrey Boynton, Professor of Psychology, University of Washington

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

Advertisement

The post Brain implants to restore sight, like Neuralink’s Blindsight, face a fundamental problem − more pixels don’t ensure better vision appeared first on .com

The Conversation

Found dead in the snow − how microbes can help pinpoint time of death for forensic investigations in frigid conditions

Published

on

theconversation.com – Noemi Procopio, Senior Research Fellow, School of and Policing, of Central Lancashire – 2024-09-09 07:25:01

Extreme weather conditions can make reconstructing the scene of a more difficult.

Nick Thompson/iStock via Getty Images Plus

Noemi Procopio, University of Central Lancashire and Lavinia Iancu, University of North Dakota

Advertisement

What happens to a dead body in an extremely cold environment? Does it decompose? How do these conditions affect how forensic scientists understand when the person died?

Estimating time of death, also called the post-mortem interval, is a complex task. It plays an important role in forensic investigations, as it can provide critical insights into the timeline of leading up to a person’s death. This information can narrow down potential scenarios and suspects, aiding in the resolution of criminal cases.

A multitude of factors are at play at a death scene, ranging from environmental conditions to the individual’s health status prior to death. Historically, scientists have estimated time of death by observing post-mortem physical and biological changes in the body, such as stiffening, fluid collection and cooling.

These methods are limited, however, by their variability and dependence on external factors. Calculating the post-mortem interval became more precise with the advent of molecular biology. But it’s still a challenging task, especially in extreme cold weather conditions. There is often a lack of obvious signs of decomposition on a frozen body during the first months after death.

Advertisement

We are forensic scientists leading the forensics programs at the University of North Dakota and the University of Central Lancashire. We use molecular biology and bioinformatics to develop tools to researchers and investigators more accurately estimate the post-mortem interval. Our recently published research in Frontiers in Microbiology found that studying the microbes involved in decomposition could predict time elapsed since death in extreme cold conditions with high accuracy.

Decomposition in cold environments

Our study took place in Grand Forks, North Dakota, one of the coldest cities in the United States, where winters are characterized by temperatures that can drop to -40 degrees Fahrenheit (-40 degrees Celsius) and high winds that can reach up to 31 miles per hour (50 kilometers per hour).

In an extremely cold environment like North Dakota’s winters, traditional methods might not be enough to understand decomposition and estimate time of death. For instance, the body cools much faster in cold conditions, which can skew estimates based on body temperature.

Barren field covered in snow under weak sunlight

The researchers set their investigation into time of death in Grand Forks, N.D., where winters can be brutal.

Lavinia Iancu, CC BY-SA

Advertisement

Similarly, cold environments can delay the onset and duration of rigor mortis, or body stiffening. The of decomposition, the activity of insects and other scavengers that contribute to the of the body, can also be slowed or halted by freezing temperatures.

Snow is another important factor when investigating decomposition. It can insulate a body by trapping residual heat and raising its temperature slightly higher than the surrounding environment. This insulating effect allows the body to decompose at a slower rate with bodies exposed to open air.

Microbes and time since death

In conditions of extreme cold, it becomes necessary to employ additional means to understand decomposition and estimate the time of death. Advanced molecular techniques, such as analyzing the microbiome, gene expression and protein degradation, can help provide valuable information about the crime scene.

Each organism has distinct microbial characteristics that act like a fingerprint. The necrobiome, a community of microbes associated with decomposing remains, plays a crucial role in decay. Specific microbes are present during different stages of decomposition, contributing to the breakdown of tissues and the recycling of nutrients. Forensic investigators can sample what microbes are living in a dead body to deduce how long ago a person died based on the makeup of the microbial population.

Advertisement

Our study focused on identifying common patterns in the microbial changes that occur during decomposition in extreme cold environments. Over a period of 23 weeks, we collected and analyzed 393 samples of microbes from the inside and outside of the noses dead pigs covered in snow. Pigs decompose similarly to humans and are commonly used in forensic research. We developed models to estimate the post-mortem interval by pairing microbial genetic data with environmental data such as snow depth and outdoor temperature.

Person sticking swab in the nose of dead pig lying on its side behind a fence in the snow

The researchers collect samples from the inside and outside of the noses of dead pigs.

Lavinia Iancu, CC BY-ND

Overall, we found that the bacterial species Psychrobacter, Pseudomonas and Carnobacterium may best predict time after death in extreme winter conditions up to six months after death, with a margin of error of just over nine days.

We found that different bacterial species are most abundant at different time intervals. For example, levels of Psychrobacter increase five weeks after death and are most abundant at 10 weeks, while Pseudomonas increase between five to nine weeks and hit a peak at 18 weeks.

Advertisement

Improving forensics

Death is often an unpleasant topic to bring into a conversation. But from a forensic perspective, techniques and methods to determine when someone has died can help bring justice and peace for loved ones.

Our study found that decomposition does not completely halt even in cold environments. Studying the microenvironment – the local conditions surrounding the body, including temperature, humidity and microbial activity – can reveal crucial information about the decomposition process. The key microbial species we identified served as biomarkers of death, allowing us to develop time-of-death models that researchers can use to overcome the limitations of just visually examining remains.

Microbes can become a crucial piece of the puzzle during the process of investigating a death by aiding in constructing more precise timelines, even in extreme conditions.The Conversation

Noemi Procopio, Senior Research Fellow, School of Law and Policing, University of Central Lancashire and Lavinia Iancu, Assistant Professor of Forensic Science, Director of the Forensic Science Program, University of North Dakota

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement

Read More

The post Found dead in the snow − how microbes can help pinpoint time of death for forensic investigations in frigid conditions appeared first on theconversation.com

Continue Reading

The Conversation

FDA’s new regulations underscore the complexity around screening for women with dense breasts

Published

on

theconversation.com – Nancy Kressin, Emeritus Professor of Medicine, Boston University – 2024-09-09 07:24:09

Breast density raises the risk of breast cancer and can also make it more difficult for breast cancer to be detected.

picture alliance/Getty Images

Nancy Kressin, Boston University; Christine M. Gunn, Dartmouth College; Priscilla J. Slanetz, Boston University, and Tracy A. Battaglia, Yale University

Advertisement

The Food and Drug Administration implemented a rule to go into effect on Sept. 10, 2024, requiring mammography facilities to notify women about their breast density. The goal is to ensure that women nationwide are informed about the risks of breast density, advised that other imaging tests might help find cancers and urged to with their about next steps based on their individual situation.

The FDA originally issued the rule on March 10, 2023, but extended the implementation date to give mammography facilities additional time to adhere.

U.S. asked a team of experts in social science and patients’ health behaviors, health policy, radiology and primary care and health services research to explain the FDA’s new regulations about these communications and what women should consider as they decide whether to pursue additional imaging tests, often called supplemental screening.

What is breast density and why does it matter?

Breast density is categorized into four categories: fatty, scattered tissue, heterogeneously dense or extremely dense.

Advertisement

Dense breasts are composed of more fibrous, connective tissue and glandular tissue – meaning glands that produce milk and tubes that carry it to the nipple – than fatty tissue. Because fibroglandular tissue and breast masses both look white on mammographic images, greater breast density makes it more difficult to detect cancer. Nearly half of all American women are categorized as having dense breasts.

Having dense breasts also increases the risk of getting breast cancer, though the reason for this is unknown.

Because of this, decisions about breast cancer screening get more complicated. While evidence is clear that regular mammograms save lives, additional testing such as ultrasound, MRI or contrast-enhanced mammography may be warranted for women with dense breasts.

What does the new FDA rule say?

The FDA now requires specific language to ensure that all women the same “accurate, complete and understandable breast density information.” After a mammogram, women must be informed:

Advertisement

– Whether their breasts are dense or not dense

– That having dense breasts increases the risk of breast cancer

– That having dense breasts makes it harder to find breast cancer on mammograms

– That for those with dense breasts, additional imaging tests might help find cancer

Advertisement

They must also be advised to discuss their individual situation with their provider, to determine which, if any, additional screening might be indicated.

A female doctor holds a breast model up, explaining it to her patient, with a mammogram image in the background.

Conversations between and doctors are crucial for determining whether supplemental screening would be beneficial.

PonyWang/E+ via Getty Images

Why did the FDA issue the new rule?

Prior to the federal rule, 38 U.S. states required some form of breast density notification. But some states had no notification requirements, and among the others there were many inconsistencies that raised concerns by advocates, women with dense breasts whose advanced cancer had not been detected on a mammogram.

The FDA standardized the information women must receive. It is written at an eighth grade reading level and may address racial and literacy-level differences in women’s knowledge about breast density and reactions to written notifications.

Advertisement

For instance, our research team found disproportionately more confusion and anxiety among women of color, those with low literacy and women for whom English was not their first language. And some women with low literacy reported decreased future intentions to undergo mammographic screening.

What is the value of additional screening?

Standard mammograms use X-rays to produce two-dimensional images of the breast. A newer type of mammography imaging called tomosynthesis produces 3D images, which find more cancers among women with dense breasts. So, researchers and doctors generally agree that women with dense breasts should undergo tomosynthesis screening when available.

There is still limited scientific evidence to guide recommendations for supplemental breast screening beyond standard mammography or tomosynthesis for women with dense breast tissue. Data shows that supplemental screening with ultrasound, MRI or contrast-enhanced mammography may detect additional cancers, but there are no prospective studies confirming that such additional screening saves more lives.

So far, there is no data from randomized clinical trials showing that supplemental breast MRIs, the most often-recommended supplemental screening, reduce from breast cancer.

Advertisement

However, more early stage – but not late-stage – cancers are found with MRIs, which may require less extensive surgery and less chemotherapy.

Various professional organizations and experts interpret the available data about supplemental screening differently, arriving at different conclusions and recommendations. An important consideration is the woman’s individual level of risk, since emerging evidence suggests that women whose personal risk of developing breast cancer is high are most likely to benefit from supplemental screening.

Some organizations have concluded that current evidence is too limited to make a recommendation for supplemental screening, or they do not recommend routine use of supplemental screening for women based solely on breast density. Others recommend additional screening for women with extremely or heterogeneously dense breasts, even when their risk is at the intermediate level.

What should women consider about added screening?

Because personal risk of breast cancer is a crucial consideration in deciding whether to undergo supplemental screening, women should understand their own risk.

Advertisement

The American College of Radiology recommends that all women undergo risk assessment by age 25. Women and their providers can use risk calculators such as Tyrer-Cuzick, which is free and available online.

Women should also understand that breast density is only one of several risks for breast cancer, and some of the others can be modified. Engaging in regular physical activity, maintaining a healthy weight, limiting alcohol use and eating a healthy diet rich in vegetables can all decrease breast cancer risk.

Are there potential harms?

Amid the debate about the benefits of supplemental breast screening, there is less discussion about its possible harms. Most common are false alarms: results that suggest a finding of cancer that require follow-up testing. Less commonly, a biopsy is needed, which may lead to short-term fear and anxiety, medical bills or potential complications from interventions.

Supplemental screening can also lead to overdiagnosis and overtreatment – the small risk of identifying and treating a cancer that would have never posed a problem.

Advertisement

MRI screening also involves use of a chemical substance called gadolinium to improve imaging. Although tiny amounts of gadolinium are retained in the body, the FDA considers the contrast agent to be safe when given to patients with normal kidney function.

MRIs may also identify incidental findings outside the breast – such as in the lungs – that warrant additional concern, testing and cost. Women should consider their tolerance for such risks, relative to their desire for the of additional screening.

The out-of-pocket cost of additional screening beyond a mammogram is also a consideration; only 29 states plus the District of Columbia require insurance coverage for supplemental breast cancer screening, and only three states – New York, Connecticut and Illinois – mandate insurance coverage with no copays.

How can you learn more?

Though the FDA urges women to talk with their providers, our research found that few women have such conversations and that many providers lack sufficient knowledge about breast density and current guidelines for breast screening.

Advertisement

It’s not yet clear why, but providers receive little or no training about breast density and report little confidence in their ability to counsel patients on this topic.

To address this knowledge deficit in some health care settings, radiologists, whose screening guidelines are more stringent than some other organizations, sometimes provide a recommendation for supplemental screening as part of their mammography report to the provider who ordered the mammogram.

Learning more about the topic in advance of a discussion with a provider can help a woman better understand her options.

Numerous online resources can provide more information, including the American Cancer Society, the website Dense Breast-info and the American College of Radiology.

Advertisement

Armed with information about the complexities of breast density and its impact on breast cancer screening, women can discuss their personal risk with their providers and evaluate the options for supplemental screening, with consideration of how they value the benefits and harms associated with different tests.The Conversation

Nancy Kressin, Emeritus Professor of Medicine, Boston University; Christine M. Gunn, Assistant Professor of Health Policy and Clinical Practice, Dartmouth College; Priscilla J. Slanetz, Professor of Radiology, Boston University, and Tracy A. Battaglia, Associate Director of Cancer Care Equity, Yale Cancer Center, Yale University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post FDA’s new regulations underscore the complexity around screening for women with dense breasts appeared first on theconversation.com

Advertisement
Continue Reading

The Conversation

The Boeing Starliner has returned to Earth without its crew – a former astronaut details what that means for NASA, Boeing and the astronauts still up in space

Published

on

theconversation.com – Michael E. Fossum, Vice President, A&M University – 2024-09-07 10:41:12

The Boeing Starliner, shown as it approached the International Station.
NASA via AP

Michael E. Fossum, Texas A&M University

Boeing’s crew transport space capsule, the Starliner, returned to Earth without its two-person crew right after midnight Eastern time on Sept. 7, 2024. Its remotely piloted return marked the end of a fraught test flight to the International Space Station which left two astronauts, Butch Wilmore and Sunita “Suni” Williams, on the station for months longer than intended after thruster failures led NASA to deem the capsule unsafe to pilot back.

Wilmore and Williams will stay on the International Space Station until February 2025, when they’ll return to Earth on a SpaceX Dragon capsule.

Advertisement

The Conversation U.S. asked former commander of the International Space Station Michael Fossum about NASA’s decision to return the craft uncrewed, the future of the Starliner program and its crew’s extended stay at the space station.

What does this decision mean for NASA?

NASA awarded contracts to both Boeing and SpaceX in 2014 to provide crew transport vehicles to the International Space Station via the Commercial Crew Program. At the start of the program, most bets were on Boeing to take the , because of its extensive aerospace experience.

However, SpaceX moved very quickly with its new rocket, the Falcon 9, and its cargo ship, Dragon. While they suffered some early failures during testing, they aggressively built, tested and learned from each failure. In 2020, SpaceX successfully launched its first test crew to the International Space Station.

Meanwhile, Boeing struggled through some development setbacks. The outcome of this first test flight is a huge disappointment for Boeing and NASA. But NASA leadership has expressed its for Boeing, and many experts, including me, believe it remains in the agency’s best interest to have more than one American crew launch system to support continued human space operations.

Advertisement

NASA is also continuing its exchange partnership with Russia. This partnership provides the agency with multiple ways to get crew members to and from the space station.

As space station operations continue, NASA and its partners have enough options to get people to and from the station that they’ll always have the essential crew on the station – even if there are launch disruptions for any one of the capable crewed vehicles. Starliner as an option will help with that redundancy.

The ISS, a cylindrical craft with solar panels on each side.
NASA has a few options to get astronauts up to the International Space Station.
Roscosmos State Space Corporation via AP

What does this decision mean for Boeing?

I do think Boeing’s reputation is going to ultimately suffer. The company is going head-to-head with SpaceX. Now, the SpaceX Dragon crew spacecraft has several flights under its belt. It has proven a reliable way to get to and from the space station.

It’s important to remember that this was a test flight for Starliner. Of course, the program managers want each test flight to run perfectly, but you can’t anticipate every potential problem through ground testing. Unsurprisingly, some problems cropped up – you expect them in a test flight.

The space environment is unforgiving. A small problem can become catastrophic in zero gravity. It’s hard to replicate these situations on the ground.

Advertisement

The technology SpaceX and Boeing use is also radically different from the kind of capsule technology used in the early days of the Mercury, Gemini and Apollo programs.

NASA has evolved and made strategic moves to advance its mission over the past two decades. The agency has leaned into its legacy of thinking outside the box. It was an innovative move to break from tradition and leverage commercial competitors to advance the program. NASA gave the companies a set of requirements and left it up to them to figure out how they would meet them.

What does this decision mean for Starliner’s crew?

I know Butch Wilmore and Suni Williams as rock-solid professionals, and I believe their first are about completing their mission safely. They are both highly experienced astronauts with previous long-duration space station experience. I’m sure they are taking this in stride.

Prior to joining NASA, Williams was a Naval aviator and Wilmore a combat veteran, so these two know how to face risk and accomplish their missions. This kind of unfavorable outcome is always a possibility in a test mission. I am sure they are leaning forward with a positive attitude and using their bonus time in space to advance science, technology and space exploration.

Advertisement

Their families shoulder the bigger impact. They were prepared to welcome the crew home in less than two weeks and now must adjust to unexpectedly being apart for eight months.

Right now, NASA is dealing with a ripple effect, with more astronauts than expected on the space station. More people means more consumables – like food and clothing – required. The space station has supported a large crew for short periods in the past, but with nine crew members on board today, the have to work harder to purify recycled drinking water, generate oxygen and remove carbon dioxide from their atmosphere.

Wilmore and Williams are also consuming food, and they didn’t arrive with the clothes and other personal supplies they needed for an eight-month stay, so NASA has already started increasing those deliveries on cargo ships.

What does this decision mean for the future?

Human spaceflight is excruciatingly hard and relentlessly unforgiving. A million things must go right to have a successful mission. It’s impossible to fully understand the performance of systems in a microgravity environment until they’re tested in space.

Advertisement

NASA has had numerous failures and near-misses in the quest to put Americans on the Moon. They lost the Apollo 1 crew in a fire during a preflight test. They launched the first space shuttle in 1981, and dealt with problems throughout that program’s 30-year , including the terrible losses of Challenger and Columbia.

After having no other U.S. options for over 30 years, three different human spacecraft programs are now underway. In addition to the SpaceX Crew Dragon and the Boeing Starliner, NASA’s Orion spacecraft for the Artemis II mission, is planned to fly four astronauts around the Moon in the next of years.

These programs have had setbacks and bumps along the way – and there will be more – but I haven’t been this about human spaceflight since I was an 11-year-old cheering for Apollo and dreaming about putting the first human footprints on Mars.The Conversation

Michael E. Fossum, Vice President, Texas A&M University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement

Read More

The post The Boeing Starliner has returned to Earth without its crew – a former astronaut details what that means for NASA, Boeing and the astronauts still up in space appeared first on theconversation.com

Continue Reading

Trending