Connect with us

The Conversation

Enzymes are the engines of life − machine learning tools could help scientists design new ones to tackle disease and climate change

Published

on

theconversation.com – Sam Pellock, Postdoctoral Scholar in Biochemistry, University of Washington – 2025-02-13 13:02:00

Enzymes are the engines of life − machine learning tools could help scientists design new ones to tackle disease and climate change

Enzymes have complicated molecular structures that are hard to replicate.
Design Cells/iStock via Getty Images Plus

Sam Pellock, University of Washington

Enzymes are molecular machines that carry out the chemical reactions that sustain all life, an ability that has captured the attention of scientists like me.

Consider muscle movement. Your body releases a molecule called acetylcholine to trigger your muscle cells to contract. If acetylcholine sticks around for too long, it can paralyze your muscles – including your heart muscle cells – and, well, that’s that. This is where the enzyme acetylcholinesterase comes in. This enzyme can break down thousands of acetylcholine molecules per second to ensure muscle contraction is stopped, paralysis avoided and life continued. Without this enzyme, it would take a month for a molecule of acetylcholine to break down on its own – about 10 billion times slower.

You can imagine why enzymes are of particular interest to scientists looking to solve modern problems. What if there were a way to break down plastic, capture carbon dioxide or destroy cancer cells as fast as acetylcholinesterase breaks down acetylcholine? If the world needs to take action quickly, enzymes are a compelling candidate for the job – if only researchers could design them to handle those challenges on demand.

Designing enzymes, unfortunately, is very hard. It’s like working with an atom-sized Lego set, but the instructions were lost and the thing won’t hold together unless it’s assembled perfectly. Newly published research from our team suggests that machine learning can act as the architect on this Lego set, helping scientists build these complex molecular structures accurately.

What’s an enzyme?

Let’s take a closer look at what makes up an enzyme.

Enzymes are proteins – large molecules that do the behind-the-scenes work that keep all living things alive. These proteins are made up of amino acids, a set of building blocks that can be stitched together to form long strings that get knotted up into specific shapes.

The specific structure of a protein is key to its function in the same way that the shapes of everyday objects are. For example, much like a spoon is designed to hold liquid in a way that a knife simply can’t, the enzymes involved in moving your muscles aren’t well suited for photosynthesis in plants.

For an enzyme to function, it adopts a shape that perfectly matches the molecule it processes, much like a lock matches a key. The unique grooves in the enzyme – the lock – that interact with the target molecule – the key – are found in a region of the enzyme known as the active site.

Diagram showing a substrate binding to an enzyme, changing both the enzyme and its own shape, then getting released as two new products
The induced fit model of enzymes states that both the enzyme and its substrate change shape when they interact.
OpenStax, CC BY-SA

The active site of the enzyme precisely orients amino acids to interact with the target molecule when it enters. This makes it easier for the molecule to undergo a chemical reaction to turn into a different one, making the process go faster. After the chemical reaction is done, the new molecule is released and the enzyme is ready to process another.

How do you design an enzyme?

Scientists have spent decades trying to design their own enzymes to make new molecules, materials or therapeutics. But making enzymes that look like and go as fast as those found in nature is incredibly difficult.

Enzymes have complex, irregular shapes that are made up of hundreds of amino acids. Each of these building blocks needs to be placed perfectly or else the enzyme will slow down or completely shut off. The difference between a speed racer and slowpoke enzyme can be a distance of less than the width of a single atom.

Initially, scientists focused on modifying the amino acid sequences of existing enzymes to improve their speed or stability. Early successes with this approach primarily improved the stability of enzymes, enabling them to catalyze chemical reactions at a higher range of temperatures. But this approach was less useful for improving the speed of enzymes. To this day, designing new enzymes by modifying individual amino acids is generally not an effective way to improve natural enzymes.

Clump of spirals and threads, with a small molecule at its center
This model of acetylcholinesterase shows acetylcholine (green) bound to its active site.
Sam Pellock, CC BY-SA

Researchers found that using a process called directed evolution, in which the amino acid sequence of an enzyme is randomly changed until it can perform a desired function, proved much more fruitful. For example, studies have shown that directed evolution can improve chemical reaction speed, thermostability, and even generate enzymes with properties that aren’t seen in nature. However, this approach is typically labor-intensive: You have to screen many mutants to find one that does what you want. In some cases, if there’s no good enzyme to start from, this method can fail to work at all.

Both of these approaches are limited by their reliance on natural enzymes. That is, restricting your design to the shapes of natural proteins likely limits the kinds of chemistry that enzymes can facilitate. Remember, you can’t eat soup with a knife.

Is it possible to make enzymes from scratch, rather than modify nature’s recipe? Yes, with computers.

Designing enzymes with computers

The first attempts to computationally design enzymes still largely relied on natural enzymes as a starting point, focusing on placing enzyme active sites into natural proteins.

This approach is akin to trying to find a suit at a thrift store: It is unlikely you will find a perfect fit because the geometry of an enzyme’s active site (your body in this analogy) is highly specific, so a random protein with a rigidly fixed structure (a suit with random measurements) is unlikely to perfectly accommodate it. The resulting enzymes from these efforts performed much more slowly than those found in nature, requiring further optimization with directed evolution to reach speeds common among natural enzymes.

Recent advances in deep learning have dramatically changed the landscape of designing enzymes with computers. Enzymes can now be generated in much the same way that AI models such as ChatGPT and DALL-E generate text or images, and you don’t need to use native protein structures to support your active site.

YouTube video
AI tools are helping researchers design new proteins.

Our team showed that when we prompt an AI model, called RFdiffusion, with the structure and amino acid sequence of an active site, it can generate the rest of the enzyme structure that would perfectly support it. This is equivalent to prompting ChatGPT to write an entire short story based on a prompt that only says to include the line “And sadly, the eggs never showed up.”

We used this AI model specifically to generate enzymes called serine hydrolases, a group of proteins that have potential applications in medicine and plastic recycling. After designing the enzymes, we mixed them with their intended molecular target to see whether they could catalyze its breakdown. Encouragingly, many of the designs we tested were able to break down the molecule, and better than previously designed enzymes for the same reaction.

To see how accurate our computational designs were, we used a method called X-ray crystallography to determine the shapes of these enzymes. We found that many of them were a nearly perfect match to what we digitally designed.

Our findings mark a key advance in enzyme design, highlighting how AI can help scientists start to tackle complex problems. Machine learning tools could help more researchers access enzyme design and tap into the full potential of enzymes to solve modern-day problems.The Conversation

Sam Pellock, Postdoctoral Scholar in Biochemistry, University of Washington

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Enzymes are the engines of life − machine learning tools could help scientists design new ones to tackle disease and climate change appeared first on theconversation.com

The Conversation

What causes the powerful winds that fuel dust storms, wildfires and blizzards? A weather scientist explains

Published

on

theconversation.com – Chris Nowotarski, Associate Professor of Atmospheric Science, Texas A&M University – 2025-03-20 07:49:00

When huge dust storms like this one in the Phoenix suburbs in 2022 hit, it’s easy to see the power of the wind.
Christopher Harris/iStock Images via Getty Plus

Chris Nowotarski, Texas A&M University

Windstorms can seem like they come out of nowhere, hitting with a sudden blast. They might be hundreds of miles long, stretching over several states, or just in your neighborhood.

But they all have one thing in common: a change in air pressure.

Just like air rushing out of your car tire when the valve is open, air in the atmosphere is forced from areas of high pressure to areas of low pressure.

The stronger the difference in pressure, the stronger the winds that will ultimately result.

A weather map with a line between high and low pressure stretching across the U.S.
On this forecast for March 18, 2025, from the National Oceanic and Atmospheric Administration, ‘L’ represents low-pressure systems. The shaded area over New Mexico and west Texas represents strong winds and low humidity that combine to raise the risk of wildfires.
NOAA Weather Prediction Center

Other forces related to the Earth’s rotation, friction and gravity can also alter the speed and direction of winds. But it all starts with this change in pressure over a distance – what meteorologists like me call a pressure gradient.

So how do we get pressure gradients?

Strong pressure gradients ultimately owe their existence to the simple fact that the Earth is round and rotates.

Because the Earth is round, the sun is more directly overhead during the day at the equator than at the poles. This means more energy reaches the surface of the Earth near the equator. And that causes the lower part of the atmosphere, where weather occurs, to be both warmer and have higher pressure on average than the poles.

Nature doesn’t like imbalances. As a result of this temperature difference, strong winds develop at high altitudes over midlatitude locations, like the continental U.S. This is the jet stream, and even though it’s several miles up in the atmosphere, it has a big impact on the winds we feel at the surface.

Wind speed and direction in the upper atmosphere on March 14, 2025, show waves in the jet stream. Downstream of a trough in this wave, winds diverge and low pressure can form near the surface.
NCAR

Because Earth rotates, these upper-altitude winds blow from west to east. Waves in the jet stream – a consequence of Earth’s rotation and variations in the surface land, terrain and oceans – can cause air to diverge, or spread out, at certain points. As the air spreads out, the number of air molecules in a column decreases, ultimately reducing the air pressure at Earth’s surface.

The pressure can drop quite dramatically over a few days or even just a few hours, leading to the birth of a low-pressure system – what meteorologists call an extratropical cyclone.

The opposite chain of events, with air converging at other locations, can form high pressure at the surface.

In between these low-pressure and high-pressure systems is a strong change in pressure over a distance – a pressure gradient. And that pressure gradient leads to strong winds. Earth’s rotation causes these winds to spiral around areas of high and low pressure. These highs and lows are like large circular mixers, with air blowing clockwise around high pressure and counterclockwise around low pressure. This flow pattern blows warm air northward toward the poles east of lows and cool air southward toward the equator west of lows.

A maps shows pressure changes don't follow a straight line.
A map illustrates lines of surface pressure, called isobars, with areas of high and low pressure marked for March 14, 2025. Winds are strongest when isobars are packed most closely together.
Plymouth State University, CC BY-NC-SA

As the waves in the jet stream migrate from west to east, so do the surface lows and highs, and with them, the corridors of strong winds.

That’s what the U.S. experienced when a strong extratropical cyclone caused winds stretching thousands of miles that whipped up dust storms and spread wildfires, and even caused tornadoes and blizzards in the central and southern U.S. in March 2025.

Whipping up dust storms and spreading fires

The jet stream over the U.S. is strongest and often the most “wavy” in the springtime, when the south-to-north difference in temperature is often the strongest.

Winds associated with large-scale pressure systems can become quite strong in areas where there is limited friction at the ground, like the flat, less forested terrain of the Great Plains. One of the biggest risks is dust storms in arid regions of west Texas or eastern New Mexico, exacerbated by drought in these areas.

Downtown is barely visible through a haze of dust.
A dust storm hit Albuquerque, N.M., on March 18, 2025. Another dust storm a few dats earlier in Kansas caused a deadly pileup involving dozens of vehices on I-70.
AP Photo/Roberto E. Rosales

When the ground and vegetation are dry and the air has low relative humidity, high winds can also spread wildfires out of control.

Even more intense winds can occur when the pressure gradient interacts with terrain. Winds can sometimes rush faster downslope, as happens in the Rockies or with the Santa Ana winds that fueled devastating wildfires in the Los Angeles area in January.

Violent tornadoes and storms

Of course, winds can become even stronger and more violent on local scales associated with thunderstorms.

When thunderstorms form, hail and precipitation in them can cause the air to rapidly fall in a downdraft, causing very high pressure under these storms. That pressure forces the air to spread out horizontally when it reaches the ground. Meteorologists call these straight line winds, and the process that forms them is a downburst. Large thunderstorms or chains of them moving across a region can cause large swaths of strong wind over 60 mph, called a derecho.

Finally, some of nature’s strongest winds occur inside tornadoes. They form when the winds surrounding a thunderstorm change speed and direction with height. This can cause part of the storm to rotate, setting off a chain of events that may lead to a tornado and winds as strong as 300 mph in the most violent tornadoes.

YouTube video
How a tornado forms. Source: NOAA.

Tornado winds are also associated with an intense pressure gradient. The pressure inside the center of a tornado is often very low and varies considerably over a very small distance.

It’s no coincidence that localized violent winds from thunderstorm downbursts and tornadoes often occur amid large-scale windstorms. Extratropical cyclones often draw warm, moist air northward on strong winds from the south, which is a key ingredient for thunderstorms. Storms also become more severe and may produce tornadoes when the jet stream is in close proximity to these low-pressure centers. In the winter and early spring, cold air funneling south on the northwest side of strong extratropical cyclones can even lead to blizzards.

So, the same wave in the jet stream can lead to strong winds, blowing dust and fire danger in one region, while simultaneously triggering a tornado outbreak and a blizzard in other regions.The Conversation

Chris Nowotarski, Associate Professor of Atmospheric Science, Texas A&M University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post What causes the powerful winds that fuel dust storms, wildfires and blizzards? A weather scientist explains appeared first on theconversation.com

Continue Reading

The Conversation

5 years on, true counts of COVID-19 deaths remain elusive − and research is hobbled by lack of data

Published

on

theconversation.com – Dylan Thomas Doyle, Ph.D. Candidate in Information Science, University of Colorado Boulder – 2025-03-20 07:47:00

National COVID-19 memorial wall for the five-year anniversary on March 11, 2025, in London, England.
Andrew Aitchison/In Pictures via Getty Images

Dylan Thomas Doyle, University of Colorado Boulder

In the early days of the COVID-19 pandemic, researchers struggled to grasp the rate of the virus’s spread and the number of related deaths. While hospitals tracked cases and deaths within their walls, the broader picture of mortality across communities remained frustratingly incomplete.

Policymakers and researchers quickly discovered a troubling pattern: Many deaths linked to the virus were never officially counted. A study analyzing data from over 3,000 U.S. counties between March 2020 and August 2022 found nearly 163,000 excess deaths from natural causes that were missing from official mortality records.

Excess deaths, meaning those that exceed the number expected based on historical trends, serve as a key indicator of underreported deaths during health crises. Many of these uncounted deaths were later tied to COVID-19 through reviews of medical records, death certificates and statistical modeling.

In addition, lack of real-time tracking for medical interventions during those early days slowed vaccine development by delaying insights into which treatments worked and how people were responding to newly circulating variants.

Five years since the beginning of COVID-19, new epidemics such as bird flu are emerging worldwide, and researchers are still finding it difficult to access the data about people’s deaths that they need to develop lifesaving interventions.

How can the U.S. mortality data system improve? I’m a technology infrastructure researcher, and my team and I design policy and technical systems to reduce inefficiency in health care and government organizations. By analyzing the flow of mortality data in the U.S., we found several areas of the system that could use updating.

Critical need for real-time data

A death record includes key details beyond just the fact of death, such as the cause, contributing conditions, demographics, place of death and sometimes medical history. This information is crucial for researchers to be able to analyze trends, identify disparities and drive medical advances.

Approximately 2.8 million death records are added to the U.S. mortality data system each year. But in 2022 – the most recent official count available – when the world was still in the throes of the pandemic, 3,279,857 deaths were recorded in the federal system. Still, this figure is widely considered to be a major undercount of true excess deaths from COVID-19.

In addition, real-time tracking of COVID-19 mortality data was severely lacking. This process involves the continuous collection, analysis and reporting of deaths from hospitals, health agencies and government databases by integrating electronic health records, lab reports and public health surveillance systems. Ideally, it provides up-to-date insights for decision-making, but during the COVID-19 pandemic, these tracking systems lagged and failed to generate comprehensive data.

Two health care workers in full PPE attending to a patient lying on hospital bed
Getting real-time COVID-19 data from hospitals and other agencies into the hands of researchers proved difficult.
Gerald Herbert/AP Photo

Without comprehensive data on prior COVID-19 infections, antibody responses and adverse events, researchers faced challenges designing clinical trials to predict how long immunity would last and optimize booster schedules.

Such data is essential in vaccine development because it helps identify who is most at risk, which variants and treatments affect survival rates, and how vaccines should be designed and distributed. And as part of the broader U.S. vital records system, mortality data is essential for medical research, including evaluating public health programs, identifying health disparities and monitoring disease.

At the heart of the problem is the inefficiency of government policy, particularly outdated public health reporting systems and slow data modernization efforts that hinder timely decision-making. These long-standing policies, such as reliance on paper-based death certificates and disjointed state-level reporting, have failed to keep pace with real-time data needs during crises such as COVID-19.

These policy shortcomings lead to delays in reporting and lack of coordination between hospital organizations, state government vital records offices and federal government agencies in collecting, standardizing and sharing death records.

History of US mortality data

The U.S. mortality data system has been cobbled together through a disparate patchwork of state and local governments, federal agencies and public health organizations over the course of more than a century and a half. It has been shaped by advances in public health, medical record-keeping and technology. From its inception to the present day, the mortality data system has been plagued by inconsistencies, inefficiencies and tensions between medical professionals, state governments and the federal government.

The first national efforts to track information about deaths began in the 1850s when the U.S. Census Bureau started collecting mortality data as part of the decennial census. However, these early efforts were inconsistent, as death registration was largely voluntary and varied widely across states.

In the early 20th century, the establishment of the National Vital Statistics System brought greater standardization to mortality data. For example, the system required all U.S. states and territories to standardize their death certificate format. It also consolidated mortality data at the federal level, whereas mortality data was previously stored at the state level.

However, state and federal reporting remained fragmented. For example, states had no unifom timeline for submitting mortality data, resulting in some states taking months or even years to finalize and release death records. Local or state-level paperwork processing practices also remained varied and at times contradictory.

Close-up of blank form titled CERTIFICATE OF DEATH
Death record processing varies by state.
eric1513/iStock via Getty Images Plus

To begin to close gaps in reporting timelines to aid medical researchers, in 1981 the National Center for Health Statistics – a division of the Centers for Disease Control and Prevention – introduced the National Death Index. This is a centralized database of death records collected from state vital statistics offices, making it easier to access death data for health and medical research. The system was originally paper-based, with the aim of allowing researchers to track the deaths of study participants without navigating complex bureaucracies.

As time has passed, the National Death Index and state databases have become increasingly digital. The rise of electronic death registration systems in recent decades has improved processing speed when it comes to researchers accessing mortality data from the National Death Index. However, while the index has solved some issues related to gaps between state and federal data, other issues, such as high fees and inconsistency in state reporting times, still plague it.

Accessing the data that matters most

With the Trump administration’s increasing removal of CDC public health datasets, it is unclear whether policy reform for mortality data will be addressed anytime soon.

Experts fear that the removal of CDC datasets has now set precedent for the Trump administration to cross further lines in its attempts to influence the research and data published by the CDC. The longer-term impact of the current administration’s public health policy on mortality data and disease response are not yet clear.

What is clear is that five years since COVID-19, the U.S. mortality tracking system remains unequipped to meet emerging public health crises. Without addressing these challenges, the U.S. may not be able to respond quickly enough to public health crises threatening American lives.The Conversation

Dylan Thomas Doyle, Ph.D. Candidate in Information Science, University of Colorado Boulder

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post 5 years on, true counts of COVID-19 deaths remain elusive − and research is hobbled by lack of data appeared first on theconversation.com

Continue Reading

The Conversation

Atlantic sturgeon were fished almost to extinction − ancient DNA reveals how Chesapeake Bay population changed over centuries

Published

on

theconversation.com – Natalia Przelomska, Research Associate in Archaeogenomics, National Museum of Natural History, Smithsonian Institution – 2025-03-20 07:47:00

Sturgeon can be several hundred pounds each.
cezars/E+ via Getty Images

Natalia Przelomska, Smithsonian Institution and Logan Kistler, Smithsonian Institution

Sturgeons are one of the oldest groups of fishes. Sporting an armor of five rows of bony, modified scales called dermal scutes and a sharklike tail fin, this group of several-hundred-pound beasts has survived for approximately 160 million years. Because their physical appearance has changed very little over time, supported by a slow rate of evolution, sturgeon have been called living fossils.

Despite their survival through several geological time periods, many present-day sturgeon species are at threat of extinction, with 17 of 27 species listed as “critically endangered.”

Conservation practitioners such as the Virginia Commonwealth University monitoring team are working hard to support recovery of Atlantic sturgeon in the Chesapeake Bay area. But it’s not clear what baseline population level people should strive toward restoring. How do today’s sturgeon populations compare with those of the past?

Three people carefully lower a large fish over the side of a boat toward the water
VCU monitoring team releases an adult Atlantic sturgeon back into the estuary.
Matt Balazik

We are a molecular anthropologist and a biodiversity scientist who focus on species that people rely on for subsistence. We study the evolution, population health and resilience of these species over time to better understand humans’ interaction with their environments and the sustainability of food systems.

For our recent sturgeon project, we joined forces with fisheries conservation biologist Matt Balazik, who conducts on-the-ground monitoring of Atlantic sturgeon, and Torben Rick, a specialist in North American coastal zooarchaeology. Together, we wanted to look into the past and see how much sturgeon populations have changed, focusing on the James River in Virginia. A more nuanced understanding of the past could help conservationists better plan for the future.

Sturgeon loomed large for millennia

In North America, sturgeon have played important subsistence and cultural roles in Native communities, which marked the seasons by the fishes’ behavioral patterns. Large summertime aggregations of lake sturgeon (Acipenser fulvescens) in the Great Lakes area inspired one folk name for the August full moon – the sturgeon moon. Woodland Era pottery remnants at archaeological sites from as long as 2,000 years ago show that the fall and springtime runs of Atlantic sturgeon (Acipenser oxyrinchus) upstream were celebrated with feasting.

triangular-shaped bone with round cavities
Archaeologists uncover bony scutes – modified scales that resemble armor for the living fish – in places where people relied on sturgeon for subsistence.
Logan Kistler and Natalia Przelomska

Archaeological finds of sturgeon remains support that early colonial settlers in North America, notably those who established Jamestown in the Chesapeake Bay area in 1607, also prized these fish. When Captain John Smith was leading Jamestown, he wrote “there was more sturgeon here than could be devoured by dog or man.” The fish may have helped the survival of this fortress-colony that was both stricken with drought and fostering turbulent relationships with the Native inhabitants.

This abundance is in stark contrast to today, when sightings of migrating fish are sparse. Exploitation during the past 300 years was the key driver of Atlantic sturgeon decline. Demand for caviar drove the relentless fishing pressure throughout the 19th century. The Chesapeake was the second-most exploited sturgeon fishery on the Eastern Seaboard up until the early 20th century, when the fish became scarce.

Man pulls large fish over side of boat
Conservation biologists capture the massive fish for monitoring purposes, which includes clipping a tiny part of the fin for DNA analysis.
Matt Balazik

At that point, local protection regulations were established, but only in 1998 was a moratorium on harvesting these fish declared. Meanwhile, abundance of Atlantic sturgeon remained very low, which can be explained in part by their lifespan. Short-lived fish such as herring and shad can recover population numbers much faster than Atlantic sturgeon, which live for up to 60 years and take a long time to reach reproductive age – up to around 12 years for males and as many as 28 years for females.

To help manage and restore an endangered species, conservation biologists tend to split the population into groups based on ranges. The Chesapeake Bay is one of five “distinct population segments” the U.S. Endangered Species Act listing in 2012 created for Atlantic sturgeon.

Since then, conservationists have pioneered genetic studies on Atlantic sturgeon, demonstrating through the power of DNA that natal river – where an individual fish is born – and season of spawning are both important for distinguishing subpopulations within each regional group. Scientists have also described genetic diversity in Atlantic sturgeon; more genetic variety suggests they have more capacity to adapt when facing new, potentially challenging conditions.

map highlighting Maycock's Point, Hatch Site, Jamestown and Williamsburg on the James River
The study focused on Atlantic sturgeon from the Chesapeake Bay region, past and present. The four archaeological sites included are highlighted.
Przelomska NAS et al., Proc. R. Soc. B 291: 20241145, CC BY

Sturgeon DNA, then and now

Archaeological remains are a direct source of data on genetic diversity in the past. We can analyze the genetic makeup of sturgeons that lived hundreds of years ago, before intense overfishing depleted their numbers. Then we can compare that baseline with today’s genetic diversity.

The James River was a great case study for testing out this approach, which we call an archaeogenomics time series. Having obtained information on the archaeology of the Chesapeake region from our collaborator Leslie Reeder-Myers, we sampled remains of sturgeon – their scutes and spines – at a precolonial-era site where people lived from about 200 C.E. to about 900 C.E. We also sampled from important colonial sites Jamestown (1607-1610) and Williamsburg (1720-1775). And we complemented that data from the past with tiny clips from the fins of present-day, live fish that Balazik and his team sampled during monitoring surveys.

scattering of small bone shards spilling out of ziplock bag, with a purple-gloved hand
Scientists separate Atlantic sturgeon scute fragments from larger collections of zooarchaeological remains, to then work on the scutes in a lab dedicated to studying ancient DNA.
Torben Rick and Natalia Przelomska

DNA tends to get physically broken up and biochemically damaged with age. So we relied on special protocols in a lab dedicated to studying ancient DNA to minimize the risk of contamination and enhance our chances of successfully collecting genetic material from these sturgeon.

Atlantic sturgeon have 122 chromosomes of nuclear DNA – over five times as many as people do. We focused on a few genetic regions, just enough to get an idea of the James River population groupings and how genetically distinct they are from one another.

We were not surprised to see that fall-spawning and spring-spawning groups were genetically distinct. What stood out, though, was how starkly different they were, which is something that can happen when a population’s numbers drop to near-extinction levels.

We also looked at the fishes’ mitochondrial DNA, a compact molecule that is easier to obtain ancient DNA from compared with the nuclear chromosomes. With our collaborator Audrey Lin, we used the mitochondrial DNA to confirm our hypothesis that the fish from archaeological sites were more genetically diverse than present-day Atlantic sturgeon.

Strikingly, we discovered that mitochondrial DNA did not always group the fish by season or even by their natal river. This was unexpected, because Atlantic sturgeon tend to return to their natal rivers for breeding. Our interpretation of this genetic finding is that over very long timescales – many thousands of years – changes in the global climate and in local ecosystems would have driven a given sturgeon population to migrate into a new river system, and possibly at a later stage back to its original one. This notion is supported by other recent documentation of fish occasionally migrating over long distances and mixing with new groups.

Our study used archaeology, history and ecology together to describe the decline of Atlantic sturgeon. Based on the diminished genetic diversity we measured, we estimate that the Atlantic sturgeon populations we studied are about a fifth of what they were before colonial settlement. Less genetic variability means these smaller populations have less potential to adapt to changing conditions. Our findings will help conservationists plan into the future for the continued recovery of these living fossils.The Conversation

Natalia Przelomska, Research Associate in Archaeogenomics, National Museum of Natural History, Smithsonian Institution and Logan Kistler, Curator of Archaeobotany and Archaeogenomics, National Museum of Natural History, Smithsonian Institution

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Atlantic sturgeon were fished almost to extinction − ancient DNA reveals how Chesapeake Bay population changed over centuries appeared first on theconversation.com

Continue Reading

Trending