Connect with us

The Conversation

Trump’s push to control Greenland echoes US purchase of Alaska from Russia in 1867

Published

on

theconversation.com – William L. Iggiagruk Hensley, Visiting Distinguished Professor, University of Alaska Anchorage – 2025-01-08 16:06:00

The U.S. bought Alaska and its significant natural resources and beautiful scenery for what amounts to a steal.

AP Photo/Mark Thiessen

William L. Iggiagruk Hensley, University of Alaska Anchorage

President-elect Donald Trump is again signaling his interest in Greenland through a series of provocative statements in which he’s mused about the prospect of the U.S. taking ownership – perhaps by force or economic coersion – of the world’s largest island by area.

Talk of a takeover of Greenland may seem fanciful. But it wouldn’t be the first time the U.S. was able to procure a piece of the Arctic. The U.S. bought Alaska from Russia in 1867. To mark the 150th anniversary of the sale in 2017, we asked William L. Iggiagruk Hensley, a visiting professor at the University of Alaska Anchorage, to write about that historic sale. This is the article we published then, with minor updates.

On March 30, 1867, U.S. Secretary of State William H. Seward and Russian envoy Baron Edouard de Stoeckl signed the Treaty of Cession. With a stroke of a pen, Tsar Alexander II had ceded Alaska, his country’s last remaining foothold in North America, to the United States for US$7.2 million.

That sum, amounting to just $138 million in today’s dollars, brought to an end Russia’s 125-year odyssey in Alaska and its expansion across the treacherous Bering Sea, which at one point extended the Russian Empire as far south as Fort Ross, California, 90 miles from San Francisco Bay.

Today, Alaska is one of the richest U.S. states thanks to its abundance of natural resources, such as petroleum, gold and fish, as well as its vast expanse of pristine wilderness and strategic location as a window on Russia and gateway to the Arctic.

So, what prompted Russia to withdraw from its American beachhead? And how did it come to possess it in the first place?

As a descendant of Inupiaq Eskimos, I have been living and studying this history all my life. In a way, there are two histories of how Alaska came to be American – and two perspectives. One concerns how the Russians took “possession” of Alaska and eventually ceded it to the U.S. The other is from the perspective of my people, who have lived in Alaska for thousands of years, and for whom the anniversary of the cession brings mixed emotions, including immense loss but also optimism.

Russia looks east

The lust for new lands that brought Russia to Alaska and eventually California began in the 16th century, when the country was a fraction of its current size.

That began to change in 1581, when Russia overran a Siberian territory known as the Khanate of Sibir, which was controlled by a grandson of Genghis Khan. This key victory opened up Siberia, and within 60 years the Russians were at the Pacific.

The Russian advance across Siberia was fueled in part by the lucrative fur trade, a desire to expand the Russian Orthodox Christian faith to the “heathen” populations in the east and the addition of new taxpayers and resources to the empire.

In the early 18th century, Peter the Great – who created Russia’s first navy – wanted to know how far the Asian landmass extended to the east. The Siberian city of Okhotsk became the staging point for two explorations he ordered. And in 1741, Vitus Bering successfully crossed the strait that bears his name and sighted Mt. Saint Elias, near what is now the village of Yakutat, Alaska.

Although Bering’s second Kamchatka expedition brought disaster for him personally when adverse weather on the return journey led to a shipwreck on one of the westernmost Aleutian Islands and his eventual death from scurvy in December 1741, it was an incredible success for Russia. The surviving crew fixed the ship, stocked it full of hundreds of the sea otters, foxes and fur seals that were abundant there, and returned to Siberia, impressing Russian fur hunters with their valuable cargo. This prompted something akin to the Klondike gold rush 150 years later.

Challenges emerge

But maintaining these settlements wasn’t easy. Russians in Alaska, who numbered no more than 800 at their peak, faced the reality of being half a globe away from Saint Petersburg, then the capital of the empire, making communications a key problem.

Also, Alaska was too far north to allow for significant agriculture and therefore unfavorable as a place to send large numbers of settlers. So they began exploring lands farther south, at first looking only for people to trade with so they could import the foods that wouldn’t grow in Alaska’s harsh climate. They sent ships to what is now California, established trade relations with the Spaniards there and eventually set up their own settlement at Fort Ross in 1812.

Thirty years later, however, the entity set up to handle Russia’s American explorations failed and sold what remained. Not long after, the Russians began to seriously question whether they could continue their Alaskan colony as well.

For starters, the colony was no longer profitable after the sea otter population was decimated. Then there was the fact that Alaska was difficult to defend, and Russia was short on cash due to the costs of the war in Crimea.

Americans eager for a deal

So, clearly, the Russians were ready to sell, but what motivated the Americans to want to buy?

In the 1840s, the United States had expanded its interests to Oregon, annexed Texas, fought a war with Mexico and acquired California. Afterward, Secretary of State Seward wrote in March 1848:

“Our population is destined to roll resistless waves to the ice barriers of the north, and to encounter oriental civilization on the shores of the Pacific.”

Almost 20 years after expressing his thoughts about expansion into the Arctic, Seward accomplished his goal.

In Alaska, the Americans foresaw a potential for gold, fur and fisheries, as well as more trade with China and Japan. The Americans worried that England might try to establish a presence in the territory, and the acquisition of Alaska, it was believed, would help the U.S. become a Pacific power. And overall the government was in an expansionist mode backed by the then-popular idea of “manifest destiny.”

So a deal with incalculable geopolitical consequences was struck, and the Americans seemed to get quite a bargain for their $7.2 million.

Just in terms of wealth, the U.S. gained about 370 million acres of mostly pristine wilderness, including 220 million acres of what are now federal parks and wildlife refuges. Hundreds of billions of dollars in whale oil, fur, copper, gold, timber, fish, platinum, zinc, lead and petroleum have been produced in Alaska over the years – allowing the state to do without a sales or income tax and give every resident an annual stipend. Alaska still likely has billions of barrels of oil reserves.

The state is also a key part of the United States defense system, with military bases located in Anchorage and Fairbanks, and it is the country’s only connection to the Arctic, which ensures it has a seat at the table as melting glaciers allow the exploration of the region’s significant resources.

Impact on Alaska Natives

But there’s an alternate version of this history.

When Bering finally located Alaska in 1741, Alaska was home to about 100,000 people, including Inuit, Athabascan, Yupik, Unangan and Tlingit. There were 17,000 alone on the Aleutian Islands.

Despite the relatively small number of Russians who at any one time lived at one of their settlements – mostly on the Aleutians Islands, Kodiak, Kenai Peninsula and Sitka – they ruled over the Native populations in their areas with an iron hand, taking children of the leaders as hostages, destroying kayaks and other hunting equipment to control the men and showing extreme force when necessary.

The Russians brought with them weaponry such as firearms, swords, cannons and gunpowder, which helped them secure a foothold in Alaska along the southern coast. They used firepower, spies and secured forts to maintain security, and they selected Christianized local leaders to carry out their wishes. They also met resistance, however, such as from the Tlingits, who were capable warriors, ensuring their hold on territory was tenuous.

By the time of the cession, only 50,000 Indigenous people were estimated to be left, as well as 483 Russians and 1,421 Creoles (descendants of Russian men and Indigenous women).

On the Aleutian Islands alone, the Russians enslaved or killed thousands of Aleuts. Their population plummeted to 1,500 in the first 50 years of Russian occupation due to a combination of warfare, disease and enslavement.

When the Americans took over, the United States was still engaged in its Indian wars, so they looked at Alaska and its Indigenous inhabitants as potential adversaries. Alaska was made a military district by Gen. Ulysses S. Grant.

For their part, Alaska Natives claimed that they still had title to the territory as its original inhabitants and having not lost the land in war or ceded it to any country – including the U.S., which technically didn’t buy it from the Russians but bought the right to negotiate with the Indigenous populations. Still, Natives were denied U.S. citizenship until 1924, when the Indian Citizenship Act was passed.

During that time, Alaska Natives had no rights as citizens and could not vote, own property or file for mining claims. The Bureau of Indian Affairs, in conjunction with missionary societies, in the 1860s began a campaign to eradicate Indigenous languages, religion, art, music, dance, ceremonies and lifestyles.

It was only in 1936 that the Indian Reorganization Act authorized tribal governments to form, and only nine years later overt discrimination was outlawed by Alaska’s Anti-Discrimination Act of 1945. The law banned signs such as “No Natives Need Apply” and “No Dogs or Natives Allowed,” which were common at the time.

Statehood and a disclaimer

Eventually, however, the situation improved markedly for Natives.

Alaska finally became a state in 1959, when President Dwight D. Eisenhower signed the Alaska Statehood Act, allotting it 104 million acres of the territory. And in an unprecedented nod to the rights of Alaska’s Indigenous populations, the act contained a clause emphasizing that citizens of the new state were declining any right to land subject to Native title – which by itself was a very thorny topic because they claimed the entire territory.

A result of this clause was that in 1971 President Richard Nixon ceded 44 million acres of federal land, along with $1 billion, to Alaska’s Native populations, which numbered about 75,000 at the time. That came after a Land Claims Task Force that I chaired gave the state ideas about how to resolve the issue.

Today, Alaska has a population of 740,000, of which 120,000 are Natives.

As the United States celebrates the signing of the Treaty of Cession, we all – Alaskans, Natives and Americans of the lower 48 – should salute Secretary of State William H. Seward, the man who eventually brought democracy and the rule of law to Alaska.

This article was first published on March 29, 2017.The Conversation

William L. Iggiagruk Hensley, Visiting Distinguished Professor, University of Alaska Anchorage

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Trump’s push to control Greenland echoes US purchase of Alaska from Russia in 1867 appeared first on theconversation.com

The Conversation

Contaminated milk from one plant in Illinois sickened thousands with Salmonella in 1985 − as outbreaks rise in the US, lessons from this one remain true

Published

on

theconversation.com – Michael Petros, Clinical Assistant Professor of Environmental and Occupational Health Sciences, University of Illinois Chicago – 2025-05-07 07:34:00

A valve that mixed raw milk with pasteurized milk at Hillfarm Dairy may have been the source of contamination. This was the milk processing area of the plant.
AP Photo/Mark Elias

Michael Petros, University of Illinois Chicago

In 1985, contaminated milk in Illinois led to a Salmonella outbreak that infected hundreds of thousands of people across the United States and caused at least 12 deaths. At the time, it was the largest single outbreak of foodborne illness in the U.S. and remains the worst outbreak of Salmonella food poisoning in American history.

Many questions circulated during the outbreak. How could this contamination occur in a modern dairy farm? Was it caused by a flaw in engineering or processing, or was this the result of deliberate sabotage? What roles, if any, did politics and failed leadership play?

From my 50 years of working in public health, I’ve found that reflecting on the past can help researchers and officials prepare for future challenges. Revisiting this investigation and its outcome provides lessons on how food safety inspections go hand in hand with consumer protection and public health, especially as hospitalizations and deaths from foodborne illnesses rise.

Contamination, investigation and intrigue

The Illinois Department of Public Health and the U.S. Centers for Disease Control and Prevention led the investigation into the outbreak. The public health laboratories of the city of Chicago and state of Illinois were also closely involved in testing milk samples.

Investigators and epidemiologists from local, state and federal public health agencies found that specific lots of milk with expiration dates up to April 17, 1985, were contaminated with Salmonella. The outbreak may have been caused by a valve at a processing plant that allowed pasteurized milk to mix with raw milk, which can carry several harmful microorganisms, including Salmonella.

Overall, labs and hospitals in Illinois and five other Midwest states – Indiana, Iowa, Michigan, Minnesota and Wisconsin – reported over 16,100 cases of suspected Salmonella poisoning to health officials.

To make dairy products, skimmed milk is usually separated from cream, then blended back together in different levels to achieve the desired fat content. While most dairies pasteurize their products after blending, Hillfarm Dairy in Melrose Park, Illinois, pasteurized the milk first before blending it into various products such as skim milk and 2% milk.

Subsequent examination of the production process suggested that Salmonella may have grown in the threads of a screw-on cap used to seal an end of a mixing pipe. Investigators also found this strain of Salmonella 10 months earlier in a much smaller outbreak in the Chicago area.

Microscopy image of six rod-shaped bacteria against a black background
Salmonella is a common cause of food poisoning.
Volker Brinkmann/Max Planck Institute for Infection Biology via PLoS One, CC BY-SA

Finding the source

The contaminated milk was produced at Hillfarm Dairy in Melrose Park, which was operated at the time by Jewel Companies Inc. During an April 3 inspection of the company’s plant, the Food and Drug Administration found 13 health and safety violations.

The legal fallout of the outbreak expanded when the Illinois attorney general filed suit against Jewel Companies Inc., alleging that employees at as many as 18 stores in the grocery chain violated water pollution laws when they dumped potentially contaminated milk into storm sewers. Later, a Cook County judge found Jewel Companies Inc. in violation of the court order to preserve milk products suspected of contamination and maintain a record of what happened to milk returned to the Hillfarm Dairy.

Political fallout also ensued. The Illinois governor at the time, James Thompson, fired the director of the Illinois Public Health Department when it was discovered that he was vacationing in Mexico at the onset of the outbreak and failed to return to Illinois. Notably, the health director at the time of the outbreak was not a health professional. Following this episode, the governor appointed public health professional and medical doctor Bernard Turnock as director of the Illinois Department of Public Health.

In 1987, after a nine-month trial, a jury determined that Jewel officials did not act recklessly when Salmonella-tainted milk caused one of the largest food poisoning outbreaks in U.S. history. No punitive damages were awarded to victims, and the Illinois Appellate Court later upheld the jury’s decision.

YouTube video
Raw milk is linked to many foodborne illnesses.

Lessons learned

History teaches more than facts, figures and incidents. It provides an opportunity to reflect on how to learn from past mistakes in order to adapt to future challenges. The largest Salmonella outbreak in the U.S. to date provides several lessons.

For one, disease surveillance is indispensable to preventing outbreaks, both then and now. People remain vulnerable to ubiquitous microorganisms such as Salmonella and E. coli, and early detection of an outbreak could stop it from spreading and getting worse.

Additionally, food production facilities can maintain a safe food supply with careful design and monitoring. Revisiting consumer protections can help regulators keep pace with new threats from new or unfamiliar pathogens.

Finally, there is no substitute for professional public health leadership with the competence and expertise to respond effectively to an emergency.The Conversation

Michael Petros, Clinical Assistant Professor of Environmental and Occupational Health Sciences, University of Illinois Chicago

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Contaminated milk from one plant in Illinois sickened thousands with Salmonella in 1985 − as outbreaks rise in the US, lessons from this one remain true appeared first on theconversation.com



Note: The following A.I. based commentary is not part of the original article, reproduced above, but is offered in the hopes that it will promote greater media literacy and critical thinking, by making any potential bias more visible to the reader –Staff Editor.

Political Bias Rating: Centrist

The article provides an analytical, factual recounting of the 1985 Salmonella outbreak, with an emphasis on public health, safety standards, and lessons learned from past mistakes. It critiques the failures in leadership and oversight during the incident but avoids overt ideological framing. While it highlights political accountability, particularly the firing of a public health official and the appointment of a medical professional, it does so in a balanced manner without assigning blame to a specific political ideology. The content stays focused on the public health aspect and the importance of professional leadership, reflecting a centrist perspective in its delivery.

Continue Reading

The Conversation

Predictive policing AI is on the rise − making it accountable to the public could curb its harmful effects

Published

on

theconversation.com – Maria Lungu, Postdoctoral Researcher of Law and Public Administration, University of Virginia – 2025-05-06 07:35:00

Data like this seven-day crime map from Oakland, Calif., feeds predictive policing AIs.
City of Oakland via CrimeMapping.com

Maria Lungu, University of Virginia

The 2002 sci-fi thriller “Minority Report” depicted a dystopian future where a specialized police unit was tasked with arresting people for crimes they had not yet committed. Directed by Steven Spielberg and based on a short story by Philip K. Dick, the drama revolved around “PreCrime” − a system informed by a trio of psychics, or “precogs,” who anticipated future homicides, allowing police officers to intervene and prevent would-be assailants from claiming their targets’ lives.

The film probes at hefty ethical questions: How can someone be guilty of a crime they haven’t yet committed? And what happens when the system gets it wrong?

While there is no such thing as an all-seeing “precog,” key components of the future that “Minority Report” envisioned have become reality even faster than its creators imagined. For more than a decade, police departments across the globe have been using data-driven systems geared toward predicting when and where crimes might occur and who might commit them.

Far from an abstract or futuristic conceit, predictive policing is a reality. And market analysts are predicting a boom for the technology.

Given the challenges in using predictive machine learning effectively and fairly, predictive policing raises significant ethical concerns. Absent technological fixes on the horizon, there is an approach to addressing these concerns: Treat government use of the technology as a matter of democratic accountability.

Troubling history

Predictive policing relies on artificial intelligence and data analytics to anticipate potential criminal activity before it happens. It can involve analyzing large datasets drawn from crime reports, arrest records and social or geographic information to identify patterns and forecast where crimes might occur or who may be involved.

Law enforcement agencies have used data analytics to track broad trends for many decades. Today’s powerful AI technologies, however, take in vast amounts of surveillance and crime report data to provide much finer-grained analysis.

Police departments use these techniques to help determine where they should concentrate their resources. Place-based prediction focuses on identifying high-risk locations, also known as hot spots, where crimes are statistically more likely to happen. Person-based prediction, by contrast, attempts to flag individuals who are considered at high risk of committing or becoming victims of crime.

These types of systems have been the subject of significant public concern. Under a so-called “intelligence-led policing” program in Pasco County, Florida, the sheriff’s department compiled a list of people considered likely to commit crimes and then repeatedly sent deputies to their homes. More than 1,000 Pasco residents, including minors, were subject to random visits from police officers and were cited for things such as missing mailbox numbers and overgrown grass.

YouTube video
Lawsuits forced the Pasco County, Fla., Sheriff’s Office to end its troubled predictive policing program.

Four residents sued the county in 2021, and last year they reached a settlement in which the sheriff’s office admitted that it had violated residents’ constitutional rights to privacy and equal treatment under the law. The program has since been discontinued.

This is not just a Florida problem. In 2020, Chicago decommissioned its “Strategic Subject List,” a system where police used analytics to predict which prior offenders were likely to commit new crimes or become victims of future shootings. In 2021, the Los Angeles Police Department discontinued its use of PredPol, a software program designed to forecast crime hot spots but was criticized for low accuracy rates and reinforcing racial and socioeconomic biases.

Necessary innovations or dangerous overreach?

The failure of these high-profile programs highlights a critical tension: Even though law enforcement agencies often advocate for AI-driven tools for public safety, civil rights groups and scholars have raised concerns over privacy violations, accountability issues and the lack of transparency. And despite these high-profile retreats from predictive policing, many smaller police departments are using the technology.

Most American police departments lack clear policies on algorithmic decision-making and provide little to no disclosure about how the predictive models they use are developed, trained or monitored for accuracy or bias. A Brookings Institution analysis found that in many cities, local governments had no public documentation on how predictive policing software functioned, what data was used, or how outcomes were evaluated.

YouTube video
Predictive policing can perpetuate racial bias.

This opacity is what’s known in the industry as a “black box.” It prevents independent oversight and raises serious questions about the structures surrounding AI-driven decision-making. If a citizen is flagged as high-risk by an algorithm, what recourse do they have? Who oversees the fairness of these systems? What independent oversight mechanisms are available?

These questions are driving contentious debates in communities about whether predictive policing as a method should be reformed, more tightly regulated or abandoned altogether. Some people view these tools as necessary innovations, while others see them as dangerous overreach.

A better way in San Jose

But there is evidence that data-driven tools grounded in democratic values of due process, transparency and accountability may offer a stronger alternative to today’s predictive policing systems. What if the public could understand how these algorithms function, what data they rely on, and what safeguards exist to prevent discriminatory outcomes and misuse of the technology?

The city of San Jose, California, has embarked on a process that is intended to increase transparency and accountability around its use of AI systems. San Jose maintains a set of AI principles requiring that any AI tools used by city government be effective, transparent to the public and equitable in their effects on people’s lives. City departments also are required to assess the risks of AI systems before integrating them into their operations.

If taken correctly, these measures can effectively open the black box, dramatically reducing the degree to which AI companies can hide their code or their data behind things such as protections for trade secrets. Enabling public scrutiny of training data can reveal problems such as racial or economic bias, which can be mitigated but are extremely difficult if not impossible to eradicate.

Research has shown that when citizens feel that government institutions act fairly and transparently, they are more likely to engage in civic life and support public policies. Law enforcement agencies are likely to have stronger outcomes if they treat technology as a tool – rather than a substitute – for justice.The Conversation

Maria Lungu, Postdoctoral Researcher of Law and Public Administration, University of Virginia

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Predictive policing AI is on the rise − making it accountable to the public could curb its harmful effects appeared first on theconversation.com



Note: The following A.I. based commentary is not part of the original article, reproduced above, but is offered in the hopes that it will promote greater media literacy and critical thinking, by making any potential bias more visible to the reader –Staff Editor.

Political Bias Rating: Center-Left

The article provides an analysis of predictive policing, highlighting both the technological potential and ethical concerns surrounding its use. While it presents factual information, it leans towards caution and skepticism regarding the fairness, transparency, and potential racial biases of these systems. The framing of these issues, along with an emphasis on democratic accountability, transparency, and civil rights, aligns more closely with center-left perspectives that emphasize government oversight, civil liberties, and fairness. The critique of predictive policing technologies without overtly advocating for their abandonment reflects a balanced but cautious stance on technology’s role in law enforcement.

Continue Reading

The Conversation

Worsening allergies aren’t your imagination − windy days create the perfect pollen storm

Published

on

theconversation.com – Christine Cairns Fortuin, Assistant Professor of Forestry, Mississippi State University – 2025-05-05 07:45:00

Windy days can mean more pollen and more sneezing.
mladenbalinovac/E+ via Getty Images

Christine Cairns Fortuin, Mississippi State University

Evolution has fostered many reproductive strategies across the spectrum of life. From dandelions to giraffes, nature finds a way.

One of those ways creates quite a bit of suffering for humans: pollen, the infamous male gametophyte of the plant kingdom.

In the Southeastern U.S., where I live, you know it’s spring when your car has turned yellow and pollen blankets your patio furniture and anything else left outside. Suddenly there are long lines at every car wash in town.

A car covered in yellow. Someone drew a smiley face with the words 'LOLLEN,' with LOL underlined.
On heavy pollen days, cars can end up covered in yellow grains.
Scott Akerman/Flickr, CC BY

Even people who aren’t allergic to pollen – clearly an advantage for a pollination ecologist like me – can experience sneezing and watery eyes during the release of tree pollen each spring. Enough particulate matter in the air will irritate just about anyone, even if your immune system does not launch an all-out attack.

So, why is there so much pollen? And why does it seem to be getting worse?

2 ways trees spread their pollen

Trees don’t have an easy time in the reproductive game. As a tree, you have two options to disperse your pollen.

Option 1: Employ an agent, such as a butterfly or bee, that can carry your pollen to another plant of the same species.

The downside of this option is that you must invest in a showy flower display and a sweet scent to advertise yourself, and sugary nectar to pay your agent for its services.

A bee noses into a white flower.
A bee enjoys pollen from a cherry blossom. Pollen is a primary source of protein for bees.
Ivan Radic/Flickr, CC BY

Option 2, the budget option, is much less precise: Get a free ride on the wind.

Wind was the original pollinator, evolving long before animal-mediated pollination. Wind doesn’t require a showy flower nor a nectar reward. What it does require for pollination to succeed is ample amounts of lightweight, small-diameter pollen.

Why wind-blown pollen makes allergies worse

Wind is not an efficient pollinator, however. The probability of one pollen grain landing in the right location – the stigma or ovule of another plant of the same species – is infinitesimally small.

Therefore, wind-pollinated trees must compensate for this inefficiency by producing copious amounts of pollen, and it must be light enough to be carried.

For allergy sufferers, that can mean air filled with microscopic pollen grains that can get into your eyes, throat and lungs, sneak in through window screens and convince your immune system that you’ve inhaled a dangerous intruder.

Tiny flowers on a live oak tree.
When wind blows the tiny pollen grains of live oaks, allergy sufferers feel it.
Charles Willgren/Flickr, CC BY

Plants relying on animal-mediated pollination, by contrast, can produce heavier and stickier pollen to adhere to the body of an insect. So don’t blame the bees for your allergies – it’s really the wind.

Climate change has a role here, too

Plants initiate pollen release based on a few factors, including temperature and light cues. Many of our temperate tree species respond to cues that signal the beginning of spring, including warmer temperatures.

Studies have found that pollen seasons have intensified in the past three decades as the climate has warmed. One study that examined 60 location across North America found pollen seasons expanded by an average of 20 days from 1990 to 2018 and pollen concentrations increased by 21%.

That’s not all. Increasing carbon dioxide levels may also be driving increases in the quantity of tree pollen produced.

Why the Southeast gets socked

What could make this pollen boost even worse?

For the Southeastern U.S. in particular, strong windstorms are becoming more common and more intense − and not just hurricanes.

Anyone who has lived in the Southeast for the past couple of decades has likely noticed this. The region has more tornado warnings, more severe thunderstorms, more power outages. This is especially true in the mid-South, from Mississippi to Alabama.

A map showing windiest events in the Southeast are over Alabama and Mississippi.
Severity of wind and storm events mapped from NOAA data, 2012-2019, shows high activity over Mississippi and Alabama. Red areas have the most severe events.
Christine Cairns Fortuin

Since wind is the vector of airborne pollen, windier conditions can also make allergies worse. Pollen remains airborne for longer on windy days, and it travels farther.

To make matters worse, increasing storm activity may be doing more than just transporting pollen. Storms can also break apart pollen grains, creating smaller particles that can penetrate deeper into the lungs.

Many allergy sufferers may notice worsening allergies during storms.

The peak of spring wind and storm season tends to correspond to the timing of the release of tree pollen that blankets our world in yellow. The effects of climate change, including longer pollen seasons and more pollen released, and corresponding shifts in windy days and storm severity are helping to create the perfect pollen storm.The Conversation

Christine Cairns Fortuin, Assistant Professor of Forestry, Mississippi State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Worsening allergies aren’t your imagination − windy days create the perfect pollen storm appeared first on theconversation.com



Note: The following A.I. based commentary is not part of the original article, reproduced above, but is offered in the hopes that it will promote greater media literacy and critical thinking, by making any potential bias more visible to the reader –Staff Editor.

Political Bias Rating: Centrist

The content is a scientific and educational article focusing on the biology of pollen, its effects on allergies, and the influence of climate change on pollen production. It presents factual information supported by research studies and references, without taking a partisan stance. While it acknowledges climate change as a factor, the discussion remains grounded in scientific observation rather than political opinion, leading to a neutral, centrist tone.

Continue Reading

Trending