Connect with us

The Conversation

Harvard expands its definition of antisemitism – when does criticism of Israel cross a line?

Published

on

theconversation.com – Joshua Shanes, Professor of Jewish Studies, College of Charleston – 2025-01-24 07:37:00

Harvard has adopted a broader definition of antisemitism.
Education Images/Universal Images Group via Getty Images

Joshua Shanes, College of Charleston

As part of Harvard University’s agreement in response to two federal lawsuits filed by Jewish students alleging antisemitic discrimination, it will adopt the International Holocaust Remembrance Alliance, or IHRA, “working definition” of antisemitism.

This is a definition favored by many Jewish community leaders and politicians because its broad language can be applied to most anti-Israel rhetoric. This includes Kenneth Marcus, who served as assistant secretary of education during the first Trump administration and represented the students as chairman of the Louis D. Brandeis Center for Human Rights Under Law.

In contrast, many scholars prefer either the competing Jerusalem Declaration on Antisemitism or the definition offered by the Nexus Task Force, a committee of experts led by the Bard Center for the Study of Hate. I am a member of the Nexus group and also helped compose its 2024 “Campus Guide to Identifying Antisemitism.”

The controversy over this move indicates that many well-intentioned people still struggle to understand what exactly constitutes antisemitism and when anti-Israel rhetoric crosses the line.

As a scholar of modern Jewish history, I offer this primer that helps answer this question.

History of antisemitism

There has been a sharp increase in antisemitism around the world since the Oct. 7, 2023, massacre by Hamas and Israel’s subsequent military attacks in the Gaza Strip.

Anti-Jewish animosity dates to antiquity. The early Christian church attacked Jews, whom it blamed for crucifying Christ, and claimed to replace them as God’s chosen people. The Gospel of John in the New Testament accused Jews of being Satan’s children, while others called them demons intent on sacrificing the souls of men.

Medieval Christians added other myths, such as the blood libel – the lie that Jews ritually murdered Christian children for their blood. Other myths accused them of poisoning wells or desecrating the consecrated host of the Eucharist to reenact the murder of Christ; some even claimed that Jews had inhuman biology such as horns or that they suckled at the teats of pigs.

Such lies led to violent persecution of Jews over many centuries.

Modern antisemitism

In the 19th century, these myths were supplanted by the additional element of race – the claim that Jewishness was immutable and could not be changed via conversion. Though this idea first appeared in 15th-century Spain, it was deeply connected to the rise of modern nationalism.

Nineteenth-century ethno-nationalists rejected the idea of a political nation united in a social contract with each other. They began imagining the nation as a biological community linked by common descent in which Jews might be tolerated but could never truly belong.

Finally, in 1879, the German journalist Wilhelm Marr pushed the term “antisemitism” to reflect that his anti-Jewish ideology was based on race, not religion. Marr imagined the Jews as a foreign, “semitic” race, referring to the language group that includes Hebrew. The term has since persisted to mean specifically anti-Jewish hostility or prejudice.

The myth of a Jewish conspiracy

Modern antisemitism built on those premodern foundations, which never completely disappeared, but was fundamentally different. It emerged as part of the new politics of the democratic modern era.

Antisemitism became the core platform of new political parties, which used it to unite otherwise opposing groups, such as shopkeepers and farmers, anxious about the modernizing world. In other words, it was not merely prejudice; it was a worldview that explained the entire world to its believers by blaming all of its faults on this scapegoat.

Unlike earlier anti-Jewish hatred, this was less about religion and more about political and social issues. Antisemites believed the conspiracy theory that Jews all over the world controlled the levers of government, media and banking, and that defeating them would solve society’s problems.

Thus, one of the most important features of modern antisemitic mythology was the belief that Jews constituted a single, malevolent group, with one mind, organized for the purpose of conquering and destroying the world.

Negative traits attributed to Jews

Antisemitic books and cartoons often used claws or tentacles to symbolize the “international Jew,” a shadowy figure they blamed for leading a global conspiracy, strangling and destroying society. Others depicted him as a puppet master running the world.

In the late 19th century, Edmond Rothschild, head of the most famous Jewish banking family, was villainized as the symbol of international Jewish wealth and nefarious power. Today, the billionaire liberal philanthropist George Soros is often portrayed in similar ways.

This myth that Jews constitute an international creature plotting to harm the nation has inspired massacres of Jews since the 19th century, beginning with the Russian pogroms of 1881 and leading up to the Holocaust.

More recently, in 2018, Robert Bowers murdered 11 Jews at the Tree of Life synagogue in Pittsburgh because he was convinced that Jews, collectively under the guidance of Soros, were working to destroy America by facilitating the mass migration of nonwhite people into the country.

Modern antisemites ascribe many immutable negative traits to Jews, but two are particularly widespread. First, Jews are said to be ruthless misers who care more about their allegedly ill-gotten wealth than the interests of their countries. Second, Jews’ loyalty to their countries is considered suspect because they are said to constitute a foreign element.

Since Israel’s establishment in 1948, this hatred has focused on the accusation that Jews’ primary loyalty is to Israel, not the countries they live in.

Antisemitism and anti-Zionism

In recent years, the relationship between antisemitism and anti-Zionism has taken on renewed importance. Zionism has many factions but roughly refers to the modern political movement that argues Jews constitute a nation and have a right to self-determination in that land.

Some activists claim that anti-Zionism – ideological opposition to Zionism – is inherently antisemitic because they equate it with denying Jews the right to self-determination and therefore equality.

Others feel that there needs to be a clearer separation between anti-Zionism and antisemitism. They argue that equating anti-Zionism with antisemitism leads to silencing criticism of Israel’s structural mistreatment of Palestinians.

Zionism in practice has meant the achievement of a flourishing safe haven for Jews, but it has also led to dislocation or inequality for millions of Palestinians, including refugees, West Bank Palestinians who still live under military rule, and even Palestinian citizens of Israel who face legal and social discrimination. Anti-Zionism opposes this, and critics argue that it should not be labeled antisemitic unless it taps into those antisemitic myths or otherwise calls for violence or inequality for Jews.

This debate is evident in these competing definitions of antisemitism. Remarkably, the three main definitions tend to agree on the nature of antisemitism except regarding the relationship of anti-Israel rhetoric to antisemitism. The IHRA definition, which is by design vague and open to interpretation, allows for a wider swath of anti-Israel activism to be labeled antisemitic than the others.

The Jerusalem Declaration, in contrast, understands rhetoric to have “crossed the line” only when it engages in antisemitic mythology, blames diaspora Jews for the actions of the Israeli state, or calls for the oppression of Jews in Israel. IHRA defenders use that definition to label a call for binational democracy – meaning citizenship for West Bank Palestinians – to be antisemitic. Likewise, they label boycotts, even of West Bank settlements that most of the world considers illegal, to be antisemitic. The Jerusalem Declaration does not.

In other words, the key to identifying whether anti-Israel discourse has masked antisemitism is to see evidence of antisemitic mythology. For example, if Israel is described as leading an international conspiracy, or if it holds the key to solving global problems, all three definitions agree this is antisemitic.

Equally, if Jews or Jewish institutions are held responsible for Israeli actions or are expected to take a stand one way or another regarding them, again all three definitions agree that this crosses the line because it is based on the myth of a global Jewish conspiracy.

Identity and pride

Critically, for many Jews living in other countries, Zionism is not primarily a political argument about the state of Israel. It instead constitutes a sense of Jewish identity and pride, even a religious identity. In contrast, many protests against Israel and Zionism are focused not on ideology but on the Israeli government and its real or alleged actions.

This disconnect can lead to confusion if protests conflate Jews with Israel just because they are Zionist, which is antisemitic. On the other hand, Jews sometimes take protests against Israel in defense of Palestinian rights to be attacks on their Zionist identity and thus antisemitic, when they are not. There are certainly gray areas, but in general, calls for Palestinian equality, I believe, are legitimate even when they upset people with Zionist identities.

Harvard’s statement captures this distinction. It posted a statement that, “For many Jewish people, Zionism is a part of their Jewish identity,” and added that Jews who subscribe to this identity must not be excluded from campus events on that basis.

This does not mean that Jews are protected from hearing contrary views, any more than they are protected from hearing Christian preachers on campus or professors who teach secular views of the Bible. It means that they cannot be excluded based only on those beliefs.

This does not, however, require an adoption of the IHRA definition of antisemitism, which goes much further. Many advocates of the IHRA definition use it to label political calls for Palestinian equality as antisemitic, as well as accusations against Israel that they consider wrong or unfair.

Harvard’s adoption of the IHRA definition, accordingly, would mean that any speech that calls for full equality for Palestinians risks academic and legal sanction, even without any material discrimination against Jewish students. It is thus opposed by students who advocate for Palestinian rights as well as supporters of free speech more generally.

Editor’s note: This is an updated version of an article first published on Jan. 29, 2024The Conversation

Joshua Shanes, Professor of Jewish Studies, College of Charleston

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Harvard expands its definition of antisemitism – when does criticism of Israel cross a line? appeared first on theconversation.com

The Conversation

Being alone has its benefits − a psychologist flips the script on the ‘loneliness epidemic’

Published

on

theconversation.com – Virginia Thomas, Assistant Professor of Psychology, Middlebury – 2025-04-04 07:18:00

Studies show that choosing ‘me time’ is not a recipe for loneliness but can boost your creativity and emotional well-being.
FotoDuets/iStock via Getty Images Plus

Virginia Thomas, Middlebury

Over the past few years, experts have been sounding the alarm over how much time Americans spend alone.

Statistics show that we’re choosing to be solitary for more of our waking hours than ever before, tucked away at home rather than mingling in public. Increasing numbers of us are dining alone and traveling solo, and rates of living alone have nearly doubled in the past 50 years.

These trends coincided with the surgeon general’s 2023 declaration of a loneliness epidemic, leading to recent claims that the U.S. is living in an “anti-social century.”

Loneliness and isolation are indeed social problems that warrant serious attention, especially since chronic states of loneliness are linked with poor outcomes such as depression and a shortened lifespan.

But there is another side to this story, one that deserves a closer look. For some people, the shift toward aloneness represents a desire for what researchers call “positive solitude,” a state that is associated with well-being, not loneliness.

As a psychologist, I’ve spent the past decade researching why people like to be alone – and spending a fair amount of time there myself – so I’m deeply familiar with the joys of solitude. My findings join a host of others that have documented a long list of benefits gained when we choose to spend time by ourselves, ranging from opportunities to recharge our batteries and experience personal growth to making time to connect with our emotions and our creativity.

YouTube video
Being alone can help remind people who they are.

So it makes sense to me why people live alone as soon as their financial circumstances allow, and when asked why they prefer to dine solo, people say simply, “I want more me time.”

It’s also why I’m not surprised that a 2024 national survey found that 56% of Americans considered alone time essential for their mental health. Or that Costco is now selling “solitude sheds” where for around US$2,000 you can buy yourself some peace and quiet.

It’s clear there is a desire, and a market, for solitude right now in American culture. But why does this side of the story often get lost amid the warnings about social isolation?

I suspect it has to do with a collective anxiety about being alone.

The stigma of solitude

This anxiety stems in large part from our culture’s deficit view of solitude. In this type of thinking, the desire to be alone is seen as unnatural and unhealthy, something to be pitied or feared rather than valued or encouraged.

This isn’t just my own observation. A study published in February 2025 found that U.S. news headlines are 10 times more likely to frame being alone negatively than positively. This type of bias shapes people’s beliefs, with studies showing that adults and children alike have clear judgments about when it is – and importantly when it is not – acceptable for their peers to be alone.

This makes sense given that American culture holds up extraversion as the ideal – indeed as the basis for what’s normal. The hallmarks of extraversion include being sociable and assertive, as well as expressing more positive emotions and seeking more stimulation than the opposite personality – the more reserved and risk-averse introverts. Even though not all Americans are extraverts, most of us have been conditioned to cultivate that trait, and those who do reap social and professional rewards. In this cultural milieu, preferring to be alone carries stigma.

But the desire for solitude is not pathological, and it’s not just for introverts. Nor does it automatically spell social isolation and a lonely life. In fact, the data doesn’t fully support current fears of a loneliness epidemic, something scholars and journalists have recently acknowledged.

In other words, although Americans are indeed spending more time alone than previous generations did, it’s not clear that we are actually getting lonelier. And despite our fears for the eldest members of our society, research shows that older adults are happier in solitude than the loneliness narrative would lead us to believe.

YouTube video
It’s all a balancing act – along with solitude, you need to socialize.

Social media disrupts our solitude

However, solitude’s benefits don’t automatically appear whenever we take a break from the social world. They arrive when we are truly alone – when we intentionally carve out the time and space to connect with ourselves – not when we are alone on our devices.

My research has found that solitude’s positive effects on well-being are far less likely to materialize if the majority of our alone time is spent staring at our screens, especially when we’re passively scrolling social media.

This is where I believe the collective anxiety is well placed, especially the focus on young adults who are increasingly forgoing face-to-face social interaction in favor of a virtual life – and who may face significant distress as a result.

Social media is by definition social. It’s in the name. We cannot be truly alone when we’re on it. What’s more, it’s not the type of nourishing “me time” I suspect many people are longing for.

True solitude turns attention inward. It’s a time to slow down and reflect. A time to do as we please, not to please anyone else. A time to be emotionally available to ourselves, rather than to others. When we spend our solitude in these ways, the benefits accrue: We feel rested and rejuvenated, we gain clarity and emotional balance, we feel freer and more connected to ourselves.

But if we’re addicted to being busy, it can be hard to slow down. If we’re used to looking at a screen, it can be scary to look inside. And if we don’t have the skills to validate being alone as a normal and healthy human need, then we waste our alone time feeling guilty, weird or selfish.

The importance of reframing solitude

Americans choosing to spend more time alone is indeed a challenge to the cultural script, and the stigmatization of solitude can be difficult to change. Nevertheless, a small but growing body of research indicates that it is possible, and effective, to reframe the way we think about solitude.

For example, viewing solitude as a beneficial experience rather than a lonely one has been shown to help alleviate negative feelings about being alone, even for the participants who were severely lonely. People who perceive their time alone as “full” rather than “empty” are more likely to experience their alone time as meaningful, using it for growth-oriented purposes such as self-reflection or spiritual connection.

Even something as simple as a linguistic shift – replacing “isolation” with “me time” – causes people to view their alone time more positively and likely affects how their friends and family view it as well.

It is true that if we don’t have a community of close relationships to return to after being alone, solitude can lead to social isolation. But it’s also true that too much social interaction is taxing, and such overload negatively affects the quality of our relationships. The country’s recent gravitational pull toward more alone time may partially reflect a desire for more balance in a life that is too busy, too scheduled and, yes, too social.

Just as connection with others is essential for our well-being, so is connection with ourselves.The Conversation

Virginia Thomas, Assistant Professor of Psychology, Middlebury

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Being alone has its benefits − a psychologist flips the script on the ‘loneliness epidemic’ appeared first on theconversation.com

Continue Reading

The Conversation

Hard work feels worth it, but only after it’s done – new research on how people value effort

Published

on

theconversation.com – Piotr Winkielman, Professor of Psychology, University of California, San Diego – 2025-04-04 07:16:00

How many stairs would you climb for that payoff?
Ozgur Donmaz/DigitalVision via Getty Images

Piotr Winkielman, University of California, San Diego and Przemysław Marcowski, University of California, San Diego

When deciding if something is worth the effort, whether you’ve already exerted yourself or face the prospect of work changes your calculus. That’s what we found in our new research, published in the Journal of Experimental Psychology: General.

When you consider a future effort, more work makes the outcome less appealing. But once you’ve completed the work, more effort makes the outcome seem more valuable. We also discovered that hiding behind this general principle of timing there are individual differences in how future and past effort shapes people’s value for the fruits of their labor.

What’s it worth to you?

In our experiment, we gave participants a choice between a fixed amount of money and a household item – a mug – that they could take home if they exerted some amount of physical effort, roughly equivalent to walking up one, two or three flights of stairs.

This setup allowed us to determine the value each person placed on the effort – did it add to or subtract from the value of the item? For instance, if putting in a little more effort made someone switch their decision and decide to go with the cash instead of the mug, we could tell that they valued the mug plus that amount of effort less than that sum of money.

We also manipulated the time aspect of effort. When the effort was in the future, participants decided whether they wanted to go with the cash or get the mug with some effort. When the effort was in the past, participants decided whether they wanted to cash in the mug they had already earned with effort.

As we had expected, future effort generally detracted from the value of the mug, but the past effort generally increased it.

But these general trends do not tell the whole story. Not everyone responds to effort the same way. Our study also uncovered striking individual differences. Four distinct patterns emerged:

  1. For some people, extra effort always subtracted value.
  2. Others consistently preferred items with more work.
  3. Many showed mixed patterns, where moderate effort increased value but excessive effort decreased it.
  4. Some experienced the opposite: initially disliking effort, then finding greater value at higher levels.

These changing patterns show that one’s relationship with effort isn’t simple. For many people, there’s a sweet spot – a little effort might make something more valuable, but push too far and the value drops. It’s like enjoying a 30-minute workout but dreading a 2-hour session, or conversely, feeling that a 5-minute workout isn’t worth changing clothes for, but a 45-minute session feels satisfying.

Our paper offers a mathematical model that accounts for these individual differences by proposing that your mind flexibly computes costs and benefits of effort.

Why violate the ‘law of less work?’

Why should timing even matter for effort? It seems obvious that reason and nature would teach you to always avoid and dislike effort.

hummingbird hovers with beak approaching a flower
A hummingbird that puts in lots of extra work to get the same amount of nectar won’t last long.
Juan Carlos Vindas/Moment via Getty Images

A hummingbird that prefers a hard-to-get flower over an easy equal alternative might win an A for effort, but, exhausted, would not last long. The cruel world requires “resource rationality” – optimal, efficient use of limited physical and mental resources, balancing the benefits of actions with the required effort.

That insight is captured by the classic psychological “law of less work,” basically boiling down to the idea that given equivalent outcomes, individuals prefer easier options. Anything different would seem irrational or, in plain language, stupid.

If so, then how come people, and even animals, often prize things that require hard work for no additional payoff? Why is being hard-to-get a route to value? Anyone who has labored hard for anything knows that investing effort makes the final prize sweeter – whether in love, career, sports or Ikea furniture assembly.

Could the answer to this “paradox of effort” be that in the hummingbird example, the decision is about future effort, and in the Ikea effect, the effort is in the past?

Our new findings explain seemingly contradictory phenomena in everyday life. In health care, starting an exercise regimen feels overwhelming when focusing on upcoming workouts, but after establishing the habit, those same exercises become a source of accomplishment. At work, professionals might avoid learning difficult new skills, yet after mastering them, they value their enhanced abilities more because they were challenging to acquire.

JFK stands at a lectern outdoor in a sports stadium
John F. Kennedy supported space exploration efforts, ‘not because they are easy, but because they are hard.’
Robert Knudsen. White House Photographs. John F. Kennedy Presidential Library and Museum, Boston, CC BY

What still isn’t known

Sayings like “No pain, no gain” or “Easy come, easy go” populate our language and seem fundamental to our culture. But researchers still don’t fully understand why some people value effortful options more than others do. Is it physical aptitude, past experiences, a sense of meaning, perception of difficulty as importance or impossibility, moralization of effort, specific cultural beliefs about hard work? We don’t know yet.

We’re now studying how effort shapes different aspects of value: monetary value; hedonic value, as in the pleasure one gets from an item; and the aesthetic value, as in the sense of beauty and artistry. For instance, we’re investigating how people value artful calligraphy after exerting different amounts of effort to view it.

This work may shed light on curious cultural phenomena, like how people value their experience seeing the Mona Lisa after waiting for hours in crowds at the Louvre. These studies could also help researchers design better motivation systems across education, health care and business.The Conversation

Piotr Winkielman, Professor of Psychology, University of California, San Diego and Przemysław Marcowski, Postdoctoral Researcher in Decision Science, University of California, San Diego

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Hard work feels worth it, but only after it’s done – new research on how people value effort appeared first on theconversation.com

Continue Reading

The Conversation

Susan Monarez, Trump’s nominee for CDC director, faces an unprecedented and tumultuous era at the agency

Published

on

theconversation.com – Jordan Miller, Teaching Professor of Public Health, Arizona State University – 2025-04-03 07:21:00

Susan Monarez, Trump’s nominee for CDC director, faces an unprecedented and tumultuous era at the agency

The Trump administration laid off thousands more employees at the CDC on April 1, 2025, as part of its workforce reduction.
Anadolu/Getty Images

Jordan Miller, Arizona State University

The job of director of the Centers for Disease Control and Prevention carries immense responsibility for shaping health policies, responding to crises and maintaining trust in public health institutions.

Since the Trump administration took office in January 2025, the position has been held on an interim basis by Susan Monarez, whom Trump has now nominated to take the job permanently after his first nominee, former Florida Congressman David Weldon, was withdrawn, in part over his anti-vaccine views.

Monarez, in contrast, is a respected scientist who endorses vaccines and has robust research experience. While she is new to the CDC, she is an accomplished public servant, having worked in several other agencies over the course of her career.

Monarez’s nomination comes at a time when the Department of Health and Human Services is in the midst of mass layoffs, and health professionals – and many in the public – have lost confidence in the federal government’s commitment to supporting evidence-based public health and medicine.

After having already cut nearly 10% of the CDC’s employees earlier in the year, the White House laid off thousands more HHS employees on April 1, gutting the CDC’s workforce by more than 24% in total.

As a teaching professor and public health educator, I appreciate the importance of evidence-based public health practice and the CDC director’s role in advancing public health science, disease surveillance and response and a host of other functions that are essential to public health.

The CDC is essential to promoting and protecting health in the U.S. and abroad, and the next director will shape its course in a challenging era.

A critical time for public health

In addition to the massive overhaul of the country’s public health infrastructure, the U.S. also faces a multistate measles outbreak and growing concerns over avian flu. Cuts to both the workforce and federal programs are hobbling measles outbreak response efforts and threatening the country’s ability to mitigate avian flu.

The Trump administration has also brought in several individuals who have long held anti-science views.

Robert F. Kennedy Jr.’s appointment to head of the Department of Health and Human Services was widely condemned by health experts, given his lack of credentials and history of spreading health misinformation.

So the stakes are high for the CDC director, who will report directly to Kennedy.

Two women hold protest signs about CDC layoffs along a roadside.
Two CDC workers – one who has been at the agency for 25 years and the other for 10 – protest mass layoffs on April 1, 2025.
AP Photo/Ben Gray

An abrupt pivot

Prior to his inauguration, Trump had signaled he would nominate Weldon, a physician who has promoted anti-vaccine theories.

But in March, Trump withdrew Weldon’s nomination less than an hour before his confirmation hearing was set to begin, after several Republicans in Congress relayed that they would not support his appointment.

Instead, Trump tapped Monarez for the top spot.

The role of a CDC director

The CDC relies on its director to provide scientific leadership, shape policy responses and guide the agency’s extensive workforce in addressing emerging health threats.

Prior to January, the CDC director was appointed directly by the president. The position did not require Senate confirmation, unlike the other HHS director positions. The selection was primarily an executive decision, although it was often influenced by political, public health and scientific considerations. But as of Jan. 20, changes approved in the 2022 omnibus budget require Senate confirmation for incoming CDC directors.

In the past, the appointed individual was typically a highly respected figure in public health, epidemiology or infectious disease, with experience leading large organizations, shaping policy and responding to public health emergencies. Public health policy experts expect that requiring Senate confirmation will enhance the esteem associated with the position and lend weight to the person who ultimately steps into the role. Yet, some have expressed concern that the position could become increasingly politicized.

Who is Susan Monarez?

Monarez holds a Ph.D. in microbiology and immunology. She has been serving as acting director of the CDC since being appointed to the interim position by Donald Trump on Jan. 24.

Prior to stepping into this role, she had been serving as deputy director for the Advanced Research Projects Agency for Health, or ARPA-H, since January 2023, a newer initiative established in 2022 through a US$1 billion appropriation from Congress to advance biomedical research.

Monarez has robust research experience, as well as administrative and leadership bona fides within the federal government. In the past, she has explored artificial intelligence and machine learning for population health. Her research has examined the intersection between technology and health and antimicrobial resistance, and she has led initiatives to expand access to behavioral and mental health care, reduce health disparities in maternal health, quell the opioid epidemic and improve biodefense and pandemic preparedness.

Monarez has not yet laid out her plans, but she will no doubt have a challenging role, balancing the interests of public health with political pressures.

Reactions to her nomination

Reactions to Monarez’s nomination among health professionals have been mostly positive. For instance, Georges Benjamin, executive director of the American Public Health Association, remarked that he appreciates that she is an active researcher who respects science.

But some have advocated for her to take a more active role in protecting public health from political attacks.

In her interim position, Monarez has not resisted Trump’s executive orders, even those that are widely seen by other health professionals as harmful to public health.

Since taking office, the current Trump administration has issued directives to remove important health-related data from government websites and has discouraged the use of certain terms in federally funded research.

Monarez has not pushed back on those directives, even though some of her own research includes key terms that would now be flagged in the current system, like “health equity”, and that health leaders expressed concerns in a letter sent to Monarez in January.

A photo of Susan Monarez.
One of the duties of Susan Monarez, the nominee to lead the CDC, is to communicate critical health information to the public.
NIH/HHS/Public domain

CDC staff have said that Monarez has not been visible as acting director. As of early April, she has not attended any all-hands meetings since she joined the CDC in January, nor has she held the advisory committee to the director meeting that is typically held every February. One agency higher-up described her as a “nonentity” in her role so far. Monarez has also reportedly been involved in decisions to drastically cut the CDC workforce.

While some have commented on the fact that she is the first nonphysician to head the agency in decades, that may actually be an advantage. The CDC’s primary functions are in scientific research and applying that research to improve public health. Doctoral scientists receive significantly more training in conducting research than medical doctors, whose training rightly prioritizes clinical practice, with many medical schools providing no training in research at all. Monarez’s qualifications are well-aligned with the requirements of the director role.

A time of change

The CDC was founded at a time of great change, in the aftermath of World War II.

Now, in 2025, the U.S. is again at a time of change, with the advent of powerful technologies that will affect public health in still unforeseeable ways. New and emerging infectious diseases, like measles, COVID-19 and Ebola, are sparking outbreaks that can spread quickly in population-dense cities.

A shifting health information ecosystem can spread health misinformation and disinformation rapidly. Political ideologies increasingly devalue health and science.

All these factors pose real threats to health in the U.S. and globally.

The next CDC director will undoubtedly play a key role in how these changes play out, both at home and abroad.

This story is part of a series of profiles of Cabinet and high-level administration positions.The Conversation

Jordan Miller, Teaching Professor of Public Health, Arizona State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Susan Monarez, Trump’s nominee for CDC director, faces an unprecedented and tumultuous era at the agency appeared first on theconversation.com

Continue Reading

Trending