theconversation.com – Jonathan Losos, William H. Danforth Distinguished University Professor, Arts & Sciences at Washington University in St. Louis – 2024-08-05 08:44:38
Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.
Title of course:
“The Science of Cats”
What prompted the idea for the course?
I’m an evolutionary biologist who has spent my career studying the evolution of small lizards in the Caribbean. I’m also a lifelong cat lover, but it never occurred to me to do anything scientific with house cats. They’re hard to study – ever tried to follow your cat around to see what they’re doing? And in contrast to amply studied lions, tigers and other wild felines, I was under the impression that there wasn’t any interesting research being conducted on the domestic representative of the cat clan, Felis catus.
Twelve years ago, I learned that I was completely wrong. Thanks to John Bradshaw’s book “Cat Sense” and the BBC’s “The Secret Life of the Cat,” I discovered that ailurologists were using the same cutting-edge methods – GPS tracking, genome sequencing, isotopic analysis – to study domestic cats that I use to study lizards and other researchers use with all manner of other creatures.
Thus was born my class on the science of cats. I’d lure students in with their love of felines and then, when they weren’t looking, I’d teach them how scientists study biodiversity – ecology, evolution, genetics and behavior.
What does the course explore?
In essence, the course is about the past, present and future of cats: where they came from, why they do what they do, what the future may hold. And, critically, how we know what we know – that is, how scientists address these sorts of questions.
The course concludes with students writing an original paper or making a mini-documentary. These projects have spanned a vast range of topics in biology and beyond, such as the impact of cats on bird populations, sexism and the crazy cat lady trope, the health effects pro and con of living with felines, the role of hybridization as a creative or constraining force in evolution, the top-down role of larger predators like coyotes and dingoes in controlling cat numbers, and the prospects for new genetic technologies to create allergen-free cats or to curb free-roaming cat populations.
Society needs more biodiversity scientists to understand our rapidly changing world. Cats pose scientific questions of broad interest, and they may serve as a gateway introduction to the world of biological research.
What’s a critical lesson from the course?
Important research on the natural world does not require traveling to remote corners of the world. Research on common animals in local surroundings – even household pets – can make important advances in basic and applied knowledge.
What materials does the course feature?
In addition to reading research papers, we took field trips that were both eye-opening and fun. We went out at the crack of dawn to join a homeless-cat advocate feeding unowned felines in a rundown part of town. We also learned about cats in ancient times from an Egyptologist, traveled to a cat show to marvel at the diversity of cat breeds, observed wild felines at the Saint Louis Zoo and examined cats in art at university museums.
What will the course prepare students to do?
Cat research is the vehicle for students to see the applicability of scientific ideas to animals they know and care for greatly. The course not only requires students to synthesize knowledge from many different fields, but also gets them to think about real-world contemporary debates, such as what to do about outdoor cats and the ethics of breeding.
theconversation.com – Virginia Thomas, Assistant Professor of Psychology, Middlebury – 2025-04-04 07:18:00
Studies show that choosing ‘me time’ is not a recipe for loneliness but can boost your creativity and emotional well-being. FotoDuets/iStock via Getty Images Plus
Loneliness and isolation are indeed social problems that warrant serious attention, especially since chronic states of loneliness are linked with poor outcomes such as depression and a shortened lifespan.
But there is another side to this story, one that deserves a closer look. For some people, the shift toward aloneness represents a desire for what researchers call “positive solitude,” a state that is associated with well-being, not loneliness.
As a psychologist, I’ve spent the past decade researching why people like to be alone – and spending a fair amount of time there myself – so I’m deeply familiar with the joys of solitude. My findings join a host of others that have documented a long list of benefits gained when we choose to spend time by ourselves, ranging from opportunities to recharge our batteries and experience personal growth to making time to connect with our emotions and our creativity.
It’s also why I’m not surprised that a 2024 national survey found that 56% of Americans considered alone time essential for their mental health. Or that Costco is now selling “solitude sheds” where for around US$2,000 you can buy yourself some peace and quiet.
It’s clear there is a desire, and a market, for solitude right now in American culture. But why does this side of the story often get lost amid the warnings about social isolation?
I suspect it has to do with a collective anxiety about being alone.
The stigma of solitude
This anxiety stems in large part from our culture’s deficit view of solitude. In this type of thinking, the desire to be alone is seen as unnatural and unhealthy, something to be pitied or feared rather than valued or encouraged.
This isn’t just my own observation. A study published in February 2025 found that U.S. news headlines are 10 times more likely to frame being alone negatively than positively. This type of bias shapes people’s beliefs, with studies showing that adults and children alike have clear judgments about when it is – and importantly when it is not – acceptable for their peers to be alone.
This makes sense given that American culture holds up extraversion as the ideal – indeed as the basis for what’s normal. The hallmarks of extraversion include being sociable and assertive, as well as expressing more positive emotions and seeking more stimulation than the opposite personality – the more reserved and risk-averse introverts. Even though not all Americans are extraverts, most of us have been conditioned to cultivate that trait, and those who do reap social and professional rewards. In this cultural milieu, preferring to be alone carries stigma.
But the desire for solitude is not pathological, and it’s not just for introverts. Nor does it automatically spell social isolation and a lonely life. In fact, the data doesn’t fully support current fears of a loneliness epidemic, something scholars and journalists have recently acknowledged.
In other words, although Americans are indeed spending more time alone than previous generations did, it’s not clear that we are actually getting lonelier. And despite our fears for the eldest members of our society, research shows that older adults are happier in solitude than the loneliness narrative would lead us to believe.
It’s all a balancing act – along with solitude, you need to socialize.
Social media disrupts our solitude
However, solitude’s benefits don’t automatically appear whenever we take a break from the social world. They arrive when we are truly alone – when we intentionally carve out the time and space to connect with ourselves – not when we are alone on our devices.
My research has found that solitude’s positive effects on well-being are far less likely to materialize if the majority of our alone time is spent staring at our screens, especially when we’re passively scrolling social media.
This is where I believe the collective anxiety is well placed, especially the focus on young adults who are increasingly forgoing face-to-face social interaction in favor of a virtual life – and who may face significant distress as a result.
Social media is by definition social. It’s in the name. We cannot be truly alone when we’re on it. What’s more, it’s not the type of nourishing “me time” I suspect many people are longing for.
True solitude turns attention inward. It’s a time to slow down and reflect. A time to do as we please, not to please anyone else. A time to be emotionally available to ourselves, rather than to others. When we spend our solitude in these ways, the benefits accrue: We feel rested and rejuvenated, we gain clarity and emotional balance, we feel freer and more connected to ourselves.
But if we’re addicted to being busy, it can be hard to slow down. If we’re used to looking at a screen, it can be scary to look inside. And if we don’t have the skills to validate being alone as a normal and healthy human need, then we waste our alone time feeling guilty, weird or selfish.
The importance of reframing solitude
Americans choosing to spend more time alone is indeed a challenge to the cultural script, and the stigmatization of solitude can be difficult to change. Nevertheless, a small but growing body of research indicates that it is possible, and effective, to reframe the way we think about solitude.
For example, viewing solitude as a beneficial experience rather than a lonely one has been shown to help alleviate negative feelings about being alone, even for the participants who were severely lonely. People who perceive their time alone as “full” rather than “empty” are more likely to experience their alone time as meaningful, using it for growth-oriented purposes such as self-reflection or spiritual connection.
Even something as simple as a linguistic shift – replacing “isolation” with “me time” – causes people to view their alone time more positively and likely affects how their friends and family view it as well.
It is true that if we don’t have a community of close relationships to return to after being alone, solitude can lead to social isolation. But it’s also true that too much social interaction is taxing, and such overload negatively affects the quality of our relationships. The country’s recent gravitational pull toward more alone time may partially reflect a desire for more balance in a life that is too busy, too scheduled and, yes, too social.
Just as connection with others is essential for our well-being, so is connection with ourselves.
When deciding if something is worth the effort, whether you’ve already exerted yourself or face the prospect of work changes your calculus. That’s what we found in our new research, published in the Journal of Experimental Psychology: General.
When you consider a future effort, more work makes the outcome less appealing. But once you’ve completed the work, more effort makes the outcome seem more valuable. We also discovered that hiding behind this general principle of timing there are individual differences in how future and past effort shapes people’s value for the fruits of their labor.
What’s it worth to you?
In our experiment, we gave participants a choice between a fixed amount of money and a household item – a mug – that they could take home if they exerted some amount of physical effort, roughly equivalent to walking up one, two or three flights of stairs.
This setup allowed us to determine the value each person placed on the effort – did it add to or subtract from the value of the item? For instance, if putting in a little more effort made someone switch their decision and decide to go with the cash instead of the mug, we could tell that they valued the mug plus that amount of effort less than that sum of money.
We also manipulated the time aspect of effort. When the effort was in the future, participants decided whether they wanted to go with the cash or get the mug with some effort. When the effort was in the past, participants decided whether they wanted to cash in the mug they had already earned with effort.
As we had expected, future effort generally detracted from the value of the mug, but the past effort generally increased it.
But these general trends do not tell the whole story. Not everyone responds to effort the same way. Our study also uncovered striking individual differences. Four distinct patterns emerged:
For some people, extra effort always subtracted value.
Others consistently preferred items with more work.
Many showed mixed patterns, where moderate effort increased value but excessive effort decreased it.
Some experienced the opposite: initially disliking effort, then finding greater value at higher levels.
These changing patterns show that one’s relationship with effort isn’t simple. For many people, there’s a sweet spot – a little effort might make something more valuable, but push too far and the value drops. It’s like enjoying a 30-minute workout but dreading a 2-hour session, or conversely, feeling that a 5-minute workout isn’t worth changing clothes for, but a 45-minute session feels satisfying.
Our paper offers a mathematical model that accounts for these individual differences by proposing that your mind flexibly computes costs and benefits of effort.
Why violate the ‘law of less work?’
Why should timing even matter for effort? It seems obvious that reason and nature would teach you to always avoid and dislike effort.
A hummingbird that prefers a hard-to-get flower over an easy equal alternative might win an A for effort, but, exhausted, would not last long. The cruel world requires “resource rationality” – optimal, efficient use of limited physical and mental resources, balancing the benefits of actions with the required effort.
That insight is captured by the classic psychological “law of less work,” basically boiling down to the idea that given equivalent outcomes, individuals prefer easier options. Anything different would seem irrational or, in plain language, stupid.
If so, then how come people, and even animals, often prize things that require hard work for no additional payoff? Why is being hard-to-get a route to value? Anyone who has labored hard for anything knows that investing effort makes the final prize sweeter – whether in love, career, sports or Ikea furniture assembly.
Could the answer to this “paradox of effort” be that in the hummingbird example, the decision is about future effort, and in the Ikea effect, the effort is in the past?
Our new findings explain seemingly contradictory phenomena in everyday life. In health care, starting an exercise regimen feels overwhelming when focusing on upcoming workouts, but after establishing the habit, those same exercises become a source of accomplishment. At work, professionals might avoid learning difficult new skills, yet after mastering them, they value their enhanced abilities more because they were challenging to acquire.
We’re now studying how effort shapes different aspects of value: monetary value; hedonic value, as in the pleasure one gets from an item; and the aesthetic value, as in the sense of beauty and artistry. For instance, we’re investigating how people value artful calligraphy after exerting different amounts of effort to view it.
This work may shed light on curious cultural phenomena, like how people value their experience seeing the Mona Lisa after waiting for hours in crowds at the Louvre. These studies could also help researchers design better motivation systems across education, health care and business.
The job of director of the Centers for Disease Control and Prevention carries immense responsibility for shaping health policies, responding to crises and maintaining trust in public health institutions.
As a teaching professor and public health educator, I appreciate the importance of evidence-based public health practice and the CDC director’s role in advancing public health science, disease surveillance and response and a host of other functions that are essential to public health.
But in March, Trump withdrew Weldon’s nomination less than an hour before his confirmation hearing was set to begin, after several Republicans in Congress relayed that they would not support his appointment.
The CDC relies on its director to provide scientific leadership, shape policy responses and guide the agency’s extensive workforce in addressing emerging health threats.
Prior to January, the CDC director was appointed directly by the president. The position did not require Senate confirmation, unlike the other HHS director positions. The selection was primarily an executive decision, although it was often influenced by political, public health and scientific considerations. But as of Jan. 20, changes approved in the 2022 omnibus budget require Senate confirmation for incoming CDC directors.
In the past, the appointed individual was typically a highly respected figure in public health, epidemiology or infectious disease, with experience leading large organizations, shaping policy and responding to public health emergencies. Public health policy experts expect that requiring Senate confirmation will enhance the esteem associated with the position and lend weight to the person who ultimately steps into the role. Yet, some have expressed concern that the position could become increasingly politicized.
Prior to stepping into this role, she had been serving as deputy director for the Advanced Research Projects Agency for Health, or ARPA-H, since January 2023, a newer initiative established in 2022 through a US$1 billion appropriation from Congress to advance biomedical research.
Monarez has not yet laid out her plans, but she will no doubt have a challenging role, balancing the interests of public health with political pressures.
Reactions to her nomination
Reactions to Monarez’s nomination among health professionals have been mostly positive. For instance, Georges Benjamin, executive director of the American Public Health Association, remarked that he appreciates that she is an active researcher who respects science.
While some have commented on the fact that she is the first nonphysician to head the agency in decades, that may actually be an advantage. The CDC’s primary functions are in scientific research and applying that research to improve public health. Doctoral scientists receive significantly more training in conducting research than medical doctors, whose training rightly prioritizes clinical practice, with many medical schools providing no training in research at all. Monarez’s qualifications are well-aligned with the requirements of the director role.
Now, in 2025, the U.S. is again at a time of change, with the advent of powerful technologies that will affect public health in still unforeseeable ways. New and emerging infectious diseases, like measles, COVID-19 and Ebola, are sparking outbreaks that can spread quickly in population-dense cities.