fbpx
Connect with us

The Conversation

Should AI be permitted in college classrooms? 4 scholars weigh in

Published

on

Should AI be permitted in college classrooms? 4 scholars weigh in

Does AI enhance or cripple a person’s analytical skills?
Yevhen Lahunov/iStock via Getty Images Plus

Nicholas Tampio, Fordham University; Asim Ali, Auburn University; Patricia A. Young, University of Maryland, Baltimore County, and Shital Thekdi, University of Richmond

One of the most intense discussions taking place among faculty is whether to permit to use artificial intelligence in the classroom. To gain perspective on the matter, reached out to four scholars for their take on AI as a learning tool and the reasons why they will or won’t be making it a part of their classes.

Nicholas Tampio, professor of political science: Learn to think for yourself

As a professor, I believe the purpose of a college class is to teach students to think: to read scholarship, ask questions, formulate a thesis, collect and analyze data, draft an essay, take feedback from the instructor and other students, and write a final draft.

One problem with ChatGPT is that it allows students to produce a decent paper without thinking or writing for themselves.

In my American political thought class, I assign speeches by Martin Luther King Jr. and Malcolm X and ask students to compose an essay on what King and X might say about a current American political debate, such as the Supreme Court’s recent decision on affirmative action.

Students could get fine grades if they used ChatGPT to “write” their papers. But they will have missed a chance to enter a dialogue with two profound thinkers about a topic that could reshape American higher education and society.

The point of learning to write is not simply intellectual self-discovery. My students go on to careers in journalism, , science, academia and business. Their employers often ask them to research and write about a topic.

Few employers will likely hire someone to use large language models that rely on an algorithm scraping databases filled with errors and biases. Already, a lawyer has gotten in trouble for using ChatGPT to craft a motion filled with fabricated cases. Employees succeed when they can research a topic and write intelligently about it.

Artificial intelligence is a tool that defeats a purpose of a college education – to learn how to think, and write, for oneself.

Patricia A. Young, professor of education: ChatGPT doesn’t promote advanced thinking

College students who are operating from a convenience or entitlement mentality – one in which they think, “I am entitled to use whatever technology is available to me” – will naturally gravitate toward using ChatGPT with or without their professor’s permission. Using ChatGPT and submitting a course assignment as your own creation is called AI-assisted plagiarism.

A woman looks straightforward.
Patricia A. Young,
University of Maryland, Baltimore County

Some professors allow the use of ChatGPT as long as students cite ChatGPT as the source. As a researcher who specializes in the use of technology in education, I believe this practice needs to be thought through. Does this mean that ChatGPT would need to cite its sources, so that students could cite ChatGPT as a type of secondary source according to APA style, a standard academic style of citing papers? What Pandora’s box are we opening? Some users that ChatGPT never reveals its sources anyway.

The proliferation of free AI means students won’t have to think much while writing – just engage in a high level of copy and paste. We used to call that plagiarism. With AI-assisted plagiarism, this brings in the potential for a new era of academic misconduct.

The concern will come when students take higher-level courses or a job and lack the literacy skills to perform on an exceptional level. We will have created a generation of functionally illiterate adults who lack the capacity to engage in advanced thinking – like critiquing, comparing or contrasting information.

Yes, students can and should use smart tools, but we need to hypothesize and measure the costs to human ingenuity and the future of the human race.

Asim Ali, instructor of information systems management: AI is another teacher

I teach information management, and in the spring of 2023, I had students use ChatGPT for an essay assignment and then record a podcast discussing how AI will impact their careers. This semester I am being more intentional by providing guidance on the possibilities and limitations of AI tools for each assignment. For example, students learn that using generative AI on a self-reflection assignment may not help, but using AI to analyze a case study is potentially a great way to find insights they may have overlooked. This emulates their future jobs in which they may use AI tools to enhance the quality of their work product.

A man smiles. A brick wall is in the background.
Asim Ali,
Auburn University

My experience with adapting to AI for my own course inspired me to create a resource for all my colleagues. As executive director of the Biggio Center for the Enhancement of Teaching and Learning, I oversee the instructional design and educational teams at Auburn University. We created a self-paced, online course called Teaching With AI.

Now there are over 600 faculty at Auburn and hundreds of faculty at almost 35 institutions engaging with the content and each other through discussion boards and practical exercises.

I messages from faculty sharing ways they are changing their assignments or discussing AI with their students. Some see AI as a threat to humans, but discussing AI with my students and with colleagues across the country has actually helped me develop human connections.

Shital Thekdi, associate professor of analytics & operations: What can you do that AI can’t?

This semester, I will ask students in my Statistics for Business and Economics course to discuss the question, “What is your value beyond the AI tools?” I want them to reframe the conversation beyond one of academic integrity and instead as a challenge. I believe students must recognize that the jobs they imagine will exist for them could be eliminated because of these new technologies. So the pressure is on students to understand not only how to use these tools but also how to be better than the tools.

A woman looks straightforward.
Shital Thekdi.
University of Richmond

I hope my students will consider ethical reasoning and the role of human connections. While AI can be trained to make value-based decisions, individuals and groups have their own values that can differ considerably from those used by AI.
And AI tools do not have the capacity to form human connections and experiences.

Students will remain vital contributors to business and society as AI tools develop. I believe it’s our responsibility as educators to prepare our students for a rapidly evolving cultural and technological landscape.The Conversation

Nicholas Tampio, Professor of Political Science, Fordham University; Asim Ali, Instructor of Information Systems Management, Auburn University; Patricia A. YoungUniversity of Maryland, Baltimore County, and Shital Thekdi, Associate Professor of Analytics and Operations, University of Richmond

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

Comparing the Trump and Harris records on technology regulation

Published

on

theconversation.com – Anjana Susarla, Professor of Information Systems, Michigan – 2024-10-18 07:22:00

The Federal Trade Commission is one of the main venues for regulation of big tech and its wares.

Alpha Photo/Flickr, CC BY-NC

Anjana Susarla, Michigan State University

It’s not surprising that technology regulation is an important issue in the 2024 U.S. presidential campaign.

The past decade has seen advanced technologies, from social algorithms to large language model artificial intelligence systems, profoundly affect society. These changes, which spanned the Trump and Biden-Harris administrations, spurred calls for the federal government to regulate the technologies and the powerful corporations that wield them.

As a researcher of information systems and AI, I examined both candidates’ records on technology regulation. Here are the important differences.

Algorithmic harms

With artificial intelligence now widespread, governments worldwide are grappling with how to regulate various aspects of the technology. The candidates offer different visions for U.S. AI policy. One area where there is a stark difference is in recognizing and addressing algorithmic harms from the widespread use of AI technology.

AI affects your in ways that might escape your notice. Biases in algorithms used for lending and hiring decisions could end up reinforcing a vicious cycle of discrimination. For example, a student who can’t get a loan for college would then be less likely to get the education needed to pull herself out of poverty.

At the AI Safety Summit in the U.K. in November 2023, Harris spoke of the promise of AI but also the perils from algorithmic bias, deepfakes and wrongful arrests. Biden signed an executive order on AI on Oct. 30, 2023, that recognized AI systems can pose unacceptable risks of harm to civil and human rights and individual well-being. In parallel, federal agencies such as the Federal Trade Commission have carried out enforcement actions to guard against algorithmic harms.

a man sits at a desk writing on a piece of paper as a woman looks on

signs an executive order addressing the risks of artificial intelligence on Oct. 30, 2023, with Vice President Kamala Harris at his side.

AP Photo/Evan Vucci

By contrast, the Trump administration did not take a public stance on mitigation of algorithmic harms. Trump has said he wants to repeal President Biden’s AI executive order. In recent interviews, however, Trump noted the dangers from technologies such as deepfakes and challenges posed to security from AI systems, suggesting a willingness to engage with the growing risks from AI.

Technological standards

The Trump administration signed the American AI Initiative executive order on Feb. 11, 2019. The order pledged to double AI research investment and established the first set of national AI research institutes. The order also included a plan for AI technical standards and established guidance for the federal government’s use of AI. Trump also signed an executive order on Dec. 3, 2020, promoting the use of trustworthy AI in the federal government.

The Biden-Harris administration has tried to go further. Harris convened the heads of Google, Microsoft and other tech companies at the White House on May 4, 2023, to undertake a set of voluntary commitments to safeguard individual rights. The Biden administration’s executive order contains an important initiative to probe the vulnerablity of very large-scale, general-purpose AI models trained on massive amounts of data. The goal is to determine the risks hackers pose to these models, including the ones that power OpenAI’s popular ChatGPT and DALL-E.

a man in a business suit waves from in front of the door to an airplane

Donald Trump departs from Washington D.C., on Feb. 11, 2019, shortly after signing an executive order on artificial intelligence that included setting technical standards.

Nicholas Kamm/AFP via Getty Images

Antitrust

Antitrust law enforcement – restricting or conditioning mergers and acquisitions – is another way the federal government regulates the technology industry.

The Trump administration’s antitrust dossier includes its attempt to block AT&T’s acquisition of Time Warner. The merger was eventually allowed by a federal judge after the FTC under the Trump administration filed a suit to block the deal. The Trump administration also filed an antitrust case against Google focused on its dominance in internet search.

Biden signed an executive order on July 9, 2021, to enforce antitrust laws arising from the anticompetitive effects of dominant internet platforms. The order also targeted the acquisition of nascent competitors, the aggregation of data, unfair competition in attention markets and the surveillance of users. The Biden-Harris administration has filed antitrust cases against Apple and Google.

The Biden-Harris administration’s merger guidelines in 2023 outlined rules to determine when mergers can be considered anticompetitive. While both administrations filed antitrust cases, the Biden administration’s antitrust push appears stronger in terms of its impact in potentially reorganizing or even orchestrating a breakup of dominant companies such as Google.

Cryptocurrency

The candidates have different approaches to regulation. Late in his administration, Trump tweeted in support of cryptocurrency regulation. Also late in Trump’s administration, the federal Financial Crimes Enforcement Network proposed regulations that would have required financial firms to collect the identity of any cryptocurrency wallet to which a user sent funds. The regulations were not enacted.

Trump has since shifted his position on cryptocurrencies. He has criticized existing U.S. laws and called for the United States to be a Bitcoin superpower. The Trump campaign is the first presidential campaign to accept payments in cryptocurrencies.

The Biden-Harris administration, by contrast, has laid out regulatory restrictions on cryptocurrencies with the Securities and Exchange Commission, which brought about a series of enforcement actions. The White House vetoed the Financial Innovation and Technology for the 21st Century Act that aimed to clarify accounting for cryptocurrencies, a bill favored by the cryptocurrency industry.

Data privacy

Biden’s AI executive order calls on to adopt privacy legislation, but it does not a legislative framework to do so. The Trump White House’s American AI Initiative executive order mentions privacy only in broad terms, calling for AI technologies to uphold “civil liberties, privacy, and American values.” The order did not mention how existing privacy protections would be enforced.

Across the U.S., several states have tried to pass legislation addressing aspects of data privacy. At present, there is a patchwork of statewide initiatives and a lack of comprehensive data privacy legislation at the federal level.

The paucity of federal data privacy protections is a stark reminder that while the candidates are addressing some of the challenges posed by developments in AI and technology more broadly, a lot still remains to be done to regulate technology in the public interest.

Overall, the Biden administration’s efforts at antitrust and technology regulation seem broadly aligned with the goal of reining in technology companies and protecting consumers. It’s also reimagining monopoly protections for the 21st century. This seems to be the chief difference between the two administrations.The Conversation

Anjana Susarla, Professor of Information Systems, Michigan State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Comparing the Trump and Harris records on technology regulation appeared first on .com

Continue Reading

The Conversation

Some people love to scare themselves in an already scary world − here’s the psychology of why

Published

on

theconversation.com – Sarah Kollat, Teaching Professor of Psychology, Penn State – 2024-10-18 07:53:00

A controlled scary experience can leave you exhilarated and relaxed afterward.

gremlin/E+ via Getty Images

Sarah Kollat, Penn State

Fall for me as a teenager meant football games, homecoming dresses – and haunted houses. My friends organized group trips to the local fairground, where barn sheds were turned into halls of horror, and masked nipped at our ankles with (chainless) chain saws as we waited in line, anticipating deeper frights to once we were inside.

I’m not the only one who loves a good scare. Halloween attractions company America Haunts estimates Americans are spending upward of US$500 million annually on haunted house entrance fees simply for the privilege of being frightened. And lots of fright fans don’t limit their horror entertainment to spooky season, gorging horror movies, shows and books all year long.

To some people, this preoccupation with horror can seem tone deaf. School shootings, child abuse, war – the list of real-life horrors is endless. Why seek manufactured fear for entertainment when the world offers real terror in such large quantities?

As a developmental psychologist who writes dark thrillers on the side, I find the intersection of psychology and fear intriguing. To explain what drives this fascination with fear, I point to the theory that emotions evolved as a universal experience in humans because they help us survive. Creating fear in otherwise safe lives can be enjoyable – and is a way for people to practice and prepare for real-life dangers.

Fear can feel good

Controlled fear experiences – where you can click your remote, close the book, or walk out of the haunted house whenever you want – offer the physiological high that fear triggers, without any real risk.

When you perceive yourself under threat, adrenaline surges in your body and the evolutionary fight-or-flight response is activated. Your heart rate increases, you breathe deeper and faster, and your blood pressure goes up. Your body is preparing to defend itself against the danger or get away as fast as possible.

This physical reaction is crucial when facing a real threat. When experiencing controlled fear – like jump scares in a zombie TV show – you get to enjoy this energized sensation, similar to a runner’s high, without any risks. And then, once the threat is dealt with, your body releases the neurotransmitter dopamine, which provides sensations of pleasure and relief.

In one study, researchers found that people who a high-intensity haunted house as a controlled fear experience displayed less brain activity in response to stimuli and less anxiety post-exposure. This finding suggests that exposing yourself to horror films, scary stories or suspenseful games can actually calm you afterward. The effect might also explain why my husband and I choose to relax by watching zombie shows after a busy day at work.

four teens gleefully clutching each other behind bars in front of red splattered wall

Going through something frightening together – like a haunted house attraction – can be a bonding experience.

AP Photo/John Locher

The ties that bind

An essential motivation for human beings is the sense of belonging to a social group. According to the surgeon general, Americans who miss those connections are caught up in an epidemic of loneliness, which leaves people at risk for mental and physical health issues.

Going through intense fear experiences together strengthens the bonds between individuals. Good examples include veterans who served together in combat, survivors of natural disasters, and the “families” created in groups of first responders.

I’m a volunteer firefighter, and the unique connection created through sharing intense threats, such as entering a burning building together, manifests in deep emotional bonds with my colleagues. After a significant fire call, we often note the improved morale and camaraderie of the firehouse. I feel a flood of positive emotions anytime I think of my firefighting partners, even when the events occurred months or years ago.

Controlled fear experiences artificially create similar opportunities for bonding. Exposure to stress triggers not only the fight-or-flight response, but in many situations it also initiates what psychologists call the “tend-and-befriend” system. A perceived threat prompts humans to tend to offspring and create social-emotional bonds for protection and comfort. This system is largely regulated by the so-called “love hormone” oxytocin.

The tend-and-befriend reaction is particularly likely when you experience stress around others with whom you have already established positive social connections. When you encounter stressors within your social network, your oxytocin levels rise to initiate social coping strategies. As a result, when you navigate a recreational fear experience like a haunted house with friends, you are setting the emotional stage to feel bonded with the people beside you.

Sitting in the dark with friends while you watch a scary or navigating a haunted corn maze with a date is good for your , in that it helps you strengthen those social connections.

man and girl lean together in movie theater

Consuming lots of horror as entertainment may make some people more resilient in real life.

Edwin Tan/E+ via Getty Images

An ounce of prevention = a pound of cure

Controlled fear experiences can also be a way for you to prepare for the worst. Think of the early days of the COVID-19 pandemic, when the films “Contagion” and “Outbreaktrended on streaming platforms as people around the world sheltered at home. By watching threat scenarios play out in controlled ways through , you can learn about your fears and emotionally prepare for future threats.

For example, researchers at Aarhus ‘s Recreational Fear Lab in Denmark demonstrated in one study that people who regularly consumed horror media were more psychologically resilient during the COVID-19 pandemic than nonhorror fans. The scientists suggest that this resilience might be a result of a kind of these fans went through – they practiced coping with the fear and anxiety provoked by their preferred form of entertainment. As a result, they were better prepared to manage the real fear triggered by the pandemic.

When I’m not teaching, I’m an avid reader of crime fiction. I also write psychological thrillers under the pen name Sarah K. Stephens. As both a reader and writer, I notice similar themes in the books I am drawn to, all of which tie into my own deep-rooted fears: mothers who fail their children somehow, women manipulated into subservience, lots of misogynist antagonists.

I enjoy writing and reading about my fears – and seeing the bad guys get their just desserts in the end – because it offers a way for me to control the story. Consuming these narratives lets me mentally rehearse how I would handle these kinds of circumstances if any were to manifest in my real life.

Survive and thrive

In the case of controlled fear experiences, scaring yourself is a pivotal technique to you survive and adapt in a frightening world. By eliciting powerful, positive emotions, strengthening social networks and preparing you for your worst fears, you’re better able to embrace each day to its fullest.

So the next time you’re choosing between an upbeat comedy and a creepy thriller for your movie night, pick the dark side – it’s good for your health.The Conversation

Sarah Kollat, Teaching Professor of Psychology, Penn State

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post Some people love to scare themselves in an already scary world − here’s the psychology of why appeared first on theconversation.com

Continue Reading

The Conversation

To make nuclear fusion a reliable energy source one day, scientists will first need to design heat- and radiation-resilient materials

Published

on

theconversation.com – Sophie Blondel, Research Assistant Professor of Nuclear Engineering, of Tennessee – 2024-10-18 07:22:00

A fusion experiment ran so hot that the wall materials facing the plasma retained defects.

Christophe Roux/CEA IRFM, CC BY

Sophie Blondel, University of Tennessee

Fusion energy has the potential to be an effective clean energy source, as its reactions generate incredibly large amounts of energy. Fusion reactors aim to reproduce on Earth what happens in the core of the Sun, where very light elements merge and release energy in the . Engineers can harness this energy to heat and generate electricity through a steam turbine, but the path to fusion isn’t completely straightforward.

Controlled nuclear fusion has several advantages over other power sources for generating electricity. For one, the fusion reaction itself doesn’t produce any carbon dioxide. There is no risk of meltdown, and the reaction doesn’t generate any long-lived radioactive waste.

I’m a nuclear engineer who studies materials that scientists could use in fusion reactors. Fusion takes place at incredibly high temperatures. So to one day make fusion a feasible energy source, reactors will need to be built with materials that can survive the heat and irradiation generated by fusion reactions.

Fusion material challenges

Several types of elements can merge during a fusion reaction. The one most scientists prefer is deuterium plus tritium. These two elements have the highest likelihood of fusing at temperatures that a reactor can maintain. This reaction generates a helium atom and a neutron, which carries most of the energy from the reaction.

Humans have successfully generated fusion reactions on Earth since 1952 – some even in their garage. But the trick now is to make it worth it. You need to get more energy out of the process than you put in to initiate the reaction.

Fusion reactions happen in a very hot plasma, which is a of matter similar to gas but made of charged particles. The plasma needs to stay extremely hot – over 100 million degrees Celsius – and condensed for the duration of the reaction.

To keep the plasma hot and condensed and create a reaction that can keep going, you need special materials making up the reactor walls. You also need a cheap and reliable source of fuel.

While deuterium is very common and obtained from water, tritium is very rare. A 1-gigawatt fusion reactor is expected to burn 56 kilograms of tritium annually. But the world has only about 25 kilograms of tritium commercially available.

Researchers need to find alternative sources for tritium before fusion energy can get off the ground. One option is to have each reactor generating its own tritium through a system called the breeding blanket.

The breeding blanket makes up the first layer of the plasma chamber walls and contains lithium that reacts with the neutrons generated in the fusion reaction to produce tritium. The blanket also converts the energy carried by these neutrons to heat.

The fusion reaction chamber at ITER will electrify the plasma.

Fusion devices also need a divertor, which extracts the heat and ash produced in the reaction. The divertor helps keep the reactions going for longer.

These materials will be exposed to unprecedented levels of heat and particle bombardment. And there aren’t currently any experimental facilities to reproduce these conditions and test materials in a real-world scenario. So, the focus of my research is to bridge this gap using models and computer simulations.

From the atom to full device

My colleagues and I work on producing tools that can predict how the materials in a fusion reactor erode, and how their properties change when they are exposed to extreme heat and lots of particle radiation.

As they get irradiated, defects can form and grow in these materials, which affect how well they react to heat and stress. In the future, we hope that agencies and private companies can use these tools to design fusion power plants.

Our approach, called multiscale modeling, consists of looking at the physics in these materials over different time and length scales with a range of computational models.

We first study the phenomena in these materials at the atomic scale through accurate but expensive simulations. For instance, one simulation might examine how hydrogen moves within a material during irradiation.

From these simulations, we look at properties such as diffusivity, which tells us how much the hydrogen can spread throughout the material.

We can integrate the information from these atomic level simulations into less expensive simulations, which look at how the materials react at a larger scale. These larger-scale simulations are less expensive because they model the materials as a continuum instead of considering every single atom.

The atomic-scale simulations could take weeks to on a supercomputer, while the continuum one will take only a few hours.

All this modeling work happening on computers is then with experimental results obtained in laboratories.

For example, if one side of the material has hydrogen gas, we want to know how much hydrogen leaks to the other side of the material. If the model and the experimental results match, we can have confidence in the model and use it to predict the behavior of the same material under the conditions we would expect in a fusion device.

If they don’t match, we go back to the atomic-scale simulations to investigate what we missed.

Additionally, we can couple the larger-scale material model to plasma models. These models can tell us which parts of a fusion reactor will be the hottest or have the most particle bombardment. From there, we can evaluate more scenarios.

For instance, if too much hydrogen leaks through the material during the operation of the fusion reactor, we could recommend making the material thicker in certain places, or adding something to trap the hydrogen.

Designing new materials

As the quest for commercial fusion energy continues, scientists will need to engineer more resilient materials. The field of possibilities is daunting – engineers can manufacture multiple elements together in many ways.

You could combine two elements to create a new material, but how do you know what the right proportion is of each element? And what if you want to try mixing five or more elements together? It would take way too long to try to run our simulations for all of these possibilities.

Thankfully, artificial intelligence is here to assist. By combining experimental and simulation results, analytical AI can recommend combinations that are most likely to have the properties we’re looking for, such as heat and stress resistance.

The aim is to reduce the number of materials that an engineer would have to produce and test experimentally to save time and money.The Conversation

Sophie Blondel, Research Assistant Professor of Nuclear Engineering, University of Tennessee

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More

The post To make nuclear fusion a reliable energy source one day, scientists will first need to design heat- and radiation-resilient materials appeared first on .com

Continue Reading

Trending