Teaching Experts Are Worried About ChatGPT, but Not for the Reasons You Think
[ad_1]
And what does it mean for professors if the answer to those questions is “yes”?
These and other questions have flooded news sites and social media since the nonprofit OpenAI released a tool called ChatGPT, which promises to revolutionize how we write. Enter a prompt and in seconds it will produce an essay, a poem, or other text that ranges in quality, users say, from mediocre to pretty good. It can do so because it has been trained on endless amounts of digital text pulled from the internet.
Scholars of teaching, writing, and digital literacy say there’s no doubt that tools like ChatGPT will, in some shape or form, become part of everyday writing, the way calculators and computers have become integral to math and science. It is critical, they say, to begin conversations with students and colleagues about how to shape and harness these AI tools as an aide, rather than a substitute, for learning.
Academia really has to look at itself in the mirror and decide what it’s going to be.
In doing so, they say, academics must also recognize that this initial public reaction says as much about our darkest fears for higher education as it does about the threats and promises of a new technology. In this vision, college is a transactional experience where getting work done has become more important than challenging ourselves to learn. Assignments and assessments are so formulaic that nobody could tell if a computer completed them. And faculty members are too overworked to engage and motivate their students.
“Academia really has to look at itself in the mirror and decide what it’s going to be,” said Josh Eyler, director of the Center for Excellence in Teaching and Learning at the University of Mississippi, who has criticized the “moral panic” he has seen in response to ChatGPT. “Is it going to be more concerned with compliance and policing behaviors and trying to get out in front of cheating, without any evidence to support whether or not that’s actually going to happen? Or does it want to think about trust in students as its first reaction and building that trust into its response and its pedagogy?”
There is some truth underlying that nightmare vision of higher ed, of course. Budget constraints that lead to large-enrollment classes and a reliance on part-time instructors can fuel teaching that feels rote. Such problems aren’t readily solved. But others can be mitigated. Students might cheat because the value of the work of education is not apparent to them. Or their courses or curriculum don’t make any sense. Those, said Eyler, “are totally in our power to correct.”
If you can create an atmosphere where students are invested in learning, they are not going to reach for a workaround.
All of those strategies may work, but underlying them, teaching experts said, is a need to talk to students about why they write. For most professors, writing represents a form of thinking. But for some students, writing is simply a product, an assemblage of words repeated back to the teacher. It’s tempting to blame them, but that’s how many students were taught to write in high school.
Generations of students “have been trained to write simulations like an algorithm in school,” only to arrive at college to be told that writing is more than that, said John Warner, a blogger and author of two books on writing. “It feels like a bait and switch to students.”
The challenge of creating authentic assessments — evaluations that measure true learning — has been longstanding, he noted, recalling his days as an undergraduate cramming for exams in large classes. “I forget everything I learned within hours.”
But the vast majority of students don’t come to college wanting to bluff their way to a degree, Warner said. “If you can create an atmosphere where students are invested in learning, they are not going to reach for a workaround. They are not going to plagiarize. They are not going to copy, they are not going to dodge the work. But the work has to be worth doing on some level, beyond getting the grade.”
At Purdue University, Melinda Zook, a history professor who runs Cornerstone, an undergraduate program that focuses on understanding and interpreting transformative texts, has advised her colleagues to “keep doing what you’re doing.” That’s because the courses are small and built around frequent feedback and discussion focused on the value and purpose of the liberal arts. ChatGPT is much less of a threat to that kind of project-based learning, she said, than to traditional humanities courses. “The fact is the professoriate cannot teach the way we used to,” she said in an email. “Today’s students have to take ownership over every step of the learning experience. No more traditional 5 paragraph essays, no more ‘read the book and write about it.’”
Anna Mills teaches English at the College of Marin, a community college in California that draws a lot of first-generation and lower-income students, as well as those for whom English is a second language.
The fact is the professoriate cannot teach the way we used to. Today’s students have to take ownership over every step of the learning experience.
In June, she began experimenting with GPT-3, an earlier version of the program on which ChatGPT was built, to test the software and read up on where it’s headed. Mills, for one, does not think using a text-producing chatbot is going to pose the same ethical quandary to students as plagiarism or contract cheating, in which you pay someone else to do the work. “They think, ‘this is a new technology. These are tools available to me. So why not use them?’ And they’re going to be doing that in a hybrid way. Some of it’s theirs and some of it’s the generators.”
But students are also puzzled and sometimes unsettled about how this technology does what it does. That’s one reason digital literacy has to include AI language tools, she said. Mills has shown her students how Elicit, an AI research assistant, can be an effective search tool. And she assigns readings on how AI can amplify biases, such as racism and anti-Muslim rhetoric.
She is concerned, too, that responses to ChatGPT and other AI might be inequitable. Students who are less fluent in English may be more likely to be accused of using such tools, for example, if they turn in fluid prose. Similarly, if instructors switch to oral presentations, writing in class only, or writing by hand, that could be a challenge for students with learning disabilities.
Mills has started putting together resource lists and begun conversations with others in higher education. The Modern Language Association and the Conference on College Composition and Communication, for example, are putting together a joint task force in hopes of providing professional guidance for instructors and departments.
“We need to become part of a societal process of thinking about, how do we want to roll this out? How should such a powerful tool be constructed?” she said. For example, “Should we just trust the tech companies to figure out how to prevent harm? Or should there be more involvement from government and from academia?”
In August at the University of Mississippi, faculty members from the department of writing and rhetoric started holding workshops for colleagues across campus on AI’s potential impact. They are also discussing how tools such as Elicit and Fermat can help students brainstorm, design research questions, and explore different points of view.
Preservice teacher candidates in Dave Cormier’s course at the University of Windsor will be spending the spring term looking at how AI tools will affect the future classroom. Cormier, a learning specialist for digital strategy and special projects in the Office of Open Learning, is going to ask them to consider a range of possibilities. Some might choose to incorporate such tools, others might want to dampen access to the internet in their classrooms.
Like others, Cormier said digital literacy has to include an understanding of how AI works. One way to do that might be to ask students to run a writing prompt through a program several times over, and look for patterns in those responses. Those patterns could then lead to a discussion of where and how the tool gathers and processes data. “Getting to the next part of the story is the literacy that I’m constantly trying to bring across with my students,” he said.
Of course, any strategy to deal with AI takes place against a backdrop of scarcity. Warner, for example, noted that first-year writing programs are often staffed by graduate students and adjunct faculty members, and that large class sizes make more intensive writing assignments a challenge.
Alternative assignments and assessments take an investment of time, too, that some faculty members feel like they can’t spare. “There are not a lot of incentives in the structure of higher education to spend time on those things,” said Warner. In a large course, “you get locked into having to do prompts that can be assessed quickly along a limited set of criteria. Otherwise you can’t work through the stuff you have to grade.”
Whether AI chatbots become a faculty nightmare or just another teaching tool may ultimately come down to this: Not the state of the technology, but whether professors are allowed the time to create meaningful work for their students.
[ad_2]
Source link