How ChatGPT Could Help or Hurt Students With Disabilities
[ad_1]
User-friendly artificial-intelligence tools like ChatGPT are new enough that professors aren’t yet sure how they will shape teaching and learning. That uncertainty holds doubly true for how the technology could affect students with disabilities.
On the one hand, these tools can function like personal assistants: Ask ChatGPT to create a study schedule, simplify a complex idea, or suggest topics for a research paper, and it can do that. That could be a boon for students who have trouble managing their time, processing information, or ordering their thoughts.
On the other hand, fears about cheating could lead professors to make changes in testing and assessment that could hurt students unable to do well on, say, an oral exam or in-class test. And instead of using it as a simple study aid, students who lack confidence in their ability to learn might allow the products of these AI tools to replace their own voices or ideas.
Such scenarios can, of course, apply to a wide range of students. You don’t need to have attention-deficit hyperactivity disorder to struggle with ordered thinking. Nor are students with severe anxiety the only ones to stress out over an oral exam. But teaching experts worry that in the rush to figure out, or rein in, these tools, instructors may neglect to consider the ways in which they affect students with disabilities in particular.
“People are really focused, for good reasons, on academic integrity and academic honesty, and trying to redefine what that means with these new tools,” says Casey Boyle, director of the Digital Writing and Research Lab at the University of Texas at Austin, who chairs a working group on digital-content accessibility. But people are just now starting to talk about the opportunities and challenges around AI and disability.
Students with disabilities or students who require accommodations are already working uphill. When we overreact, what we’re doing is increasing the slope of those hills.
Students with disabilities have long faced challenges in the classroom, starting with the difficulty of securing accommodations that can help them learn better, such as receiving note-taking assistance or extra time to take tests, or being allowed to type instead of writing by hand. Boyle says he has heard of instructors moving from take-home writing assignments to timed writing exercises in class to keep students from using ChatGPT. Students who struggle with cognitive loads, or dyslexia, or are unable to focus are not going to perform well under those conditions.
“Students with disabilities or students who require accommodations are already working uphill,” Boyle says. “When we overreact, what we’re doing is increasing the slope of those hills.”
Welcome Assistance
While professors are understandably concerned that students may use AI tools inappropriately, some teaching experts caution against banning their use entirely because there are ways in which AI tools could assist students with disabilities.
- Students with mobility challenges may find it easier to use generative AI tools — such as ChatGPT or Elicit — to help them conduct research if that means they can avoid a trip to the library.
- Students who have trouble navigating conversations — such as those along the autism spectrum — could use these tools for “social scripting.” In that scenario, they might ask ChatGPT to give them three ways to start a conversation with classmates about a group project.
- Students who have trouble organizing their thoughts might benefit from asking a generative AI tool to suggest an opening paragraph for an essay they’re working on — not to plagiarize, but to help them get over “the terror of the blank page,” says Karen Costa, a faculty-development facilitator who, among other things, focuses on teaching, learning, and living with ADHD. “AI can help build momentum.”
- ChatGPT is good at productive repetition. That is a practice most teachers use anyway to reinforce learning. But AI can take that to the next level by allowing students who have trouble processing information to repeatedly generate examples, definitions, questions, and scenarios of concepts they are learning.
“I really want you as a student to do that critical thinking and not give me content produced by an AI,” says Manjeet Rege, a professor and chair of the department of software engineering and data science at the University of St. Thomas. But because students may spend three hours in a lecture session, he says, “at the end of it, if you would like to take aspects of that, put it into a generative AI model and then look at analogies and help you understand that better, yes, absolutely, that is something that I encourage.”
Teaching experts point out that instructors can use AI tools themselves to support students with disabilities. One way to do that might be to run your syllabus through ChatGPT to improve its accessibility, says Thomas Allen, an associate professor of computer science and data science at Centre College, in Kentucky.
Allen, who has ADHD, is particularly aware of the ways that an overly complex syllabus can stymie students. A 20-page document, for example, with lots of graphics could trip up students with a range of disabilities, such as people with low vision or those who have dyslexia, autism, or ADHD. “That’s using AI to solve a problem that we created,” he says, “by not having an accessible classroom to start with.”
Disability-rights advocates have long encouraged instructors to use an approach called universal design for learning, or UDL. In a nutshell, this method enables students to engage with material in many ways. A common example is putting captioning on videos. Another is to provide text explanations of graphics. These strategies can benefit all learners, advocates note, creating more-inclusive classrooms.
“Professors who have designed their courses with UDL at the heart of their pedagogy are going to be better prepared and more adaptive, not only to AI but any other weird and challenging things,” says Costa.
Teaching experts caution that these tools have to be used with care. In simplifying a syllabus, or lecture notes, ChatGPT could change the meaning of words or add things that were not said, Allen notes. And it will reflect biases in the human-generated ideas and language on which it was trained. “You can’t trust the output as it is,” says Allen.
Risks and Challenges
A more-subtle challenge, teaching experts say, is that because students with disabilities can lack confidence as learners, they may be more likely than others to replace their own words and ideas with AI output, rather than use it as an assistant.
It’s not all on you to figure this out and have all the answers. Partner with your students and explore this together.
Students have, for example, put first drafts of papers through ChatGPT to get feedback on the clarity of their language, the coherence of their arguments, and other measures of good writing. If the AI tools significantly change their words — and not necessarily in a way that an instructor would think is an improvement — a student who doesn’t have faith in their own work and sees the tool as an expert might defer to it. “The outputs I’ve been seeing are overly rational and overly linear and overly correct in a very unproductive way,” says Boyle.
One way to mitigate that risk is to teach all students about the strengths and limitations of AI. That includes showing students how to write thoughtful and specific prompts to get the most useful feedback; discussing the ways that generative AI tools can produce confident-sounding, yet false or flat, writing; and reminding students that ChatGPT is a word predictor without actual intelligence, so it should not be treated as a replacement for a teacher, counselor, or tutor.
“If you keep deferring to the technology, you won’t grow and develop because you’re leaning on this technology,” says S. Mason Garrison, an assistant professor of quantitative psychology at Wake Forest University. “This is a problem for anyone, but it could disproportionately impact folks who are genuinely worried their work isn’t good enough.”
Disability-rights advocates point to two other challenges that could affect students with disabilities more than others.
One is that if you use AI to help generate ideas or smooth out writing, your work may be more likely to get flagged by an AI detector. That’s a problem for a range of students, including those for whom English is not their first language. But a neurodivergent student might face particular issues in response, says Allen.
“Sometimes we have difficulty looking people in the eye, and we fidget. It’s part of our social challenges,” he says. “If you get called in and some instructor or the dean says your writing has been flagged, tell me why you cheated. You’re fidgeting. You’re looking at your shoes. That may be interpreted as guilt. And maybe the student used it to take on the persona of a character and had a conversation but used that to inform their thinking. That’s a different use case from typing in the prompt, using what it spits out.”
The other challenge is that many students don’t seek accommodations until they need them. And how many students have ever had to sit through an oral exam or write an essay by hand?
“In all likelihood, the first time that happens to a student, they’re not going to be able to get the accommodation in time because they never thought they needed it,” says Garrison. “There’s probably going to be a lot of surprises like that. And for professors, it might not even occur to them that that’s something you put in your syllabus.”
One central piece of advice teaching experts have is this: Include students, and particularly students with disabilities, when designing policies on AI use. It’s going to become more important as generative AI evolves and becomes embedded in other technologies.
“It’s not all on you to figure this out and have all the answers,” says Costa. “Partner with your students and explore this together.”
[ad_2]
Source link