Right now, on a purely technical/stylistic level, ChatGPT is an okay writer.
It's not great. But it's not bad, either. It's better (and again, we're talking purely technical here -- leaving aside factual hallucinations and the like) than some of my students, and I teach at a law school. Of course, even when I taught undergraduates I was inordinately concerned that many of my students seemingly never learned and never were taught how to write. So there has always been a cadre of students who are very smart and diligent, but just didn't really have writing in their toolkit. And I'd say ChatGPT has now exceeded their level.
The thing that worries me most about ChatGPT, though, isn't that it's better than some of my law students. It's that it will always be better than essentially every middle schooler.
Learning to write is a process. Repetition is an important part of that process (this blog was a great asset to my writing just because it meant I was writing essentially every day for years). But part of that process is writing repeatedly even when one was is not good at writing. Writing a bunch of objectively mediocre essays in middle school is how you learn to write better ones in high school and even better ones in college.
ChatGPT is going to short-circuit that scaffolding. It is one thing to say that an excellent writer in, say, high school, can still outperform ChatGPT. But how will that kid become excellent if, in the years leading up to that, they're always going to underperform a bot that could do all their homework in 35 seconds? The pressure to kick that work over to the bot will be irresistible, and we're already learning that it's difficult-to-impossible to catch. How can we get middle schoolers to spend time being bad writers when they can instantly access tools that are better?
There might be workarounds. I've heard suggestions of reverting to long-hand essay writing and more in-class assignments. There might be ways to leverage ChatGPT as a comparator -- have them write their own essay, then compare it to a AI-generated one and play spot-the-difference. I think frankly that we might also be wise to abolish grading, at least in lower-level writing oriented classes, to take away that temptation to use the bot. I don't care how conscientious you are, there aren't a lot of 14 year olds who can stand putting in hours trying to actually do their homework and then getting blown out of the water by little Cameron who popped the prompt into an LLM and 45 seconds later is back to playing Overwatch. And again, that's going to be the reality, because ChatGPT's output just is better than anything one can reasonably expect a young writer to produce.
In many ways, large language models are like any mechanism of mass production. They displace older artisans, not because their product is better -- it isn't, it's objectively worse -- but on sheer volume and accessibility. The art is worse, but it's available to the masses on the cheap.
And like with mass production, this isn't necessarily a bad thing even though it's disruptive. It's fine that many people now can, in effect, be "okay writers" essentially for free. It's like mass-produced clothing -- yes, most people's t-shirts are of lower-quality than a bespoke Italian suit, but that's okay because now most people can afford a bunch of t-shirts that are of acceptable quality (albeit far less good than a bespoke Italian suit). The alternative was never "everyone gets an entire wardrobe of bespoke Italian suits", it was "a couple of people enjoy the benefits of intense luxury and most people get scraps." Likewise, I'm not so naive as to think that most people in absence of ChatGPT would have become great writers. So this is a net benefit -- it brings acceptable-level writing to the masses.
If that was all that happened -- the big middle gets expanded access to cheap, okay writing, with "artisanal" great writing remaining costly and being reserved for the "elite" -- it might not be that bad. But the question is whether this process will inevitably short-circuit the development of great writers. You have to pass through a long period of being a crummy writer before you become a good or great writer. Who is still going to do that when adequacy is so easily at hand?
I'm not tempted to use ChatGPT because even though my writing takes longer, I'm confident that at the end my work product will be better. But that's only true because I spent a long time writing terribly. Luckily for me, I didn't have an alternative. Kids these days? They absolutely have an alternative. It's going to be very hard to get them to pass that up.
As a child, I didn’t speak enough in school and because of fine motor problems, my handwriting was not very legible. I was eventually given the accommodation of being allowed to use a word processor. Eventually, I began using the family computer to write stories. When someone struggles to express themselves, being given the keys to do so is....
ReplyDeleteWell, even with being able to type out my words, I still sometimes struggle to come up with the right word.
Anyway...
Because of my lived experience, it's upsetting for me to hear of teachers wanting to force children to use handwriting in order to prevent the use of ChatGPT. Some folks might argue that exceptions would be made for children with disabilities. But getting a diagnosis is far from easy. The best accommodations are those that can be given out easily to anyone who needs them whether officially diagnosed or not.
Computers are great accommodations for all who love writing and have awful handwriting.
As for the classroom, I think too much of American education is focused on catching children cheating and/or not doing their homework.
Maybe required-writing assignments will need to be taken out of the curriculum. What if only students who love to write use writing as a learning tool? And for those who are on the fence, maybe it would help to create writing assignments that are more appealing. Even those of us who love reading and writing do not love reading and writing about subjects that are boring to us. So having more interest-focused, independent learning would probably help.
One idea for classrooms is to encourage students to use ChatGPT and other AI for educational conversations. This is how I usually use Bard. Instead of asking it to write something for me, I usually use it to brainstorm or excitedly blab on about something I recently learned or saw. What if instead of tests and writing assignments to assess students, teachers read over conversations between the students and AI. That could give a picture of the student's communication abilities, their level of curiosity about the subject, their understanding of the subject, etc.
In my head I'm arguing with myself. "But AI isn't always correct. It might give students false information." Yeah. But the same can be said for teachers, parents, and other grown-ups educating the young. Especially all the kiddos being raised by parents who are Fox News couch potatoes.