Let me say something about repetition and patterns. I spent 27 years as a coach and judge of HS Forensics. I've spent countless hours listening to students use 3 point analysis to explain why the US should continue to fund NASA (for example) and I've sat through countless debates on resolutions like, "Resolved: Civil Disobedience is a just response to oppressive government."
By and large, regardless of the topic or the resolution, students followed certain formats. In Debate (Lincoln Douglas debate) it was the Toulmin rhetorical method (Claims, Evidence, Warrants, Impacts) and in Extemporaneous, it was generally 3 point analysis organized around time, location, or hierarchy.
Over time, as a coach, my students became far more fluent and advanced in their rhetorical choices, and their choices were more agile, creative, and nimble.
But they could not have gotten there without first understanding these forms.
And yes, debate is a game with rather predictable patterns at the novice level. But at some point, with enough experience and real-world feedback, they make a huge cognitive leap.
Surely I'm not suggesting that all students should engage in the rigorous and often ridiculous event of HS Forensics, where some of the very best speeches I've ever seen are those that lampoon just how predictable their speeches are.
But I am suggesting that kids need lots and lots of practice and that understanding the importance of form as a scaffold is important...so long as we also understand we need to help them move away from this.
I've spent thousands of dollars and endless hours freewriting and revising work through attendance at Bard College's Institute for Writing and Thinking. I know how I write, and that knowledge is a debt I owe to Peter Elbow, who never taught the way most HS English teacher teach:
"The Teacherless Writing Class
According to Elbow, improving your writing has nothing to do with learning discrete skills or getting advice about what changes you need to make. This stuff doesn’t help. What helps is understanding how other people experience your work. Not just one person, but a few. You need to keep getting it from the same people so they get progressively better at transmitting their experiences while you get better at receiving them."
If what ChatGPT does is, as you and many here and elsewhere are saying, is force out those who profess forms and efficiencies over voice and engaging prose, then count me in. I'll put on my VR headset and lead the way to something more human, something unpredictable, something closer to our own truths in words.
I confess a fondness for the five paragraph theme, but mostly only as a means of writing parodies of five paragraph essays. I don’t really want to teach again (through I did teach an online writing class a year ago), but I do think I’d do a much better job at 47 than I did at 24, in part because I’ve realized that writing is genuinely difficult for a lot of people—and that coming up with forms of writing that make sense as “writing one might actually be called upon to do in one’s actual life” is a better mode of approaching the whole business.
(Some day perhaps someone will decide I am qualified to teach seminars on How to Write a Work Email in addition to my dream job of of teaching people How to Run a Meeting.)
Excellent points, John. I think that the Prussian model of education which is designed to segregate the large majority of students into obedient workers while praising rewarding a subset of leaders has finally met its match in the AI. The AI accomplishes what many of my elementary school teachers wanted us to do, which is to digest the information that they gave us without question and spit it back out in an approved way. The Japanese have a saying that the nail that sticks out furthest gets hit the hardest, and that certainly can apply in American public schools. I agree with you that this system is not worth saving. Mediocrity is easy enough to achieve anyway.
Sent this to my husband, a high school English teacher whose department is currently reckoning with “what to do about ChatGPT.” Terrific read; thank you.
Very pleased to hear both that you sent it and that they're wrestling with the question. I'm sitting here stewing over the Google AI Olympics ad that has a father telling his daughter to use AI to write a letter to Sydney McLaughlin and thinking that that tech companies are going to be the opposite of helpful.
Thankfully it's being widely pilloried. It makes me hopeful in some sense that people are reflexively recoiling from the reflexive dehumanizing parts of the tech.
This gets at the problem exactly. Students hate writing because we obsess over the product, the structure, the details of what the writing should turn into, instead of the process of writing itself. We fixate on the outcome of student writing as if 8th graders are going to be able to produce something valuable if they could only figure out how to get those sentence transitions right. And we then lose out on all of the value that comes from encouraging a regular practice of writing as a way of expression, rather than a system for producing an output.
But of courze the output is easier to grade on an AP test, and since the whole point of school is to get good marks on standardized tests I suppose it all makes sense.
Students arrived in my college first-year writing class not as bad writers, but as people who both disliked (or worse) and feared writing, which was the worst part from my perspective. I'd experienced writing as liberating, a way to help me understand the world, and they'd just never been given the chance because of things like - as you say - the AP test. These are students who are among the most "prepared" for college around, and yet they had almost no experience with the way of writing I was asking them to do in our FYW course. It baffled me at the time, which is why I went investigating. And now, here we are...I honestly wonder if things will change or we'll just get a doubling down on what hasn't been working for a long time.
I hope it prompts a better system in public schools, but there's also the worry that we could have something even worse. I know anecdotally (lots of grade school English teachers in my life) that they see the problem as clearly as you seem to in university. One teacher friend does scheduled free-writing, with the grade coming solely from participation. Another teaches International Baccalaureate students, where the focus is on much longer form writing, with written projects having a more mentorship-focused evaluation approach.
But the vast majority of US students are taught to 'the test', and it makes sense when you consider that (woefully underpaid) teachers get bonuses based on their school grades or even on the number of students that individually pass the AP exams.
This is nice. Concerning GPT3, I knew I had no fear because like others, I don't deny my sentimental desire, or as George Orwell wrote in "Why I Write," for Sheer egoism.
We want to write. We want to communicate. We want to express. And we want to reproduce. As long as we keep these instincts alive - or not deny them because of some hyper notion of rationality, writing remains alive and GPT3 won't be the Lord.
Thank you for sharing your expertise in a really interesting and thoughtful essay.
"The reason the appearance of this tech is so shocking is because it forces us to confront what we value, rather than letting the status quo churn along unexamined" -- right to the marrow; excellent insight.
"It’s not every week that someone with my particular employment profile and expertise has something they’re knowledgable about become a hot topic of national discussion"
Hmm. Sounds just like what an AI chatbot *would* say. (LOL)
In my senior AP English class, once we sat for the exam in early April, fake curriculum was over and the real teaching began. Groups had to create a 3 minute movie based on any novel. Our film may have been heavily influenced by Cruel Intentions (17 year olds have notoriously sketchy taste). I will remember that assignment for the rest of my life.
Also, which question on the quiz did you miss??? 🙃
AP exams are the ne plus ultra of examples of writing that's divorced from thinking. We should spend a lot more time thinking about school experiences that students will remember.
I missed the Vikram Seth question. I'd actually read all the other books, so they were easy, but my guessed wrong on Q.4.
John, I've been interested to hear your take on this and it doesn't disappoint. It reminded me of two great (not-five-paragraph) essays worth sharing. Your thoughts about what needs to change in college writing pedagogy made me think of this provocative piece by John Michael Colón about what needs to change in college reading pedagogy (to save the humanities in the same way you'd like to save college writing): https://thepointmag.com/letter/on-the-end-of-the-canon-wars/?mc_cid=0f46429ec7&mc_eid=21b801de99. Colón says: "The value of the humanities is, upon exposure to real humanistic practice, self-evident"--that when students read and engage with great books, when they wrestle with them, debate and discuss them, they learn from them. Then they value them without needing to be told "why." Reading and writing are two sides of the same coin. We can't have one without the other, and both need to be valued if either is to be saved.
The second essay is my all-time favorite piece on NLP-generated writing by one my favorite essayists (and writing teacher) Meghan O'Gieblyn, cleverly titled "Babel": https://www.nplusonemag.com/issue-40/essays/babel-4/#fn27-13678). The essay was published in 2021 (a lifetime ago in AI years), but it feels uncannily prescient given the current buzz around ChatGPT. O'Gieblyn questions the nature of machine writing and human writing, and subtly challenges their differences. Of GPT-3, she says: "There was something prismatic in its voice, an uncanny chorus of intertextuality, the haunting of writing by other writing. The internet was driven from its earliest days by the promise of universal authorship. Hypertext and collaborative software were going to revive the myth and the folktale, narratives created not by a single creative genius but by the collective effort of any oral storytelling culture. It is tempting at times to see this technology as the realization of that dream, a repository of the collective wisdom and knowledge we’ve accumulated as a species. All of humanity speaking in a single voice."
I'm no O'Gieblyn, but I've done some writing and research in this area. (I did an independent study called "Can a Bot Read? What Happens When the Digital Becomes Literate" with two wonderful, well-respected Information Science professors.) And I'd gently push back on the claim that ChatGPT is a "bullshitter" who doesn't "read" or "write." I think that's true only in the most literal sense. As O'Gieblyn points out, algorithms like GPT-3 and its cousin ChatGPT, are trained on massive data sets of human language. So while it's not explicitly programmed with grammar, it absorbs our speech patterns, which contain grammatical structures. We humans have effectively taught it to "speak," and we've done so by giving up our data to corporations who then use that data to train these machines. The machines learn by trawling or "reading" that data and looking for patterns, a process that I can't help but compare to close reading (only the machines are infinitely better close readers than humans are. They see things our brains can't even register). So, while these systems don't "read" and "write" using the exact same processes as humans, there are some similarities and the effect is often the same. I worry that when we dismiss ChatGPT and its ilk as "bullshitters" or "toys," we ignore their real power over us. I'm no futurist either but I suspect these technologies are going to have a much bigger impact on society than most people realize. (Remember when the iPhone first came out and everyone thought it was just a Walkman you could call your friends on?) Anyway, I'm happy people in reading and writing communities are talking about these technologies, even if it's just the start of the conversation. Thanks for writing about it!
These points are well-taken. I don't want anyone to think that I'm minimizing the potential threat of this technology, or, as you say "the power the hold over us," but that's one of the reasons I'll still insist it can't read or write. The bots can hoover up and spit out syntax, but they have no independent understanding or appreciation. Unlike humans they have no emotional response to something they read or write.
One of the things I try to do with students is to encourage them to trust their immediate emotional (and physical) reactions to texts as meaningful data perhaps the the most meaningful data. I give them experiences to practice a method I call ROAS (I wish I had a better acronym) which stands for:
React
Observe
Analyze
Synthesize
This, for me, is a human response to text and by starting with the human response before we attempt to analyze the text and then synthesize that analysis into meaning, we're doing something no algorithm can duplicate. The really bad turn that school took was codifying a set of answers (for both reading and writing) that students were expected to figure out. That is the student acting like an algorithm. It's sort of incredible how far away we've gotten from letting students be human.
The AI can definitely create "text," and there's an interesting argument in the epistemological realm about whether or not its remixing of stuff that humans generated makes it "writing" but for me, writing is more than arranging syntax. Reading is more than taking in information. They are embodied experiences, and without the bodies, the meaning isn't the same.
I love that approach! Beginning with a reader's emotional connection to a text seems both human and humane :). As a lit student, I learned with traditional Socratic-style discussions and loose essay assignments. For six years it was basically the same thing: read a text, talk about it, write about it (however you like as long as it's within word count). I had a tremendous amount of freedom. I've only recently realized how lucky I was. If the assignments had been more prescriptive or not reading- and writing-centered, I'm not sure I would have finished college. It will be interesting what effect language-model technologies like ChatGPT have on humanities programs that are built around the essay.
You're correct in every point, but ... For most students, all of the immediate incentives are to get the grade, or if they've got a somewhat longer view, to learn to write the sort of things schools will ask them to write in the future. As someone commented about Harvard students, they're "incredibly good at figuring out how to do exactly what they need to do to get the grade. They're incredibly strategic. And I think that's really true of students everywhere." I want to agree with the concept of students wanting to learn how to write to express themselves, but the fraction of students who want to express themselves creatively in writing is fairly small, and I suspect the need for that skill in the "real world" outside school is significantly smaller than the supply of people who can do it now under the current blighted system.
The only sector of literature that I know anything about is science fiction, and some famous S.F. writer noted that the number of people who made a living writing S.F. at any one time "could fit in a van".
There's an irony, in that writing prompts to get ChatGPT to produce adequate versions of writing assignments is itself a matter of writing, and fairly creative writing at that, in that it's not a known process (yet). In a way, it's like a lot of the better-grade writing people need to do in life, not an expression of "what they have to say" but requiring considerable skill to write a text that will induce someone/something to do some particular act the writer wants.
Totally agree about the problem of incentives. Attacking those bad incentives is a huge focus of Why They Can't Write and The Writer's Practice is an attempt to provide a curriculum that incentivizes the kinds of experiences that are meaningful when it comes to learn how to write, to disrupt that strategic thinking when it comes to school and schooling by offering something better: learning!
This goes beyond "creative" expression in the sense of creative writing. It invokes the kind of writing we're doing here, an exchange of views rooted in values, trying to engage with a genuine audience.
Really it's just about thinking of writing as a human activity and experience as opposed to a means to an end to produce an end product.
The link at the beginning of the post, to “More Than Words: How to Think About Writing in the Age of AI” took me right back to this article? I think that is unintentional?
Writing is a tool for externalized thought. Humans have always tries to make their life easier with tools. Adopting the habit of using a tool means loosing an ability, to gain another one. The problem is what we are willing to give up. I think we'll need to be very careful here. Reasoning is just too valuable to let it go willy nilly.
In "Punishment and Discipline", Focault identified three levels or technologies of power, with increasing efficacy and sense-making ability: the power of the sovereign which inks its meaning in blood, the power of the civil order with its ability to direct thought and norms, the disciplinary power which enshrines sense in the daily habits of its subjects so that freedom to act within the accepted bounds set by power become the only moves the subject is capable off, bakes in its own reflexes. We are in the age of a new information power, that bakes the habit directly into our world. It creates synchronicities through which it influences us, like the ads or content we consume. We consume, to consume more, and become consumed in the process. This is just another step in the same trend, after removing facts from our hyper-realities, we remove the very technology for sense making that is reasoning, and writing.
This isn't a cry for the good old days. This kind of power breeds performance, it is efficient and cannot be contrasted by something less efficient. I think we need to double-down on factuality and on creating the bounds and checks to keep reasoning and sense-making within the hands of every person. The clarity that writing and reasoning brings has an advantage over the hallucinations of something fully disconnected from truth and reality. Without reasoning or communication, we end up in the fragmented alienated world that we live that breeds conflict, because we lost the ability to face the Other and communicate with them even when they are unlike us.
"I cannot emphasize this enough: ChatGPT is not generating meaning. It is arranging word patterns. I could tell GPT to add in an anomaly for the 1970s - like the girl looking at Billy’s Instagram - and it would introduce it into the text without a comment about being anomalous."
I asked ChatGPT to introduce the girl looking at Billy's instagram. The response:
"Instagram didn't exist in the 1970s. Do you want to keep the setting authentic to the '70s or update the story to a contemporary timeframe where Instagram fits naturally?"
Let me say something about repetition and patterns. I spent 27 years as a coach and judge of HS Forensics. I've spent countless hours listening to students use 3 point analysis to explain why the US should continue to fund NASA (for example) and I've sat through countless debates on resolutions like, "Resolved: Civil Disobedience is a just response to oppressive government."
By and large, regardless of the topic or the resolution, students followed certain formats. In Debate (Lincoln Douglas debate) it was the Toulmin rhetorical method (Claims, Evidence, Warrants, Impacts) and in Extemporaneous, it was generally 3 point analysis organized around time, location, or hierarchy.
Over time, as a coach, my students became far more fluent and advanced in their rhetorical choices, and their choices were more agile, creative, and nimble.
But they could not have gotten there without first understanding these forms.
And yes, debate is a game with rather predictable patterns at the novice level. But at some point, with enough experience and real-world feedback, they make a huge cognitive leap.
Surely I'm not suggesting that all students should engage in the rigorous and often ridiculous event of HS Forensics, where some of the very best speeches I've ever seen are those that lampoon just how predictable their speeches are.
But I am suggesting that kids need lots and lots of practice and that understanding the importance of form as a scaffold is important...so long as we also understand we need to help them move away from this.
I've spent thousands of dollars and endless hours freewriting and revising work through attendance at Bard College's Institute for Writing and Thinking. I know how I write, and that knowledge is a debt I owe to Peter Elbow, who never taught the way most HS English teacher teach:
"The Teacherless Writing Class
According to Elbow, improving your writing has nothing to do with learning discrete skills or getting advice about what changes you need to make. This stuff doesn’t help. What helps is understanding how other people experience your work. Not just one person, but a few. You need to keep getting it from the same people so they get progressively better at transmitting their experiences while you get better at receiving them."
If what ChatGPT does is, as you and many here and elsewhere are saying, is force out those who profess forms and efficiencies over voice and engaging prose, then count me in. I'll put on my VR headset and lead the way to something more human, something unpredictable, something closer to our own truths in words.
I confess a fondness for the five paragraph theme, but mostly only as a means of writing parodies of five paragraph essays. I don’t really want to teach again (through I did teach an online writing class a year ago), but I do think I’d do a much better job at 47 than I did at 24, in part because I’ve realized that writing is genuinely difficult for a lot of people—and that coming up with forms of writing that make sense as “writing one might actually be called upon to do in one’s actual life” is a better mode of approaching the whole business.
(Some day perhaps someone will decide I am qualified to teach seminars on How to Write a Work Email in addition to my dream job of of teaching people How to Run a Meeting.)
Excellent points, John. I think that the Prussian model of education which is designed to segregate the large majority of students into obedient workers while praising rewarding a subset of leaders has finally met its match in the AI. The AI accomplishes what many of my elementary school teachers wanted us to do, which is to digest the information that they gave us without question and spit it back out in an approved way. The Japanese have a saying that the nail that sticks out furthest gets hit the hardest, and that certainly can apply in American public schools. I agree with you that this system is not worth saving. Mediocrity is easy enough to achieve anyway.
Sent this to my husband, a high school English teacher whose department is currently reckoning with “what to do about ChatGPT.” Terrific read; thank you.
Very pleased to hear both that you sent it and that they're wrestling with the question. I'm sitting here stewing over the Google AI Olympics ad that has a father telling his daughter to use AI to write a letter to Sydney McLaughlin and thinking that that tech companies are going to be the opposite of helpful.
Oof. Did NOT know about that ad.
Thankfully it's being widely pilloried. It makes me hopeful in some sense that people are reflexively recoiling from the reflexive dehumanizing parts of the tech.
This gets at the problem exactly. Students hate writing because we obsess over the product, the structure, the details of what the writing should turn into, instead of the process of writing itself. We fixate on the outcome of student writing as if 8th graders are going to be able to produce something valuable if they could only figure out how to get those sentence transitions right. And we then lose out on all of the value that comes from encouraging a regular practice of writing as a way of expression, rather than a system for producing an output.
But of courze the output is easier to grade on an AP test, and since the whole point of school is to get good marks on standardized tests I suppose it all makes sense.
Students arrived in my college first-year writing class not as bad writers, but as people who both disliked (or worse) and feared writing, which was the worst part from my perspective. I'd experienced writing as liberating, a way to help me understand the world, and they'd just never been given the chance because of things like - as you say - the AP test. These are students who are among the most "prepared" for college around, and yet they had almost no experience with the way of writing I was asking them to do in our FYW course. It baffled me at the time, which is why I went investigating. And now, here we are...I honestly wonder if things will change or we'll just get a doubling down on what hasn't been working for a long time.
I hope it prompts a better system in public schools, but there's also the worry that we could have something even worse. I know anecdotally (lots of grade school English teachers in my life) that they see the problem as clearly as you seem to in university. One teacher friend does scheduled free-writing, with the grade coming solely from participation. Another teaches International Baccalaureate students, where the focus is on much longer form writing, with written projects having a more mentorship-focused evaluation approach.
But the vast majority of US students are taught to 'the test', and it makes sense when you consider that (woefully underpaid) teachers get bonuses based on their school grades or even on the number of students that individually pass the AP exams.
There's some cool stuff happening outside of the school systems. David Perell runs a program called 'Liftoff' which exclusively focuses on teaching high-schoolers to be writers: https://writeofpassage.school/2022/09/12/our-vision-for-write-of-passage/
But things like that are obviously small scale and early days.
This is nice. Concerning GPT3, I knew I had no fear because like others, I don't deny my sentimental desire, or as George Orwell wrote in "Why I Write," for Sheer egoism.
We want to write. We want to communicate. We want to express. And we want to reproduce. As long as we keep these instincts alive - or not deny them because of some hyper notion of rationality, writing remains alive and GPT3 won't be the Lord.
John, thank you for breaking down AI like this — and for your kind advocacy for Ancestor Trouble!
Thank you for sharing your expertise in a really interesting and thoughtful essay.
"The reason the appearance of this tech is so shocking is because it forces us to confront what we value, rather than letting the status quo churn along unexamined" -- right to the marrow; excellent insight.
"It’s not every week that someone with my particular employment profile and expertise has something they’re knowledgable about become a hot topic of national discussion"
Hmm. Sounds just like what an AI chatbot *would* say. (LOL)
In my senior AP English class, once we sat for the exam in early April, fake curriculum was over and the real teaching began. Groups had to create a 3 minute movie based on any novel. Our film may have been heavily influenced by Cruel Intentions (17 year olds have notoriously sketchy taste). I will remember that assignment for the rest of my life.
Also, which question on the quiz did you miss??? 🙃
AP exams are the ne plus ultra of examples of writing that's divorced from thinking. We should spend a lot more time thinking about school experiences that students will remember.
I missed the Vikram Seth question. I'd actually read all the other books, so they were easy, but my guessed wrong on Q.4.
John, I've been interested to hear your take on this and it doesn't disappoint. It reminded me of two great (not-five-paragraph) essays worth sharing. Your thoughts about what needs to change in college writing pedagogy made me think of this provocative piece by John Michael Colón about what needs to change in college reading pedagogy (to save the humanities in the same way you'd like to save college writing): https://thepointmag.com/letter/on-the-end-of-the-canon-wars/?mc_cid=0f46429ec7&mc_eid=21b801de99. Colón says: "The value of the humanities is, upon exposure to real humanistic practice, self-evident"--that when students read and engage with great books, when they wrestle with them, debate and discuss them, they learn from them. Then they value them without needing to be told "why." Reading and writing are two sides of the same coin. We can't have one without the other, and both need to be valued if either is to be saved.
The second essay is my all-time favorite piece on NLP-generated writing by one my favorite essayists (and writing teacher) Meghan O'Gieblyn, cleverly titled "Babel": https://www.nplusonemag.com/issue-40/essays/babel-4/#fn27-13678). The essay was published in 2021 (a lifetime ago in AI years), but it feels uncannily prescient given the current buzz around ChatGPT. O'Gieblyn questions the nature of machine writing and human writing, and subtly challenges their differences. Of GPT-3, she says: "There was something prismatic in its voice, an uncanny chorus of intertextuality, the haunting of writing by other writing. The internet was driven from its earliest days by the promise of universal authorship. Hypertext and collaborative software were going to revive the myth and the folktale, narratives created not by a single creative genius but by the collective effort of any oral storytelling culture. It is tempting at times to see this technology as the realization of that dream, a repository of the collective wisdom and knowledge we’ve accumulated as a species. All of humanity speaking in a single voice."
I'm no O'Gieblyn, but I've done some writing and research in this area. (I did an independent study called "Can a Bot Read? What Happens When the Digital Becomes Literate" with two wonderful, well-respected Information Science professors.) And I'd gently push back on the claim that ChatGPT is a "bullshitter" who doesn't "read" or "write." I think that's true only in the most literal sense. As O'Gieblyn points out, algorithms like GPT-3 and its cousin ChatGPT, are trained on massive data sets of human language. So while it's not explicitly programmed with grammar, it absorbs our speech patterns, which contain grammatical structures. We humans have effectively taught it to "speak," and we've done so by giving up our data to corporations who then use that data to train these machines. The machines learn by trawling or "reading" that data and looking for patterns, a process that I can't help but compare to close reading (only the machines are infinitely better close readers than humans are. They see things our brains can't even register). So, while these systems don't "read" and "write" using the exact same processes as humans, there are some similarities and the effect is often the same. I worry that when we dismiss ChatGPT and its ilk as "bullshitters" or "toys," we ignore their real power over us. I'm no futurist either but I suspect these technologies are going to have a much bigger impact on society than most people realize. (Remember when the iPhone first came out and everyone thought it was just a Walkman you could call your friends on?) Anyway, I'm happy people in reading and writing communities are talking about these technologies, even if it's just the start of the conversation. Thanks for writing about it!
These points are well-taken. I don't want anyone to think that I'm minimizing the potential threat of this technology, or, as you say "the power the hold over us," but that's one of the reasons I'll still insist it can't read or write. The bots can hoover up and spit out syntax, but they have no independent understanding or appreciation. Unlike humans they have no emotional response to something they read or write.
One of the things I try to do with students is to encourage them to trust their immediate emotional (and physical) reactions to texts as meaningful data perhaps the the most meaningful data. I give them experiences to practice a method I call ROAS (I wish I had a better acronym) which stands for:
React
Observe
Analyze
Synthesize
This, for me, is a human response to text and by starting with the human response before we attempt to analyze the text and then synthesize that analysis into meaning, we're doing something no algorithm can duplicate. The really bad turn that school took was codifying a set of answers (for both reading and writing) that students were expected to figure out. That is the student acting like an algorithm. It's sort of incredible how far away we've gotten from letting students be human.
The AI can definitely create "text," and there's an interesting argument in the epistemological realm about whether or not its remixing of stuff that humans generated makes it "writing" but for me, writing is more than arranging syntax. Reading is more than taking in information. They are embodied experiences, and without the bodies, the meaning isn't the same.
I love that approach! Beginning with a reader's emotional connection to a text seems both human and humane :). As a lit student, I learned with traditional Socratic-style discussions and loose essay assignments. For six years it was basically the same thing: read a text, talk about it, write about it (however you like as long as it's within word count). I had a tremendous amount of freedom. I've only recently realized how lucky I was. If the assignments had been more prescriptive or not reading- and writing-centered, I'm not sure I would have finished college. It will be interesting what effect language-model technologies like ChatGPT have on humanities programs that are built around the essay.
You're correct in every point, but ... For most students, all of the immediate incentives are to get the grade, or if they've got a somewhat longer view, to learn to write the sort of things schools will ask them to write in the future. As someone commented about Harvard students, they're "incredibly good at figuring out how to do exactly what they need to do to get the grade. They're incredibly strategic. And I think that's really true of students everywhere." I want to agree with the concept of students wanting to learn how to write to express themselves, but the fraction of students who want to express themselves creatively in writing is fairly small, and I suspect the need for that skill in the "real world" outside school is significantly smaller than the supply of people who can do it now under the current blighted system.
The only sector of literature that I know anything about is science fiction, and some famous S.F. writer noted that the number of people who made a living writing S.F. at any one time "could fit in a van".
There's an irony, in that writing prompts to get ChatGPT to produce adequate versions of writing assignments is itself a matter of writing, and fairly creative writing at that, in that it's not a known process (yet). In a way, it's like a lot of the better-grade writing people need to do in life, not an expression of "what they have to say" but requiring considerable skill to write a text that will induce someone/something to do some particular act the writer wants.
Totally agree about the problem of incentives. Attacking those bad incentives is a huge focus of Why They Can't Write and The Writer's Practice is an attempt to provide a curriculum that incentivizes the kinds of experiences that are meaningful when it comes to learn how to write, to disrupt that strategic thinking when it comes to school and schooling by offering something better: learning!
This goes beyond "creative" expression in the sense of creative writing. It invokes the kind of writing we're doing here, an exchange of views rooted in values, trying to engage with a genuine audience.
Really it's just about thinking of writing as a human activity and experience as opposed to a means to an end to produce an end product.
The link at the beginning of the post, to “More Than Words: How to Think About Writing in the Age of AI” took me right back to this article? I think that is unintentional?
Writing is a tool for externalized thought. Humans have always tries to make their life easier with tools. Adopting the habit of using a tool means loosing an ability, to gain another one. The problem is what we are willing to give up. I think we'll need to be very careful here. Reasoning is just too valuable to let it go willy nilly.
In "Punishment and Discipline", Focault identified three levels or technologies of power, with increasing efficacy and sense-making ability: the power of the sovereign which inks its meaning in blood, the power of the civil order with its ability to direct thought and norms, the disciplinary power which enshrines sense in the daily habits of its subjects so that freedom to act within the accepted bounds set by power become the only moves the subject is capable off, bakes in its own reflexes. We are in the age of a new information power, that bakes the habit directly into our world. It creates synchronicities through which it influences us, like the ads or content we consume. We consume, to consume more, and become consumed in the process. This is just another step in the same trend, after removing facts from our hyper-realities, we remove the very technology for sense making that is reasoning, and writing.
This isn't a cry for the good old days. This kind of power breeds performance, it is efficient and cannot be contrasted by something less efficient. I think we need to double-down on factuality and on creating the bounds and checks to keep reasoning and sense-making within the hands of every person. The clarity that writing and reasoning brings has an advantage over the hallucinations of something fully disconnected from truth and reality. Without reasoning or communication, we end up in the fragmented alienated world that we live that breeds conflict, because we lost the ability to face the Other and communicate with them even when they are unlike us.
bad popup, disable the nag in substack settings
"I cannot emphasize this enough: ChatGPT is not generating meaning. It is arranging word patterns. I could tell GPT to add in an anomaly for the 1970s - like the girl looking at Billy’s Instagram - and it would introduce it into the text without a comment about being anomalous."
I asked ChatGPT to introduce the girl looking at Billy's instagram. The response:
"Instagram didn't exist in the 1970s. Do you want to keep the setting authentic to the '70s or update the story to a contemporary timeframe where Instagram fits naturally?"