1 Comment
⭠ Return to thread

What I think you really illustrate here is how necessary your comprehensive and sophisticated prior knowledge and work as a writer (across multiple contexts, even) is to the purposeful use of the tool. As I think about it, the different dimensions of what I call "the writer's practice" that you're illustrating are almost uncountable.

The amount of practice and depth of experience it takes to get to the place is literally years and years. I don't think of myself as a Luddite when it comes to technology. I've continually integrated technology into the way I teach over the years, and because this stuff is still new my thoughts are still developing, but I tend to lean toward a first do no harm approach when it comes to integrating these things into the work students do in school, and to focus on helping them develop their writing practices that will allow them to develop the knowledge, skills, attitudes, and habit of mind that allow them to use these tools with agency, as you clearly do.

One thing I'll observe, though, is that your accurate observations about the difficulties of trying to forge a full-time career as a novelist, needing to be "lucky or prolific," is an acknowledgement of a bad status quo that I think generative AI is going to make worse, essentially there is no winning that race because it's set up for the writers to lose. I want to spend more time talking about how it becomes possible for the human work of writing to be supported in ways that doesn't require inordinate amounts of luck, or an exhausting spring on a treadmill of productivity.

It's possible that that ship has sailed. I'm not naive, and I've been living that reality for the last 20-plus years, but I'm not ready to accept the idea that we have to jump on this train or risk getting crushed under its wheels.

I watched that Sal Kahn talk right after it was released, and I have to say that it's a good example of what I'd like to push against. One of the current problems (IMO) we have in education is students feeling alienated from the work we ask them to do in school contexts. I don't know how outsourcing what could - and I would say, should - be done by humans (but isn't because it's apparently too costly) is progress. I'm not saying that an AI tutor has not potentially helpful role, but the thought that students would spend time writing for their digital tutor, rather than other humans makes me profoundly depressed.

Who is going to read these books you're working so hard on if those are the experiences we provide to students as they're developing as humans?

Expand full comment