19 Comments
May 28, 2023·edited May 28, 2023Liked by John Warner

I just saw Altman’s session at Wisdom 2.0 with Jack Kornfield. Altman reminded me of my younger self when I was negotiating with AOL at the height of its power. I got rolled. The tech giants will say anything but the juice that drives them seems to be much more banal, like getting Samsung to use MS Bing rather than Google (it’s a multibillion dollar contract).

Altman is socially pleasing window dressing and is not in control. His company has seven directors and he does not control them. I appreciate your cold water and agree that the tech giants pursuing competitive gains will do what they want. The FTC notably stated that companies that fire their ethics teams should not be surprised by regulation. I hope this is true and that we don’t have more regulatory capture. Thank you for your tech perspective. https://jackkornfield.com/how-will-ai-impact-our-world-a-conversation-with-sam-altman-of-chatgpt/

Expand full comment
author

I saw somewhere that Microsoft is rolling out GPT-integrated into the Office suite or programs imminently, so it's going to be tough to avoid this even if you want to. I was worried about the response of some senators after Altman's testimony. I worry about how few lessons have been learned from prior hands-off treatment of tech, but with OpenAI promising magical productivity gains, no one wants to be the person that "strangles" innovation. How sure are we that it's truly innovative?

Expand full comment
May 28, 2023Liked by John Warner

I truly appreciate your article because it reminds us about how much of what is being sold is hype. In Europe they are ok with stopping technology that plagiarizes from databases without permission, violates privacy or provides terribly wrong and harmful answers (in part because answers are sourced from some extremist database as per an investigation by the Washington Post). I am delighted that the FTC noted regulation for companies that fired their ethics teams was being considered. I have been educated by Melanie Mitchell and Gary Marcus's public tweets and writings. And your newsletter today is excellent in helping your readers remember that this technology and those forcing it on us do not have our interests at heart!

PS One thing that did help me when I watched Altman was a class that Jack Brown, Ph.D (twitter) is teaching on body language. For my writing I wished to develop a better body language vocabulary. When I saw Sam Altman, I used some of that analysis to interpret "what am I seeing." He looked like a guilty child - something I would not have picked up on without developing a deeper somatic vocabulary.

Expand full comment
author

I read Melanie Mitchell's book when it came out a few years ago and it's really shaped how I think about these things. I appreciated that she's not a skeptic at all, but she approaches these things with a kind of humility that gets underneath the surface-level amazement at what the technology SEEMS capable of. It's like Elon' self-driving cars. What can be done seems close enough to self-driving that you think full self-driving can't be far off. It turns out, though, that that last seemingly small gap is, in reality, an unbridgeable gulf.

Expand full comment
May 28, 2023Liked by John Warner

I can't help but see a future where actual, human creativity is reserved for the hobby projects of the rich. Where art belongs to whatever version of the church and the Medici family is in power. Probably Tesla and Chloe Kardashian respectively.

We might be living in a golden age of art, where the average person can watch shows with great dialogue and read a book full of ideas written by another human being.

I worry that in the future all art and research will be only available I areas that match the interests of those able to privately fund the artist/scientist.

Expand full comment

Those who can afford the processing power. The gate keepers will tax the use of super computers.

Expand full comment
author

I think we are already heading towards this scenario. Even now it's very difficult to support oneself through your writing, and almost impossible if you're tying yourself to trying to produce "art." Fewer people are majoring in the humanities because, in part, they fear that those majors won't pay off vs. the cost of college. The structures are hostile to the sustainability of a creative life.

Expand full comment
May 28, 2023Liked by John Warner

While I beg to differ with Sam Altman, a greater mind than mine, AI is not what I have been waiting for. But it, like many other tech forward tools, i.e. the internet itself, is only a tool. I am trying to write a memoir of my family, for my family, not for publication. Rather than make it a chronological list of events, names, and dates, I am trying to make it readable, humorous, and poignant. With no prior experience writing creative non-fiction, I have difficulties. I write and write and write. With the magic of editing tools, Grammarly, ProWriting Aid, and the editor in Word, I sound better all the time. Also, I learn from the suggestions. I was stuck the other day and wanted a word starting with p for an alliterative string of descriptions I was trying out. My thesaurus skills weren't cutting it. Hmm, I'll try ChatGPT. Four seconds later, boom! Another tool at my disposal.

Yes, straight up AI is a disruptive technology. Still, nothing will replace human creativity. When I sent a paragraph out there for suggestions, what came back wasn't me, so I returned to my merry way in my own voice.

Expand full comment
author

I do appreciate the way some of these tools have improved. Until the last few months, I never used a grammar check because the way it flagged my writing irritated the crap out of me. I knew the stuff I was doing didn't adhere to all the rules, but as you say, I want to write in my own voice.

Now, there's times where I do find it helpful as a quick check on something that I'm trying to get into the world where I don't have a ton of time for polish. And thanks to spellcheck, all my too fast typing errors get corrected automatically as long as I get close enough on the word I'm looking for.

Expand full comment
May 28, 2023Liked by John Warner

You're really good at this column thing, John. You lean into controversy *just* enough that I'm inspired to comment, which is unusual for me. Here's some of my reactions:

1) I definitely do not agree with that tweet, nor with the idea that AI is going to destroy creativity or writing.

2) I'm not sure AI, a dynamic and broad technology (like the internet) with the ability to do many, many things, can be compared to bad loans causing a financial crisis. I guess the idea is people will believe in AI so much that, when the rug is pulled, they'll be fucked? It reminds me of those that warn about the collapse of the internet, or the collapse of the power grid.

3) "At the speed with which AI is moving ... we'll have destroyed the existing structures that allow writing by humans to happen." I'm not sure I get this. Writing by humans can always happen. In fact, it *has* to happen. It's communication. It's stories. Text is the primary form of communication in the web 2.0 and 3.0 world. Writing by humans isn't going anywhere. In fact, there may be more of it.

4) Back to the tweet and your essay, this doomsday vibe seems to be *actually* AI's Big Hype, and it's skips a lot of bases. You're right that Altman is playing to it, but only because he knows how prevalent it is in public discourse. I think he's trying to rise to that fear to show people he gets them, which I'm sure he finds a little ridiculous for how basic AI is right now. It makes sense, though: the fear. It's usually writers saying this stuff. But have we forgotten about AI's promise to eliminate straight-up annoying, busy-work tasks? i.e building a website, mocking up a creative idea, copy editing or formatting a book, sending 1,000 emails, automating repeat tasks, turning low-quality sound into high-quality (Adobe Podcast), generating graphics without spending a week designing (so we can get back to the bigger tasks). Creatives, I'm sure, are going to flourish because of AI. It makes so many new things possible, saves time, and makes it easier for artist to express themselves, eliminating technical or time-based blockers. It's tool, like a smart phone. It makes life easier and makes more things possible.

5) Lastly, I was confused by your last paragraph. Do you think that, "if generative AI is the gateway to mass human flourishing and creativity" it follows that no one will read/listen to books? I don't buy it.

Thanks for this thought-provoking piece, John.

Expand full comment
author

Some thoughts to your thoughts.

Re 2): I'm not making a direct comparison of the tech to the loans, but working through the notion of how we - and I mean all of us - sometimes allow our desires for the lives we think we're supposed to want, override good sense. AI is sold as a kind of progress, the tech people have been waiting for according to Altman. I'm trying to explore how true or not true that is, as well as deal with the trap of believing in things like the American Dream of home ownership, or in my case, writing and reading as experiences where the doing by the humans is of primary importance.

3): When I say writing, I don't mean the mere production of text, but as referenced above, writing as a fully-embodied human experience where the act of thinking and feeling as humans, as opposed to assembling text as GPT does. I'm trying to draw a distinction between text production and what I think of as writing. That's sort of the core of this book and the thing that I fear will be lost.

4): No offense, but this all sounds naive to me and the kind of thinking I'm hoping to guard against. Sure, all these tools can be used to aid creative work, but they are also in the process of destroying entire markets for say, graphic artists, the people whose work was used without pay or permission to create these image-generating programs. What you describe as a technical barrier is actually a barrier of skill, practice, and talent, not technology. Why is time and efficiency important rather than craft and quality?

Why is easier better? Who

How will creatives flourish in a marketplace where you can't get paid for your creativity? This isn't working out for journalists at the moment, which harms those of us who rely on journalism as part of keeping our democratic order functioning either.

The unthinking embrace of efficiency as the primary value has destroyed how we teach writing in school, denying students the kinds of experiences that help them learn how to think as writers.

5): I mean that no one is going to remember my book if I'm wrong. No one will read it if I'm right either, for that matter.

Expand full comment
May 28, 2023Liked by John Warner

Not sure AI will ever replace human creativity when it comes to the arts - but in the near term it's perfectly capable of producing human sounding tweets. Mass disinformation at scale. Entire AI generated "humans" can be spun up and programmed to post banal everyday tweets to build credibility and then collectively directed to prop up whatever lie, click bait, or disinformation campaign it sees fit to amplify. Citing websites can be spun up just as quickly with credible enough sounding copy attached to sensational headlines. All the current book banning happening in the States can be traced back to 11 individuals, imagine what a small number of folks handling an army of bots could do. Dating honeytraps and elaborate texting scams are brute force phishing attacks that rely on volume - perfect for AI. It's a bit tinfoil hat sounding but you can never go wrong betting on people's basest instincts.

Expand full comment
author

These are a big concern, and we don't have the capacity to police it or rein it in.

Expand full comment

The question I have is, if the people aren’t writing any longer because of AI, what will the AI use to learn what writing is? Isn’t everything it learns based on what WE give it? Language and speech is always changing, can you imagine the future equivalent of AI spitting out Old English essays...

Expand full comment
author

It's a good question. I wonder what happens when an increasing amount of the data it's trained on has been generated by the AI.

Expand full comment
May 28, 2023Liked by John Warner

As a retired teacher and communications director, I wrote a number of blog entries. While writing blog entries last Fall as a subcontractor, I tried AI to either get a topic started or to add depth of knowledge and interesting information to certain paragraphs. I found that it simply created (hallucinated, as you put it) information that was completely untrue. The company warned that its AI data was current only through 2019, but if it writes that the United States capital is Philadelphia and that a species of freshwater fish lives in saltwater, these kinds of “choices” are obviously not post-2019 data. At this point, AI is no more helpful to a writer than playing Dungeons & Dragons. Maybe less so.

Since humanity can’t even come together to prevent the ongoing cascade of climate-catastrophic events, I’m afraid the chance that we come together to prevent the AI debacle is just about nil.

Expand full comment
author

The lawyer who got himself in trouble actually asked GPT if the cites were accurate and it told him yes. I think your example is an indicator that we don't know if this stuff is actually going to work the way the developers promise, or if it'll just keep falling flat. The problem from my POV is that it's arriving whether it works or not.

Expand full comment

The lie of inevitability is such a powerful tool when it pays off, and looks really bad when it doesn't. If you get enough people with enough power saying something is inevitable, and acting as if it is, it becomes a powerful self-fulfilling prophecy. However, I'm reminded of the treatment of NFTs, an obviously absurd and stupid idea that couldn't be forced on people no matter how many celebrities and wealthy people promoted them. AI is obviously very different as the US has treated technology and technological "disruption" as an inherent part of progress (and therefore implicitly good). The Received Wisdom podcast from the Michigan Ford School has a good episode about how US science investment since the 60s has encouraged this sort of "new frontier" or "manifest destiny" thinking about technology.

In this passage, I think you should have stopped at the point I marked: "but it’s also true that individuals could have resisted taking out these mortgages that were obviously too good to be true.

Or could they? Maybe this was simply impossible given the functioning of society"

I wish I could remember the title, but I read a review of a book on public policy recently which tracked the history of housing policy in the US and showed how the government has spent huge amounts of money subsidizing homeownership as both a way to gain wealth and stability and a way to keep racial stratification of US society. From white homesteaders getting free land after pushing Native Americans off of it, to redlining, to the continued subsidization of mortgages through tax rebates and first-time homeowner grants, the US has ignored deeper systemic issues by promoting this individualistic solution for advancement. So I actually do think that it was the kind of choice that made sense to a lot of struggling people. Homeownership is the American staple of security and stability.

Expand full comment

I sometimes shudder to confess that I write tweets for a living, but I do. Awhile back my boss asked ChatGPT to write some for our organization and showed them to me. I said, "Well, they'd be great if we wanted to put people to sleep."

I think the people who think AI can replace humans are people who have no understanding of what writing (and reading, and research) is in the first place. Sure, AI can write a strategic plan and it probably sounds like every other strategic plan you've ever read, because strategic plans are terrible. If AI takes over writing, it won't be because it's good at writing--it will be because people don't get what writing is.

Expand full comment