Looking for quick, anodyne text? Something passable, for the most part grammatically correct, and ostensibly professional?
You’re in luck. Tools like ChatGPT, JasperAI, and many others can spew out ostensibly passable prose in no time. Apart from standalone tools, text-generation capabilities have already arrived in Microsoft Word, Google Docs, and other word processors. Ditto for just about every other mainstream software application, like Notion.
But should you use AI to write something of consequence, like a book?
I’ll argue against doing so—and not just out of self-interest. As it turns out, large language models share a great deal in common with Xerox machines and lossless compression.
For more on this subject, give ChatGPT Is a Blurry JPEG of the Web from The New Yorker a read. If you lack the time, here’s the money paragraph:
Sometimes it’s only in the process of writing that you discover your original ideas. Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.
No one can stop you from using generative AI to write for you or with you. Hell, Reid Hoffman did it a few months back.
Will you save time? Sure, but the results won’t differentiate your idea or book. Maybe that changes in the future. For now, though, if you’re intent on writing an original and meaningful text, it’s best to go old school and put the time in.
0 Comments