AI will not write for me.
LLMs are very good at manipulating language. Still, there's a lot of slop out there. It's not "slop" because it's inherently useless (there is a lot of that too, though), but because it lacks the perspective, experience, and accountability of a human. At best, AI-generated content is the soulless, plastic average of those experiences, which necessarily robs the reader of something special, even if it does get helpful transactional information from a machine to my brain.
I say all this as a daily, satisfied user of AI. Most of the time, it's for solving technical problems (including writing chunks of code), helping me think of a word different from the one I just used a couple sentences prior, or for quick answers to questions. I think incidental value like this is very helpful, and I'm glad I have it as a tool at my disposal.
Where I draw the line is letting it articulate an idea on my behalf, disconnected from the idiosyncratic perspective only I can offer, and risking the expression of an idea with which I don't align.
That's why I've committed to never letting AI write for me. It may help generate a contrived code example, or feed me an analogy to help articulate an idea. But it will not own the expression of the idea itself. You can count on that while reading everything on this site.
🫡