I Fed A Chatbot The Complete Works of Baudrillard and Now It Won't Stop Screaming
At some point in every phone conversation with my 85 year-old father, the following exchange takes place:
Him: Let me ask you something...what do you think about this...1-A? A-1?
Me: AI? I fucking hate it.
Him: Yeah, that's it...I can't remember what it stands for...
I'm not including this to burn my dad—who seems to have real short-term memory problems that he attributes to not enough Omega-3s in his diet, as opposed to, say, the decades of regular cannabis use—because I think his lack of understanding has more to do with his general technophobia than anything neurological. I am reasonably sure that my dad has never been on the internet. He is incapable of figuring out a smart phone, to the point of having somehow done something to send all of my calls directly to voice mail, which he never checks, so I have to wait for him to call me if I want to talk to him.
Okay, maybe I'm burning him a little bit. It doesn't matter, he will never get on the internet and see this, and even if he did, he wouldn't remember it.
The word "luddite" gets tossed around a lot as shorthand for people who are antagonistic towards AI and other new technologies, but of course the actual legacy of the Luddites is more specifically a skepticism of and objection to technology used to benefit the ownership class while exploiting workers. Lots of AI-haters, myself very much included, love technology, but have found ourselves increasingly uncomfortable with the way it is being developed and deployed seemingly without any concern over future impacts beyond what can be entered on a projected future earnings sheet and send stock prices into orbit.
The "AI is inevitable" contingent needs you to believe that's true, because that framing is essential to their purposes. The sell is that we have to invest this enormous amount of money in infrastructure now, without any returns, because that will enable us to reap the benefits way down the line. Web 2.0 largely rewarded the people and companies who were able to continuously dump unfathomable sums of money into products that only became profitable by eliminating competition and undermining existing institutions, and those people and their sycophants are the same ones pushing all of their money back on the pass line and rolling the dice on things like VR and AI.
Part of the problem is that most of these people, even those with technical knowledge, are dumb as shit. They have zero interest in anything that can't be distilled down to a TikTok explainer, and have so warped the idea of a marketplace that there's no personal benefit to them in learning anything more complex. Even the above example of betting at a craps table is inaccurate. Since profits and losses don't really matter, it's more like just slamming the MAX BET button on a slot machine over and over again hoping for a jackpot without ever having to look at your wins and losses. One win pays for all, and if you thought the dice have no memory, wait until you get a load of the modern stock market.
I don't have a particularly advanced understanding of economics, and my objections to AI have a lot more to do with a deep aversion to the types of things it's being hyped as useful for. Almost all of the tech claims are bunk, and are dressed up as "Oh, it's a tool for helping people create/build, not a replacement for it," an ironic assertion since it is absolutely being sold as a way to reduce the actual creative and constructive work of human beings to make uncreative people feel capable of saying "I made a thing."
Even steering clear of the artistic stuff, the use cases for AI are almost always bad. The one I hear the most is that it's great for replying to emails, but email itself is a tool for communicating between people. The part that it makes sense to outsource is the routine, non-productive portion. Rather than sitting down with a pen and paper and writing, then revising, and finally mailing off information that is presumably important enough to warrant the effort, and then waiting hours, days, months for a reply, you can type it up, digitally edit, and even have the machine check the spelling for you before sending it almost instantly to the other party, who can reply almost as quickly. Automating the content delivery system makes sense because it's a vessel for the information you presumably want to convey. Telling your phone or computer to send a message about X topic to a colleague who will presumably just scan the AI summary before having their phone or computer compose a response isn't a brilliant time-saving hack, it indicates that either you are bad at that aspect of your job, or that your job only exists right now because they need a person to periodically perform very routine tasks that will be done by the AI you're using as soon as your boss thinks they can get away with it.
Sigh...I didn't mean to write this post. I sat down after walking my dogs and thinking about why I like machine learning, which is often mislabeled as AI, but not LLMs and GenAI, which are most of what we're talking about right now. I was playing some Caves of Qud this morning, and reveling in the unpredictability of semi-random systems, which rely on machine learning but still allow a sense of authorial intent. I like it because there is (usually) thought put into the frame that contains these systems that keeps them from feeling like Mad Libs, but they still enough unpredictable elements for them to carom off one another in interesting ways. Put another way, there's nothing that exists within the system that wasn't intended by its designer, but the number of possible combinations are so complex that it's impossible even for the designer to predict exactly what will happen on a given run.
It's why I like Backgammon more than Chess: Both contain a huge number of potential moves, but Backgammon features a pair of dice to ensure there's an element of unpredictability, whereas with Chess, you have to be better than your opponent at Chess 100% of the time. I know that makes it more attractive for people who are good at Chess, but I crave that unpredictability. Also, I am bad at Chess.
I've been working on a new D&D adventure, and it's really helped sharpen my disdain for outsourcing creative work to AI. I could very easily have ChatGPT or one of the many automated character generators fill in the stats I need for the NPCs in the game without it mattering all that much, but I imagine I'd have to spend just as much time correcting and tweaking it as I'd save by not just building those characters myself in the first place. Likewise, I could ask AI to describe or map a town with specific characteristics I wanted, but even though that's not what anyone mentions when they talk about the fun part of creating something, that process is integral to the finished product. It's commonplace to complain bitterly about the process of writing, because the process of writing fucking sucks. It's uncomfortable, and often agonizing, but it's also impossible to avoid if you want to write something, because it's right there in the name of the thing. The fact that we type on keyboards or dictate or blink out a manuscript in Morse Code doesn't change that verb because it's the method of organizing and composing and then sharing one's thoughts that is fundamental, not the tools we use to get there. The parts of that we automate are the parts we don't value, and if that includes the contents of the message as well as the message delivery system, what part is left that is even worth sharing?