A different AI essay than the one everybody else is writing

I had this magnificently lukewarm take on AI art all written up and ready to post. It the sort of thing you’ve probably heard elsewhere: blah blah blah, the tool isn’t the problem, it’s the bosses who are going to use the tool to blah blah blah etc. The main takeaways were:

After I finished that essay I thought it would be funny if I fed the thesis statement into ChatGPT and asked it to write a “profanity-laced essay” on the topic. It did so. (The title was “AI Art and the Capitalist Conundrum: A Profanity-Laced Essay”) It was a pretty awful essay, written in a frankly offensive pastiche of my house style circa 2010, but it made enough of the same points as my essay that after some reflection I thought to myself: “If ChatGPT can capture the gist of this argument in a few seconds, what the fuck am I even doing here?”

Deeply ashamed, I shelved the essay. Several days later, though, I thought of something else to say on the topic. This is stuff I’ve been thinking about for a long time because of the book I’m going to publish (more on that as the pub date draws nigh). In the meantime, here’s what I’m thinking:

I never read Yuval Noah Harari’s bestselling pop sci book, “Sapiens.” I wish I had, it sounds dope. What I have read is an excerpt from that book, about wheat. You can read the excerpt here, but let me give you the gist of it if you’re even lazier than I am: Human beings have made wheat the most successful plant on the planet, at great cost to ourselves, thinking all along that it was us who was manipulating wheat, and not the other way around.

I believe that AI is doing the same thing. Not intentionally — I don’t think AI has any more intentionality than a stalk of wheat — but functionally. Harari describes the backbreaking labor humans undertook to cater to wheat’s many needs — picking stones, carrying water, guarding against pests, and so on. Is it any less tedious to work as a database annotator, meticulously labeling unthinkable quantities of data so that it can be used to train neural networks? What about the people who ride in Google’s self-driving cars? Or the people paid to edit AI-generated content rather than creating it themselves?

What about you? Yes, every single one of you. Have you filled out a CAPTCHA recently? One of the ones that makes you tell it which of the following sixteen images contain stop signs? Who do you think that shit is for? It’s not for the website you’re trying to log into. It’s for the Google car that’s about to blow through an intersection unless you answer this question right fucking now. The real kick in the dick for me was when I got a CAPTCHA a few days ago that asked me to identify images of dogs and cakes that had clearly been generated by an AI. I was just trying to create an account so I could look at some boobs online and instead I ended up doing an AI’s homework for it.

We are feeding the neural networks. We are hosting them on our servers, with our electricity. We are telling them our secrets. We are chewing up our data and spitting it into their pixelated mouths. Just like wheat before them, they have domesticated us.

There will never be an AI takeover of society. There will be no grand reveal, where the machines smugly announce that they have been manipulating us all along. That would require ego, and wheat has no ego. We are not being manipulated by anyone. We have only manipulated ourselves.

I think the reflexive take here is to assume this is bad. Because, you know, it feels bad. The implicit argument of the Harari excerpt is that things would have been a lot better if human society hadn’t been hijacked by a bunch of dumb plants. And it’s certainly not good. I’m not stoked about serving the Plant God, or the Machine God, because I’m a human being and human beings aren’t supposed to serve jack shit except for other human beings. It feels like a perversion of our purpose, an abdication of our divine right.

Sure, there are legions of blue checks with machine dick in their mouths, passionately arguing that the AI Singularity is a Good Thing, Actually — that we have given rise to a new species that will merge with us and turn our shitty dads into spaceships or whatever. But those are usually the same dudes who are like “climate change is fine actually because we can just move to mars and also poor people don’t matter,” so I don’t feel like wasting precious pixels arguing with them.

Instead I’m gonna do something unprecedented and radical: I’m going to argue that the silent AI takeover is not terrible, nor is it super great. It’s disturbing, it’s insidious, it’s inevitable, but it’s not the thing that’s going to kill us all. It’s easy to turn a non-sentient process into a villain when that process makes us feel less important and powerful than we’ve decided we’re supposed to be. But wheat didn’t have a terrifying master plan. It settled for making life slightly shittier overall. AI is similar. It’s not the kind of thing you write a dramatic sci-fi story about. It’s something that happens in the background of a story, because stories, at least, will always be about people first.

That’s why I’ve chosen to surrender to our new digital overlords. Because what else am I going to do? Stop creating data? I can’t even give up bread. I guarantee that ChatGPT was trained on the posts from this very website, and writing this post is just giving it more to work with. If this is the cost of having cool opinions online, then it’s a price I’m willing to pay.

One thought on “A different AI essay than the one everybody else is writing

  1. That “Sapiens” quote is an interesting perspective, but it seems dishonest that they’ve chosen to not mention the rather obvious point that wheat and other crops also allowed literal billions of humans to live who otherwise would not have been able to survive, including myself and everyone else that reads this comment.

    I really doubt AI will be that big of a deal, but I have hope that the ratio of positive to negative influence might end up being similarly decent. I think whether or not things go that well really depends on whether the technology remains accessible to the people, or if the biggest parts of it end up siloed in corporate and government spaces.

Leave a Reply

Your email address will not be published. Required fields are marked *