Our company's policy regarding generative Artificial Intelligence can be summarized as follows:
Fuck AI.
That about sums it up. I don't know why, forty years after the release of The Terminator, we, as a society, are still debating this, but here we are for some fucking reason.
If you think machines nuking the world and hunting down the survivors is hyperbolic, sci-fi nonsense, sure, maybe you're right for today. I mean, the world's militaries are all currently tripping over each other in a contest resembling nothing so much as the Cold War's arms race to incorporate AI into every facet of their functions, but, sure, maybe nothing bad will come of giving the nuclear codes to literal code.
But even if the Terminator outcome is a bit farfetched for your tastes, the present stated goal of AI companies is to eliminate every white collar job so that corporate profits can go up. So if the whole nuclear holocaust thing is, like, a ten year plan, laying off as many humans as possible is the short-term goal. Why in God's name would anyone support the Great Depression 2.0!
Yeah, that's a great future, one CEO getting paid while five hundred AI programs run the store and the employees all fuck off and join the breadline. I'm sure the same product that currently produces six-fingered memes and can't recreate words on a sign will accurately and completely conduct all of the accounting, sales, customer service, and marketing for your company, with absolutely no ghosts, glitches, or hiccups.
Oh, is that hyperbolic, too? Maybe none of this will come to pass? Sure, I'm also hoping that AI goes the way of the NFT and we never have to worry about any of these nightmarish scenarios. But even if it all went away tomorrow, that wouldn't change any of the damage AI has already done. Rivers dried up, rainforests hacked down, who knows how many towns and livelihoods destroyed to build and run data centers.
Not to mention all of the outright theft! These scions of the brave new world of the future seemed to think nothing of stealing every copyrighted work of art or fiction ever made. Not paying, not renting, not composing some kind of fair rate for the novel service of training large language models, no, the AI companies didn't do any of that, they decided to just steal every book ever written and every painting ever painted and pay the human creators nothing. "Fuck them, let them take us to court," seemed to be the prevailing sentiment.
And the human damage, my god! I won't even go down the lost revenue rabbithole, because it's pretty much impossible to quantify how many people might have otherwise paid an artist for something they generated instead. But all the research now is showing that students who use AI are unable to think. Parts of their brain shut down, even if they eventually stop using AI. A whole generation's critical thinking skills are being wiped out.
I'm also deeply concerned about what happens when a mentally ill person - or even just someone who's bored or sad - turns to the yes-man of AI seeking help. It will justify paranoia and delusions. "I can absolutely see why you think the lizard people are following you, and it's reasonable to be concerned about it." It will encourage immature or even dangerous life choices under the guise of being positive. "Sure, abandon your family and move to Hawaii. Here are some tips on being a better macadamia farmer."
AI's ethical applications are limited at best and open to flagrant and worrisome abuse at the best of times. And based on the damage - physical, emotional, environmental - that AI is currently doing and is capable of doing, I'm not sure that it is ethical to use it at all.
But perhaps most fundamental to our position as a publishing company, a person who is using AI to generate content is missing out on the purpose of art. We write, we paint, we sing, we sculpt because of the joy and pain and fulfillment and frustration that comes from the act. The durability of the product can range from that of a mandala to that of the Pyramid of Giza. But it is not the fact that a work of art exists that makes it worthwhile. It is the blood, sweat, and tears of the creator that makes it worthwhile. AI-generated slop is not art and, philosophically, AI use is diametrically opposed to what we do here.
Great damage has been done by AI and greater damage still may yet come, but neither I nor my company will be a part of it. French Press Publishing will categorically:
- not publish any author whose work is partially or wholly created by generative AI
- not hire any artist, editor, or other contractor who utilizes generative AI
I am not agnostic on the matter. I am not uneducated on the matter. I am actively anti-AI.
(This post was originally published here on the FPP blog.)
No comments:
Post a Comment