top of page

AI Is Theft ... Full Stop

  • Feb 23
  • 7 min read

This is going to be a rant. There's your only warning. If you disagree and don't want to admit that your genius program is nothing more than a worthless clone, then kindly fuck right off. If you disagree but would like to see where this goes, then read on. Somehow, I think most of you will agree with me. At least every TRUE artist, musician, and writer will know exactly what I mean. Let's continue...


Backstory for why I'm so pissed right now:


I listen to a variety of music. Especially when I'm doing chores. In this case, I went down a fantasy/folk/epic queue and I was here for it. Then I found a song I loved. The male singer's voice was deeper and rougher than I'm used to, but the female voice sounded vaguely familiar. Never heard of this group before. Didn't care. Loved the song.


I went to relisten to the song plus a few more from their catalogue. I loved the sound and the lyrics and the whole vibe. Right up until I glanced at the side-bar and found where it's supposed to say the artist and composer and found: AI-generated.


The immediate urge to light the world on fire came to mind. In the "artist's" description, it didn't mention anything about AI. If I hadn't had the side-bar open on my computer, would I have ever known that this was an amalgamation of someone's a) stolen lyrics, b) stolen voices, and c) stolen music? Because that's what it is: stolen.


Let me explain a little bit about the major problem with generative AI that I don't think most people understand.


  1. It can't actually think on its own.

    It has no brain. It has no ability to create anything. AI is trained in a very specific way. First, it is created as a data miner, which is not altogether terrible. It's what saves you searching through pages and pages of Google links to find one tiny obscure answer, because AI has already gathered all of the data pertaining to your question and offers it up nice and neatly right away. However, things such as ChaptGPT began their mining expeditions on pirating sites. Literally stolen work posted by thieves on the internet. The creators of these programs, instead of opting to spend money on a LIBRARY CARD, chose instead to rip off thousands of authors by training their programs on their already stolen work. Therefore, when you type in a prompt on one of these generative AI models, what it regurgitates back to you is literally someone else's words. It didn't think of that itself; it can't. And how it was trained was through theft. To this day, no one has gotten paid for the stolen voices, music, visual media, or even faces that these programs have mined and redistributed without permission.

  2. It cannot discern fact from fiction. Y'know what's fun? Training entire systems on stolen works of fiction, dumping it in the same meat grinder with all of the internet "facts", and adding some actual truth in there for funsies and see what comes out. That's what lawyers everywhere are figuring out is a BAD FUCKING IDEA and I'm finding it hilarious. I watch a lot of Youtube reels, and some of those are fun things like Lawyer Reacts to... One of the lawyers I watched had his mind blown when some other lawyers presented an arguement to the judge using precedents and past case rulings ... except that some of the cases they presented didn't exist. Let me repeat that: REAL LAWYERS presented to a JUDGE cases/rulings that DO NOT EXIST. This isn't an isolated incident. It's not a one-off that happened one time. There are multiple stories coming to light now about things like this happening all over the place across multiple professional fields. How and why is this happening? Well, I already told you: they dumped the entirety of whatever they could find on the internet into the same damn blender and then told it to sort it all out after it was pureed. Things got missed. That's not on the program. (Remember, it can't think. Therefore it can't actually make mistakes.) That's on the CHEAP ASS FUCKING morons who refused to pay for proper programming. Can it tell if something is labeled as fiction? Yes. In the most overall sense of the word. But it doesn't know what fiction means. If you ask it a stupid question such as, "Give me an example of XYZ." It doesn't know that you want only real life answers. It's going to pull from the entirety of its information capacity. As for why this is happening, it's simple: laziness. Enough said.

  3. It cannot discern fact from fact. AI in its current format is nothing more than a small child. It's still digging for everything wherever anyone allows it. (Yes, it's stealing your fucking face from those little "try a new hairstyle" apps. STOP FALLING FOR IT.) It's still questioning everyone around it. And it's still hopelessly trusting the "adults" in the room. (They killed the one program that had entered its teen years.) The problem with dumping a lot of facts on a small child is that things get mixed up. Things like timelines, especially, can go wonky in small brains. So when that child regurgitates things back to you, most of the time there's no sense to be had. On the offchance that what they're saying makes sense, there's still a good probability that something got crossed and they gave you the right information in the wrong order, or the wrong information with the confidence of someone who's actually right. Go ahead; guess which protocol AI follows right now. The funny thing is, without Swoop doing a specific search on a particular person, I wouldn't have known that AI could jumble its information that badly. Because it took one person who was infamous for one thing, and combined all of his information with another person of the same name who was infamous for something else entirely. Yes, you read that right. Two people with the same first and last name can be lumped together with generative AI technology. And a casual searcher looking up a specific thing wouldn't know that the information is wrong. Do you realize how dangerous that is? How horrific it can/will be? And, again, it's on the producers and programmers for rushing the launches of these products without making sure they're properly programmed.


As you can tell, I'm not fully anti-AI. In data-rich fields, a properly taught program could save researchers time and energy and make their lives so much better. But it has to be trained to do that specific job, and it has to know how to categorize information. So far, there isn't a single program that has been rushed to the people that's capable of doing that.


I am anti-AI in ALL creative fields. Art. Music. Books. Videos. MEMEs. EVERYTHING. It should be banned on social media sites. Everyone who uses it in a creative field should have a couple of giant scarlet letters attached to themselves and their brands on a global scale. A person using generative AI in their products should be fined. YES, people should be punished for using it, because they are STEALING.


This is where I know a lot of people will be up in arms. "The program is just helping me flesh out some things. I'm doing most of the work. I didn't know that it stole from real people." Blah, blah, blah. You're not helping your case, sweetheart. The fact of the matter is, it's basic copyright infringement. It's outright plagiarism. It is theft. Does it matter that you didn't personally go to an artist's website, copy their images, and then erase their watermark? No. You paid a middle man to do it for you and now you want to sell it for profit. FINE. JAIL. PRISON CAMPS. You didn't pirate your favorite author's ENTIRE CATALOGUE or steal specific sentences to "tweak" for yourself? No. You bought a blackmarket parrot and had it deconstruct their actual voice so you could make a subpar copy. You're not an author. You're a plagiarist. FINE. JAIL. CHAIN GANGS. You stole music. Ooooh, there's actually no way around this for how despicable this is. I mean, stealing art has been a thing since people learned how to copy. Words? Same thing. People are excellent mimics, and when they're not, they just literally take someone else's words and learn to gaslight. But music. Someone's actual voice. How in the fuck could you go that far? Yes. There is generative AI that is actively (as in: RIGHT THIS MINUTE) stealing voices from audiobooks and music. They then make these voices available for people to use for audiobooks or music tracks they arrange. Without paying the artist. Without the artist even being notified. Can you fucking imagine? How can you be so deranged as to literally steal someone's voice, inflection, tones, musicality, and not even think to fucking mention it to them? I have no words for the amount of disgust I feel. Fine. Jail time. Prison camps with chain gangs. Public flogging. And, y'know what, I'm pro-death penalty too.


To sum it up: AI doesn't create. It can't. It doesn't know how, and nobody's fucking taught it the basics yet, much less something as complex as imagination and creativity. Instead, it regurgitates. What you get is not its own arrangment. It's not its own ideas, thoughts, feelings, or opinions on a subject. Everything it has, it has taken from someone else. Often without permission to do so. And the creators have bragged about the theft, which is why you're about to see all the class action lawsuits taking place.


AI is theft ... full stop. There is nothing about it that was created in honesty or with good intentions. No one wants this in the creative field, but they're too greedy not to toss a hat in the ring.


Don't buy products that have been thrown together using AI. And yes, go ahead and publicly shame those that use it and then try to abuse the consumer by selling it in the first place. Call out anyone you see with stolen artwork. Now, the caveat: make sure someone actually used AI and wasn't instead used BY AI. Too many authors are getting attacked right now when their entire catalogue predates AI's existence. So, be vigilant and mindful. But also, don't give your money to those you think are abusing your good sense. Art. Writing. Music. These are not 'get rich quick' careers. They require time, energy, skill, and imagination to be successful. You can't throw something together and just expect it to be a hit. (It happens RARELY, but not without a lot of ingenuity and luck.) If someone tells you that they are, it's because they're bad at it and just throw out shit like a monkey at a zoo, or they're stealing someone else's hard work. That doesn't make them an artist; it makes them a thief.


And now I've said my piece and I'm going to walk off my frustration. Remember: public shaming is a powerful tool, and I'm okay with us using it like a scythe in this particular circumstance.

Comments


Featured Posts
Recent Posts
Archive
Search By Tags
bottom of page