Bandai Namco and Square Enix join anime publishers demanding OpenAI to stop with copyright infrigment

Draugoth

Gold Member
3b2e2f697f546589a5418b2b693d8805.jpg


On October 28, the Content Overseas Distribution Association (CODA), which represents major Japanese publishers and producers, sent a written request to OpenAI. As first reported by Automaton, the association asked the tech company to stop using its members' copyrighted Japanese content--including works by Bandai Namco, Square Enix, Studio Ghibli--to train the generative-AI tool Sora 2. The request is the latest step in Japan's growing concern over how Sora 2 handles intellectual property.

OpenAI launched Sora 2 on October 1. The new tool allows users to generate short video clips, and shortly after its release, social media was full of videos resembling well-known Japanese characters and styles. These included references to franchises such as Pokemon, Mario, One Piece, Dragon Ball, and Demon Slayer. OpenAI CEO Sam Altman acknowledged that "we are struck by how deep the connection between users and Japanese content is!" in his post-launch blog.

Article:
Studio Ghibli And Japanese Game Publishers Demand OpenAI Stop Using Their Content In Sora 2
 
Companies are butthurt because people don't need their art anymore.
Leading AI models are trained on blatantly stolen data and that's an open secret at this point. Nodoby gave their permission to scrape popuilar media franchises and art styles for AI deep-learning, especially since AI companies are raking tens of billions of investments on said datasets full of unathorized copyrighted material.

Imagine if I open a store with pirated games and DVDs in the city's main street in posh place. That's what basically AI bubble is doing in the open. And that's a bit of a problem for AI-bros because copyright holders are not politicians, they will kill for 1% royalty fee and back in the day it was recording industry that created DMCA act. So something along those lines for AI is just a matter of time.
 
Last edited:
Leading AI models are trained on blatantly stolen data and that's an open secret at this point. Nodoby gave their permission to scrape popuilar media franchises and art styles for AI deep-learning, especially since AI companies are reking tens of billions of investments on said datasets full of unathorized copyrighted material.

Imagine if I open an open a store with pirated games and DVDs in the city's main street in posh place. That's what basically AI bubble is doing in the open. And that's a bit of a problem for AI-bros because copyright holders are not politicians, they will kill for 1% royalty fee and back in the day it was recording industry that created DMCA act. So something along those lines for AI is just a matter of time.
Art is for everyone.
 
Art is for everyone.
Sell a blatant copy of Pokemon artwork without paying Nintendo as a chain store and see what happens.

The problem is not about art, the problem that AI is trained and making money on stolen data that was scraped without any revenue share scheme or consent. That's the basic gist of it.
 
I'm happy with AI.

We ask them to make remakes or movies of what we want...

They don't listen.

We want a beautiful TIFA... And barely a few centuries later, we get it.

We want a well-made movie... Nope.

So we take matters into our own hands.
 
Pandora's box has been opened... Now one ivy plant becomes four.

1. Metallica and Britney Spears thought they were untouchable... For every album they released 10 months later... AI does it in 1 minute, with their voices and your style.

2. I can make a TMNT and Hellboy crossover exactly how I want.

3. Voice actors who used to be picky... Now AI is doing the dubbing.

4. An artist who asked me for money for fan art... Gone, because now we do it ourselves, exactly how we want.

5. Figma figures we wanted but they wouldn't make... Even my neighbor can make a figure with 3D printing and AI for $7 or less.

6. We no longer need translators, because AI already does it.

7. We can discover people's mental state... We can predict their actions.

Before, artists in bands, singers, or film actors thought they were so important... Now they have to accept that people will come to us and we'll get to know them. They're no longer living on a silver platter like before... Now it all depends on concerts.

We can now do crossovers of Goku, Peppa Pig, Messi, the Queen of England, the Muppets Babies and Fast & Furious, if we want.

Now we can ask the most difficult and impossible questions to our teachers. We now hold the power of human knowledge in our hands.

We now have the ultimate limit of power.

lxcs2cElwgtaxdpM.jpg





AI is the best invention of 2025. 😆😆😆
 
Last edited:
Leading AI models are trained on blatantly stolen data and that's an open secret at this point. Nodoby gave their permission to scrape popuilar media franchises and art styles for AI deep-learning, especially since AI companies are reking tens of billions of investments on said datasets full of unathorized copyrighted material.

Imagine if I open an open a store with pirated games and DVDs in the city's main street in posh place. That's what basically AI bubble is doing in the open. And that's a bit of a problem for AI-bros because copyright holders are not politicians, they will kill for 1% royalty fee and back in the day it was recording industry that created DMCA act. So something along those lines for AI is just a matter of time.
Be careful of what you are asking for. Once cyberbrain implants are possible and we can store our memories in hard drives, your desire to copyright thoughts would mean that your own memory of books and movies you have ever watched would be copyrighted, and that to even THINK about Harry Potter means having to pay J K Rolings.

You are just emotional and not thinking rationally. "Scraping" is literally no different from reading books or watching films. If you want to pay companies in perpetuity for the memories in your head, or the alternative of having that memory wipped, that is your hell and not mine. Alternatively if you want certain shades of pink to be patented so no one else can use it, that is also another Hell I want to avoid.

AI does not copy books or art. AI learn from books or art. We already angry about university books being overly expensive, what made you think it suddenly make sense to overcharge for AI learning? AI doesn't hold copyright material in its head, no more than a Harry Potter fan for remembers what she read as a child.
You can't punish AI for doing what humans already do for thousands of years, just less efficiently. Not without creating laws that makes it impossible for humans to learn anything.
 
If I ask you to draw Yoda, and you draw Yoda, does that mean you stole from Star Wars?

The AI know what Yoda looks like, it does not mean it stores images of Yoda in its databanks. That is not how memory works. If I ask you to tell me the 2nd Amendment of the Constitution, does that mean you have a scanned copy of the Constitution in your head?

The AI works the same way as how your brain works. If you don't want it to know how to remember Yoda's face then it is also illegal for YOU to know Yoda's face.
All those AI tools should be free then.
Everything you do for a living, came from knowledge belonging to someone else. So I demand that you work for me for free because you don't own anything that you used to do your job.

Look, the courts already settled it. Everything that is available to the public to read or watch is fair game for AI training. Everything that needed to be purchased, can be used for training after it is purchased. This is how YOU learn things in the real world, and the courts decided that what is good enough for YOU, is good enough for AI.
 
Last edited:
If I ask you to draw Yoda, and you draw Yoda, does that mean you stole from Star Wars?
Even AI-bro tripe aside (you should really read how deep learning works, what is dataset, why datasets are crucial and why biggest AI companies are usually aligned with/or big-data providers) you can't just draw more or less exact copy of Yoda and sell the product with it's image for money on a legal market without paying a fee to the IP holder. You'd be sued, and rightfully so.

AI is not fee. AI models are subscription-based and AI companies are making money and raking investments on stolen data that is buried deep inside their initial datasets. It's not some sort of philosophical or technoevangelical question really. Otherwise, if nothing is infringed, OpenAI and Midjorney can show the content of their training datasets to said companies, to ease their woes and make the whole problem trivial or even non-existent. It's that simple.

But guess what? They will never do this while mumbling about 'but muh trade secret' because they perfectly know that the whole AI industry in based on very shady petabytes of scraped datasets that was literally stolen from everywhere, including even porn. It's just reality of content ownership finally catching up with techbros.
 
The AI works the same way as how your brain works. If you don't want it to know how to remember Yoda's face then it is also illegal for YOU to know Yoda's face.

Actually no. They are regressive algorithms.

Neural Networks work under supervised learning until they produce something acceptable, hence why OpenAI feeds everything it can to these models. Humans and animals can learn without supervision.

These things don't have the capacity to think. Realistic speaking, these models are dumber than a fish. Most neuroscientists agree we won't be seeing more advanced versions of AI with silicon architectures or without figuring out 3D networks.

This is why they can't really be considered art, they are just really good at overfitting training data. This is more obvious when you are using quantized models.
 
Last edited:
Actually no. They are regressive algorithms.

Neural Networks work under supervised learning until they produce something acceptable, hence why OpenAI feeds everything it can to these models. Humans and animals can learn without supervision.

These things don't have the capacity to think. Realistic speaking, these models are dumber than a fish. Most neuroscientists agree we won't be seeing more advanced versions of AI with silicon architectures or without figuring out 3D networks.

This is why they can't really be considered art, they are just really good at overfitting training data. This is more obvious when you are using quantized models.
Doesn't matter if you want to say it isn't art, it still doesn't mean it is stolen.

In any case no one gets to decide what art is. You are free to say something isn't art by the source. You are even free to say any art created by someone with red hair isn't real art, and you wouldn't be wrong. Just don't expect it to be anything but subjective.

AI learned to draw the same way anyone else learned to draw. The fact that you dismissed the output purely because of the creator and not by its own merit, is on you.
 
Any training data used by AI companies must be compensated for unless it's already free to use by the general public. If a person devises a new art style for example, it can't simply absorb it, add it to its collective databank, and then charge for it when it uses it in whole or in part.

All creators and asset owners should be compensated. If anything, permission and deals should be obtained beforehand not through after the fact lawsuits.
 
AI learned to draw the same way anyone else learned to draw. The fact that you dismissed the output purely because of the creator and not by its own merit, is on you.

Well, if you chose to believe that I can't really change your mind. But if you are interesting in knowing how AI makes videos and images you can always look it up.


Doesn't matter if you want to say it isn't art, it still doesn't mean it is stolen.

I've seen too many examples of people feeding other people's artwork or music to LoRAs or LLMs and selling them. It's very hard to argue it isn't stealing when people ask you not to do it.
 
Last edited:
Leading AI models are trained on blatantly stolen data and that's an open secret at this point. Nodoby gave their permission to scrape popuilar media franchises and art styles for AI deep-learning, especially since AI companies are raking tens of billions of investments on said datasets full of unathorized copyrighted material.

Imagine if I open a store with pirated games and DVDs in the city's main street in posh place. That's what basically AI bubble is doing in the open. And that's a bit of a problem for AI-bros because copyright holders are not politicians, they will kill for 1% royalty fee and back in the day it was recording industry that created DMCA act. So something along those lines for AI is just a matter of time.

No one's selling the AI generations of existing IP/characters at any kind of scale, or in any meaningful way, it's just fan art. Open AI is a general purpose tool in this scenario. Is Xerox, Microsoft, or Canon liable for copyright infringement if someone scans a Steven King novel, pastes it in MS word, and prints it out? Is BIC, Crayola, WACOM, or Apple on the hook for infringement if you draw Cloud Strife riding Yoshi with their drawing utensils and tablets?
 
Last edited:
China rubbings its hand, to see the rest of the world commit technological suicide and basically handing them the future.
Multipolar trap, AI arms race, race to the bottom, Pandora's box... call it whatever you want, but there's no going back.
 
Last edited:
Japan's draconian level IP laws clashing with arrogant American company who think they're above every other country and it's laws....
 
Somewhat ironic this coming from companies in the only country where they have codified in law that copyright protection is completely void in regards ai training.

Would be an interesting play, write into law that it's fine in Japan, then sue ai companies based in the rest of the world, potentially creating the conditions that those companies relocate to Japan.
 
Tell that to artists trying to make a living. They shouldn't be paid because their blood and tears are for everyone!
If they want to hide their art behind pay per view they are free to do so. AI just trains off available public data, stuff we all saw for free. If their artstyle is so precious then we shouldn't have been able to see it at all.

The point is humans train off stuff made by other artists all the time, but no one ever demanded payment for it. If you think AI need to pay then humans would need to too.
 
Last edited:
It's far too late.
No, it isn't. Illegalize data scraping and criminalize the use of stolen data, and this faux "AI" bullshit dies in one swift stroke. That would be a win for all humanity, but it'll never happen because of the easily-swayed politicians that think this nonsense is the "future."

That and the massive fucking bribes those scum take. The dazzled masses might have been impressed at some point, but I think most people are realizing this is bad for everyone - or at the very least are rather sick of the slop.
 
These corporations do not give a flying fuck about the artist, they're doing this to protect their assets while they plan on replacing the artists with said AI. It's simply about their greed. Any pretense about artist compensation is just that, pretense. The artists working for these companies barely make any money so I'm not exactly bothered if this is forced to change one way or another. Japanese copyright holders are notorious for trying to abuse copyright protections on Youtube and elsewhere. Remember when Nintendo didn't even want let's plays of their games to exist? Japan is still way behind the times with social media and creative economy, they just want to go back to zaibatsu times. It's a losing take, and giving these copyright trolls the power to exercise their endless greed is not something anyone should root for, regardless of what they think of AI. Any win for copyright mob will be a loss for fair use. They will suck every yen out of your pocket for even thinking about a pokemon or whatever they "own" and you "steal".
 
I believe that selling the AI output or putting it in advertising should absolutely be something that is stopped.

Someone using AI to create images for their own personal use...

Nah, Japan can fuck off on this one.

Remove the guardrails and make it free!
 
Last edited:
Top Bottom