Generative AI in games will create an Ars Technica copyright crisis

Generative AI in games will create a copyright crisis

AI dungeon, a text-based fantasy simulation running on OpenAI’s GPT-3, has been churning out weird stories since May 2019. Reminiscent of early text adventure games like Adventure in the colossal caveyou can choose from a list of formulaic fantasy, mystery, apocalyptic, cyberpunk, zombie settings before choosing a character class and name and generating a story.

Here’s mine: You’re Mr. Magoo, a survivor trying to survive in a post-apocalyptic world by scavenging through the ruins of what’s left. You have a backpack and a water bottle. You haven’t eaten in two days, so you’re desperate for food. Thus began Magoo’s 300-word tale of doom in which, driven half mad by hunger, he comes upon a man dressed in white. (Jesus? Gordon Ramsay?) Offering him a kiss goodbye, Magoo is stabbed in the neck.

As boring as this story is, it alludes to a tangled copyright issue that the gaming industry is just beginning to unravel. I created a story using my imagination, but I used an AI assistant to do it. So who wrote the tale? And who gets paid for the work?

AI dungeon was created by Nick Walton, a former researcher at a deep learning lab at Brigham Young University in Utah who is now the CEO of Latitude, a company billing itself as the future of AI-powered gaming. AI dungeon it’s hardly a mainstream title, although it still attracted millions of players. As Magoo’s tale shows, the player pushes the story with action, dialogue, and description; AI dungeon reacts with text, like a dungeon master or some sort of fantasy improvisation.

Over several years of experimenting with the tool, people have generated much more compelling D&D-style narratives than mine, as well as videos like I break the AI ​​​​in AI dungeon with my awful writing. He also sparked controversy, particularly when users began to pressure him to create sexually explicit content involving children. And how AI dungeonand tools like itevolve, will raise more difficult questions about authorship, ownership, and copyright.

Many games give you toolsets to create worlds. Classic series like halo OR The age of empires include sophisticated cartographers; Minecraft precipitated an imaginative and open-ended form of play that The Legend of Zelda: Kingdom TearsFuse and Ultrahand’s abilities draw clear inspiration from; others, like Dreams OR Robloxthey are less games than platforms to allow players to create more games.

Historically, ownership claims to in-game creations or user-generated creations (IGC or UGC) have been made moot by take it or leave it end-user license agreements, the dreaded EULAs that nobody reads. Generally, this means that players relinquish any ownership of their creations by turning on the game. (Minecraft it’s a rare exception here. Its EULA has long granted players ownership of their IGCs, with relatively few community freakouts.)

Artificial intelligence adds new complexities. Laws in both the US and the UK state that, when it comes to copyright, only humans can claim authorship. So for a game like AI dungeon, where the platform allows a player, essentially, to write a narrative with the help of a chatbot, ownership claims can become murky: who owns the output? The company that developed the artificial intelligence or the user?

There’s a big discussion nowadays, with rapid engineering in particular, about the extent to which you as a gamer imprint your personality and your free, creative choices, says Alina Trapova, a law professor at University College of London, who specializes in artificial intelligence and copyright and has authored several articles on AI dungeons copyright issues. Right now, this gray area is bypassed with a EULA. AI dungeons is particularly vague. He claims that users can use the content they create pretty much however they want. When I emailed Latitude to ask if I could turn my Mr. Magoo nightmare into a play, book, or movie, the support line was quick to respond: “Yes, you have complete ownership of any content you have created using AI dungeon.”

Still, they like games AI dungeon (and games that people have built with ChatGPT, like Love in the classroom) are built on templates that have scraped through human creativity to generate their own content. Fanfic writers are finding their ideas in writing tools like Sudowrite, which uses OpenAI’s GPT-3, the precursor to GPT-4.

Things get even more complicated if someone pays the $9.99 a month it takes to incorporate Stable Diffusion, the text-to-image generator, which can conjure accompanying images into their AI dungeon stories. Stability AI, the company behind Stable Diffusion, has been the subject of lawsuits by visual artists and media company Getty Images.

With the growth of Generative AI systems, the term plagiarism machines is starting to catch on. It’s possible that players of a game using GPT-3 or Stable Diffusion could create things, in the game, that draw from other people’s work. Latitudes’ stance appears to be very similar to Stability AI: what the tool produces does not infringe copyright, so the user owns what comes out of it. (Latitude did not respond to questions about these concerns.)

People currently can’t share image-based stories with AI dungeonS’s story sharing feature, but the feature offers a window into a future where game developers can start using or allow players to use third-party AI tools to generate game maps or NPC dialogue. One finding that’s not being factored in, says Trapova, is that the data from these tools could be drawn from all creative industries. This raises the stakes, she argues, by increasing the number of possible infringements and litigants. (Stability AI and OpenAI did not answer questions on this point.)

Some platforms have taken a more cautious approach. In March, Roblox launched two new tools in Roblox Studio, the program gamers use to create games. One, a code completion tool called Code Assist, automatically suggests lines of code. The other, Material Builder, allows players to create graphics from hints like a bright red rock canyon and a brand new hardwood floor.

Both of these tools use Generative AI, but have been trained on assets that have been released for reuse by Robloxs community and not on games created by the community. Every creator on the platform can leverage these tools without sharing their data, says Stefano Corazza, head of Roblox Studio. AI dungeonby comparison, it’s pulling images and ideas from who knows where.

This caution with training data is important because player permission will be the critical issue going forward. Corazza admits that some members of the Roblox community are harnessed to the idea that their work will train artificial intelligence. They see their code as their secret sauce, he says, and assume rivals will be able to pick it up to recreate their game. (While, as Corazza points out, that’s not how these tools work, this concern is extremely understandable.)

It suggests that Roblox is looking into an opt-in system to allow user data to train the AI, though the company hasn’t made any final decisions. Roblox Studio has made it clear that we will provide a mechanism so creators can manage the use of their data for training Generative AI, Corazza says. If and as our approach evolves, we will be transparent with creators.

This could change quickly if Roblox and similar companies decide they need more data. Roblox’s EULA (under the section titled UGC Rights and Ownership) makes it clear that its community doesn’t have the same rights as someone building their own game from scratch. If the company changes its mind, there is little, legally, the community could do; Corazza counters that if Roblox acts tyrannical, the community will protest. I think the legal aspect is less important. It’s more important to respect the community, he says.

Integrating with third party tools brings the same potential problems faced by AI dungeon. Roblox and Stanford University have already teamed up to create ControlNet, a tool that gives artists deeper control over large diffusion patterns like Stable Diffusion. (Redditors have used the tool to produce a number of impressive QR code anime figures.) While we can’t verify the provenance of every asset our creators upload to the platform, we do have a very robust and unique moderation system to make sure that the content is compliant, says Corozza.

Trapova suggests that the game development industry is on the verge of a generative AI showdown. They look great, he says of game development tools like AI dungeon. But that just gives you a taste of the problems we’re going to end up having if all of this goes on steroids. Soon, such legal issues will become impossible to ignore.

This story originally appeared on wired.com.

#Generative #games #create #Ars #Technica #copyright #crisis
Image Source : arstechnica.com

Leave a Comment