Tensions over generative AI in games are sky-high. They're about to get even higher
Image: Sandfall Interactive/Kepler InteractiveAfter pulling off a sweep at The Game Awards earlier this month, Clair Obscur: Expedition 33 is still making headlines — though likely not any that Sandfall Interactive is celebrating over. On Dec. 18, the acclaimed RPG added two more awards to its growing collection, nabbing Best Debut Game and Game of the Year at the Indie Game Awards. But just two days later, the show’s organizers announced that both awards would be rescinded due to the game’s use of generative AI, which they say violates the show’s “hard stance” on the tech.
Despite the fact that the Indie Game Awards is a tiny production compared to The Game Awards, the controversial decision has drawn major attention. It represents a breaking point in what has been the most heated debate in gaming throughout 2025. As more studios have adopted generative AI into their development workflows, backlash against the tech has only risen too. Now, it’s becoming increasingly clear that the resistance isn’t going away. That sets the stage for a messy war in 2026, one that highlights a brewing education crisis that threatens to muddy some already murky waters.
Image: Sandfall Interactive/Kepler InteractiveWhile tech companies have been talking up how generative AI could be used in video game development for a few years now, the hype cycle reached its crescendo in 2025. Publishers like Ubisoft and Xbox got more serious about experimenting with the tech, generative AI placeholders were discovered in both Clair Obscur: Expedition 33 and The Alters, and Krafton announced its intention to become an “AI-first company.” All of that, and more, happened during a tough year for the video game industry that saw mass layoffs across companies big and small. It signaled an existential moment for the medium, fueling fears that the tech was pushing humans out of game development and ushering in an era of machine-created “slop.”
Resistance to the trend was strong throughout the year, but it reached a new peak last week. Following the Game Awards, Bloomberg published an interview with Larian Studios in which founder Swen Vincke said that the company was using generative AI to help out with various tasks, from PowerPoint assistance to concept art creation. The comments, coming from a celebrated developer that has often advocated for a games industry that takes care of its workers, sparked outrage and sent Vincke into damage control mode as he tried to clarify his position on the tech. Larian now plans to host a Reddit AMA in early 2026 where Vincke will aim to clear up any questions players have about the studio’s AI usage directly.
That context is crucial to understanding what exactly happened at the Indie Game Awards. Tensions surrounding generative AI were already high heading into the show, but they rose more after Clair Obscur: Expedition 33 won its most coveted prize. The decision earned the show some criticism considering that its organizers made a point to stress their anti-AI stance during the broadcast. That talk seemed at odds with Clair Obscur’s win when the game originally shipped on April 24 with what appeared to be an AI-generated placeholder asset. After the placeholder was discovered in late April, Sandfall Interactive quickly removed and replaced it just a few days later. In June, the studio did eventually confirm that it used AI, in some form, on the project.
The Indie Game Awards will have a lot of questions to reckon with in the aftermath of its decision. Why let the game compete at all when its AI usage was common knowledge well before the show? Is it fair to ding a game for using the tech in its development at a time where generative AI was still experimental and attitudes around it were still forming? And, most importantly, what is the line for the show going forward? After all, The Roottrees Are Dead was nominated for Best Narrative at the show despite the fact that the game began its life as a game jam project that featured AI-generated art. Sure, those elements were fully replaced by human-made art in the full version of the game, but is that so different from Sandfall Interactive using AI placeholders?
Figuring out that line is going to be no easy task in 2026, and not just for the Indie Game Awards. The more vocal players have become in recent months, the less specific the target has become. Some have begun lumping generative AI in with the kind of everyday AI that’s always been core to video game development. That’s already causing some major headaches for anyone trying to navigate some complicated tech talk.
Amid the debates over the weekend, The Escapist published an article blasting the Indie Game Awards for what it called a “performative” decision to give Clair Obscur’s award to Blue Prince. The article claimed that the latter also used AI in its development (a claim made without sourcing) and asserted that “All games use AI in some way.” It cited things like “NPC behavior” to support that argument.
In response, Blue Prince publisher Raw Fury confirmed that the game did not use generative AI in its development, and while the article has since been updated, the initial claim spread through social media. Case in point: Look through the comments section on Polygon’s story about the Indie Game Awards’ decision and you’ll find multiple people repeating the claim that Blue Prince was made with generative AI. In one Facebook thread I saw about the news, another commenter went one step further, claiming that any game made with Unity is guilty of using AI.
Image: Dogubomb/Raw FuryThese arguments miss one crucial fact: AI and generative AI are not the same thing. The latter has been a point of controversy over the years for very specific reasons. Chief among those is the fact that generative AI is usually trained on a wide data set made up of content that companies like OpenAI don't own. That has sparked concerns about digital plagiarism, as any art created by generative AI could potentially be copying an existing work of art. If the placeholder poster that was present in Clair Obscur was riffing off someone else’s art and Sandfall Interactive redrew it, would that count as plagiarism? The lines are blurry considering the current lack of regulation on the tech, leaving many players uncomfortable. And that’s before getting into concerns about the reported environmental impact of generative AI or the existential threat it poses to human workers who may find themselves automated out of a job because of it.
Those are specific concerns built around generative AI and they have little to do with the tech that’s used to create NPC behavior. The more that critics use AI as a shorthand to discuss the generative strain of it, the more difficult it’s going to become to draw a clear line. Game developers are currently struggling with that breakdown of language. In an interview with Polygon during The Game Awards, The Last of Us creator Bruce Straley explained how generative AI has poisoned the well and made it difficult for him to pitch his upcoming game Coven of the Chicken Foot, which revolves around an AI companion in a more traditional gaming sense.
“It’s difficult to even pitch the concept of this creature, because in my world, NPCs are AI,” Straley told Polygon. “AI programmers are a type of personnel you have on staff in the programming department. Now you can’t say that because if somebody does have an opinion about AI, I can’t now call this creature the most advanced AI companion. People are going to think we did machine learning, and LLMs, and all that. No, we did none of that.”
Image: Sandfall Interactive via PolygonAs much as tech companies may be hoping for acceptance in 2026, it’s clear that the war is only just beginning. The backlash to Larian Studios’ comments and the Indie Game Awards’ decision on Clair Obscur both signal that players are ready to dig their heels in and push back against something that’s being positioned as an inevitability. If they’re going to win that fight, education is going to be crucial. A scattershot pushback to generative AI that lumps it in with concepts like procedural generation or pathfinding only threatens to weaken the prosecution’s case.
Is plagiarism the problem? Is it a labor issue that can’t be resolved by a clean data set? Or is it a philosophical matter of forking over human creativity to a machine that can’t think on its own? The battle lines need to be clearly drawn, and that’s going to require casual critics to actually understand the thing they’re criticizing.
.png)
6 days ago
6









English (US) ·