I only care if the game is good. What tools are used to make it is irrelevant to my enjoyment.
Yeah I am sure higher ups of this one studio alone will use it responsibly because they are “not like the others” and definetely will resist making a bad decision despite deadlines.
There’s nothing to complain about here. Games require tons of placeholders, in art, dialogue, and code. They will iterate dozens of times before the final product, and given Larian’s own production standards, there’s no chance anything but the most inconsequential or forgotten items made by an LLM will stay in.
Among the devs responding is a former Larian staffer, environment artist Selena Tobin. “consider my feedback: i loved working at @larianstudios.com until AI,” Tobin writes. “reconsider and change your direction, like, yesterday. show your employees some respect. they are world-class & do not need AI assistance to come up with amazing ideas.”
there’s no chance anything but the most inconsequential or forgotten items made by an LLM will stay in.
Concept art is not a placeholder. It’s part of the creative process. By using AI to generate text and images you already influenced the creative process negatively.
The article doesn’t say Larian is using it for concept art.
Those were hypothetical statements from people outside the studio.
Literally the first sentence
A few hours ago, reports surfaced that Larian are making use of generative AI during development of their new RPG Divinity - specifically, to come up with ideas, produce placeholder text, develop concept art, and create materials for PowerPoint presentations.
Idk but that seems pretty obvious to me from reading the quote by Larian CEO Swen Vincke that they used to, or are still using it to generate or “enhance” concept art and that’s it’s a highly discussed topic within the company?
They honestly should have expected this given peoples visceral reaction to anything AI. Personally, I have huge problems with AI and refuse to play most games that have used it. I think it’s poisoning every creative industry and replacing important jobs while using vague the excuse that it makes things “easier” while making the game soulless in the process. I’m willing to give Larian the benefit of the doubt simply because of their previous games being amazing, but imma wait for the reviews on this one. This game is still going to be in development for another 4 years and none of us will no what’ll happen between then and now, but for now I’ll remain hopefully optimistic
Nothing wrong with using AI to organize or supplement workflow. That’s literally the best use for it.
Except for the ethical question of how the AI was trained, or the environmental aspect of using it.
There is no ethics under capitalism, so that’s a moot point.
No one [intelligent] is using an LLm for workflow organization. Despite what the media will try to convince you, Not every AI is an LLM or even and LLM trained on all the copyrighted shit you can find in the Internet.
There’s more to AI than LLM.
There are AI’s that are ethically trained. There are AI’s that run on local hardware. We’ll eventually need AI ratings to distinguish use types, I suppose.
There are AI’s that are ethically trained
Can you please share examples and criteria?
The cat’s out of the bag. Focus your energy on stopping fascist oligarchs then regulating AI to be as green and democratic as possible. Or sit back and avoid it out of ethical concerns as the fascists use it to target and eliminate you.
Holy false dichotomy. I can care about more than one thing at a time. The existence of fascists doesn’t mean I need to use and like AI lmao
The cat’s out of the bag
That’s 👏 not 👏 an 👏 excuse 👏 to be 👏 SHITTY!
The number of people who think that saying that the cat’s out of the bag is somehow redeeming is completely bizarre. Would you say this about slavery too in the 1800s? Just because people are doing it doesn’t mean it’s morally or ethically right to do it, nor that we should put up with it.
No one 👏👏 is 👏👏 excusing 👏👏 being 👏👏 shitty.
The “cat” does not refer to unethical training of models. Tell me, if we somehow managed to delete every single unethically trained model in existence AND miraculously prevent another one from being ever made (ignoring the part where the AI bubble pops) what would happen? Do you think everyone would go “welp, no more AI I guess.” NO! People would immediately get to work making an “ethically trained” model (according to some regulatory definition of “ethical”), and by “people” I don’t mean just anyone, I mean the people who can afford to gather or license the most exclusive training data: the wealthy.
“Cat’s out of the bag” means the knowledge of what’s possible is out there and everyone knows it. The only thing you could gain by trying to put it “back in the bag” is to help the ultra wealthy capitalize on it.
So, much like with slavery and animal testing and nuclear weapons, what we should do instead is recognize that we live in a reality where the cat is out of the bag, and try to prevent harm caused by it going forward.

The world is on fire, but if you don’t add fire to the fire, you might get burned.
The only good use for LLMs and generative AI is spreading misinformation.
We’ve had tools to manage workflows for decades. You don’t need Copilot injected into every corner of your interface to achieve this. I suspect the bigger challenge for Larian is working in a development suite that can’t be accused of having “AI Assist” hiding somewhere in the internals.
Yup! Certifying a workflow as AI-free would be a monumental task now. First, you’d have to designate exactly what kinds of AI you mean, which is a harder task than I think people realize. Then, you’d have to identify every instance of that kind of AI in every tool you might use. And just looking at Adobe, there’s a lot. Then you, what, forbid your team from using them, sure, but how do you monitor that? Ya can’t uninstall generative fill from Photoshop. Anyway, that’s why anything with a complicated design process marked “AI-Free” is going to be the equivalent of greenwashing, at least for a while. But they should be able to prevent obvious slop from being in the final product just in regular testing.
Yeah, do you use any Microsoft products at all (like 98% of corporate software development does)? Everything from teams to word to visual studio has copilot sitting there. It would just take one employee asking it a question to render a no-AI pledge a lie.
It’s simple: go back to binary.
Or just have a hard cut-off for software released after 2022.
It’s the only way I search for recipes anymore - a date filter from 1/1/1990 - 1/1/2022.
Keep going. Handmade analog mediums only.
Coincidentally, this paper published yesterday indicates that LLMs are worse at coding the closer you get to the low level like assembly or binary. Or more precisely, ya stop seeing improvements pretty early on in scaling up the models. If I’m reading it right, which I’m probably not.
Just stop using computers at all to program computer games.
You know it doesn’t have to be all or nothing, right?
In the early design phase, for example, quick placeholder objects are invaluable for composing a scene. Say you want a dozen different effigies built from wood and straw – you let the clanker churn them out. If you like them, an environment artist can replace them with bespoke models, as detailed and as optimized as the scene needs it. If you don’t like them, you can just chuck them in the trash and you won’t have wasted the work of an artist, who can work on artwork that will actually appear in the released product.
Larian haven’t done anything to make me question their credibility in this matter.
You know it doesn’t have to be all or nothing, right?
Part of the “magic” of AI is how much of the design process gets hijacked by inference. At some scale you simply don’t have control of your own product anymore. What is normally a process of building up an asset by layers becomes flattened blobs you need to meticulously deconstruct and reconstruct if you want them to not look like total shit.
That’s a big part of the reason why “AI slop” looks so bad. Inference is fundamentally not how people create complex and delicate art pieces. It’s like constructing a house by starting with the paint job and ending with the framing lumber, then asking an architect to fix where you fucked up.
If you don’t like them, you can just chuck them in the trash and you won’t have wasted the work of an artist
If you engineer your art department to start with verbal prompts rather than sketches and rough drawings, you’re handcuffing yourself to the heuristics of your AI dataset. It doesn’t matter that you can throw away what you don’t like. It matters that you’re preemptively limiting yourself to what you’ll eventually approve.
That’s a big part of the reason why “AI slop” looks so bad. Inference is fundamentally not how people create complex and delicate art pieces. It’s like constructing a house by starting with the paint job and ending with the framing lumber, then asking an architect to fix where you fucked up.
This is just the whole robot sandwich thing to me.
A tool is a tool. Fools may not use them well, but someone who understands how to properly use a tool can get great things out of it.
Doesn’t anybody remember how internet search was in the early days? How you had to craft very specific searches to get something you actually wanted? To me this is like that. I use generative AI as a search engine and just like with altavista or google, it’s up to my own evaluation of the results and my own acumen with the prompt to get me where I want to be. Even then, I still need to pay attention and make sure what I have is relevant and useful.
I think artists could use gen AI to make more good art than ever, but just like a photographer… a thousand shots only results in a very small number of truly amazing outcomes.
Gen AI can’t think for itself or for anybody, and if you let it do the thinking and end up with slop well… garbage in, garbage out.
At the end of the day right now two people can use the same tools and ask for the same things and get wildly different outputs. It doesn’t have to be garbage unless you let it be though.
I will say, gen AI seems to be the only way to combat the insane BEC attacks we have today. I can’t babysit every single user’s every email, but it sure as hell can bring me a shortlist of things to look at. Something might get through, but before I had a tool a ton of shit got through, and we almost paid tens of thousands of dollars in a single bogus but convincing looking invoice. It went so far as a fucking bank account penny test (they verified two ach deposits) Four different people gave their approvals - head of accounting included… before a junior person asked us if we saw anything fishy. This is just one example for why gen AI can have real practical use cases.
This is just the whole robot sandwich thing to me.
If home kitchens were being replaced by pre-filled Automats, I’d be equally repulsed.
A tool is a tool. Fools may not use them well, but someone who understands how to properly use a tool can get great things out of it.
The most expert craftsman won’t get a round peg to fit into a square hole without doing some damage. At some point, you need to understand what the tool is useful for. And the danger of LLMs boils down to the seeming industrial scale willingness to sacrifice quality for expediency and defend the choice in the name of business profit.
Doesn’t anybody remember how internet search was in the early days? How you had to craft very specific searches to get something you actually wanted?
Internet search was as much constrained by what was online as what you entered in the prompt. You might ask for a horse and get a hundred different Palominos when you wanted a Clydesdale, not realizing the need to be specific. But you’re never going to find a picture of a Vermont Morgan horse if nobody bothered to snap a photo and host it where a crawler could find it.
Taken to the next level with LLMs, you’re never going to infer a Vermont Morgan if it isn’t in the training data. You’re never going to even think to look for one, if the LLM hasn’t bothered to index it properly. And because these AI engines are constantly eating their own tails, what you get is a basket of horses that are inferred between a Palomino and a Clydesdale, sucked back into training data, and inferred in between a Palomino and a Palomino-Clydesdale, and sucked back into the training data, and, and, and…
I think artists could use gen AI to make more good art than ever
I don’t think using an increasingly elaborate and sophisticated crutch will teach you to sprint faster than Hussein Bolt. Removing steps in the artistic process and relying on glorified Clipart Catalogs will not improve your output. It will speed up your output and meet some minimum viable standard for release. But the goal of that process is to remove human involvement, not improve human involvement.
I will say, gen AI seems to be the only way to combat the insane BEC attacks we have today.
Which is great. Love to use algorithmic defenses to combat algorithmic attacks.
But that’s a completely different problem than using inference to generate art assets.
How do you think a human decides what to sketch? They talk about the requirements.
I was saying that as well.
I get the knee jerk reaction because everything has been so horrible everywhere lately with AI, but they’re actually one of the few companies using it right.
Welp, there goes all of my enthusiasm for the next game. Will have to check out other actual game developers and artists instead of whatever Larian and their genAI prompt monkeys have become.
I’ve had an idea of making a visual novel with gen AI, but I’d want to attach “Placeholder: AI Artwork” in a visible location for each sprite. And I only even consider that because I’m not exactly a known game dev and don’t have ready access to artists.
Larian should likely expect if they’re taking shortcuts in their position, they’d get backlash. I can at least recognize that they’re trying to be moderate about it.
Also you don’t have the infamous AAA deadlines so you are not guaranteed to include them in final outcome anyways due to “an oopse.”
Yeah, the outrage is overblown.
This doesn’t mean they’re enforcing a CoPilot quota or vibe coding the game or shipping slop; it could be simple autocompletion, or (say) a component that makes the mocap pipeline easier.
Don’t let Tech Bros poison dumb tools that could help out devs like Larian.
…Now, if they ship slop into the final game or announce an “OpenAI partnership,” that’s a different story.
…Now, if they ship slop into the final game
At a certain level, it is going to be a chore to determine who is or is not slopping up with AI media. Not every asset comes out with six fingers and a half-melted face.
I can see legitimate frustration with an industry that seems reliant on increasingly generic and interchangeable assets. AI just becomes the next iteration of this problem. You’ve expanded the warehouse of prefab images, but you’re still stuck with end products that are uncannily similar to everything else on the market.
And that’s before you get to the IP implications of farming all your content out to a third party that doesn’t seem to care where its base library is populated from.
At a certain level, it is going to be a chore to determine who is or is not slopping up with AI media. Not every asset comes out with six fingers and a half-melted face.
Image/video diffusion is a tiny subset of genAI. I’d bet nothing purely autogenerated makes it into a game.
I can see legitimate frustration with an industry that seems reliant on increasingly generic and interchangeable assets. AI just becomes the next iteration of this problem. You’ve expanded the warehouse of prefab images, but you’re still stuck with end products that are uncannily similar to everything else on the market.
See above. And in many spaces, there are a sea of models to choose from, and an easy ability to tune them to whatever style you want.
And that’s before you get to the IP implications of farming all your content out to a third party that doesn’t seem to care where its base library is populated from.
Thier tools can be totally in house, disconnected from the outside web, if they wish. They might just be a part of the pipeline on their graphics workstations.
Keep a distinction between “some machine learning in tedious parts of our workflows” and “a partnership with Big Tech APIs.” Those are totally different things.
It sounds like Larian is talking about the former, and I’m not worried about any loss of creativity from that.
At a certain level, it is going to be a chore to determine who is or is not slopping up with AI media. Not every asset comes out with six fingers and a half-melted face.
That’s a good point. I think a lot of people are dismissing AI content because there’s this fallacy and desire to believe it’s all “slop”. It’s willfully ignorant to wave it all away like that. Sure, we’ve all seen the stupid stuff, and it’s really annoying, but we absorb the good stuff without even knowing it. Anyone claiming they can reliably spot AI generated images is fooling themselves even at this early stage.
I’d like to know when something is real, especially real art and even pictures of nature, but I don’t think I can.
Considering we’ve already got the one former Larian employee speaking out against this, it’ll be interesting to see how many more show up off the record (or maybe on the record anonymously). I’m sure there was an internal battle over it.
There aren’t many (possibly none) with more goodwill banked among enthusiast gamers than Vincke, so I feel like we’re about to see just how far a popular figure can step into this particular puddle without coming out soaked.
As someone who owns D:OS2, this is really disappointing and below their standards. I’ll be giving this new Divinity a pass.
They don’t need to be using AI to create concepts, and if they do, I don’t think the “concepts” will be all that great in the first place. Not to mention the ethical perils of using models trained off other artists who are not licensed or compensated.
This is some classic CEO “step on a rake and then get mad at everyone else” nonsense. They openly talked about how they liked AI, and get mad at us for saying “cool, that’s a game I’m gonna skip then!”
That’s an awfully early point to judge a game, with basically zero knowledge of what they’re actually doing/using.
What if they’re referencing a small, home grown model to assist with mocap? Or a sketch->3D drafting tool? Would that be enough to write it off?
What if it’s a home grown model to assist with mocap?
Well, that’s not what it is (a), at least according to the CEO. They used it for concepts, not animations. And also, (b) I’m not really in the place to give people the benefit of the doubt when using AI that is trained off stolen materials. I sincerely doubt they’re using a “home grown model” because anyone who knows even a scrap of how LLM/GANs work knows that the data needs to train a model would be far beyond the reach of a company of Larian’s scale. They’ve likely just licensed it from one of the many grifting oligarch AI peddlers.
We don’t need defenders coming in here trying to pretend that the CEO hasn’t just clarified that they are using AI for preproduction, we know this and it’s not up for debate now.
Would that be enough to write it off?
As someone who really appreciates and likes animation, in that particular example, then yes it would probably be enough to write it off. And frankly, why do I need to play their game when I could just AI generate my own slop and save the 70 bucks? In reality, it’s actually fine for me, I have plenty of games and can replay the old Divinity games before these guys lost their way. They used to be a company that followed a passion for CRPGs with good-will behind them, but now that BG3 has been a runaway hit, it seems like they’ve forgotten about the community that got them to where they are today in favor of some AAA gaming nonsense.
Edit:
That’s an awfully early point to judge a game, with basically zero knowledge of what they’re actually doing/using.
Frankly, there are plenty of games that people judge from the outset. There’s a reason why we have the saying “First impressions matter”. They’ve left a bad taste in anyone who dares question the ethics of AI use, but thankfully there might be an audience of people out there who like slop more than I dislike it so they could be ok. No skin off my nose.
because anyone who knows even a scrap of how LLM/GANs work knows that the data needs to train a model would be far beyond the reach of a company of Larian’s scale
If it’s like an image/video model, they could start with existing open weights, and fine tune it. There are tons to pick from, and libraries to easily plug them into.
If it’s not, and something really niche, and doesn’t already exist to their satisfaction, it probably doesn’t need to be that big a model. A lot of weird stuff like sketch -> 3D models are trained on university student project time + money budgets (though plenty of those already exist).
We don’t need defenders coming in here trying to pretend that the CEO hasn’t just clarified that they are using AI for preproduction, we know this and it’s not up for debate now.
No. We don’t know.
And frankly, why do I need to play their game when I could just AI generate my own slop and save the 70 bucks
I dunno what you’re on about, that has nothing to do with tools used in preproduction. How do you know they’ll even use text models? Much less that a single would ever be shipped in the final game? And how are you equating LLM slop to a Larian RPG?
hit, it seems like they’ve forgotten about the community that got them to where they are today in favor of some AAA gaming nonsense.
Except literally every word that comes out of interviews is care for their developers, and their community, which they continue to support.
Frankly, there are plenty of games that people judge from the outset. There’s a reason why we have the saying “First impressions matter”. They’ve left a bad taste in anyone who dares question the ethics of AI use, but thankfully there might be an audience of people out there who like slop more than I dislike it so they could be ok. No skin off my nose.
Read that again; pretend it’s not about AI.
It sounds like language gamergate followers use as excuses to hate something they’ve never even played, when they’ve read some headline they don’t like.
…Look, if Divinity comes out and it has any slop in it, it can burn in hell. If it comes out that they partnered with OpenAI or whomever extensively, it deserves to get shunned and raked over coals.
But I do not like this zealous, uncompromising hate for something that hasn’t even come out, that we know little about, from a studio we have every reason to give the benefit of the doubt. It reminds me of the most toxic “gamer” parts of Reddit and other cesspools of the internet, and I don’t want it to spread here.
We’re not looking to, but if it happens, it happens.
I would play a game with ai NPCs. Not the artwork but the mechanic. You could run hundreds of interqctable characters with simple prompts that states ztheir goals and personalities. Like a modern radiant ai.
These games already exist and they’re just boring, no thought behind any dialogue, just a waterfall of LLM slop.
any name I can take a look at? I’m curious to see it.
I’m not sure about games personally, but there’s the Mantella mod for Skyrim which is pretty much exactly what you’re looking for.
Check out this video if you want to see how it works in practice.
Shit that is awesome. Sounds as shit as Skyrim dialogue, but the responsiveness to the world and players actions is such a nice feature to bring life to such a boring game.
Games dead on arrival clanker.
Gamers will and do bitch about anything and everything. I couldn’t care less if they use AI if the product is good.
I really hope this marks the end of the internet’s weird love fest with this company and CEO. They don’t care about you in the slightest.
I dont give a shit if they care or not, they make amazing games
Yeah because why like a company that has released banger after banger after banger without any MTX or other bullshit attached to it.
That may be true. Still a company, still doesn’t care about anyone besides as a consumer. I don’t get the weird parasocial relationships people develop with any company/corporation.
Because one day they won’t. The more money a company makes the closer they get to being every successful company.
So what, then I can still dislike the company lmao. What’s the point of hating something in advance?
Not really “in advance” if they’re openly discussing doing the thing that people are mad about right now, more like right on time
Because then you don’t get cool points for going puts sunglasses on “I always knew those guys were evil! I’ve been hating them longer than you!”
well until that day, we’ll love them. one day the fucking sun will explode, but you won’t catch me saying we should stop using solar generators.
That’s not a very good comparison, but do whatever you want, bud.











