I am actually getting ready to start a campaign using AI. Now granted i am spending hours providing input into it, but ChatGPT is polishing it. I have one chat that i am building my overall world in. Then i have a second chat for a specific town. I give it prompts to kick out businesses and npcs for that town. I have it generate quests from some of those npcs. Then i take those quests and go back and forth with more prompts until i have something i am happy with. Its like using a playdough press. Sure i could make a burger shape with my hands. But having a mold to help shape the final product can help smooth the edges. I am still providing a ton of input, its just filling in the gaps.
The current generative AI technologies mostly fail at the DM role because they must be prompted. DMing is all about spot lighting and pacing, you can't do that if you can only respond to "player" requests. I think what cocks is saying reflects the same thing I have been seeing literally everywhere. While the creators in the space shout down anyone who says AI is fine, the consumers are replacing us with the ability to quickly generate art, ideas, and prose. Why wouldn't they, it's free. I also think its super interesting that there is such a huge divide between opinions on AI writing and AI "art". Like somehow replacing writers with automation is somehow ok but replacing artists isn't. So strange. Thanks for calling out how unhealthy that creator -> creator ire is.
Actually, NaNoWriMo, one of the biggest writing groups in the world, made a statement in favor of AI generated writing, and half their leadership resigned in protest.
@@inquisitorkobold6037oh wow, I hadn’t heard that. I was more referring to the discourse in the ttrpg space. Creators like baron derop and others seem to be fine with generative writing tools and getting a pass while anyone who touched generative image tools gets screeched out very loudly. Glad to know that people in the space are pushing back across the board.
The real harm is that delegating more and more of the creative process to AI is, in my opinion, damaging to human creativity and flourishing. People often think that the negative impacts of AI will be dramatic, Terminator-like situations. But the reality is much more subtle and much more insidious. The less we engage with the creative arts, writing, editing, sketching, drawing, composing, the more soulless our works become. Creative endeavors are wonderful because of their diversity, their variety; AI works toward homogeny. (All my opinion as someone who encounters and has to investigate AI content in both my personal and professional life!)
@@TalesFromElsewhereGames art is communication between sentient beings and ai isn't communicating anything. That's why it's always hollow. Anything interesting was taken from real human works.
Excellent take. You were specifically commenting on the creative process, but I think this argument you make can be extended out to the impact of AI on human life in general. What gives life purpose? Is life worth living if we are never challenged? I think of all the times I've had to learn a new skill out of necessity and that journey from initial frustration to deep satisfaction when I actually achieve what it is I set out to accomplish; what if I could have skipped straight to the end in every one of those scenarios?
I'm using the current crop of text generators the same way I have been using them for the last 30 years (going back to Markov chains and similar techniques): For procedural generation of details. In that, they're just an extension of random roll tables, which I use extensively as well for prep and during play.
16:00 This is an excellent point you make here, and I think this is where the broader conversation about AI needs to get to. Whether we like it or not, humanity is transitioning to a future where generative AI plays some part in our daily lives. The sooner we move past this initial rejection of the technology, the sooner we can start working out what it means for our respective communities and begin to reconcile the fact that it will exist and people will use it regardless of whether we want them to.
My GM has used AI generated token-images for enemies - as have I, whenever I have something specific in mind, which I can't find online. But stories... no! Both as GM and as a Player, comming up with my Adventure Ideas and Character Backstory is the fun part.
@@mikethedriver5673 that's really sad. One of the main things that makes a story is the author communicating with you through their art. Ai isn't communicating anything. There's no conscious being there.
@@Julez60 Well you just skipped an entire philosophical debate there. Firstly that AI isn't communicating anything. You wouldn't really know that abstractly. You could imagine a scenario where someone prompting an AI to write a book also asks it to communicate "something" in that book through symbolism or what have you. Or like various other emergent features of AI a it could simply try to communicate "something" in its books or through its art because its clear in its training data that when people try to write books many people try to communicate "something" with in then. On the more philosophical side there are a whole lot of people who believe in "death of the author" and advocate for that so I could see a good argument for why something like that shouldn't matter for them.
@@mikethedriver5673 it takes consciousness to communicate. And that ai chat chatbot is an unthinking neural net that had to mass steal the works of real authors to function. You know what I mean when I say It's not communicating anything.
Bit disappointing to see you acknowledge that companies just construct consensus through their insistent broadcasting of baseless claims, but then uncritically repeat the "AI is here to stay line". As John briefly mentioned, AI has already been here for a very long time in the form of the various behaviors videogame NPCs are designed to exhibit. In contrast, what the industry is now trying to sell as "AI" is little more than a familiar interface to a specific family of machine learning models which largely parrot back the distributions of text they're exposed to, conditional on the preceding tokens. This last point is very relevant to understand their limitations; as you alluded, a human writer will call upon various memories and ideas when deciding where to go next with their writing, while the architectures LLMs are based on have states that are entirely described by the chain of text that has been generated so far. As such, there's no room for any form of separate, sustained thought process necessary to produce rich overarching narratives. That's also why, despite the companies' best attempts, it remains trivially easy to "jailbreak" these models by exposing them to the right inputs. Another larger point that is being missed here with regards to resource consumption of these models is that, beyond the ecological impact they may or may not have, you're presupposing the continued existence of the complex supply chains that are needed to keep all of it running. A large fraction of the rare metals that those data centers are built on come from low resource, high vulnerability regions of the world. Who will work the mines as the frequency and intensity of floods and heatwaves continues to increase?
Yeah, I'm sure we missed a lot. If you think of any other gaps, let us know. Like I mentioned, this AI business isn't something I'm particularly well educated on. The more "under the hood" explanations, I think, the better. -John
For the memory part, interfaces like SillyTavern have that since quite a bit of time, and they can be used to feed the AI a kind of "what happened so far..." summary, world details, and even goals for the NPCs and story to go to if you know what you're doing. I've seen ChatGPT adding a similar, though way more limited, "memory" feature recently too to their web interface. So we're getting there, but as with all tools, it requires practice and inspiration to use it to create interesting and/or useful results.
Perhaps an outlier, but I just super don't care how other people run their games. I like building the game, so I won't use AI for that. I'm interested in engaging with other people's rad ideas, so I've got no interest in playing in an AI game. I do not get hyped about high quality visuals, wordy descriptions, or catchy AI tunes. I get hyped out of my mind that my friend had a good time creating those things. You get out what you put in and production value is a red herring. I don't think any industry has a right to exist. I do think a lot of the training data has been handled poorly. In general, I stray away from AI things because there's nothing interesting about it.
Running AI is cheap. Training it is not. The energy cost argument doesn't really work since these models are going to be trained anyway, choosing not to use them won't change the relevant part of the energy footprint. I have run freely available models on par with ChatGPT 3.5 at home on consumer hardware and they coherently generate paragraphs of text per second. The energy cost of running a model is barely more expensive than playing a video game on ultra settings for the same amount of time.
As you seem to say yourself later on, it’s not inevitable that we use AI, or that it is here to stay. This is the line pushed by OpenAI and other companies creating generative AI tools and it’s a calculated message meant to create pressure on businesses to use LLMs and other tools. It’s also an excuse to ignore all the problems with the technology - “it’s here so we’d better learn to live with it”. If we start from that premise then those companies have succeeded in without needing to solve any of those problems.
Not sure which part you're keying in on, but I do think that AI is here to stay (barring some calamity that sets us back to the stone age), but I also don't think it's an "excuse to ignore all the problems with the the technology," either. As I see it, our goal should be to ensure regulation of the technology from the basis that ethical boundaries exist (ie. scraping creative work and personal info without permission,) and should be observed.
So I feel like I think faster than I type so I’ll talk to the AI and tell it my ideas and it’ll just organize my ideas i’m just writing this here because it seems like a lot of people think that AI just comes up with the ideas and that’s all it does.
The energy cost of running AI is pretty minimal. You can generate something with stable diffusion and with a reasonable rig it'll take a couple of seconds to generate an image- with comparable cost to doing something like running a game for that amount of time. It's something, yeah, but... really not anything terribly big- and you need to consider the energy cost of a human drawing digital art, which takes dramatically longer- and while any individual moment may be less energy intensive, the screen alone for the hours that it takes is likely always going to cost more than the few seconds of GPU time
I think the total energy costs for both training and inference is more the issue. Training the AI will take a substantial amount of energy. Even if you "allocate" or distribute those energy costs across all the many different users that will eventually use that model checkpoint, it will still be a substantial cost especially as the models get more advanced and a higher number of parameters are reached. Inference is pretty cheap you're right. A decent rig can get you 4-5 images within seconds if not minutes. I will say that if people are "wasteful", say generating hundreds or thousands of AI images using a local rig or cloud service overnight and saving all the images for filtering/perusal later, then things can get interesting. The same power consumption critcism of crypto and blockchain would start applying to AI art generation.
my biggest and indeed only problem with AI is it is often trained on data that the people who created the AI had no legal right to use and then that person is given no credit. The fact that it's cutting edge technology just means it's fairly unregulated. People are afraid that it is a cheap loophole that companies like wizards of the coast will use to sidestep paying content creators.
I block channels that use ai for thumbnails on principle and i appreciate you not using them. Also ya'll did jumble the issue with ai, an ai the taught itself chess is fine, an ai that learned by analyzing masters copies how they play. People haven't consented to that second one, if it learned all by itself that would be a different issue
Tech Bros are gonna Tech Bro, however after dealing with AI and its critics for _decades_ I can say the majority of objections are in bad faith if not outright hypocritical and come from the same sorts who claimed AI could never create art in the first place. They complain about low quality, yet their work is subpar. They claim quality doesn't matter, yet the market says otherwise. They complain about loss of work, yet don't take commissions. And even if you train AI exclusively on your own work they'll still go after you. Fact is art has always been about manifesting compelling internal visions (which by definition no AI can replace) through any means necessary. And with few exceptions the folks most afraid of it are the least creative because they know they'll be the first to be 'replaced'.
I think that's a level-headed take, and I agree in part. Some of the artists I worked with in the games industry were interested in AI for its capability to enhance, and expedite work. At the end of the day, the creativity is all them, but if it can save time trudging through the non-evocative work so you can get to the fun stuff, it seems just like every other tool we've ever invented.
You are here to paint all critics of generative AI as incompetent, jealous, insecure, as if that somehow furthers your argument? There is no line of logic here, just ad hominem. "Compelling internal visions" is just a copout for being an eternal "ideas guy" who's too lazy to ever actually learn a skill and too incompetent to manage. There're people out there who draw with their feet, or type with their chin because of disabilities, and you're here to defend prompting. You claim to separate yourself from "Tech Bros" but your argument is literally at the same level of idiocy from those who tried to sell me NFTs in 2021.
@@cynthiacrescent Much agree with this as well. The OP's post is certainly sweeping, which isn't helpful to the discussion overall (and something I shouldn't have glossed over when I read it the first time.) I definitely read that through the lens of creating commercial art for large projects in specific industries, however. We will eventually get to the point where AI is just a normal part of the workflow for many projects (and indeed it already is.) At the same time, I don't think that eliminates the need and desire for folks to work 'without' such tools. There will be an audience for both, but I don't know that a 2d art position in the games/advertising/television industry is going to look the same as a 2d art position in the tabletop/hobbies/art-for-the-sake-of-art industries -- they'll have different needs, audiences, and external pressures. The first group will need to embrace new technology as the industry around them evolves, while the latter group is probably going to become a counter-culture where bespoke, human-made art is what the audience demands.
There is no ethical delima. There's a vocal minoriaty that want to soapbox. Gaming is not a jobs program. If ai gets you a superior product, use it. If not, don't.
I am actually getting ready to start a campaign using AI. Now granted i am spending hours providing input into it, but ChatGPT is polishing it. I have one chat that i am building my overall world in. Then i have a second chat for a specific town. I give it prompts to kick out businesses and npcs for that town. I have it generate quests from some of those npcs. Then i take those quests and go back and forth with more prompts until i have something i am happy with. Its like using a playdough press. Sure i could make a burger shape with my hands. But having a mold to help shape the final product can help smooth the edges. I am still providing a ton of input, its just filling in the gaps.
The current generative AI technologies mostly fail at the DM role because they must be prompted. DMing is all about spot lighting and pacing, you can't do that if you can only respond to "player" requests. I think what cocks is saying reflects the same thing I have been seeing literally everywhere. While the creators in the space shout down anyone who says AI is fine, the consumers are replacing us with the ability to quickly generate art, ideas, and prose. Why wouldn't they, it's free. I also think its super interesting that there is such a huge divide between opinions on AI writing and AI "art". Like somehow replacing writers with automation is somehow ok but replacing artists isn't. So strange. Thanks for calling out how unhealthy that creator -> creator ire is.
Actually, NaNoWriMo, one of the biggest writing groups in the world, made a statement in favor of AI generated writing, and half their leadership resigned in protest.
@@inquisitorkobold6037oh wow, I hadn’t heard that. I was more referring to the discourse in the ttrpg space. Creators like baron derop and others seem to be fine with generative writing tools and getting a pass while anyone who touched generative image tools gets screeched out very loudly. Glad to know that people in the space are pushing back across the board.
The real harm is that delegating more and more of the creative process to AI is, in my opinion, damaging to human creativity and flourishing. People often think that the negative impacts of AI will be dramatic, Terminator-like situations. But the reality is much more subtle and much more insidious. The less we engage with the creative arts, writing, editing, sketching, drawing, composing, the more soulless our works become.
Creative endeavors are wonderful because of their diversity, their variety; AI works toward homogeny.
(All my opinion as someone who encounters and has to investigate AI content in both my personal and professional life!)
@@TalesFromElsewhereGames art is communication between sentient beings and ai isn't communicating anything. That's why it's always hollow. Anything interesting was taken from real human works.
@@Julez60 I agree!
AI has a purpose, but purpose is in data analytics and data scraping. I'd prefer it stay far from the world of art!
@@TalesFromElsewhereGames 100% agree
Well said.
Excellent take. You were specifically commenting on the creative process, but I think this argument you make can be extended out to the impact of AI on human life in general. What gives life purpose? Is life worth living if we are never challenged? I think of all the times I've had to learn a new skill out of necessity and that journey from initial frustration to deep satisfaction when I actually achieve what it is I set out to accomplish; what if I could have skipped straight to the end in every one of those scenarios?
I'm using the current crop of text generators the same way I have been using them for the last 30 years (going back to Markov chains and similar techniques): For procedural generation of details. In that, they're just an extension of random roll tables, which I use extensively as well for prep and during play.
16:00 This is an excellent point you make here, and I think this is where the broader conversation about AI needs to get to. Whether we like it or not, humanity is transitioning to a future where generative AI plays some part in our daily lives. The sooner we move past this initial rejection of the technology, the sooner we can start working out what it means for our respective communities and begin to reconcile the fact that it will exist and people will use it regardless of whether we want them to.
My GM has used AI generated token-images for enemies - as have I, whenever I have something specific in mind, which I can't find online.
But stories... no! Both as GM and as a Player, comming up with my Adventure Ideas and Character Backstory is the fun part.
When it comes to generative ai slop: If you can't be bothered to write a book why should anyone be bothered to read it?
Well... assuming its good... because its good.
@@mikethedriver5673 that's really sad.
One of the main things that makes a story is the author communicating with you through their art. Ai isn't communicating anything. There's no conscious being there.
@@Julez60 Well you just skipped an entire philosophical debate there. Firstly that AI isn't communicating anything. You wouldn't really know that abstractly. You could imagine a scenario where someone prompting an AI to write a book also asks it to communicate "something" in that book through symbolism or what have you. Or like various other emergent features of AI a it could simply try to communicate "something" in its books or through its art because its clear in its training data that when people try to write books many people try to communicate "something" with in then.
On the more philosophical side there are a whole lot of people who believe in "death of the author" and advocate for that so I could see a good argument for why something like that shouldn't matter for them.
@@mikethedriver5673 it takes consciousness to communicate. And that ai chat chatbot is an unthinking neural net that had to mass steal the works of real authors to function.
You know what I mean when I say It's not communicating anything.
@@Julez60 So does this mean AI conclusively kills the whole "death of the author" school of thought?
Uuh... No, BG3 doesn't use an AI DM, the enemies have behaviors programed by humans and the whole story was writen by humans
Bit disappointing to see you acknowledge that companies just construct consensus through their insistent broadcasting of baseless claims, but then uncritically repeat the "AI is here to stay line". As John briefly mentioned, AI has already been here for a very long time in the form of the various behaviors videogame NPCs are designed to exhibit. In contrast, what the industry is now trying to sell as "AI" is little more than a familiar interface to a specific family of machine learning models which largely parrot back the distributions of text they're exposed to, conditional on the preceding tokens. This last point is very relevant to understand their limitations; as you alluded, a human writer will call upon various memories and ideas when deciding where to go next with their writing, while the architectures LLMs are based on have states that are entirely described by the chain of text that has been generated so far. As such, there's no room for any form of separate, sustained thought process necessary to produce rich overarching narratives. That's also why, despite the companies' best attempts, it remains trivially easy to "jailbreak" these models by exposing them to the right inputs.
Another larger point that is being missed here with regards to resource consumption of these models is that, beyond the ecological impact they may or may not have, you're presupposing the continued existence of the complex supply chains that are needed to keep all of it running. A large fraction of the rare metals that those data centers are built on come from low resource, high vulnerability regions of the world. Who will work the mines as the frequency and intensity of floods and heatwaves continues to increase?
Yeah, I'm sure we missed a lot. If you think of any other gaps, let us know. Like I mentioned, this AI business isn't something I'm particularly well educated on. The more "under the hood" explanations, I think, the better.
-John
For the memory part, interfaces like SillyTavern have that since quite a bit of time, and they can be used to feed the AI a kind of "what happened so far..." summary, world details, and even goals for the NPCs and story to go to if you know what you're doing. I've seen ChatGPT adding a similar, though way more limited, "memory" feature recently too to their web interface.
So we're getting there, but as with all tools, it requires practice and inspiration to use it to create interesting and/or useful results.
Perhaps an outlier, but I just super don't care how other people run their games.
I like building the game, so I won't use AI for that. I'm interested in engaging with other people's rad ideas, so I've got no interest in playing in an AI game. I do not get hyped about high quality visuals, wordy descriptions, or catchy AI tunes. I get hyped out of my mind that my friend had a good time creating those things. You get out what you put in and production value is a red herring.
I don't think any industry has a right to exist. I do think a lot of the training data has been handled poorly. In general, I stray away from AI things because there's nothing interesting about it.
Running AI is cheap. Training it is not. The energy cost argument doesn't really work since these models are going to be trained anyway, choosing not to use them won't change the relevant part of the energy footprint. I have run freely available models on par with ChatGPT 3.5 at home on consumer hardware and they coherently generate paragraphs of text per second. The energy cost of running a model is barely more expensive than playing a video game on ultra settings for the same amount of time.
As you seem to say yourself later on, it’s not inevitable that we use AI, or that it is here to stay. This is the line pushed by OpenAI and other companies creating generative AI tools and it’s a calculated message meant to create pressure on businesses to use LLMs and other tools. It’s also an excuse to ignore all the problems with the technology - “it’s here so we’d better learn to live with it”. If we start from that premise then those companies have succeeded in without needing to solve any of those problems.
Not sure which part you're keying in on, but I do think that AI is here to stay (barring some calamity that sets us back to the stone age), but I also don't think it's an "excuse to ignore all the problems with the the technology," either. As I see it, our goal should be to ensure regulation of the technology from the basis that ethical boundaries exist (ie. scraping creative work and personal info without permission,) and should be observed.
So I feel like I think faster than I type so I’ll talk to the AI and tell it my ideas and it’ll just organize my ideas i’m just writing this here because it seems like a lot of people think that AI just comes up with the ideas and that’s all it does.
Good use of it, for sure.
was that about artists being supposed to be compensated for the initial scrape just a fever dream?
I think AI is great. Its helped me develop my games a lot...
The energy cost of running AI is pretty minimal. You can generate something with stable diffusion and with a reasonable rig it'll take a couple of seconds to generate an image- with comparable cost to doing something like running a game for that amount of time. It's something, yeah, but... really not anything terribly big- and you need to consider the energy cost of a human drawing digital art, which takes dramatically longer- and while any individual moment may be less energy intensive, the screen alone for the hours that it takes is likely always going to cost more than the few seconds of GPU time
I think the total energy costs for both training and inference is more the issue. Training the AI will take a substantial amount of energy. Even if you "allocate" or distribute those energy costs across all the many different users that will eventually use that model checkpoint, it will still be a substantial cost especially as the models get more advanced and a higher number of parameters are reached. Inference is pretty cheap you're right. A decent rig can get you 4-5 images within seconds if not minutes. I will say that if people are "wasteful", say generating hundreds or thousands of AI images using a local rig or cloud service overnight and saving all the images for filtering/perusal later, then things can get interesting. The same power consumption critcism of crypto and blockchain would start applying to AI art generation.
my biggest and indeed only problem with AI is it is often trained on data that the people who created the AI had no legal right to use and then that person is given no credit. The fact that it's cutting edge technology just means it's fairly unregulated. People are afraid that it is a cheap loophole that companies like wizards of the coast will use to sidestep paying content creators.
I block channels that use ai for thumbnails on principle and i appreciate you not using them. Also ya'll did jumble the issue with ai, an ai the taught itself chess is fine, an ai that learned by analyzing masters copies how they play. People haven't consented to that second one, if it learned all by itself that would be a different issue
You'd look real good in a dress.
Tech Bros are gonna Tech Bro, however after dealing with AI and its critics for _decades_ I can say the majority of objections are in bad faith if not outright hypocritical and come from the same sorts who claimed AI could never create art in the first place. They complain about low quality, yet their work is subpar. They claim quality doesn't matter, yet the market says otherwise. They complain about loss of work, yet don't take commissions. And even if you train AI exclusively on your own work they'll still go after you. Fact is art has always been about manifesting compelling internal visions (which by definition no AI can replace) through any means necessary. And with few exceptions the folks most afraid of it are the least creative because they know they'll be the first to be 'replaced'.
I think that's a level-headed take, and I agree in part. Some of the artists I worked with in the games industry were interested in AI for its capability to enhance, and expedite work. At the end of the day, the creativity is all them, but if it can save time trudging through the non-evocative work so you can get to the fun stuff, it seems just like every other tool we've ever invented.
You are here to paint all critics of generative AI as incompetent, jealous, insecure, as if that somehow furthers your argument? There is no line of logic here, just ad hominem. "Compelling internal visions" is just a copout for being an eternal "ideas guy" who's too lazy to ever actually learn a skill and too incompetent to manage. There're people out there who draw with their feet, or type with their chin because of disabilities, and you're here to defend prompting. You claim to separate yourself from "Tech Bros" but your argument is literally at the same level of idiocy from those who tried to sell me NFTs in 2021.
@@cynthiacrescent Much agree with this as well. The OP's post is certainly sweeping, which isn't helpful to the discussion overall (and something I shouldn't have glossed over when I read it the first time.) I definitely read that through the lens of creating commercial art for large projects in specific industries, however. We will eventually get to the point where AI is just a normal part of the workflow for many projects (and indeed it already is.) At the same time, I don't think that eliminates the need and desire for folks to work 'without' such tools. There will be an audience for both, but I don't know that a 2d art position in the games/advertising/television industry is going to look the same as a 2d art position in the tabletop/hobbies/art-for-the-sake-of-art industries -- they'll have different needs, audiences, and external pressures. The first group will need to embrace new technology as the industry around them evolves, while the latter group is probably going to become a counter-culture where bespoke, human-made art is what the audience demands.
There is no ethical delima. There's a vocal minoriaty that want to soapbox. Gaming is not a jobs program. If ai gets you a superior product, use it. If not, don't.
+
True