General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsMastodon.art just banned AI art
Post about it from the person in charge there: https://masto.ai/@Curator@mastodon.art/109785175102774337
Blog post about the rule change: https://dotart.blog/dotart-blog/ai-art-rule-change
Behind the scenes I've been rejecting applications from people who specify they want an account to post AI art, I never boost or interact with any AI art from Curator, and tend to mute off-instance accounts I see posting AI art using the art hashtags I follow just to keep them out of my search results. But just for us to have a more definitive stance on it, we're updating our rules to be more strict on what we're allowing going forward.
The main point of contention with AI art is that the publicly accessible models people use to generate images were trained using other people's artwork without their consent. It's a glaring issue in ethics, and obviously one that's close to home for an instance full of artists who've had their work scraped by crawlers that pass on the data to things like Stable Diffusion for training.
As such, from this point forward, no AI art is allowed on .art. If you want to post AI art, there are plenty of other instances that don't regulate posting it, and you can create an account on one of those.
We're aware that there are some grey areas, like people who generate and then heavily modify AI art to use in work that's otherwise their own and where those lines of ownership might begin/end, or people training AI on their own art but given how prominent the ethics issue is with the training models right now, and the lack of transparency from a lot of models on where they got their training data, we're keeping the rules tight and all-encompassing. If, in future, some models emerge that are verifiably trained ethically without infringement on existing works, we will re-visit the rule.

lindysalsagal
(22,760 posts)What's the point of it? I want to see what a human artist sees in their environment or imagination.
Polybius
(20,957 posts)I'm not sure I understand the logic.
dalton99a
(90,669 posts)FalloutShelter
(13,885 posts)






blogslug
(39,026 posts)The other day I saw some promotional image for some MAGA thing. It was a "painting" of TFG walking beside a lion. It was done Jon McNaughton-style but the lion was missing a leg and another leg was all twisted up. Definitely AI.
It occurred to me the AI engine probably trained on Jon McNaughton works and likely plagiarized him so some skeevy MAGA event could print a flyer. That made me smile.
AZSkiffyGeek
(12,743 posts)AI art has had trouble with fingers from what I've seen, but I couldn't tell on TFG's tiny hands. His feet and the lion's legs were definitely fucked up though.
blogslug
(39,026 posts)~snort~
TheProle
(3,762 posts)demmiblue
(38,801 posts)Celerity
(52,479 posts)




Brenda
(1,847 posts)the last one, like! Space traveler in daguerreotype.
Celerity
(52,479 posts)for me
these are all (none are AI btw) creepy AF:
Voltaire2
(15,377 posts)each mastodon instance sets its own policies. Other instance do not have this ban.
I think this policy is silly. Generative AI systems are just going to get better, and the ability to determine, without a forensic investigation, whether this essay or that picture, or this video is ai produced will basically be impossible.
If one thinks of a system like dalle to be an imagination camera, it becomes a tool to assist in the process of transforming a visual concept into a physical image. One might as well ban real cameras and photoshop.
highplainsdem
(58,608 posts)without touching or in any way setting up the camera, and the camera created that photo by ripping off and reassembling bits of other photographs taken by real photographers. No skill's required. No talent's required.
AI art is no more real art than a paint-by-numbers kit is. It's just more elaborate.
Voltaire2
(15,377 posts)Nobody creates in a vacuum. AI is as much 'real art' as, for example, collages created from re-assembling bits and pieces of magazines and newspapers. Also, these AI systems are not spontaneously generating things (yet), it is a human directed activity. The concept comes from a person, the tool produces an image (dalle) or text (chatgpt).
highplainsdem
(58,608 posts)It's a lazy pretense of art.
Voltaire2
(15,377 posts)The assistants produce the art as directed by the artist. I guess that is 'lazy art' too.
(Also the atelier - the artist workshop - goes back to the middle ages. The master directs, the apprentices execute the directions.)
highplainsdem
(58,608 posts)a directing artist takes sole credit is controversial.
See this:
https://fineartviews.com/blog/38751/artists-debate-over-the-use-of-artist-assistants-where-do-you-stand
The Old Masters tended to be very open about the identity of their artist assistants. After all, having a highly skilled art assistant in your studio meant that you, the master artist, had trained the assistant well. One can imagine the bragging rights an Old Master had when his relationship with a pupil came full-circle -- his methods and teachings lived on in the pupil... who was now a master in his own right. Today we rarely see that kind of transparency in regard to the use of artist assistants in general.
Today artists who utilize artist assistants, such as Damien Hirst, tend to keep said information a tightly guarded secret -- at least within the context of the mainstream art world. In other words, the Old Masters viewed their assistants as pupils -- a point of pride -- whereas Hirst and others view their assistants as mere employees. The artist working for Hirst is providing a service that has been paid for rather than being a student who is learning from his or her teacher.
The differences mentioned above may not seem like much -- but it is when you consider that the Old Masters, when using artist assistants, were passing on a tradition of methods and teachings whereas Damien Hirst and others are simply 'getting a job done'. In fact, Hirst has openly suggested that he utilizes the service of assistants because he does not have time to be bothered with creating the artwork himself. Some articles have implied that Hirst does not look at his assistant created artwork until it is finished -- at which point he decides if the piece is exhibit worthy or not. That is a far cry from how and why the Old Masters used art assistants. I assume that is a major factor in David Hockney's opinion concerning the use of art assistants in general.
One of the comments/replies there points out that "if artwork created 100 percent by assistants is still viewed as the work of the person who hired them... that means anyone with the resources to hire a team of highly skilled artists could theoretically become an art world sensation and museum magnet having never held a tool of art in his or her own hand. True, many don't care either way -- but many, many, many artists and art lovers would experience a kick in the gut."
Voltaire2
(15,377 posts)I'm not arguing that ateliers are good or bad. That is a separate issue. I'm arguing that dalle is the equivalent.
Response to highplainsdem (Reply #21)
speak easy This message was self-deleted by its author.
highplainsdem
(58,608 posts)you claimed. And neither the way some artists now use assistants (which is not respected by other artists) nor DALL- E are like that old tradition of master artists teaching assistants who are essentially apprentices with the aim of them becoming recognized artists themselves. Did you even look at what I linked to? Or even read the excerpts?
Voltaire2
(15,377 posts)by all the famous artists doing exactly this. Did you read it?
highplainsdem
(58,608 posts)to "all the famous artists" using assistants to do the work while taking full credit. Those are your words.
Hockney was talking about one artist, Damien Hirst.
The linked article at https://twocoatsofpaint.com/2012/01/quick-read-damien-hirst-edition.html has this comment, which IMO nails it:
I feel he gives abstract and conceptual artists a bad name, and I am appalled that I cannot look at any work with his name on it without feeling it is cheapened. Fortunately, I can mostly avoid that.
And that article links to this one - https://www.theguardian.com/artanddesign/2012/jan/03/david-hockney-damien-hirst-rival-exhibitions - which probably explains why Hirst has others create for him:
My impression is that the people most interested in using ChatGPT to "create" will be people with little or no talent who want to claim to be creative. And perhaps a few who could actually write something decent or create some worthwhile piece of art or music themselves, but don't want to put the time and effort into it, especially if they can con anyone into believing that it's really creative to just give instructions to software, and that the AI isn't unethically using real artists' work.
Mossfern
(4,431 posts)I'm just getting back into creating works with "real" media. I hold an MFA in painting, after decades of not painting (long irrelevant story) My training was in the academic style - "from the studio of..." Yes, I'm an old fart but art, to me is not about making a pleasing image. It's about, life, experience, texture, feeling the tip of a brush on a canvas, wielding a hammer and chisel - does anyone make etchings any more?
AI 'creations' feel like an abomination to me.
Yes, that's quite harsh language and I apologize to any who are offended by my words, but it's true.
We're becoming less and less human.....oh, I'm going off now.
OK
It's merely 'art' with no soul.
tinrobot
(11,804 posts)Most of the control you have over the output is with the words you provide. On top of that, the photographer takes every word literally and doesn't have a trained eye, so the final result is usually mediocre. Some pleasant surprises, but a lot of duds.
The more I work with it, the more I see the shortcomings.
But there are plenty of ways to use it that don't involve copying other people's work. Of course, people have been copying others styles since cave paintings were a thing. Regardless, it is not going away. People will use it, the genie is already out of the bottle.
highplainsdem
(58,608 posts)via lawsuits, threats of lawsuits, and general recognition that acceptance of fake art from AI damages society overall and real artists in particular.
People who want to be artists should take the time to master real skills, not use AI as a crutch and pretend it's equivalent to real art.
People who want something written should write it. Surrendering communication to a machine is both lazy and stupid. And anyone whose AI-produced writing causes any harm ( as could have happened with CNET, whose AI-produced articles on finance contained advice they admitted was potentially harmful) should be held legally liable for it.
tinrobot
(11,804 posts)People claimed it wasn't 'real' music, musicians feared for their livelihoods, there were lots of lawsuits, etc, etc...
And yet, here we are. Sampling never went away. It became an accepted part of popular music. You probably even like a song or two that uses it.
And even with sampling, there's still demand for musicians. A talented musician can still make a decent living.
This technology will follow a similar arc. This technology by itself is not creative. It's not going to replace all other forms of art. It will just be one more medium in which to create.
Ms. Toad
(37,856 posts)Both are tools. You can accomplish great things with tools - as well as pretty crappy things.
The challenge with AI is not that is a crutch, or being lazy. A lot of innovation has been called lazy or a crutch until we got used to it. As to photography there was (and still is, in some artistic circles) a lot of criticism of both digital cameras and the digital darkroom. For that matter, there was criticism that photography, itself, wasn't real art.
One challenge is that AI builds on the work of others and most of it does nto properly license those works. In other words. Most AI tools currently steal from the original artists. If there are ways to limit AI to using properly licensed original content, it may well be a useful tool for legitimate artists.
A second challenge is that the work product is largely indistinguishable from original works, so it not only creates a substantial risk of theft from the original artists - it creates an additional risk that its product will be submitted by someone who asserts they created it. That creates a problem for traditional evaluation tools in academic endeavors.
But as long as the original artists are compensated for their work (and have the right to opt out of having their work used), whoever is creating the AI art will still have to sort the wheat from the chaff (as photographers who use camera phones do now), and it will either succeed or fail on its merits.
highplainsdem
(58,608 posts)no indication the people behind this software ever intended to.
Hence the lawsuits.
Which I hope will be followed by laws with serious penalties and punishment for violations.
Ms. Toad
(37,856 posts)You described it as inherently bac - specifically that it was bad because it was lazy art and people needed to learn the skills. (And suggested lawsuits were an appropriate way to solve the problem you described.)
Those are two very different issues. Your assertion was, at its core, that the use of automated tools to do the work which traditionally has been done manually is bad. It's not inherently bad - it's a choice, similar to the choice to use film or digital cameras; to use a physical darkroom v. a digital one.
The focus of any lawsuit wouldn't be on whether using newer tools is bad. It's on whether the manufacturer or user of the tools are complying with the law: did they properly license the underlying work? Are they claiming they wrote an essay when the rules of the relevant forum prohibit the use of AI tools in any submitted essay?
Lawsuits aren't the way to resolve whether using tools in creating art or writing is a good thing (or not).
Back when I started teaching I had similar concerns about using calculators in high school math/science classes. I used a slide rule when I was in high school. My graduation present was a 4-function calculator which cost around $100. So 4 years later, when I started teaching, I prohibited students from using slide rules because I believed it made them lazy - and - because relying on "magic" to do arithmetic - especially for those who struggled with arithmetic - was bad (and would lead to students who could not cope in the world because they believed whatever the calculator spit out was correct. (I was also concerned about the cost of the tools, which put them out of reach of students with fewer resources.)
I was wrong. As soon as calculators were reasonably priced, I asked the school to purchase calculators (to relieve any lingering concerns about equity of resources). I used them in my very, very basic math class (Essentially - now that you know 1+1=2, what co you do with it.) I was required to have my students take the same final exam as every other class (where calculators were prohibited) - but they were not permitted to use calculators on the exam. I was terrified the first year that once my students were deprived of calculators they would bomb. What I discovered was that once they were no longer to struggle with the mechanics of performing arithmetic, they were able to understand how to use it appropriately.
The same may be true of AI, particularly in writing. A lot of people struggle with the mechanics of writing. AI could be used by creative instructors in the same way I used calculators - to relieve the barrier to writing so that they aren't struggling too much to understand how to use it. It can also be used to teach art appreciation/critique.
But the reluctance of those in the fields to accept a new tool is unrelated to whether the tool can be legally used (as currently constructed).
highplainsdem
(58,608 posts)music by ripping off human work.
And at this point I haven't seen anything to suggest that telling AI to write something for you in any way teaches someone to write.
I imagine your feelings about calculators would have been different, too, if schools had decided they didn't need nearly as many math teachers because people would just use calculators.
FWIW, I do know some people who can't do even very basic math without a calculator. They just didn't see a need for it. Just as people who think it's okay to have AI write or create music or images aren't likely to see any need to learn and ideally master skills like those of the writers, musicians and artists AI rips off. They'll become spectators and consumers of what AI produces for them. No more truly creative than people watching football games are truly athletes. But unlike football fans, who at least usually know which teams and players they're watching, they won't likely know who originally created the works AI ripped off. It doesn't supply that information and may never be able to.
I posted a thread a while back - another DUer tried to kick it the other day but it was too old - about a novelist who's been able to sell her work but recently decided she'd just have AI do much of the work for her. She's even played around with having AI copy the style of much more successful writers. Do you really think she's improving as a writer, learning anything about writing, by doing that?
Ms. Toad
(37,856 posts)It is relatively easily resolved by using a licensing program similar to the one used for music currently so that people whose work is used are compensated for it.
What troubles me about this discussion (broadly - not just these few posts) is the assertion that taking advantage of technological advances is inherently bad or somehow makes the results not "real."
AI is just a tool, in the same way the digital darkroom automates processes which used to be manual. Many photographers, including me, prefer making the changes by hand, using tools similar to those available in the physical darkroom. And - even before the automation of the processes in the electronic darkroom, many photographers preferred the physical darkroom - and believed work produced in the digital darkroom wasn't real art.
Any time there are technological innovations, a fair number of people reject the shortcuts it facilitates as somehow cheating. AI is just another tool.
As for how AI can be used to teach writing - there are a lot of teachers already implementing it. A lot of writing is formulaic - you learn the formula and make decisions as to what to plug into the formula. Many students struggle with natching their work to a formula. AI does formulaic well. I have not only taught math, but also writing. While I haven't used AI in teaching writing, I can easily see a role for it in having students critique AI writing, pick it apart, compare their own work to it, use it as a starting point for the structure and improve it, fact check it (AI lies with absolutely no conscience).
In other words, I'd use it exactly the same way as I did calculators - as a tool to teach them about writing.
As for being unable to do basic arithmetic without a calculator - there isn't anything inherently wrong with that. What people do need to be able to do is to problem solve (what arithmetic do I use) and estimate a general expected answer (so they can identify garbage-in/garbage-out answers). All of which I taught (and they learned) more easily when they had calculators as tools to assist them. When students haven't mastered basic arithmetic by the time they are in their 2nd - 5th year of high school (and no, that's not a typo), they aren't going to master it. Far better to teach them to estimate what their weekly paycheck is so they can ballpark what it should be, and if it isn't what is expected - to use a calculator to check it precisely.
I've been through this revolution, personally, with a number of technological advances, in a number of fields (photography, writing, mathematics, computers (across many disciplines). With each advance, there is always a body of people who can't get past the equivalent of, "When I was your age I walked 5 miles to school, uphill both ways" attitude. There are others (and I'm generally in that category) who prefer more traditional tools. That doesn't make new tools inherently bad - it's simply a matter of preference.
highplainsdem
(58,608 posts)students watch AI fill in a formula.
And again, there's no practical way to have AI identify what it's plagiarizing, let alone find some way to compensate the writers, artists and musicians being ripped off. Its use now is unethical.
And unfortunately teachers using AI to "create" may lose sight of how unethical it is.
See this discussion thread on a music forum, the second reply there:
https://forums.stevehoffman.tv/threads/chatgpt.1166665/
I'm not going to quote that post but you should read it, about a group of teachers discussing students using ChatGPT to cheat. And one teacher of music theory liking song lyrics ChatGPT wrote for her so much that she's tempted to have it write more songs for her to "pass off" as hers.
The relevant words there are "pass off" because it would be fraud. It would be cheating. Because she wouldn't have created the songs. AI would have, from sources she wouldn't even be able to identify.
If a teacher finds it that tempting, how do you expect kids not to try to use it to cheat?
If you want your students to see how humans write, to understand how they create, ask local writers if they'll help. Or check to see if there are any such videos on YouTube, because there probably are. You can't learn writing, as a human, from AI. It can't tell you how it creates.
And btw, formulaic writing isn't something teachers should usually aim for.
Elessar Zappa
(16,374 posts)AI is here to stay, theres no avoiding it and it will be beneficial in many ways and harmful in others. But well adapt, just like we did with previous technological changes.
highplainsdem
(58,608 posts)still serious penalties for exceeding those limits.
Plagiarism was made much easier thanks to copy-and-paste and the internet, but plagiarism is still penalized.
Liberals are horrified by the gun advocates who think people should be able to buy any gun available, carry it openly, and feel justified in using it if they feel there's any excuse for doing so.
I don't understand the attitude that because AI programs have been created, they can't or shouldn't be regulated, and their misuse punished as harshly as necessary to make that misuse rare and socially unacceptable.
Elessar Zappa
(16,374 posts)The law just has to catch up.
48656c6c6f20
(7,638 posts)It makes me Slightly Nettled Over being Better I guess.
speak easy
(12,435 posts)
Mossfern
(4,431 posts)AI itself would be the piece of art.
Nothing more is needed.
I don't think the creators were thinking of Dada.

Equivalent VIII, Tate Modern. The case is closed.
Mossfern
(4,431 posts)IA 'creations' are not conceptual art.
Conceptual art was/is to make some sort of statement.
Very often there was a written statement to compliment the piece.
Voltaire2
(15,377 posts)And conceptual art need not 'make a statement' it is simply the expression of a concept by the artist in its simplest form, typically written.
Mossfern
(4,431 posts)As I am a person who's basically a Luddite, can you give me an example of what a dalle statement for conceptual art would be?
The statement I was referring to in my post would be about the piece itself - otherwise a smudge on the wall is just a smudge on the wall. That's my concept of what conceptual art would be. Way back in the early 70's I displayed a conceptual piece, it was interesting, but the piece itself required human presence and the response was unexpected, but interesting.
I have absolutely no idea of the technicalities of AI created art, music, prose.
As I said before, I'm just an old fart suffering future shock way more that I ever thought I would.
Last edited Fri Feb 3, 2023, 09:37 PM - Edit history (1)

"The End of The Bull Run" Dalle-2
An artist who cut down the Wall Street bull and took this picture would be prosecuted.
Recursion
(56,582 posts)Somewhere out there, the AI visual art version of Marley Marl is doing something mind-blowing, and in a decade or so we'll wonder how we got by without it.
Lancero
(3,244 posts)And today, we somehow consider photography a artform in spite of all those artists it helped put out of work.
The written word is, too, considered a artform - Could their not be art found in spinning the words needed to guide the AI in creating whatever photo you're desiring?
Perhaps some artists are just angry that this is the death-kneel for their own deriding statements towards writers - You know the saying, "A pictures worth a thousand words"? Yeah. Writers only need a dozen or so now.
highplainsdem
(58,608 posts)babbled by any baby, or picked at random from a dictionary, and AI will try to make something of it.
Ask AI for, say, a dozen images of a child imagining monsters hiding under the bed, and it will provide them. Most likely by quickly plagiarizing illustrations it's been fed specifically or scraped off the internet.
And it might provide the same AI-generated images to someone else giving the same directions.
Does that make the people giving the directions artists? Of course not.
Lancero
(3,244 posts)You just don't understand the art.
Many people didn't understand how photography could be a artform. But we consider it such today. Press a button, boom, art.
For AI art, you'll have to constantly refine the directions you give it until it manages to put something out in line with whatever artistic vision guides you. Because hat is what art is. The vision, not the method used to make that vision a reality.
Don't forget to get rid of your camera, don't want it to steal jobs from people who paint portraits and sunsets.
highplainsdem
(58,608 posts)And art is NOT simply a vision or concept. If it was, every human being is an artist much of every day. Art is a vision PLUS skill PLUS work.
Using AI for art turns the skill and work part of that, plus much of the vision, over to software exploiting work done by human artists.
Some people using AI are already turning the basic concepts over to it. Asking it to generate entire plots and characters for novels based on a few guidelines, sometimes just the genre they want.
Do you call that creative writing?
Lancero
(3,244 posts)And doing so will take time and effort, which is work.
Vision, skill to guide the AI into creating that vision, and the time and effort to continue refining the words you're using to guide the AI.
By your own definition - Vision, plus skill, plus work - then yes... AI art IS art. Even if you don't understand the art.
highplainsdem
(58,608 posts)specifying every pixel, every line, every color and hue, you are not creating that image yourself.
That would be enormously time-consuming, and it would be a genuine use of AI as an artistic tool - perhaps by an artist physically unable to draw or paint because of an illness or injury - but it isn't what AI is being used for.
It's being used to draw on other people's work, to plagiarize others' creativity and work, without permission.
Which is why lawsuits have been filed. Like this one:
https://www.theverge.com/2023/1/17/23558516/ai-art-copyright-stable-diffusion-getty-images-lawsuit
Lancero
(3,244 posts)Still art, even if it's remixed. Lawsuits like these aren't to determine if something is art is art - It's meant to determine who gets the money should that art be monetized.
We already have a term for such works though - We call them derivative works.
But if you want to argue that something needs to hold a valid copyright to be considered legitimate art, well... Go right ahead? Certainly sucks that you're going to write off everything predating copyright laws as not legitimate art though.
Voltaire2
(15,377 posts)highplainsdem
(58,608 posts)Not surprisingly, it has a lot of critics:
https://www.artdex.com/conceptual-art-difficult-understand/
Despite being dismissed as real art by many experts, conceptual artists claim that their art is based on the essence of art as an idea or concept, existing despite often the absence of a material presence. This angered many artists who felt that this then meant that anyone could deem themselves, artists. Throw a pile of rotting Autumn leaves in the gap between a burnt tire and give it a name and suddenly, it was art. Voila!
It appears that conceptual art rose as a form of rebellion by those who didnt want to be limited by the confines of contemporary art. Or perhaps, as some have suspected, conceptual art was intended to poke fun at the masses who were easily deceived when presented with a completely new concept that was so beyond description and understanding yet easily convinced it was art; reducing conceptualism to a social experiment.
However, conceptual art does have a rather strong following particularly for those that feel that art, conceptual or not, is meant to be philosophical. Art needs not be beautiful, fall into a category, follow a technique or even have a name! Art is an open-ended question and meant to make you feel something and simply communicate with us. So whether conceptualism is met with admiration, disdain, or confusion (often than not); its done its job as proper art to render a reaction from its audience.
See the messy unmade bed surrounded by clutter in the photo with that article, and think how lucky we are to have millions of conceptual artists who won't make their beds or get rid of clutter. What an artistic treasure!
By the standards of conceptual art, a baby throwing up is offering a serious artistic comment on life, the universe, the parents, or maybe just the most recent meal. Attach a pretentious label and presto! it's art. By those standards, the insurrectionists decorating the Capitol with their feces were also artists commenting on politics.
Such a wonderful thing, conceptual art.
And think how wonderfully thought-provoking unused spiral notebooks are! Monuments of emptiness inviting creations! An open unused notebook displayed in an art gallery should be worth 7 figures, don't you think? Such an amazing concept...emptiness enticing creation.