Penny Arcade - Comic - Terminarter

13»

Posts

  • Akira13Akira13 Registered User regular
    dennis wrote: »
    Let me know when it spontaneously mentions pedipalps.

    lol... I will say I guess it reflects well on Jerry that this is like, significantly above average output in terms of creative writing for GPT, I'd say, like getting it to write this way is effectively impossible if you just try to *ask* it to do so.

  • nialloniallo Registered User regular
    Those Chat-GPT things are absolutely terrible. Utter garbage. Like a kid going 'Long words lol!'

  • Akira13Akira13 Registered User regular
    I disagree TBH, these sound like fairly close to something that Jerry would actually write if you ask me. Like it's an intentional style he's established over the course of years. A human would probably come up with something similar if asked to imitate him. GPT-3.5 (which powered Chat GPT when it got super popular for a bit) wasn't nearly as good. The pics I posted are GPT-4 though, which is DRASTICALLY better in every way.

  • nialloniallo Registered User regular
    OK. All long words are the same to you?

  • BremenBremen Registered User regular
    I mean, it reads like someone trying to parody his style, but that in itself shows that the AI is at least able to imitate a good understanding of his style. Which is both impressive and a little creepy.

  • dennisdennis aka bingley Registered User regular
    edited January 2024
    We all understand that the only direction for AI to go is to get better. Either in 5 years or 50 years, AI will be good enough that we won't even disagree that it can generate copy that reads exactly like Jerry wrote it. Add literature to the things that AI will "make easier" for us. Why would anyone buy a copy of the next Stephen King (okay, whoever is hot in 5-50 years), when they can get a dirt cheap (or maybe free) auto-generated one free of pesky expensive copyright payments to a popular author?

    Edit: Or even if you postulate a post-scarcity society where no one has to pay for anything, most people who find storytelling to be a calling will actually want to tell stories to people who will listen. Will be very hard given that stories take you time and effort to create while the AI just farts them out in the millions.

    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    This is the future that I don't enjoy contemplating for humanity.

    dennis on
  • V1mV1m Registered User regular
    dennis wrote: »
    Let me know when it spontaneously mentions pedipalps.

    And, of course, Spokane.

  • MoonlighterMoonlighter Registered User regular
    dennis wrote: »
    We all understand that the only direction for AI to go is to get better. Either in 5 years or 50 years, AI will be good enough that we won't even disagree that it can generate copy that reads exactly like Jerry wrote it. Add literature to the things that AI will "make easier" for us. Why would anyone buy a copy of the next Stephen King (okay, whoever is hot in 5-50 years), when they can get a dirt cheap (or maybe free) auto-generated one free of pesky expensive copyright payments to a popular author?

    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    This is the future that I don't enjoy contemplating for humanity.

    I think this is a key point that has been lost a number of times in this discussion. Free art may seem cool, and it is, and I don't think it's the issue at hand. I'd wager that it's the fact that current AI are generating free art using other people's material without their permission. Maybe companies will hire artists to train their models, etc, as predicted, but that's secondary to permission in the present moment - as in the Steven King and Jerry examples, AI arguably can't recreate these authors without using their stuff, and that goes back to "do they have permission to." Again, the "they" in this case is companies like Google, etc., not the tools themselves. If you built an ethical AI model that only allowed samples from people with permission (eg the Darth Vader example mentioned), that seems like a less egregious road to the future dennis mentions (although it's probably still there). I agree with the poster who said that analogies fail because there's nothing like this.

    So while free art and a labor free society are logical extensions of AI tools and a ton of sci-fi literature, the issue of who has the right to train them with what is more salient to me.

    I can feel myself itching to use analogies. It's so tempting. Human language just does it so easily!

  • OverkillengineOverkillengine Registered User regular
    edited January 2024
    dennis wrote: »
    We all understand that the only direction for AI to go is to get better. Either in 5 years or 50 years, AI will be good enough that we won't even disagree that it can generate copy that reads exactly like Jerry wrote it. Add literature to the things that AI will "make easier" for us. Why would anyone buy a copy of the next Stephen King (okay, whoever is hot in 5-50 years), when they can get a dirt cheap (or maybe free) auto-generated one free of pesky expensive copyright payments to a popular author?

    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    This is the future that I don't enjoy contemplating for humanity.

    I think this is a key point that has been lost a number of times in this discussion. Free art may seem cool, and it is, and I don't think it's the issue at hand. I'd wager that it's the fact that current AI are generating free art using other people's material without their permission. Maybe companies will hire artists to train their models, etc, as predicted, but that's secondary to permission in the present moment - as in the Steven King and Jerry examples, AI arguably can't recreate these authors without using their stuff, and that goes back to "do they have permission to." Again, the "they" in this case is companies like Google, etc., not the tools themselves. If you built an ethical AI model that only allowed samples from people with permission (eg the Darth Vader example mentioned), that seems like a less egregious road to the future dennis mentions (although it's probably still there). I agree with the poster who said that analogies fail because there's nothing like this.

    So while free art and a labor free society are logical extensions of AI tools and a ton of sci-fi literature, the issue of who has the right to train them with what is more salient to me.

    I can feel myself itching to use analogies. It's so tempting. Human language just does it so easily!

    Yeah as I had argued on another board on this same type of topic:
    A human being given permission to look at and thus potentially learn from art could be argued to be legally distinct from giving an automated tool permission to do so.

    Basically usage permissions need to be updated and clarified as technology advances since it is not reasonable to expect an artist to have perfect foreknowledge of such.


    And a good test is to ask the AI to recreate known works by an artist whose IP rights are suspected of being infringed upon.

    Going to be kind of hard to argue you didn't steal their work if it can reliably reproduce a specified artist's style. (And any potential end consumers are going to want that ability, so falsifying that becomes self defeating)

    Which will hopefully lead to proactive obtainment of permissions and maintaining auditable registries of such.

    Overkillengine on
  • BremenBremen Registered User regular
    edited January 2024
    dennis wrote: »
    We all understand that the only direction for AI to go is to get better. Either in 5 years or 50 years, AI will be good enough that we won't even disagree that it can generate copy that reads exactly like Jerry wrote it. Add literature to the things that AI will "make easier" for us. Why would anyone buy a copy of the next Stephen King (okay, whoever is hot in 5-50 years), when they can get a dirt cheap (or maybe free) auto-generated one free of pesky expensive copyright payments to a popular author?

    Edit: Or even if you postulate a post-scarcity society where no one has to pay for anything, most people who find storytelling to be a calling will actually want to tell stories to people who will listen. Will be very hard given that stories take you time and effort to create while the AI just farts them out in the millions.

    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    This is the future that I don't enjoy contemplating for humanity.

    I'm not sure an AI will ever be able to create unique novels, or art. Assuming sufficient improvements we might reach the point where AI could publish a coherent novel, but it would be pretty soulless. At least not without becoming a true human level AI capable of creativity and innovation, at which point I wouldn't draw much distinction between a book written by a human and a book written by said AI.

    As for can humans find meaning producing novels in a sea of soulless mass produced mediocrity... honestly I feel like we don't need AI to do that. The thing about novels in the digital era is that one novel can be sold infinite times, so doubling the market size doubles the number of authors but not the market for unique books (because if everyone buys 10 books a year, they can all be buying the same 10 books). Hence why we have stuff like Kindle Unlimited with a vast sea of very cheap books, many very generic and soulless. But the cream does still rise to the top and some authors see a lot of success there because people like their books and word spreads.

    Bremen on
  • dennisdennis aka bingley Registered User regular
    I'm surprised anyone in 2024 would be hanging an argument on "I'm not sure AI will ever be able to ___________". Maybe early 2023. :grin:

    While you're not entirely wrong about how much stuff is out there, I feel like you're talking about multiplying that by a million. I am able to find stuff in the sea of mediocrity now, but in this scenario it seems unlikely.

    Or, more likely, those other things won't be mediocre. They'll be just as good as one created by the person whose style it is aping.

  • lazegamerlazegamer The magnanimous cyberspaceRegistered User regular
    edited January 2024
    dennis wrote: »
    We all understand that the only direction for AI to go is to get better. Either in 5 years or 50 years, AI will be good enough that we won't even disagree that it can generate copy that reads exactly like Jerry wrote it. Add literature to the things that AI will "make easier" for us. Why would anyone buy a copy of the next Stephen King (okay, whoever is hot in 5-50 years), when they can get a dirt cheap (or maybe free) auto-generated one free of pesky expensive copyright payments to a popular author?

    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    This is the future that I don't enjoy contemplating for humanity.

    I think this is a key point that has been lost a number of times in this discussion. Free art may seem cool, and it is, and I don't think it's the issue at hand. I'd wager that it's the fact that current AI are generating free art using other people's material without their permission. Maybe companies will hire artists to train their models, etc, as predicted, but that's secondary to permission in the present moment - as in the Steven King and Jerry examples, AI arguably can't recreate these authors without using their stuff, and that goes back to "do they have permission to." Again, the "they" in this case is companies like Google, etc., not the tools themselves. If you built an ethical AI model that only allowed samples from people with permission (eg the Darth Vader example mentioned), that seems like a less egregious road to the future dennis mentions (although it's probably still there). I agree with the poster who said that analogies fail because there's nothing like this.

    So while free art and a labor free society are logical extensions of AI tools and a ton of sci-fi literature, the issue of who has the right to train them with what is more salient to me.

    I can feel myself itching to use analogies. It's so tempting. Human language just does it so easily!

    Yeah as I had argued on another board on this same type of topic:
    A human being given permission to look at and thus potentially learn from art could be argued to be legally distinct from giving an automated tool permission to do so.

    Basically usage permissions need to be updated and clarified as technology advances since it is not reasonable to expect an artist to have perfect foreknowledge of such.


    And a good test is to ask the AI to recreate known works by an artist whose IP rights are suspected of being infringed upon.

    Going to be kind of hard to argue you didn't steal their work if it can reliably reproduce a specified artist's style. (And any potential end consumers are going to want that ability, so falsifying that becomes self defeating)

    Which will hopefully lead to proactive obtainment of permissions and maintaining auditable registries of such.

    The question is whether creating embeddings from a non-public work and training a model using those embeddings is direct infringement. You can download every image on the internet without permission, shrink them down to a thumbnail and share them freely while providing some metadata about them. That's not infringement under fair use. The question is not whether non-public works were used, but whether or not the collection of weights and features is a derivative work, and if the use of those weights and features is transformative.

    And while it's sometimes called stealing/theft/piracy ("you wouldn't download a car"), infringing a rightsholder's temporary monopoly on reproducing a work is not the same thing.

    lazegamer on
    I would download a car.
  • nialloniallo Registered User regular
    The question of crime and theft and so on always comes back to the damage done. That's why, for example, attempted murder has a lesser punishment than a successful one.

    Nobody really believes IP piracy is the same as theft, because the original item is not taken and the owner not deprived of it. If it's a crime, it's not theft. It does less harm.

    The question is always how much damage does it do? Pirates downplay it, IP owners exaggerate it, and we try to work out what the truth is, e.g. wondering how many pirates would have ever bought the original.

    With iterative generation, LLMs and so on, the lawsuits and so on are related to the damage they do. Because these new technologies threaten the entire industry of art and the livelihood of writers and artists.

    So, the question should NOT be whether they are transformative. That's applying an existing legal model to an entirely new technology. It's an outdated approach, very suddenly.

    This comes back to what law is for. The law should exist to protect people, and change as needed. There are multiple vectors of change - precedents, legislation, interpretation and more. The underlying question should be - and usually is - 'will this new tech damage society and individuals massively'.

    Unfortunately, the legal and legislative industries are not honest, and obfuscate many of their fundamental processes. So what might be absolutely clear to a legal scholar - e.g. what law is, what is its purpose - are hidden from us normal people, and we fumble through discussions on 'fair use' without sufficient grounding in these powerful, important concepts.

    When I think back to my own education, and how so much of it focused on the War of the Roses rather than 'What's a law for?' or 'How do you avoid being really poor?' I get very very angry.


  • QuidQuid Definitely not a banana Registered User regular
    dennis wrote: »
    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    There will always be people who want to read or listen to the stories from a person. We have entire industries centered around providing "authentic" goods and services despite cheaper, easier to obtain options existing. Something as deeply intrinsic to the human experience as storytelling won't cease to exist because computers can generate theoretical Dan Brown novels.

  • MichaelLCMichaelLC In what furnace was thy brain? ChicagoRegistered User regular
    Quid wrote: »
    dennis wrote: »
    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    There will always be people who want to read or listen to the stories from a person. We have entire industries centered around providing "authentic" goods and services despite cheaper, easier to obtain options existing. Something as deeply intrinsic to the human experience as storytelling won't cease to exist because computers can generate theoretical Dan Brown novels.

    Sure in twenty years I'll be able to buy a novel or painting off Etsy, but that will be nothing compared to every company using AI to not pay artists of all types. Need a logo? AI. Need a script? AI. Need a movie? AI. Need a package design? AI

    There's no convincing anyone who thinks copying a human's work and reproducing it at profit to you and none for the artist is a bad thing.

  • dennisdennis aka bingley Registered User regular
    Quid wrote: »
    dennis wrote: »
    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    There will always be people who want to read or listen to the stories from a person. We have entire industries centered around providing "authentic" goods and services despite cheaper, easier to obtain options existing. Something as deeply intrinsic to the human experience as storytelling won't cease to exist because computers can generate theoretical Dan Brown novels.

    There might always be people, but a) will there be enough and b) will you ever be able to get your name in front of their eyes.

    The vast majority of all artists already struggle to make a living from their art.

  • palidine40palidine40 Registered User regular
    IntotheSky wrote: »
    But the people at fault are the end users of these generative AI tools, not their creators. We don't place liability on Photoshop when people use it to create and distribute infringing works; why would we blame the AI tools and the developers, especially when actively prompting them to do so is the only way to reliably create infringing work using them.

    Blaming just the buyer doesn't work as a solution, not long term, not under repeateded assault.

    Drug dealers, Prostitution, Guns, Theft, Alien Employment

    Addiction treatment centers help with drugs/"other" addictions. Equalization of felt rights and actual social status (power coupon access) can help against gun violence and theft. Alien employment is complicated and hot button requiring multi layered solutions from both ends.

    Of these methods of breaking the law, they all have something similar in how these AI solutions harm artists and industry, but there'd be too much text to line up the similarities. There's no simple solution to any one of them.

    Regulating them as a whole from the top to the bottom would likely be a better solution. But i Am going to hazard a guess, and say you Would blame the lab cooks, the cartel leaders, the pimps, the street dealers... so why is that different from the devs and the companies to you?

    I don't think you're identifying the result as a direct harm, which is likely not made from personal malice, maybe its not personally impacting enough event to you, maybe you haven't seen some perspectives, maybe you haven't felt the pain or loss yourself, maybe you haven't done a cost analysis of the impacts to an individual, maybe you have a valuation of different people/things in the equation than i do.

  • QuidQuid Definitely not a banana Registered User regular
    dennis wrote: »
    Quid wrote: »
    dennis wrote: »
    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    There will always be people who want to read or listen to the stories from a person. We have entire industries centered around providing "authentic" goods and services despite cheaper, easier to obtain options existing. Something as deeply intrinsic to the human experience as storytelling won't cease to exist because computers can generate theoretical Dan Brown novels.

    There might always be people, but a) will there be enough and b) will you ever be able to get your name in front of their eyes.

    The vast majority of all artists already struggle to make a living from their art.

    I'm responding to the scenario you proposed. Why would anyone be entitled to others reading their work? What even is "enough" people in this post scarcity world?

  • dennisdennis aka bingley Registered User regular
    Quid wrote: »
    dennis wrote: »
    Quid wrote: »
    dennis wrote: »
    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    There will always be people who want to read or listen to the stories from a person. We have entire industries centered around providing "authentic" goods and services despite cheaper, easier to obtain options existing. Something as deeply intrinsic to the human experience as storytelling won't cease to exist because computers can generate theoretical Dan Brown novels.

    There might always be people, but a) will there be enough and b) will you ever be able to get your name in front of their eyes.

    The vast majority of all artists already struggle to make a living from their art.

    I'm responding to the scenario you proposed. Why would anyone be entitled to others reading their work? What even is "enough" people in this post scarcity world?

    "Entitled" isn't really a word I would use for that. Unless you consider almost every artist currently alive to be "entitled", because they tend to want people to see their work. So that out of the way, why do they want others to read their work? Because they think their work is worth reading.

    As for what is "enough", I'd say you'll have to use common sense and judgment on that one. It's not really any different from in our pre-scarcity world.

  • QuidQuid Definitely not a banana Registered User regular
    edited January 2024
    dennis wrote: »
    Quid wrote: »
    dennis wrote: »
    Quid wrote: »
    dennis wrote: »
    Sorry, authors. You'll just have to retrain to... well, I'm sure there's SOMETHING left for you. Or you could just keep writing your stories, knowing that almost no one will ever read them or care.

    There will always be people who want to read or listen to the stories from a person. We have entire industries centered around providing "authentic" goods and services despite cheaper, easier to obtain options existing. Something as deeply intrinsic to the human experience as storytelling won't cease to exist because computers can generate theoretical Dan Brown novels.

    There might always be people, but a) will there be enough and b) will you ever be able to get your name in front of their eyes.

    The vast majority of all artists already struggle to make a living from their art.

    I'm responding to the scenario you proposed. Why would anyone be entitled to others reading their work? What even is "enough" people in this post scarcity world?

    "Entitled" isn't really a word I would use for that. Unless you consider almost every artist currently alive to be "entitled", because they tend to want people to see their work. So that out of the way, why do they want others to read their work? Because they think their work is worth reading.

    As for what is "enough", I'd say you'll have to use common sense and judgment on that one. It's not really any different from in our pre-scarcity world.

    Every artist I know wants to be paid a living wage so they can do whatever they want. Every artist, every human, is entitled to having their needs met. An audience isn't a need.

    Commons sense is a terrible metric when deciding what other people have to do for a single individual. If an artist can't attract an audience then they don't get to have one. Just like everyone else.

    Quid on
  • dennisdennis aka bingley Registered User regular
    edited January 2024
    And with this type of outlook, we lose human-made art, except for as a rare sideshow.

    dennis on
  • Eat it You Nasty Pig.Eat it You Nasty Pig. tell homeland security 'we are the bomb'Registered User regular
    the point is that this apparently terrifying future where most human-created art is not commercially valuable and passes unseen and un-remarked-upon is already here, and has been for quite some time. It doesn't seem to have impacted humans' desire to make the art in the first place. No matter how good the current language/diffusion model AIs get all they can ultimately do is imitate; they can't produce anything really novel and will always need a stream of new inputs to remain 'current.'

    this thread has made great hay out of 'write me a post in the style of tycho brahe,' but that only works if 1) "tycho brahe" has produced enough work for the algorithm to munch and 2) it has disseminated widely enough that Joe User even knows to ask for it. Leaving aside how the author may personally feel about that type of thing, it can only even happen if they're already known/successful.

    hold your head high soldier, it ain't over yet
    that's why we call it the struggle, you're supposed to sweat
  • QuidQuid Definitely not a banana Registered User regular
    edited January 2024
    dennis wrote: »
    And with this type of outlook, we lose human-made art, except for as a rare sideshow.

    I don't think so. The vast, vast majority of us create for just ourselves and no one else. Lots of art is completely unprofitable and/or ethereal, existing in the moment before disappearing altogether, and we continue to do it.

    I'm confident humanity will continue to create. It's instinctual and we enjoy it. Skinner boxes can't replace that joy.

    Quid on
  • LttlefootLttlefoot Registered User regular
    I'm aware of my own irony herehkh8jvi4aarb.png

  • MichaelLCMichaelLC In what furnace was thy brain? ChicagoRegistered User regular
    edited January 2024
    Quid wrote: »
    dennis wrote: »
    And with this type of outlook, we lose human-made art, except for as a rare sideshow.

    I don't think so. The vast, vast majority of us create for just ourselves and no one else. Lots of art is completely unprofitable and/or ethereal, existing in the moment before disappearing altogether, and we continue to do it.

    I'm confident humanity will continue to create. It's instinctual and we enjoy it. Skinner boxes can't replace that joy.

    I honestly can't tell if you're being dishonest in your statement or really don't understand the problem.

    Sorry to yell, but IT'S THE ARTISTS WHO CREATE ART AS THEIR JOB/FOR PROFIT THAT AI IS DESTROYING.

    ARTISTS MAKE COMPANY LOGOS, GAMES, PICTURES, PRODUCT PHOTOS AND VIDEOS AS THEIR JOB. THIS IS THE WORK THAT AI WHO STEALS THEIR WORK AND SELLS IT FOR PENNIES IS RUINING.

    NOT SOMEONE MAKING ART FOR FUN.

    Maybe you were just stating that artists will continue to create, which is true. But in regards to "AI", that is not the market being replaced.

    MichaelLC on
  • QuidQuid Definitely not a banana Registered User regular
    The proposition was a post scarcity society where wages wouldn't matter. I'm not going to talk about something that isn't that. It always results in confusing different people's statements.

  • giraffe chauffekgiraffe chauffek Registered User new member
    I lost the comic address for "whatever you call it. Dope or whatever you call it." I think it's a gabe only comic.

  • nialloniallo Registered User regular
    A post-scarcity society is as relevant to this discussion of employment, IP rights, the fundamentals of what art actually is (e.g. communication between people), and the capacities of generative 'AI', as Narnia is.

    Art would be fine in a post-scarcity society, because Mr Tumnus is awesome.

  • LttlefootLttlefoot Registered User regular
    I just read about a program called Glaze you can run your artwork through before putting it online, where it looks the same to humans, but different to AI

Sign In or Register to comment.