As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
We're funding a new Acquisitions Incorporated series on Kickstarter right now! Check it out at https://www.kickstarter.com/projects/pennyarcade/acquisitions-incorporated-the-series-2

Death of the Artist [AI-Generated "Art"]

1212224262790

Posts

  • The Lovely BastardThe Lovely Bastard Registered User regular
    if you motherfuckers give pooro a bigger fucking ego

    7656367.jpg
    PoorochondriacZonugalMagellGustavKristmas KthulhuDisruptedCapitalistHacksaw
  • The Zombie PenguinThe Zombie Penguin Eternal Hungry Corpse Registered User regular
    if you motherfuckers give pooro a bigger fucking ego

    It's for science, okay? Dont question the science.

    Ideas hate it when you anthropomorphize them
    Steam: https://steamcommunity.com/id/TheZombiePenguin
    Stream: https://www.twitch.tv/thezombiepenguin/
    Switch: 0293 6817 9891
  • InquisitorInquisitor Registered User regular
    if you motherfuckers give pooro a bigger fucking ego

    You know what they say about Pooro?

    He’s no Chico.

    Poorochondriac
  • Evil MultifariousEvil Multifarious Registered User regular
    Inquisitor wrote: »
    Nobeard wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    This kind of singularity cult nonsense is very frustrating because it generally comes from people who have no grounding in philosophy, but also actively look at philosophy with contempt, including philosophy of technology and philosophy of science. So they make major conceptual and procedural errors when engaging in these thought experiments about future AIs and simulations and ethical imperatives, and then they go on to work at or build companies that operate on these principles and lobby politicians with tech money

    Have you ever had a conversation with someone who thinks Roko's Basilisk is real? It's absolutely surreal. There are people cutting deals with politicians who believe in this garbage.

    What’s Roko’s Basilisk?

    I want to expand on previous answers:

    Roko's Basilisk posits a future AI of immense power and intelligence. Specifically, it is capable of fully simulating the brain and body of a human being, including a full virtual consciousness.

    It posits that such a simulated person would have absolutely no way of knowing that they are a simulation.

    It posits that an AI of such power would be motivated to punish the people who refused, in the past, to help it come into being, perhaps out of sheer malevolence.

    It would therefore simulate all human beings, based on reconstruction from copious records and data, and run them through their lives to see if they ever had the idea that such an AI could exist and refused to help it. If they so refused, the AI would consign them to eternal simulated torture.

    This then means that KNOWING about the Basilisk is dangerous — because you don't have any way of knowing if you're one of those test simulations, and if you are, even hearing the concept would doom you unless you immediately worked toward the Basilisk's creation. It's a cognitohazard that condemns you to technoperdition.

    As a result, the other possible motivation for this convoluted torture is that the Basilisk knows all this and needs to enforce the mechanism that traps humans who understand it into working toward its genesis, retroactively. In fact, a benevolent AI would do the same to hasten its own existence because it will save more lives the earlier it exists

    Obviously this is tremendously stupid and confused, down to the causal, temporal level, or else I have doomed anyone who read this to cyberdamnation.

    Why…. Do I care if it creates a simulation based on me that it tortures forever? I feel like I am missing a step here.

    Because you might be the simulation, not the original (and also you have a moral duty to safeguard your simulation from damnation, since you're the only one who can, and they are a real, conscious being, just a virtual one)

    destroyah87
  • Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    Heffling wrote: »
    So, just to follow up the the removing barriers to artists creating, here's stuff that would make creating for me more accessible as a disabled artist:

    Money. (Art supplies)
    Better access to mental health resources so I can sort out the brain crap that gets in the way
    An Easel (I can draw with my right hand sitting down, but I learned to draw from my shoulder with my left arm, so it's hard to do sitting down)
    A house I actually own instead of renting, so i can have a dedicated art room to put the easel in
    Money (therapy)
    Money (batteries for my tens kit to manage my chronic pain)
    Money (various books, non fiction and fiction to build a reference and inspiration library)
    Money (so I don't have to get a job and can spend my limited energy and resources on creating instead of surviving)

    Things that won't help me make art:

    Stealing other artists work! Which, to be clear, is what ai art would be doing.

    These days most of my artistic output is done via writing because there's way less barriers to writing for me, and I'm lucky enough to be someone who can both write and draw. (I should actually make a thread here in the writing forum for my messages in a bottle series)

    I am really surprised that the apparent need for a UBI isn't a popular concept here.

    No one here takes "we just need UBI and itll be fine" seriously when its offered by people who's politics dont differentiate from the Democratic Party in any meaningful way.

    wq09t4opzrlc.jpg
    DarkPrimusVegemyteLanzHacksaw
  • InquisitorInquisitor Registered User regular
    Inquisitor wrote: »
    Nobeard wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    This kind of singularity cult nonsense is very frustrating because it generally comes from people who have no grounding in philosophy, but also actively look at philosophy with contempt, including philosophy of technology and philosophy of science. So they make major conceptual and procedural errors when engaging in these thought experiments about future AIs and simulations and ethical imperatives, and then they go on to work at or build companies that operate on these principles and lobby politicians with tech money

    Have you ever had a conversation with someone who thinks Roko's Basilisk is real? It's absolutely surreal. There are people cutting deals with politicians who believe in this garbage.

    What’s Roko’s Basilisk?

    I want to expand on previous answers:

    Roko's Basilisk posits a future AI of immense power and intelligence. Specifically, it is capable of fully simulating the brain and body of a human being, including a full virtual consciousness.

    It posits that such a simulated person would have absolutely no way of knowing that they are a simulation.

    It posits that an AI of such power would be motivated to punish the people who refused, in the past, to help it come into being, perhaps out of sheer malevolence.

    It would therefore simulate all human beings, based on reconstruction from copious records and data, and run them through their lives to see if they ever had the idea that such an AI could exist and refused to help it. If they so refused, the AI would consign them to eternal simulated torture.

    This then means that KNOWING about the Basilisk is dangerous — because you don't have any way of knowing if you're one of those test simulations, and if you are, even hearing the concept would doom you unless you immediately worked toward the Basilisk's creation. It's a cognitohazard that condemns you to technoperdition.

    As a result, the other possible motivation for this convoluted torture is that the Basilisk knows all this and needs to enforce the mechanism that traps humans who understand it into working toward its genesis, retroactively. In fact, a benevolent AI would do the same to hasten its own existence because it will save more lives the earlier it exists

    Obviously this is tremendously stupid and confused, down to the causal, temporal level, or else I have doomed anyone who read this to cyberdamnation.

    Why…. Do I care if it creates a simulation based on me that it tortures forever? I feel like I am missing a step here.

    Because you might be the simulation, not the original (and also you have a moral duty to safeguard your simulation from damnation, since you're the only one who can, and they are a real, conscious being, just a virtual one)

    I don’t think I am stoned enough for this to make sense.

    My simulation isn’t me, I am not being punished in anyway currently, there is no damnation.

    Also they aren’t a real, being, they are explicitly a simulation by an AI.

  • The Zombie PenguinThe Zombie Penguin Eternal Hungry Corpse Registered User regular
    edited September 2022
    Heffling wrote: »
    I am really surprised that the apparent need for a UBI isn't a popular concept here.

    Have you read any of my posts dude? At all? I'm super fucking on board with a UBI. I've also spent time, repeatedly, going over why just allowing AI art to happen without push back is going to harm the chances of a UBI. And again, WE STILL DONT HAVE UBI. Which is quite the sticking point for a UBI will solve things!

    Seriously, to be 100% clear: I am unemployed. I live on what's called the Supported Living Benefit here in NZ, due to my chronic health issues. This means i get around 305 USD a week to live on, of which 130 USD is eaten by my rent immediately. I cannot, currently, afford the mental health therapy I badly need so i can fix the shit that keeps me out of being able to hold down a job. I have one more free session, my psych is generously giving me a couple more gratis, and then i'm going ot have to find a new psych and figure out how the fuck i pay for that. I have less than 180 NZD in my savings right now, and live week to week. I have lost most of my bottom molars and am currently being fitted for dentures, because a lifetime of chronic depression and breathing issues destroyed my teeth and Root canals were not funded, so i had to get them removed.

    And my situation is fucking cushy compared to beneficiaries on the "Jobseeker" benefit, which is our main one (it used to be the unemployment benefit and the sickness benefit as separate things, but a previous right-wing government in their infinite poor hating wisdom smushed them together and renamed it). I have family who actually support me, including financially (Though this means i've not come out to them about being Queer, Poly and Non-binary for fear of loosing that support). I have a neighbor who lets me walk their dog and pays me under the table for that, which gives me a bit of extra money. I split food bills with my partner (Who i cannot legally acknowledge that they're my partner, otherwise my benefit would be slashed), which helps a ton even as the cost of living crisis goes nuts.

    and with all of this, for a beneficiaries? my situation is pretty okay! and it still fucking sucks and is horrible, degrading and generally soul destroying. For most beneficiaries? It is vastly worse.

    Words cannot convey how much i want a UBI (amongst other things like rent controls and land tax so there aren't shit gibbons who own 50 fucking properties out there). It would, properly implemented, be transformative for my life. It would be beyond transformative for all the people who are in worse situations to me. I want this immediately.

    But wanting is not the same as having.

    The Zombie Penguin on
    Ideas hate it when you anthropomorphize them
    Steam: https://steamcommunity.com/id/TheZombiePenguin
    Stream: https://www.twitch.tv/thezombiepenguin/
    Switch: 0293 6817 9891
    DarkPrimusVegemyteKristmas Kthulhu
  • ZonugalZonugal (He/Him) The Holiday Armadillo I'm Santa's representative for all the southern states. And Mexico!Registered User regular
    As president, I’ll establish a universal basic income program for Pell Grant recipients who start a business that operates for three years in disadvantaged communities

    Ross-Geller-Prime-Sig-A.jpg
    Hexmage-PADex DynamoMagellasofyeunKetarVegemyteTefKristmas KthulhuHacksawDee KaeFANTOMAS
  • ChicoBlueChicoBlue Registered User regular
    Have you all read Permutation City?

    It's about creating digital copies of people so they can live on after death.

    Only, it takes A LOT of processing power and their are bottle necks, so the people living in these simulations are running 6 times slower than real-time.

    IF they're rich.

    If you're not rich, but can still afford a digital clone, then you live at an even slower rate.

    Also, it's a 90s cyberpunk book that takes into account global warming in its world building.

    Also, there's a guy in it who sometimes has testicular spasms so it really hurts when he cums.

  • CelloCello Registered User regular
    Zonugal wrote: »
    As president, I’ll establish a universal basic income program for Pell Grant recipients who start a business that operates for three years in disadvantaged communities

    Jimmy are you even allowed to run for president

    Steam
    3DS Friend Code: 0216-0898-6512
    Switch Friend Code: SW-7437-1538-7786
  • InquisitorInquisitor Registered User regular
    ChicoBlue wrote: »
    Have you all read Permutation City?

    It's about creating digital copies of people so they can live on after death.

    Only, it takes A LOT of processing power and their are bottle necks, so the people living in these simulations are running 6 times slower than real-time.

    IF they're rich.

    If you're not rich, but can still afford a digital clone, then you live at an even slower rate.

    Also, it's a 90s cyberpunk book that takes into account global warming in its world building.

    Also, there's a guy in it who sometimes has testicular spasms so it really hurts when he cums.

    I think a lot of SciFi never really handles the fact that if you make digital copy of yourself that the original you is still dead and the digital copy is a separate entity that you have no link to.

    Incenjucar
  • HefflingHeffling No Pic EverRegistered User regular
    Heffling wrote: »
    I am really surprised that the apparent need for a UBI isn't a popular concept here.

    Have you read any of my posts dude? At all? I'm super fucking on board with a UBI. I've also spent time, repeatedly, going over why just allowing AI art to happen without push back is going to harm the chances of a UBI. And again, WE STILL DONT HAVE UBI. Which is quite the sticking point for a UBI will solve things!

    My bad. But we can't say we can't have something because we don't currently have something. It's a goal we have to work towards.
    Zonugal wrote: »
    Heffling wrote: »
    I am really surprised that the apparent need for a UBI isn't a popular concept here.

    It isn't that we don't support UBI.

    We simply acknowledge that its being presented in these discussions as an unrealistic cure-all, in an attempt to allow for AI-generated art.

    Its incredibly insufferable because it comes across as saying, "this societal issue won't be a problem when we bring together all seven of the Dragon Balls."

    That's a fair criticism. My response is that a prophecy that you will fail at a task is pretty self fulfilling. And that, at least in my experience and I haven't seen a citation to the contrary, all lesser efforts undertaken have failed in the US. There are valid models we could base our societal changes on, like Denmark's Tiered Social Security System.

    But I'll tamp down on beating this drum.

  • ZonugalZonugal (He/Him) The Holiday Armadillo I'm Santa's representative for all the southern states. And Mexico!Registered User regular
    Cello wrote: »
    Zonugal wrote: »
    As president, I’ll establish a universal basic income program for Pell Grant recipients who start a business that operates for three years in disadvantaged communities

    Jimmy are you even allowed to run for president

    The LA Times would seem to say that I can!!

    Ross-Geller-Prime-Sig-A.jpg
    CelloMidnite
  • Dex DynamoDex Dynamo Registered User regular
    Heffling wrote: »
    If we had preferably a UBI but at least a GI bill for citizens without requiring military or governmental service, then artists would have other avenues to retrain if displaced by ai art, including into a different field of art to something completely unrelated to their previous works.

    Other people have more eloquently touched on the UBI aspect, but this, this sucks shit to me as someone who works as an artist (albeit not in one currently facing automation)

    Like, "spending your days doing the art you love doing" is the whole ballgame, to me--that's what all the other grind is for, that's the endgame, I put up with all of the other bullshit so this thing I love, I can do that thing, and express myself and connect with people and all that other crap, I don't want to "retrain in a new field completely unrelated to my previous works"

    And I know this is no way unique to artists, and there are people who truly loved what they did who faced automation, but my gut reaction is to read that as a worst case scenario, not a happy ending

    Hexmage-PAMagellHefflingDarkPrimusMidnitetynicPoorochondriacVegemyteKristmas KthulhuHacksaw
  • ChicoBlueChicoBlue Registered User regular
    Inquisitor wrote: »
    ChicoBlue wrote: »
    Have you all read Permutation City?

    It's about creating digital copies of people so they can live on after death.

    Only, it takes A LOT of processing power and their are bottle necks, so the people living in these simulations are running 6 times slower than real-time.

    IF they're rich.

    If you're not rich, but can still afford a digital clone, then you live at an even slower rate.

    Also, it's a 90s cyberpunk book that takes into account global warming in its world building.

    Also, there's a guy in it who sometimes has testicular spasms so it really hurts when he cums.

    I think a lot of SciFi never really handles the fact that if you make digital copy of yourself that the original you is still dead and the digital copy is a separate entity that you have no link to.

    It's been a while since I read it, but I think there's a bit in it where one of the characters is thinking about this with relation to her mother who has a terminal illness.

    Also, one of the characters is a CEO digital construct and his board are like, "Why would we listen to this guy? Just tell him that things are going good and we'll do our own thing."

    Inquisitor
  • PaladinPaladin Registered User regular
    I think one solution is that we are defined by the lives of others, so as long as your copy carries your legacy for those that depend on you, your meaning in the world lives on or some crap like that

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • MaddocMaddoc I'm Bobbin Threadbare, are you my mother? Registered User regular
    I'm pretty sure doppelganger rules apply to digital mind copies,

    in that they will be overcome with the impulse to murder the original so that they can become real

    97H9G7S.png PSN - Masked Unit | FFXIV - Laitarne Gilgamesh
    Morninglord
  • ChicoBlueChicoBlue Registered User regular
    If I was a digital copy, I would simply reprogram my brain so that I am extremely content and fulfilled with creating endless chairs in my own little digital wood workshop for a set period of time.

    Like in the book Permutation City!

  • HefflingHeffling No Pic EverRegistered User regular
    Heffling wrote: »
    So, just to follow up the the removing barriers to artists creating, here's stuff that would make creating for me more accessible as a disabled artist:

    Money. (Art supplies)
    Better access to mental health resources so I can sort out the brain crap that gets in the way
    An Easel (I can draw with my right hand sitting down, but I learned to draw from my shoulder with my left arm, so it's hard to do sitting down)
    A house I actually own instead of renting, so i can have a dedicated art room to put the easel in
    Money (therapy)
    Money (batteries for my tens kit to manage my chronic pain)
    Money (various books, non fiction and fiction to build a reference and inspiration library)
    Money (so I don't have to get a job and can spend my limited energy and resources on creating instead of surviving)

    Things that won't help me make art:

    Stealing other artists work! Which, to be clear, is what ai art would be doing.

    These days most of my artistic output is done via writing because there's way less barriers to writing for me, and I'm lucky enough to be someone who can both write and draw. (I should actually make a thread here in the writing forum for my messages in a bottle series)

    I am really surprised that the apparent need for a UBI isn't a popular concept here.

    No one here takes "we just need UBI and itll be fine" seriously when its offered by people who's politics dont differentiate from the Democratic Party in any meaningful way.

    No Styro, that's not it at all. You disagree with me on some things and you don't like the Democratic Party, so you lump me in with them as something that you also don't like. My personal beliefs are much, much more progressive than the Democratic Platform on practically any topic that comes to mind.

    But you don't care about my position, or any nuances to the conversation at all. You just want to drop one to three sentences into a thread that use vague words to appear to convey a deep meaning without often saying anything at all, but garnering agrees and awesomes from like minded individuals. And when you get called on it, your start shifting goalposts or claiming that others misinterpreted you. Because at your core you are unable to admit that you're wrong.

    You called me tedious earlier, and if I'm being tedious, at least I'm with like minded company.

    ShadowhopeMarathon
  • HefflingHeffling No Pic EverRegistered User regular
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    I truly believe that those people should be hounded out of polite society the way we (should) do fascists

    If you don't care about communities, get the fuck out of 'em

    Yes, cliquishness and exclusion, truly the signs of polite society.

    Shadowhope
  • MagellMagell Detroit Machine Guns Fort MyersRegistered User regular
    Heffling wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    I truly believe that those people should be hounded out of polite society the way we (should) do fascists

    If you don't care about communities, get the fuck out of 'em

    Yes, cliquishness and exclusion, truly the signs of polite society.

    Why would I want to hang out with someone who doesn't care about humanity and just wants to perfect AI to rule over the rubble of civilization?

    shoeboxjeddyHexmage-PAPoorochondriacMidniteZonugalVegemyte-SPI-Kristmas KthulhuHacksawFANTOMASHappy Little Machine
  • PoorochondriacPoorochondriac Ah, man Ah, jeezRegistered User regular
    edited September 2022
    Heffling wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    I truly believe that those people should be hounded out of polite society the way we (should) do fascists

    If you don't care about communities, get the fuck out of 'em

    Yes, cliquishness and exclusion, truly the signs of polite society.

    To be very, very, very clear:
    Inquisitor described a person that views humans as amoebas. As creatures not worth worrying about, and worthy of destruction in pursuit of evolution. That kind of person.

    I said that that kind of person? Who doesn't care about humanity? Should not get to be around humanity.

    If you think this is an unreasonable position, I truly and genuinely do not give a shit.

    Edit, later and more charitably: If your read on that post of mine was that I was talking about anybody in this thread, that's not what I was (or am) doing. A person was described who thinks human extinction is not only acceptable but aspirational. That person should not get to be around people.

    Poorochondriac on
    tynicVegemyteLanzKristmas KthulhuHacksawHappy Little Machine
  • Evil MultifariousEvil Multifarious Registered User regular
    Arguing that using an AI to generate art is an assault on the ineffable qualities of humanity as Poorochondriac did is a much broader, more aggressive position than "AI should only be able to train on art with consent". These are two extremely different cases: one is that AI art is bad for society and even imperils the virtues that make us human, the other is that scraping art without consent is bad even if the art is only used as part of a vast aggregate database.

    I think more people are receptive to the latter, smaller-scale argument. But I also think the smaller-scale argument, if it won the day and changed how AI art trains the machine, would do little or nothing to protect artists from the economic impact of automation.

    My opinion is indeed broad and aggressive - I find AI Art generators inherently disrespectful to artists. Even the (I think worth reminding, currently-imaginary) open-source-art-only version. I think "a machine could do your job" is an insult. I feel no need to treat an insult with respect, or dignify it, or be kind to it.

    The (again, imaginary) open-source-only models, they are less harmful. If one is going to exist, I'd prefer it be that. People who support those are at least attempting to mitigate harm. While I'd prefer harm be avoided entirely, a position of mitigation is, technically, better than nothing

    But I'm not gonna back down from a position of "this shit sucks stem to stern and I don't care if my expression of that fact is uncomfortable for people"

    Can I ask why you think it's disrespectful to artists? I can see looking at current AI art and thinking "this is great, who needs human artists anymore?" is disrespectful; it equates weird, mediocre, inconsistent AI output with the higher-quality work of a human artist.

    But you seem to be saying that the very existence of AI art generators is insulting. Would that be the case even if they could produce art that is technically as good or better than humans, with similar or greater flexibility and consistency—or do you think that's inherently impossible, and the presumption that such an achievement is possible is part of the insult, presumably because the human production of art depends on those ineffable qualities? To be clear, I don't believe there are necessarily any ineffable human qualities, and that it is entirely possible that we will, for better or for worse, largely or entirely eff those qualities at some point, but I don't think it's out of the question, either.

    I am an artist, and I look at this technology with a pang of concern — because the idea of a machine being able to casually, easily do something I value doing (though it isn't particularly impressive, tbh) to be existentially concerning. I wouldn't call it an insult, personally, but more of a creeping question about what humans are going to do, how they're going to live, if machines can outdo them at the things they value for leisure and self-worth. It's the kind of thing that can make a guy into a Lex Luthor.

    Right now, and probably for a long time, these machines are going to be cheaper, more productive, shittier replacements for people. But if there are no ineffable human qualities, if a sufficiently sophisticated algorithm with sufficient aggregate data can supercede a human in qualitative, emotional, sentimental areas, what then? I don't think this is absurd; I think human beings repeatedly, chronically overestimate the extent to which we are special or exceptional, and the extent to which our capacities are unique.

    ShadowhopeMrMister
  • HefflingHeffling No Pic EverRegistered User regular
    Dex Dynamo wrote: »
    Heffling wrote: »
    If we had preferably a UBI but at least a GI bill for citizens without requiring military or governmental service, then artists would have other avenues to retrain if displaced by ai art, including into a different field of art to something completely unrelated to their previous works.

    Other people have more eloquently touched on the UBI aspect, but this, this sucks shit to me as someone who works as an artist (albeit not in one currently facing automation)

    Like, "spending your days doing the art you love doing" is the whole ballgame, to me--that's what all the other grind is for, that's the endgame, I put up with all of the other bullshit so this thing I love, I can do that thing, and express myself and connect with people and all that other crap, I don't want to "retrain in a new field completely unrelated to my previous works"

    And I know this is no way unique to artists, and there are people who truly loved what they did who faced automation, but my gut reaction is to read that as a worst case scenario, not a happy ending

    I think it's awesome that you've found a job that is doing what you love to do. Too many people, including myself, haven't been able to do this. But let's say automation does eventually make what you do redundant, if you really love what you do you can keep doing it. There are all kinds of companies that need people with those skill sets, which is why there are still ~10,000 professional full time blacksmiths and another 30-40,000 hobbyist blacksmiths in the US.

    Adam Savage has talked extensively about how ILM went through a very difficult transition from TPM to AOTC, and the thing that enabled them to recover was that they retrained their modelers from their now defunct modeling department to be the new 3d modelers in their digital department, because these older employees know all about lighting and other considerations that were necessary to pull off good designs.

    I sincerely hope you are able to keep doing what you love for as long as you desire to do it.

    IncenjucarShadowhopeMrMister
  • Evil MultifariousEvil Multifarious Registered User regular
    Inquisitor wrote: »
    Inquisitor wrote: »
    Nobeard wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    This kind of singularity cult nonsense is very frustrating because it generally comes from people who have no grounding in philosophy, but also actively look at philosophy with contempt, including philosophy of technology and philosophy of science. So they make major conceptual and procedural errors when engaging in these thought experiments about future AIs and simulations and ethical imperatives, and then they go on to work at or build companies that operate on these principles and lobby politicians with tech money

    Have you ever had a conversation with someone who thinks Roko's Basilisk is real? It's absolutely surreal. There are people cutting deals with politicians who believe in this garbage.

    What’s Roko’s Basilisk?

    I want to expand on previous answers:

    Roko's Basilisk posits a future AI of immense power and intelligence. Specifically, it is capable of fully simulating the brain and body of a human being, including a full virtual consciousness.

    It posits that such a simulated person would have absolutely no way of knowing that they are a simulation.

    It posits that an AI of such power would be motivated to punish the people who refused, in the past, to help it come into being, perhaps out of sheer malevolence.

    It would therefore simulate all human beings, based on reconstruction from copious records and data, and run them through their lives to see if they ever had the idea that such an AI could exist and refused to help it. If they so refused, the AI would consign them to eternal simulated torture.

    This then means that KNOWING about the Basilisk is dangerous — because you don't have any way of knowing if you're one of those test simulations, and if you are, even hearing the concept would doom you unless you immediately worked toward the Basilisk's creation. It's a cognitohazard that condemns you to technoperdition.

    As a result, the other possible motivation for this convoluted torture is that the Basilisk knows all this and needs to enforce the mechanism that traps humans who understand it into working toward its genesis, retroactively. In fact, a benevolent AI would do the same to hasten its own existence because it will save more lives the earlier it exists

    Obviously this is tremendously stupid and confused, down to the causal, temporal level, or else I have doomed anyone who read this to cyberdamnation.

    Why…. Do I care if it creates a simulation based on me that it tortures forever? I feel like I am missing a step here.

    Because you might be the simulation, not the original (and also you have a moral duty to safeguard your simulation from damnation, since you're the only one who can, and they are a real, conscious being, just a virtual one)

    I don’t think I am stoned enough for this to make sense.

    My simulation isn’t me, I am not being punished in anyway currently, there is no damnation.

    Also they aren’t a real, being, they are explicitly a simulation by an AI.

    Again, the idea is very dumb, so you might be skipping over its weird premises.

    1) Sufficiently granular simulations of a human being are indistinguishable from a real human being. They are capable of human consciousness, because consciousness is a physical phenomenon, a product of the matter that forms a human being, and a sufficiently high-res simulation of that physical matter will be conscious as well.

    2) The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed. It is impossible for you to know if you're an original human or a simulated human running through this simulated life. It doesn't even have to be exact! How would you know, if you're the simulation and not the original?

    3) If you're a simulated human, learning about the Basilisk dooms you to torture. If you're an original human, you won't be tortured, so you're safe — but you should probably feel bad for the far future simulation of you, who is going to get tortured. And since you can't know which one you are, you must rationally act as though the worst case scenario is possible, bevause the outcome is so dreadful.

    It's Pascal's Wager for tech doofuses

    tynicPoorochondriacElvenshaeLanzShadowhopeMorninglordHacksawNobeardSolarHappy Little Machine
  • MaddocMaddoc I'm Bobbin Threadbare, are you my mother? Registered User regular
    Heffling wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    I truly believe that those people should be hounded out of polite society the way we (should) do fascists

    If you don't care about communities, get the fuck out of 'em

    Yes, cliquishness and exclusion, truly the signs of polite society.

    So much for the tolerant left

    97H9G7S.png PSN - Masked Unit | FFXIV - Laitarne Gilgamesh
    tynicPoorochondriacMagellVegemyteStyrofoam Sammich-SPI-LanzGR_ZombieKristmas KthulhuDisruptedCapitalistHacksawKwoaruFANTOMASHappy Little Machine
  • PoorochondriacPoorochondriac Ah, man Ah, jeezRegistered User regular
    Arguing that using an AI to generate art is an assault on the ineffable qualities of humanity as Poorochondriac did is a much broader, more aggressive position than "AI should only be able to train on art with consent". These are two extremely different cases: one is that AI art is bad for society and even imperils the virtues that make us human, the other is that scraping art without consent is bad even if the art is only used as part of a vast aggregate database.

    I think more people are receptive to the latter, smaller-scale argument. But I also think the smaller-scale argument, if it won the day and changed how AI art trains the machine, would do little or nothing to protect artists from the economic impact of automation.

    My opinion is indeed broad and aggressive - I find AI Art generators inherently disrespectful to artists. Even the (I think worth reminding, currently-imaginary) open-source-art-only version. I think "a machine could do your job" is an insult. I feel no need to treat an insult with respect, or dignify it, or be kind to it.

    The (again, imaginary) open-source-only models, they are less harmful. If one is going to exist, I'd prefer it be that. People who support those are at least attempting to mitigate harm. While I'd prefer harm be avoided entirely, a position of mitigation is, technically, better than nothing

    But I'm not gonna back down from a position of "this shit sucks stem to stern and I don't care if my expression of that fact is uncomfortable for people"

    Can I ask why you think it's disrespectful to artists? I can see looking at current AI art and thinking "this is great, who needs human artists anymore?" is disrespectful; it equates weird, mediocre, inconsistent AI output with the higher-quality work of a human artist.

    But you seem to be saying that the very existence of AI art generators is insulting. Would that be the case even if they could produce art that is technically as good or better than humans, with similar or greater flexibility and consistency—or do you think that's inherently impossible, and the presumption that such an achievement is possible is part of the insult, presumably because the human production of art depends on those ineffable qualities? To be clear, I don't believe there are necessarily any ineffable human qualities, and that it is entirely possible that we will, for better or for worse, largely or entirely eff those qualities at some point, but I don't think it's out of the question, either.

    I am an artist, and I look at this technology with a pang of concern — because the idea of a machine being able to casually, easily do something I value doing (though it isn't particularly impressive, tbh) to be existentially concerning. I wouldn't call it an insult, personally, but more of a creeping question about what humans are going to do, how they're going to live, if machines can outdo them at the things they value for leisure and self-worth. It's the kind of thing that can make a guy into a Lex Luthor.

    Right now, and probably for a long time, these machines are going to be cheaper, more productive, shittier replacements for people. But if there are no ineffable human qualities, if a sufficiently sophisticated algorithm with sufficient aggregate data can supercede a human in qualitative, emotional, sentimental areas, what then? I don't think this is absurd; I think human beings repeatedly, chronically overestimate the extent to which we are special or exceptional, and the extent to which our capacities are unique.

    I don't mean this in a dickish way, I promise, it's just getting late and I'm running on fumes a bit, you're engaging in good faith and I didn't want to ignore you.

    But I've spilled a lot of ink in this thread about why I think it's an insult and why I think art is more than a mechanical process. My thoughts are already in this thread, just not this page.

  • Evil MultifariousEvil Multifarious Registered User regular
    Arguing that using an AI to generate art is an assault on the ineffable qualities of humanity as Poorochondriac did is a much broader, more aggressive position than "AI should only be able to train on art with consent". These are two extremely different cases: one is that AI art is bad for society and even imperils the virtues that make us human, the other is that scraping art without consent is bad even if the art is only used as part of a vast aggregate database.

    I think more people are receptive to the latter, smaller-scale argument. But I also think the smaller-scale argument, if it won the day and changed how AI art trains the machine, would do little or nothing to protect artists from the economic impact of automation.

    My opinion is indeed broad and aggressive - I find AI Art generators inherently disrespectful to artists. Even the (I think worth reminding, currently-imaginary) open-source-art-only version. I think "a machine could do your job" is an insult. I feel no need to treat an insult with respect, or dignify it, or be kind to it.

    The (again, imaginary) open-source-only models, they are less harmful. If one is going to exist, I'd prefer it be that. People who support those are at least attempting to mitigate harm. While I'd prefer harm be avoided entirely, a position of mitigation is, technically, better than nothing

    But I'm not gonna back down from a position of "this shit sucks stem to stern and I don't care if my expression of that fact is uncomfortable for people"

    Can I ask why you think it's disrespectful to artists? I can see looking at current AI art and thinking "this is great, who needs human artists anymore?" is disrespectful; it equates weird, mediocre, inconsistent AI output with the higher-quality work of a human artist.

    But you seem to be saying that the very existence of AI art generators is insulting. Would that be the case even if they could produce art that is technically as good or better than humans, with similar or greater flexibility and consistency—or do you think that's inherently impossible, and the presumption that such an achievement is possible is part of the insult, presumably because the human production of art depends on those ineffable qualities? To be clear, I don't believe there are necessarily any ineffable human qualities, and that it is entirely possible that we will, for better or for worse, largely or entirely eff those qualities at some point, but I don't think it's out of the question, either.

    I am an artist, and I look at this technology with a pang of concern — because the idea of a machine being able to casually, easily do something I value doing (though it isn't particularly impressive, tbh) to be existentially concerning. I wouldn't call it an insult, personally, but more of a creeping question about what humans are going to do, how they're going to live, if machines can outdo them at the things they value for leisure and self-worth. It's the kind of thing that can make a guy into a Lex Luthor.

    Right now, and probably for a long time, these machines are going to be cheaper, more productive, shittier replacements for people. But if there are no ineffable human qualities, if a sufficiently sophisticated algorithm with sufficient aggregate data can supercede a human in qualitative, emotional, sentimental areas, what then? I don't think this is absurd; I think human beings repeatedly, chronically overestimate the extent to which we are special or exceptional, and the extent to which our capacities are unique.

    I don't mean this in a dickish way, I promise, it's just getting late and I'm running on fumes a bit, you're engaging in good faith and I didn't want to ignore you.

    But I've spilled a lot of ink in this thread about why I think it's an insult and why I think art is more than a mechanical process. My thoughts are already in this thread, just not this page.

    I will never fault anyone who wants to stop arguing on the internet, don't worry

    Hexmage-PAInquisitorZonugalHefflingMidnitePoorochondriacThe Zombie PenguinOlivawNobeardMrMisterHappy Little Machine
  • Hexmage-PAHexmage-PA Registered User regular
    edited September 2022
    Evil Multifarious I appreciate that you've been acting in good faith in both of these threads.

    Hexmage-PA on
  • InquisitorInquisitor Registered User regular
    edited September 2022
    Inquisitor wrote: »
    Inquisitor wrote: »
    Nobeard wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    This kind of singularity cult nonsense is very frustrating because it generally comes from people who have no grounding in philosophy, but also actively look at philosophy with contempt, including philosophy of technology and philosophy of science. So they make major conceptual and procedural errors when engaging in these thought experiments about future AIs and simulations and ethical imperatives, and then they go on to work at or build companies that operate on these principles and lobby politicians with tech money

    Have you ever had a conversation with someone who thinks Roko's Basilisk is real? It's absolutely surreal. There are people cutting deals with politicians who believe in this garbage.

    What’s Roko’s Basilisk?

    I want to expand on previous answers:

    Roko's Basilisk posits a future AI of immense power and intelligence. Specifically, it is capable of fully simulating the brain and body of a human being, including a full virtual consciousness.

    It posits that such a simulated person would have absolutely no way of knowing that they are a simulation.

    It posits that an AI of such power would be motivated to punish the people who refused, in the past, to help it come into being, perhaps out of sheer malevolence.

    It would therefore simulate all human beings, based on reconstruction from copious records and data, and run them through their lives to see if they ever had the idea that such an AI could exist and refused to help it. If they so refused, the AI would consign them to eternal simulated torture.

    This then means that KNOWING about the Basilisk is dangerous — because you don't have any way of knowing if you're one of those test simulations, and if you are, even hearing the concept would doom you unless you immediately worked toward the Basilisk's creation. It's a cognitohazard that condemns you to technoperdition.

    As a result, the other possible motivation for this convoluted torture is that the Basilisk knows all this and needs to enforce the mechanism that traps humans who understand it into working toward its genesis, retroactively. In fact, a benevolent AI would do the same to hasten its own existence because it will save more lives the earlier it exists

    Obviously this is tremendously stupid and confused, down to the causal, temporal level, or else I have doomed anyone who read this to cyberdamnation.

    Why…. Do I care if it creates a simulation based on me that it tortures forever? I feel like I am missing a step here.

    Because you might be the simulation, not the original (and also you have a moral duty to safeguard your simulation from damnation, since you're the only one who can, and they are a real, conscious being, just a virtual one)

    I don’t think I am stoned enough for this to make sense.

    My simulation isn’t me, I am not being punished in anyway currently, there is no damnation.

    Also they aren’t a real, being, they are explicitly a simulation by an AI.

    Again, the idea is very dumb, so you might be skipping over its weird premises.

    1) Sufficiently granular simulations of a human being are indistinguishable from a real human being. They are capable of human consciousness, because consciousness is a physical phenomenon, a product of the matter that forms a human being, and a sufficiently high-res simulation of that physical matter will be conscious as well.

    2) The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed. It is impossible for you to know if you're an original human or a simulated human running through this simulated life. It doesn't even have to be exact! How would you know, if you're the simulation and not the original?

    3) If you're a simulated human, learning about the Basilisk dooms you to torture. If you're an original human, you won't be tortured, so you're safe — but you should probably feel bad for the far future simulation of you, who is going to get tortured. And since you can't know which one you are, you must rationally act as though the worst case scenario is possible, bevause the outcome is so dreadful.

    It's Pascal's Wager for tech doofuses

    “Sufficiently granular simulations of a human being are indistinguishable from a real human being.” ah see, I don’t think I accept this core premise so the whole thing falls apart to me.

    “The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed.” this also confused me because it means the AI is incredibly bad and slow? It should be able to simulate the entire life of a human like, near instantly.

    “but you should probably feel bad for the far future simulation of you, who is going to get tortured.” Nah that’s just the basilisk being a dick, sweet fuck all I can do about that. This is some god waiting for humanity to fuck up in the garden bullshit, piece of shit AI will find some reason to create things to torture because it’s fucked in the head, I guess.

    But yeah, responses like mine here is probably why I’ve always bounced off of a lot philosophy stuff.

    I do really appreciate you taking the time to walk me through this though.

    Inquisitor on
  • tynictynic PICNIC BADASS Registered User, ClubPA regular
    Inquisitor wrote: »
    Inquisitor wrote: »
    Inquisitor wrote: »
    Nobeard wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    This kind of singularity cult nonsense is very frustrating because it generally comes from people who have no grounding in philosophy, but also actively look at philosophy with contempt, including philosophy of technology and philosophy of science. So they make major conceptual and procedural errors when engaging in these thought experiments about future AIs and simulations and ethical imperatives, and then they go on to work at or build companies that operate on these principles and lobby politicians with tech money

    Have you ever had a conversation with someone who thinks Roko's Basilisk is real? It's absolutely surreal. There are people cutting deals with politicians who believe in this garbage.

    What’s Roko’s Basilisk?

    I want to expand on previous answers:

    Roko's Basilisk posits a future AI of immense power and intelligence. Specifically, it is capable of fully simulating the brain and body of a human being, including a full virtual consciousness.

    It posits that such a simulated person would have absolutely no way of knowing that they are a simulation.

    It posits that an AI of such power would be motivated to punish the people who refused, in the past, to help it come into being, perhaps out of sheer malevolence.

    It would therefore simulate all human beings, based on reconstruction from copious records and data, and run them through their lives to see if they ever had the idea that such an AI could exist and refused to help it. If they so refused, the AI would consign them to eternal simulated torture.

    This then means that KNOWING about the Basilisk is dangerous — because you don't have any way of knowing if you're one of those test simulations, and if you are, even hearing the concept would doom you unless you immediately worked toward the Basilisk's creation. It's a cognitohazard that condemns you to technoperdition.

    As a result, the other possible motivation for this convoluted torture is that the Basilisk knows all this and needs to enforce the mechanism that traps humans who understand it into working toward its genesis, retroactively. In fact, a benevolent AI would do the same to hasten its own existence because it will save more lives the earlier it exists

    Obviously this is tremendously stupid and confused, down to the causal, temporal level, or else I have doomed anyone who read this to cyberdamnation.

    Why…. Do I care if it creates a simulation based on me that it tortures forever? I feel like I am missing a step here.

    Because you might be the simulation, not the original (and also you have a moral duty to safeguard your simulation from damnation, since you're the only one who can, and they are a real, conscious being, just a virtual one)

    I don’t think I am stoned enough for this to make sense.

    My simulation isn’t me, I am not being punished in anyway currently, there is no damnation.

    Also they aren’t a real, being, they are explicitly a simulation by an AI.

    Again, the idea is very dumb, so you might be skipping over its weird premises.

    1) Sufficiently granular simulations of a human being are indistinguishable from a real human being. They are capable of human consciousness, because consciousness is a physical phenomenon, a product of the matter that forms a human being, and a sufficiently high-res simulation of that physical matter will be conscious as well.

    2) The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed. It is impossible for you to know if you're an original human or a simulated human running through this simulated life. It doesn't even have to be exact! How would you know, if you're the simulation and not the original?

    3) If you're a simulated human, learning about the Basilisk dooms you to torture. If you're an original human, you won't be tortured, so you're safe — but you should probably feel bad for the far future simulation of you, who is going to get tortured. And since you can't know which one you are, you must rationally act as though the worst case scenario is possible, bevause the outcome is so dreadful.

    It's Pascal's Wager for tech doofuses

    “Sufficiently granular simulations of a human being are indistinguishable from a real human being.” ah see, I don’t think I accept this core premise so the whole thing falls apart to me.

    “The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed.” this also confused me because it means the AI is incredibly bad and slow? It should be able to simulate the entire life of a human like, near instantly.

    “but you should probably feel bad for the far future simulation of you, who is going to get tortured.” Nah that’s just the basilisk being a dick, sweet fuck all I can do about that. This is some god waiting for humanity to fuck up in the garden bullshit, piece of shit AI will find some reason to create things to torture because it’s fucked in the head, I guess.

    But yeah, responses like mine here is probably why I’ve always bounced off of a lot philosophy stuff.

    I do really appreciate you taking the time to walk me through this though.

    I mean let's be clear, this isn't philosophy, it's superstitious masturbation from people who have (as said) independently invented pascals wager and then decided to be scared of it.

    Like it was originally brought up in the context of "anyone who believes this is a dipshit". I don't want you to get the impression this bullshit is in service of some deep philosophical truth.

    MidniteEvil MultifariousLanzShadowhopeMagellKristmas KthulhuDarkPrimusGrisloYoshisummonsdestroyah87HacksawNobeardHappy Little Machine
  • InquisitorInquisitor Registered User regular
    tynic wrote: »
    Inquisitor wrote: »
    Inquisitor wrote: »
    Inquisitor wrote: »
    Nobeard wrote: »
    Inquisitor wrote: »
    There are some real characters in the AI space. I don’t know if this is true of the art fair fellow, but there are definitely people in the AI space that basically think “as the single cell organism lead to the multi-cell organism, the true purpose of mankind is to give birth to strong AI and retire from the scene”

    Which is, you know, totally bonkers to me. And also the weak AI we have currently is such a far cry from anything approaching a strong AI that the idea is currently laughable, but there are some real doomsday cult folks out and about.

    This kind of singularity cult nonsense is very frustrating because it generally comes from people who have no grounding in philosophy, but also actively look at philosophy with contempt, including philosophy of technology and philosophy of science. So they make major conceptual and procedural errors when engaging in these thought experiments about future AIs and simulations and ethical imperatives, and then they go on to work at or build companies that operate on these principles and lobby politicians with tech money

    Have you ever had a conversation with someone who thinks Roko's Basilisk is real? It's absolutely surreal. There are people cutting deals with politicians who believe in this garbage.

    What’s Roko’s Basilisk?

    I want to expand on previous answers:

    Roko's Basilisk posits a future AI of immense power and intelligence. Specifically, it is capable of fully simulating the brain and body of a human being, including a full virtual consciousness.

    It posits that such a simulated person would have absolutely no way of knowing that they are a simulation.

    It posits that an AI of such power would be motivated to punish the people who refused, in the past, to help it come into being, perhaps out of sheer malevolence.

    It would therefore simulate all human beings, based on reconstruction from copious records and data, and run them through their lives to see if they ever had the idea that such an AI could exist and refused to help it. If they so refused, the AI would consign them to eternal simulated torture.

    This then means that KNOWING about the Basilisk is dangerous — because you don't have any way of knowing if you're one of those test simulations, and if you are, even hearing the concept would doom you unless you immediately worked toward the Basilisk's creation. It's a cognitohazard that condemns you to technoperdition.

    As a result, the other possible motivation for this convoluted torture is that the Basilisk knows all this and needs to enforce the mechanism that traps humans who understand it into working toward its genesis, retroactively. In fact, a benevolent AI would do the same to hasten its own existence because it will save more lives the earlier it exists

    Obviously this is tremendously stupid and confused, down to the causal, temporal level, or else I have doomed anyone who read this to cyberdamnation.

    Why…. Do I care if it creates a simulation based on me that it tortures forever? I feel like I am missing a step here.

    Because you might be the simulation, not the original (and also you have a moral duty to safeguard your simulation from damnation, since you're the only one who can, and they are a real, conscious being, just a virtual one)

    I don’t think I am stoned enough for this to make sense.

    My simulation isn’t me, I am not being punished in anyway currently, there is no damnation.

    Also they aren’t a real, being, they are explicitly a simulation by an AI.

    Again, the idea is very dumb, so you might be skipping over its weird premises.

    1) Sufficiently granular simulations of a human being are indistinguishable from a real human being. They are capable of human consciousness, because consciousness is a physical phenomenon, a product of the matter that forms a human being, and a sufficiently high-res simulation of that physical matter will be conscious as well.

    2) The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed. It is impossible for you to know if you're an original human or a simulated human running through this simulated life. It doesn't even have to be exact! How would you know, if you're the simulation and not the original?

    3) If you're a simulated human, learning about the Basilisk dooms you to torture. If you're an original human, you won't be tortured, so you're safe — but you should probably feel bad for the far future simulation of you, who is going to get tortured. And since you can't know which one you are, you must rationally act as though the worst case scenario is possible, bevause the outcome is so dreadful.

    It's Pascal's Wager for tech doofuses

    “Sufficiently granular simulations of a human being are indistinguishable from a real human being.” ah see, I don’t think I accept this core premise so the whole thing falls apart to me.

    “The Basilisk, in the far future, simulates the entire lives of all the humans who might have hindered its development in the past—their entire regular lives, reconstructed.” this also confused me because it means the AI is incredibly bad and slow? It should be able to simulate the entire life of a human like, near instantly.

    “but you should probably feel bad for the far future simulation of you, who is going to get tortured.” Nah that’s just the basilisk being a dick, sweet fuck all I can do about that. This is some god waiting for humanity to fuck up in the garden bullshit, piece of shit AI will find some reason to create things to torture because it’s fucked in the head, I guess.

    But yeah, responses like mine here is probably why I’ve always bounced off of a lot philosophy stuff.

    I do really appreciate you taking the time to walk me through this though.

    I mean let's be clear, this isn't philosophy, it's superstitious masturbation from people who have (as said) independently invented pascals wager and then decided to be scared of it.

    Like it was originally brought up in the context of "anyone who believes this is a dipshit". I don't want you to get the impression this bullshit is in service of some deep philosophical truth.

    Haha that’s a totally fair point, I am treating this with a level of gravity it does not warrant.

    Though to be fair I think I made a similar volume of fart noises when I learned about Pascal’s wager in school as well.

    tynicElvenshae
  • Evil MultifariousEvil Multifarious Registered User regular
    I would note that Roko's Basilisk is sourced from a forum that was arguably a seething, fetid breeding ground for tech bros, and is not like, philosophy

    (which, to be fair, the philosophy of simulation and simulacra can be way more inscrutable than this, even if its ideas are much more profound)

    InquisitorLord_AsmodeusElvenshae
  • tynictynic PICNIC BADASS Registered User, ClubPA regular
    Alistair Beckett -King once described R's B as "The Game but for silicon valley billionaires"

    Evil MultifariousMidniteThe Zombie PenguinInquisitorPoorochondriacSpeed RacerLanzOlivawMagellDarkPrimusHacksawNobeardFANTOMASHappy Little Machine
  • Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    Heffling wrote: »
    Heffling wrote: »
    So, just to follow up the the removing barriers to artists creating, here's stuff that would make creating for me more accessible as a disabled artist:

    Money. (Art supplies)
    Better access to mental health resources so I can sort out the brain crap that gets in the way
    An Easel (I can draw with my right hand sitting down, but I learned to draw from my shoulder with my left arm, so it's hard to do sitting down)
    A house I actually own instead of renting, so i can have a dedicated art room to put the easel in
    Money (therapy)
    Money (batteries for my tens kit to manage my chronic pain)
    Money (various books, non fiction and fiction to build a reference and inspiration library)
    Money (so I don't have to get a job and can spend my limited energy and resources on creating instead of surviving)

    Things that won't help me make art:

    Stealing other artists work! Which, to be clear, is what ai art would be doing.

    These days most of my artistic output is done via writing because there's way less barriers to writing for me, and I'm lucky enough to be someone who can both write and draw. (I should actually make a thread here in the writing forum for my messages in a bottle series)

    I am really surprised that the apparent need for a UBI isn't a popular concept here.

    No one here takes "we just need UBI and itll be fine" seriously when its offered by people who's politics dont differentiate from the Democratic Party in any meaningful way.

    No Styro, that's not it at all. You disagree with me on some things and you don't like the Democratic Party, so you lump me in with them as something that you also don't like. My personal beliefs are much, much more progressive than the Democratic Platform on practically any topic that comes to mind.

    But you don't care about my position, or any nuances to the conversation at all. You just want to drop one to three sentences into a thread that use vague words to appear to convey a deep meaning without often saying anything at all, but garnering agrees and awesomes from like minded individuals. And when you get called on it, your start shifting goalposts or claiming that others misinterpreted you. Because at your core you are unable to admit that you're wrong.

    You called me tedious earlier, and if I'm being tedious, at least I'm with like minded company.

    What are you doing to try to make UBI happen?

    wq09t4opzrlc.jpg
    Lanz
  • GustavGustav Friend of Goats Somewhere in the OzarksRegistered User regular
    edited September 2022
    I truly think the only honest stances with AI art are A- wanting accolades without having to do much work and B- wanting immediate satisfaction on the cheap.

    I respect any of these opinions more than a maze of thought experiments that bring us to the same results. I don’t agree with them but I can frankly accept the honesty.

    Gustav on
    aGPmIBD.jpg
    PoorochondriacMaddocThe Lovely BastardStyrofoam SammichtynicThe Zombie PenguinZonugalDex DynamoEtiowsaVegemyteOlivawMagellCelloKristmas KthulhuDarkPrimusMidniteHacksawHappy Little Machine
  • GustavGustav Friend of Goats Somewhere in the OzarksRegistered User regular
    edited September 2022
    Like Jesus Christ I don’t honestly believe most of you actually are being honest in saying algorithms are as (if not more) important than people. Or that this is your backdoor plot to UBI. Or that you care about traditional farming practices. Or that AI is our only lasting important legacy once humanity is done. Or whatever argument has been brought to this thread.

    I think you want immediate gratification on your short time on earth. Just say you want to be considered an artist. It’s more honest and more direct and appeals to the human desire far more than the false moral cover y’all are hunting for.

    If you want to join the art world get ready to defend what you do as art every day. Honestly the work this aims to steal? Illustration, comics, the like? Always been scoffed at by that world too. So happy trails. Who gives a shit about what’s art. I give a shit about what’s communication and what’s people.

    I think most working artists here are arguing more about the process and its meaning than than the status. It’s telling that a solid portion (and I’ll grant not nearly all of yall) are arguing for the status alone.

    Gustav on
    aGPmIBD.jpg
    PoorochondriacZonugalDex DynamotynicVegemyteLanzMagellDarkPrimusMidniteThe Lovely BastardDisruptedCapitalistHacksaw
  • InquisitorInquisitor Registered User regular
    I emphatically would never like to be viewed as an artist. Being viewed as an artist sounds like a nightmare to me.

    I am quite looking forward to sculpting, printing and painting my own miniatures soon. But that’s just my dorky fun hobby.

    Course I’m mostly over here in the camp of “AI is a tool and like all tools it has good uses and bad uses and we as a society need to figure out what we think are and are not acceptable uses of this tool.”

  • ChicoBlueChicoBlue Registered User regular
    Guess what, dweeb?

    If you're making and painting your own miniatures you're an artist.

    Time to pick where you want the brand.

    tynicMagellKristmas KthulhuMidniteDee KaeHappy Little Machine
  • ChicoBlueChicoBlue Registered User regular
    GUESS WHAT, DWEEBS.

    YOU DREW A HORSE ONE TIME?

    WELCOME TO ARTIST TOWN POPULATION YOU DWEEBS.

Sign In or Register to comment.