As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

AI Government

1235789

Posts

  • Options
    Cedar BrownCedar Brown Registered User regular
    edited March 2010
    AI government? That conjures up some images in my mind.

    "Excuse me, Secretary, has the President come to a decision yet?"

    "Ummm... I don't know. Last time I checked he was partially stuck into the wall and sprinting. That was an hour ago, shall I check again?"



    It's going to be a very, very long time until the general populace would accept AI systems in government. Especially since technology can be used as a tool for corruption. Who has control over the system? Who's going to watch them?

    Cedar Brown on
  • Options
    TheAceofSpadesTheAceofSpades Registered User regular
    edited March 2010
    Can you imagine the conspiracy theories that would endlessly flow? The AI's hardware was built in China and not eligible for ruling!!

    TheAceofSpades on
  • Options
    DarkWarriorDarkWarrior __BANNED USERS regular
    edited March 2010
    Ouch...yeah you can't have a President computer made in China. That won't go down well at all.

    DarkWarrior on
  • Options
    Bliss 101Bliss 101 Registered User regular
    edited March 2010
    Can you imagine the conspiracy theories that would endlessly flow? The AI's hardware was built in China and not eligible for ruling!!

    The AI's approach to religion would also be an interesting topic.

    I dread to think of an AI that's programmed to accept some religious dogma as part of its reasoning process.

    Bliss 101 on
    MSL59.jpg
  • Options
    Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited March 2010
    Hit. Nail. Head. :lol::^:

    Zilla360 on
  • Options
    QuidQuid Definitely not a banana Registered User regular
    edited March 2010
    Kamar wrote: »
    DanHibiki wrote: »
    Modern Man wrote: »
    DanHibiki wrote: »
    why would an AI driven government not be democratic? If anything it would be far more democratic then the government is now since it would be able to take the suggestions of all citizens and devise a strategy that would best benefit everyone and filter out political trolls like Beck.

    Mind you there would have to be a far better system for voting then the one now.
    That's not democracy. That's a benevolent dictatorship. What if the AI decides human input is not useful? Would 50%+1 of the electorate have the power to kickk the AI out of office and replace it with another AI, or human leaders?

    Not that I'm in favor of a pure democracy. I prefer a system where the government is heavily limited in what it can and cannot do, barring super-majority vote to change the constitution. Even if the majority decide my right to free speech is disruptive, they still can't take it away.

    why would it?

    Why does everyone think that an AI has only two modes, off and kill-all-humans?

    Name one popular movie with an AI that isn't creepy as fuck, outright evil, or harmful to humans through coldblooded logic.

    I think this is the issue. People watch only the Terminator and The Matrix then only associate AI with "Kill all Humans".

    Quid on
  • Options
    jothkijothki Registered User regular
    edited March 2010
    Quid wrote: »
    Why wouldn't it go for the option that best benefits everyone?

    If the computer negotiates from a position of lets do the best for everyone and other countries negotiate from a position of whats best for their people the computer is always going to attempt to take control of the other countries.

    The problem with having a leader that can clearly perceive obstacles is that he's going to try to act to remove them. That's not inherently a problem, but there are a hell of a lot of obstacles, and they don't like to get out of the way or even admit that they're obstacles.

    It would obviously be possible to program the computer to try to avoid tampering with the freedoms of others, but there's a difficult balance there. The AI would need to believe that it is more capable of making the right decisions than individual humans or humanity in general, or otherwise it wouldn't have any reason to try to make those decisions. If you make an AI that sees itself as a threat to its own objectives, it'll just destroy itself or refuse to act.

    jothki on
  • Options
    Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited March 2010
    jothki wrote: »
    If you make an AI that sees itself as a threat to its own objectives, it'll just destroy itself or refuse to act.
    "Healthcare debate - so stupid - error - divide by zero - BZZZT! - MUST TERMINATE SELF!" *Formats and installs Windows ME.*

    Zilla360 on
  • Options
    Mr_RoseMr_Rose 83 Blue Ridge Protects the Holy Registered User regular
    edited March 2010
    Which brings up why this discussion is full of silly goosery everybody assumes

    Perfect AI == their opinions, which is of course arrogance beyond belief.

    I would have a great laugh if right after this AI was enacted it totally deregulated the economy right before appointing Glenn Beck as its right hand man.

    Should fat people be restricted in what they are allowed to eat? According to some threads on this board lately I think your perfect AI could go either way depending on who is describing it.

    I kind of like how you assume you know my entire position from one post containing two diametrically opposed possible outcomes. Especially the part where you additionally conclude that because I indicate a dislike of religion I can't possibly be in favour of a deregulated economy under this hypothetical AI, when in fact such an entity would be the one thing we'd need to make such an economy viable in the long term; an impartial, omniscient consumer advocate.

    Mr_Rose on
    ...because dragons are AWESOME! That's why.
    Nintendo Network ID: AzraelRose
    DropBox invite link - get 500MB extra free.
  • Options
    ScroffusScroffus Registered User regular
    edited March 2010
    SpeedySwaf wrote: »
    I've wondered about this too, but in a slightly different light. We have scientists who've already created machines that can mimic some basic survival techniques, and after these AIs have evolved for a couple thousand generations, they begin showing behavior very similar to some actual animals, in the sense they start forming "packs" to secure their own livelihood, so to speak.

    So I wonder if we could do something in regards to economics or the like...have groups of machines in simple, but similar situations that we may find ourselves, and after so many generations, see how they've learned to balance the well being of themselves and the others, if at all.

    Do you have a link to this? (It's not that I'm challenging it, I just think it would be interesting to read).

    Scroffus on
  • Options
    MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited March 2010
    For me, this is basically "we can't do this, instead of consciously evolving ourselves let's elect a nanny"

    Nice idea but I prefer not to give up, thanks.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Options
    AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    edited March 2010
    I can't really think of feasible approach to governing by AI, but I do think it would be really interesting to have an interactive non-partisan, non-binding method of interaction between government and the general population, if nothing more than for more accurate information being given to people electing representatives and more accurate and swift forms of opinion polling.


    "63% of American households disapprove of the latest sweeping social reforms, but of those respondents, 84% have a less than 4th grade reading level and an IQ under 75. The remaining 16% have documented histories of accidentally hitting 'No' when they really mean 'Yes' on government surveys."

    Atomika on
  • Options
    MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited March 2010
    I can see a lot of potential for AI in all fields of life, especially as impartial observers.

    But as flawed as we are, as wrong as we often are, as bad as we are at governing ourselves....if we don't do it why do we deserve to survive at all?

    Asking for an AI to govern us is not wanting to deal with trying to change other people. That's a real shame, because you can change other people. It just takes time and patience.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Options
    ElitistbElitistb Registered User regular
    edited March 2010
    Quid wrote: »
    I think this is the issue. People watch only the Terminator and The Matrix then only associate AI with "Kill all Humans".
    The amusing part about The Matrix is that in the movies they barely cover the backstory you'd get from the Animatrix, where you learn that the humans were the silly geese that virtually caused all the problems.
    Asking for an AI to govern us is not wanting to deal with trying to change other people. That's a real shame, because you can change other people. It just takes time and patience.
    Missing something here. "It just takes time and patience while you wait for their generation to die and hope their children are reasonable." That looks more like reality.

    Elitistb on
    steam_sig.png
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited March 2010
    Someone may already have mentioned this, but the books Recursion, Capacitiy and... the Other one, by Tony Ballentyne (or something, I can't be bothered walking down stairs, the pain is too great) deal with having an AI government in an interesting way.

    There are a range of issues that it deals with, such as the fact that by the time an A.I. is powerful enough to design other A.I.s that it is so complex and beyond the comprehension of a human, even a team of humans that we can have no idea about what its motivations are.

    The books are really good even if you reject the premises injected into the latter two books, which I do.

    They're pretty firmly my favourite Sci Fi books ever.

    Apothe0sis on
  • Options
    MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited March 2010
    Elitistb wrote: »
    Quid wrote: »
    I think this is the issue. People watch only the Terminator and The Matrix then only associate AI with "Kill all Humans".
    The amusing part about The Matrix is that in the movies they barely cover the backstory you'd get from the Animatrix, where you learn that the humans were the silly geese that virtually caused all the problems.
    Asking for an AI to govern us is not wanting to deal with trying to change other people. That's a real shame, because you can change other people. It just takes time and patience.
    Missing something here. "It just takes time and patience while you wait for their generation to die and hope their children are reasonable." That looks more like reality.

    I kind of implictly meant "time and patience as you work towards what you want" not "time and patience while you sit on your ass whining not doing anything"

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Options
    ElitistbElitistb Registered User regular
    edited March 2010
    Doesn't mean you'll change their minds for some people. Many of the big societal shifts were caused by generational change. One generation primarily opposed something, the next did not.

    Elitistb on
    steam_sig.png
  • Options
    ButtcleftButtcleft Registered User regular
    edited March 2010
    Elitistb wrote: »
    Quid wrote: »
    I think this is the issue. People watch only the Terminator and The Matrix then only associate AI with "Kill all Humans".
    The amusing part about The Matrix is that in the movies they barely cover the backstory you'd get from the Animatrix, where you learn that the humans were the silly geese that virtually caused all the problems.
    Asking for an AI to govern us is not wanting to deal with trying to change other people. That's a real shame, because you can change other people. It just takes time and patience.
    Missing something here. "It just takes time and patience while you wait for their generation to die and hope their children are reasonable." That looks more like reality.

    I kind of implictly meant "time and patience as you work towards what you want" not "time and patience while you sit on your ass whining not doing anything"

    most of the time in movies or the backstories of the movies or what have you,

    the AI function of kill all humans is a direct developmental result of some atrocious thing humans did to it previously, an example of which was previously mentioned in this quote train

    Buttcleft on
  • Options
    DanHibikiDanHibiki Registered User regular
    edited March 2010
    For me, this is basically "we can't do this, instead of consciously evolving ourselves let's elect a nanny"

    Nice idea but I prefer not to give up, thanks.

    you just used a computer to type that up rather then chiseling it on the side of a rock. You already gave up, just at a different degree.

    DanHibiki on
  • Options
    ButtcleftButtcleft Registered User regular
    edited March 2010
    DanHibiki wrote: »
    For me, this is basically "we can't do this, instead of consciously evolving ourselves let's elect a nanny"

    Nice idea but I prefer not to give up, thanks.

    you just used a computer to type that up rather then chiseling it on the side of a rock. You already gave up, just at a different degree.

    using the computer as a tool to tell you his message.

    and the computer telling you what it wants his message to you to be

    are vastly seperate things

    Buttcleft on
  • Options
    DarkWarriorDarkWarrior __BANNED USERS regular
    edited March 2010
    i think the fact is, we can evolve much faster as a people, change as a people if we have an impartial, intelligent, informed entity controlling t he more basic facets of life like speed limits, money allocation and minor things like gay marriage which consume HUGE amounts of time for humans to debate. Thus we are able to evolve and become better people quicker because we're no longer rabidly foaming at the mouth to get our voice heard, its just heard.

    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.

    DarkWarrior on
  • Options
    frandelgearslipfrandelgearslip 457670Registered User regular
    edited March 2010
    Mr_Rose wrote: »
    Which brings up why this discussion is full of silly goosery everybody assumes

    Perfect AI == their opinions, which is of course arrogance beyond belief.

    I would have a great laugh if right after this AI was enacted it totally deregulated the economy right before appointing Glenn Beck as its right hand man.

    Should fat people be restricted in what they are allowed to eat? According to some threads on this board lately I think your perfect AI could go either way depending on who is describing it.

    I kind of like how you assume you know my entire position from one post containing two diametrically opposed possible outcomes. Especially the part where you additionally conclude that because I indicate a dislike of religion I can't possibly be in favour of a deregulated economy under this hypothetical AI, when in fact such an entity would be the one thing we'd need to make such an economy viable in the long term; an impartial, omniscient consumer advocate.

    I used your quote, because you assumed (like Mr. Transhumanism :P ) that you knew what the AI would do. The second paragraph had nothing to do with you and was directed at the posters in general most of who are assuming that the computer will be a far left democrat.
    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.

    Thats a silly goose of an arguement. None of us are going to be alive for your nanny computer either.

    frandelgearslip on
  • Options
    DarkWarriorDarkWarrior __BANNED USERS regular
    edited March 2010
    Mr_Rose wrote: »
    Which brings up why this discussion is full of silly goosery everybody assumes

    Perfect AI == their opinions, which is of course arrogance beyond belief.

    I would have a great laugh if right after this AI was enacted it totally deregulated the economy right before appointing Glenn Beck as its right hand man.

    Should fat people be restricted in what they are allowed to eat? According to some threads on this board lately I think your perfect AI could go either way depending on who is describing it.

    I kind of like how you assume you know my entire position from one post containing two diametrically opposed possible outcomes. Especially the part where you additionally conclude that because I indicate a dislike of religion I can't possibly be in favour of a deregulated economy under this hypothetical AI, when in fact such an entity would be the one thing we'd need to make such an economy viable in the long term; an impartial, omniscient consumer advocate.

    I used your quote, because you assumed (like Mr. Transhumanism :P ) that you knew what the AI would do. The second paragraph had nothing to do with you and was directed at the posters in general most of who are assuming that the computer will be a far left democrat.
    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.

    Thats a silly goose of an arguement. None of us are going to be alive for your nanny computer either.

    We would under our perfect computer Overlord. Get your head around that.

    DarkWarrior on
  • Options
    frandelgearslipfrandelgearslip 457670Registered User regular
    edited March 2010
    Mr_Rose wrote: »
    Which brings up why this discussion is full of silly goosery everybody assumes

    Perfect AI == their opinions, which is of course arrogance beyond belief.

    I would have a great laugh if right after this AI was enacted it totally deregulated the economy right before appointing Glenn Beck as its right hand man.

    Should fat people be restricted in what they are allowed to eat? According to some threads on this board lately I think your perfect AI could go either way depending on who is describing it.

    I kind of like how you assume you know my entire position from one post containing two diametrically opposed possible outcomes. Especially the part where you additionally conclude that because I indicate a dislike of religion I can't possibly be in favour of a deregulated economy under this hypothetical AI, when in fact such an entity would be the one thing we'd need to make such an economy viable in the long term; an impartial, omniscient consumer advocate.

    I used your quote, because you assumed (like Mr. Transhumanism :P ) that you knew what the AI would do. The second paragraph had nothing to do with you and was directed at the posters in general most of who are assuming that the computer will be a far left democrat.
    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.

    Thats a silly goose of an arguement. None of us are going to be alive for your nanny computer either.

    We would under our perfect computer Overlord. Get your head around that.

    In the time it takes to invent said computer the six centuries you posited could have also passed.

    If your handwaving and saying nannybot exists today then I am handwaving and saying my 600 years have passed also.

    frandelgearslip on
  • Options
    DanHibikiDanHibiki Registered User regular
    edited March 2010
    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.

    Thats a silly goose of an arguement. None of us are going to be alive for your nanny computer either.
    Maybe you, I exercise.

    DanHibiki on
  • Options
    Modern ManModern Man Registered User regular
    edited March 2010
    i think the fact is, we can evolve much faster as a people, change as a people if we have an impartial, intelligent, informed entity controlling t he more basic facets of life like speed limits, money allocation and minor things like gay marriage which consume HUGE amounts of time for humans to debate. Thus we are able to evolve and become better people quicker because we're no longer rabidly foaming at the mouth to get our voice heard, its just heard.

    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.
    The problem is, your definition of "basic facets of life" and "minor things" is purely subjective. Your argument in favor of an AI government seems to boil down to the belief that such government will make the decisions you want on issues like gay marriage and allocation of tax money.

    It seems like you don't like the fact that some of your fellow citizens might disagree with you, so you want some AI to make the "right" (read: your preferred) decisions on these matters.

    It should be pointed out that some Christians believe that Jesus will return and rule the Earth like a benevolent, all-knowing father figure and will make all the "right" (read: those Christians' preferred) decisions on all Earthly matters. If you don't think that's a good idea, then neither is the idea of an AI government.

    Modern Man on
    Aetian Jupiter - 41 Gunslinger - The Old Republic
    Rigorous Scholarship

  • Options
    DanHibikiDanHibiki Registered User regular
    edited March 2010
    Modern Man wrote: »
    i think the fact is, we can evolve much faster as a people, change as a people if we have an impartial, intelligent, informed entity controlling t he more basic facets of life like speed limits, money allocation and minor things like gay marriage which consume HUGE amounts of time for humans to debate. Thus we are able to evolve and become better people quicker because we're no longer rabidly foaming at the mouth to get our voice heard, its just heard.

    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.
    The problem is, your definition of "basic facets of life" and "minor things" is purely subjective. Your argument in favor of an AI government seems to boil down to the belief that such government will make the decisions you want on issues like gay marriage and allocation of tax money.

    It seems like you don't like the fact that some of your fellow citizens might disagree with you, so you want some AI to make the "right" (read: your preferred) decisions on these matters.

    It should be pointed out that some Christians believe that Jesus will return and rule the Earth like a benevolent, all-knowing father figure and will make all the "right" (read: those Christians' preferred) decisions on all Earthly matters. If you don't think that's a good idea, then neither is the idea of an AI government.

    If he was benevolent, open to debate and real, I would not mind it.

    Jesus isn't such a bad guy, he's basically an ancient liberal. It's his dad that was the psycho with a superiority complex.

    DanHibiki on
  • Options
    MagicPrimeMagicPrime FiresideWizard Registered User regular
    edited March 2010
    This reminds me of that story with the Universal AC and it finding out how to stop entropy.

    MagicPrime on
    BNet • magicprime#1430 | PSN/Steam • MagicPrime | Origin • FireSideWizard
    Critical Failures - Havenhold CampaignAugust St. Cloud (Human Ranger)
  • Options
    DarkWarriorDarkWarrior __BANNED USERS regular
    edited March 2010
    Modern Man wrote: »
    i think the fact is, we can evolve much faster as a people, change as a people if we have an impartial, intelligent, informed entity controlling t he more basic facets of life like speed limits, money allocation and minor things like gay marriage which consume HUGE amounts of time for humans to debate. Thus we are able to evolve and become better people quicker because we're no longer rabidly foaming at the mouth to get our voice heard, its just heard.

    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.
    The problem is, your definition of "basic facets of life" and "minor things" is purely subjective. Your argument in favor of an AI government seems to boil down to the belief that such government will make the decisions you want on issues like gay marriage and allocation of tax money.

    It seems like you don't like the fact that some of your fellow citizens might disagree with you, so you want some AI to make the "right" (read: your preferred) decisions on these matters.

    It should be pointed out that some Christians believe that Jesus will return and rule the Earth like a benevolent, all-knowing father figure and will make all the "right" (read: those Christians' preferred) decisions on all Earthly matters. If you don't think that's a good idea, then neither is the idea of an AI government.

    I never said I wanted it to vote my way because I think my way is right, I just want it to make impartial, uncorrupted, informed decisions without being tinged by irrational fear, hatred and pettiness that americans are seeing right now in Congress. Government now doesn't agree with me and an AI wouldn't neccesaarily but at least Id know its not in it for itself

    DarkWarrior on
  • Options
    Zombie NirvanaZombie Nirvana Registered User regular
    edited March 2010
    I'd also like to put my name forward as a benevolent monarch figure. I'd do at least as good a job as the AI. I promise.

    Zombie Nirvana on
  • Options
    lazegamerlazegamer The magnanimous cyberspaceRegistered User regular
    edited March 2010
    Modern Man wrote: »
    i think the fact is, we can evolve much faster as a people, change as a people if we have an impartial, intelligent, informed entity controlling t he more basic facets of life like speed limits, money allocation and minor things like gay marriage which consume HUGE amounts of time for humans to debate. Thus we are able to evolve and become better people quicker because we're no longer rabidly foaming at the mouth to get our voice heard, its just heard.

    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.
    The problem is, your definition of "basic facets of life" and "minor things" is purely subjective. Your argument in favor of an AI government seems to boil down to the belief that such government will make the decisions you want on issues like gay marriage and allocation of tax money.

    It seems like you don't like the fact that some of your fellow citizens might disagree with you, so you want some AI to make the "right" (read: your preferred) decisions on these matters.

    It should be pointed out that some Christians believe that Jesus will return and rule the Earth like a benevolent, all-knowing father figure and will make all the "right" (read: those Christians' preferred) decisions on all Earthly matters. If you don't think that's a good idea, then neither is the idea of an AI government.

    I never said I wanted it to vote my way because I think my way is right, I just want it to make impartial, uncorrupted, informed decisions without being tinged by irrational fear, hatred and pettiness that americans are seeing right now in Congress. Government now doesn't agree with me and an AI wouldn't neccesaarily but at least Id know its not in it for itself

    Without the motivation and will of a human (or similarly constructed) being, though, what makes you think the the decisions are to your benefit? To an AI that doesn't have an equivalent of emotion, what drives it to make decisions that favor humanity? You could cage it's ability to think around a concrete objective, like the survival of the human race. However, you can't direct it to do the things that will make people happy without it having some understanding of the emotion. And if it has emotion, then you have the same problems of an human leader who doesn't act entirely rationally.

    lazegamer on
    I would download a car.
  • Options
    DanHibikiDanHibiki Registered User regular
    edited March 2010
    You don't need to be feeling an emotion to understand the emotion, you need intelligence.

    The main trick is that it's impartial.

    DanHibiki on
  • Options
    DarkWarriorDarkWarrior __BANNED USERS regular
    edited March 2010
    lazegamer wrote: »
    Modern Man wrote: »
    i think the fact is, we can evolve much faster as a people, change as a people if we have an impartial, intelligent, informed entity controlling t he more basic facets of life like speed limits, money allocation and minor things like gay marriage which consume HUGE amounts of time for humans to debate. Thus we are able to evolve and become better people quicker because we're no longer rabidly foaming at the mouth to get our voice heard, its just heard.

    You can say you don't want to give up on humans and human leaders, thats fine, its just by the time they've sorted tihngs out you'll have been dead for about 6 centuries.
    The problem is, your definition of "basic facets of life" and "minor things" is purely subjective. Your argument in favor of an AI government seems to boil down to the belief that such government will make the decisions you want on issues like gay marriage and allocation of tax money.

    It seems like you don't like the fact that some of your fellow citizens might disagree with you, so you want some AI to make the "right" (read: your preferred) decisions on these matters.

    It should be pointed out that some Christians believe that Jesus will return and rule the Earth like a benevolent, all-knowing father figure and will make all the "right" (read: those Christians' preferred) decisions on all Earthly matters. If you don't think that's a good idea, then neither is the idea of an AI government.

    I never said I wanted it to vote my way because I think my way is right, I just want it to make impartial, uncorrupted, informed decisions without being tinged by irrational fear, hatred and pettiness that americans are seeing right now in Congress. Government now doesn't agree with me and an AI wouldn't neccesaarily but at least Id know its not in it for itself

    Without the motivation and will of a human (or similarly constructed) being, though, what makes you think the the decisions are to your benefit? To an AI that doesn't have an equivalent of emotion, what drives it to make decisions that favor humanity? You could cage it's ability to think around a concrete objective, like the survival of the human race. However, you can't direct it to do the things that will make people happy without it having some understanding of the emotion. And if it has emotion, then you have the same problems of an human leader who doesn't act entirely rationally.

    It would depend how intelligent it was, I'm assuming a highly advanced AI capable of evolving and learning. It'd be programmed with paramaters to stay within and guidelines to lead it in the way we are as we grow so we know that killing people shouldn't really be a good thing to do. It's not so much about making people happy but meeting criteria it knows is beneficial to its people.

    So it wouldn't legalise cannabis because it makes people happy but because by its understanding it won't make them unhappy and it will increase its income to be spent on its other projects.

    DarkWarrior on
  • Options
    QuidQuid Definitely not a banana Registered User regular
    edited March 2010
    For me, this is basically "we can't do this, instead of consciously evolving ourselves let's elect a nanny"

    Nice idea but I prefer not to give up, thanks.

    Who said anything about about giving up improving ourselves? I don't need Palin in charge in order to figure out to become a better person.

    Quid on
  • Options
    MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited March 2010
    DanHibiki wrote: »
    For me, this is basically "we can't do this, instead of consciously evolving ourselves let's elect a nanny"

    Nice idea but I prefer not to give up, thanks.

    you just used a computer to type that up rather then chiseling it on the side of a rock. You already gave up, just at a different degree.

    Nonsense.

    your example isn't even remotely analogous. I did all my own thinking when I wrote that, and I made my own choice and wrote it under my own power. My computer did not type it for me, I typed it into a computer.
    Quid wrote: »
    For me, this is basically "we can't do this, instead of consciously evolving ourselves let's elect a nanny"

    Nice idea but I prefer not to give up, thanks.

    Who said anything about about giving up improving ourselves? I don't need Palin in charge in order to figure out to become a better person.

    Individual improvement is pretty pointless if those individuals don't try to interact with anything other than themselves.

    I think we need much better large scale social intelligence. If we had machines run us, there wouldn't be any need to develop that, any need to develop deeper understanding of anything further than our own individual lives and pursuits.

    Since we are social creatures foremost and individuals second, to me it's stifling our (potential) evolution.

    People aren't going to bother if they don't have to. We only get better at things with struggle.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    edited March 2010
    The point isn't to have machines run us though. It's to have a machine understand us, to provide utility from government we otherwise cannot achieve.

    My whole OP was motivated by "what if we could give everyone that one friend who knows about politics", which is what seems to happen a lot in other threads we had - notably on healthcare - where plenty of people have managed to convince others of the utility of what was passed by patiently explaining the entire thing to them.

    People would still be running their government, just the representative would be a vastly more capable one which really would be concerned with your individual needs.

    Lobby groups and money only really hold power in politics because they affect how well you can appear to communicate with your audience - exposure ultimately wins elections, because you really can't talk to all your citizens individually, and you definitely can't remember or understand most of them (the upper limit seems to be maybe 400-500 people in your monkeysphere). Make an AI sufficiently capable conversely, and we could make an entity with a monkeysphere that could encompass billions of people, and be capable of reaching and talking to all of them. Such a thing would immune to traditional lobbyists.

    I also take issue with the idea that not having emotions means it couldn't govern humans effectively: we govern people without sharing their emotions all the time. Just we're ruled by our own and pretty poor at it. Applying intelligence and rationality to any problem, you can deal with the emotions of others and understand them without experiencing them personally. Humanity's great flaw is that we're really bad at considering the emotions of others when they're suitably removed from us.

    electricitylikesme on
  • Options
    DarkWarriorDarkWarrior __BANNED USERS regular
    edited March 2010
    I still think a few AIs and a super server feeding into a human-nano augmented hybrid is the best solution! And practical.

    DarkWarrior on
  • Options
    QuidQuid Definitely not a banana Registered User regular
    edited March 2010
    Individual improvement is pretty pointless if those individuals don't try to interact with anything other than themselves.

    I think we need much better large scale social intelligence. If we had machines run us, there wouldn't be any need to develop that, any need to develop deeper understanding of anything further than our own individual lives and pursuits.

    Since we are social creatures foremost and individuals second, to me it's stifling our (potential) evolution.

    People aren't going to bother if they don't have to. We only get better at things with struggle.

    Okay, again, I can still do all of this regardless of who's in charge of the country. The government isn't the only large scale social interaction, it's just the only one that has by far the most serious consequences should things go wrong.

    Whereas, should the drama society screw things up and vote the wrong person to take charge the worst case scenario is there's no production of King Lear this autumn.

    Quid on
  • Options
    MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited March 2010
    Quid wrote: »
    Individual improvement is pretty pointless if those individuals don't try to interact with anything other than themselves.

    I think we need much better large scale social intelligence. If we had machines run us, there wouldn't be any need to develop that, any need to develop deeper understanding of anything further than our own individual lives and pursuits.

    Since we are social creatures foremost and individuals second, to me it's stifling our (potential) evolution.

    People aren't going to bother if they don't have to. We only get better at things with struggle.

    Okay, again, I can still do all of this regardless of who's in charge of the country. The government isn't the only large scale social interaction, it's just the only one that has by far the most serious consequences should things go wrong.

    Whereas, should the drama society screw things up and vote the wrong person to take charge the worst case scenario is there's no production of King Lear this autumn.

    I can't believe you are equivocating a small scale drama production to large scale government decisions and are trying to claim that the problems and complexity involved are equal.

    I honestly cannot believe you are trying to do that Quid. It's like watching someone trying to ram a square peg into a triangle hole.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Options
    QuidQuid Definitely not a banana Registered User regular
    edited March 2010
    Because they are still analogous. You're claiming people need to run a government in order to grow because it's a wide spread social institution, but there are a myriad of others. Deciding which road to build next isn't necessary for human growth.

    But hey, replace it with whatever large scale social institution you would like.

    Quid on
Sign In or Register to comment.