You can understand emotion but you can't replicate it. Without emotion there is no ambition, meaning that no AI will ever be "smart" enough to try and take over the world and kill humanity.
You're just making things up.
So now your just going to dismiss my arguments. Thats cool.
You have no evidence for your assertions. Why shouldn't your argument be dismissed? You are making a claim: "Computers will never be able to match human creativity." You need to back it up with something besides your gut opinion, which at this point looks like nothing more than a poorly thought out preconceived notion.
Lols. Honestly though, do you think Einstein would have come up with the Theory of Relativity if he hadn't led the life that he had, hadn't felt the way he felt? I know computers are smart, in terms of numbercrunching and analytical thought. But humans have abilities that far surpass computers because of their emotions.
I think the whole problem with this intellectual exercise is that no one has yet defined AI. Are we talking about something that can manage a large amount of data or something that is inherently human but run by circuitboards instead of neurons?
I don't think it will be possible to create an AI that is exactly human, as circuits will likely never be able to fully capture the complexity of the entire human body (whether I've eaten recently changes how my brain works, for example), but I don't see any reason to assume a self-aware, sentient entity capable of some emotion is impossible.
beauty is that they don't have to.
An AI just has to accurately simulate popper brain function which is a lot easier then reproducing a human at the sub atomic level.
Beauty in that is you can make the AI far more intelligent(with instant access to any available data) and versatile then a human brain and just as creative\artistic etc.
It's true, but no AI will be human, and I doubt will even be close unless that is an intended design parameter. I think AI will produce art, and I think it will be very alien to our mindset. Which, incidentally enough, I think is excellent. Guernica might reveal a lot about the human condition, but I am at least as interested in a window to a to an inhuman mindset.
But there will inevitably be an AI or android built to be in every way mentally human simply because it is the sort of thing that fascinates science and humanity. "Can an artificial human be identical to a human?" that will be a question on the minds of every scientist from the moment that the first AI is built. It won't be for any practical purpose, but it will be built just to prove that point.
I have no doubt that these copies will be more human then human.
You can understand emotion but you can't replicate it. Without emotion there is no ambition, meaning that no AI will ever be "smart" enough to try and take over the world and kill humanity.
You're just making things up.
So now your just going to dismiss my arguments. Thats cool.
You have no evidence for your assertions. Why shouldn't your argument be dismissed? You are making a claim: "Computers will never be able to match human creativity." You need to back it up with something besides your gut opinion, which at this point looks like nothing more than a poorly thought out preconceived notion.
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
Oski on
0
HachfaceNot the Minister Farrakhan you're thinking ofDammit, Shepard!Registered Userregular
Lols. Honestly though, do you think Einstein would have come up with the Theory of Relativity if he hadn't led the life that he had, hadn't felt the way he felt? I know computers are smart, in terms of numbercrunching and analytical thought. But humans have abilities that far surpass computers because of their emotions.
First of all, you are assuming that we would provide our AI with no emotions. This is a baseless assumption. It stands to reason that we would indeed program our future machines with an obsessive interest in the tasks we want them to perform.
Lols. Honestly though, do you think Einstein would have come up with the Theory of Relativity if he hadn't led the life that he had, hadn't felt the way he felt? I know computers are smart, in terms of numbercrunching and analytical thought. But humans have abilities that far surpass computers because of their emotions.
Are you suggesting that Einstein understanding that matter and energy are the same substance came from him oggling hooters when he was 13?
You can understand emotion but you can't replicate it. Without emotion there is no ambition, meaning that no AI will ever be "smart" enough to try and take over the world and kill humanity.
You're just making things up.
So now your just going to dismiss my arguments. Thats cool.
You have no evidence for your assertions. Why shouldn't your argument be dismissed? You are making a claim: "Computers will never be able to match human creativity." You need to back it up with something besides your gut opinion, which at this point looks like nothing more than a poorly thought out preconceived notion.
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
The assertion that emotions are inherently human and can't be understood or replicated is different from the assertion that humans may one day be able to create emotion in an AI.
edit: A hypothetical argument is not a go ahead for baseless assumptions.
Jason Todd on
0
HachfaceNot the Minister Farrakhan you're thinking ofDammit, Shepard!Registered Userregular
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
1. You're shifting the goalposts. Are you arguing that computers can't create art or that computers can't have emotions? Different but related things.
2. I am not really asserting anything; I just refuse to discount any possibilities. You have made concrete assertions with regards to the limitations of AI, and therefore the burden of proof is on you.
Lols. Honestly though, do you think Einstein would have come up with the Theory of Relativity if he hadn't led the life that he had, hadn't felt the way he felt? I know computers are smart, in terms of numbercrunching and analytical thought. But humans have abilities that far surpass computers because of their emotions.
Are you suggesting that Einstein understanding that matter and energy are the same substance came from him oggling hooters when he was 13?
Naw, but maybe the intellectual leap he took to make such an assertion resulted directly from his emotions and human interaction.
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
1. You're shifting the goalposts. Are you arguing that computers can't create art or that computers can't have emotions? Different but related things.
2. I am not really asserting anything; I just refuse to discount any possibilities. You have made concrete assertions with regards to the limitations of AI, and therefore the burden of proof is on you.
I'm not arguing that computers cant create art; they can. I'm saying that they can't create any art of value because they don't have and will never have emotions.
Simply put, I don't think AI will ever have emotion and that will be its one and only limitation. That limitation alone will prevent it from ever being a truly perfect intelligence.
Oski on
0
HachfaceNot the Minister Farrakhan you're thinking ofDammit, Shepard!Registered Userregular
Naw, but maybe the intellectual leap he took to make such an assertion resulted directly from his emotions and human interaction.
Pure speculation. It is equally likely that his formidable brain could have led to even deeper discoveries had he not been inhibited by quirks of human psychology.
Hachface on
0
HachfaceNot the Minister Farrakhan you're thinking ofDammit, Shepard!Registered Userregular
I'm not arguing that computers cant create art; they can. I'm saying that they can't create any art of value because they don't have and will never have emotions.
This is a meaningless quibble.
Simply put, I don't think AI will ever have emotion and that will be its one and only limitation. That limitation alone will prevent it from ever being a truly perfect intelligence.
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
1. You're shifting the goalposts. Are you arguing that computers can't create art or that computers can't have emotions? Different but related things.
2. I am not really asserting anything; I just refuse to discount any possibilities. You have made concrete assertions with regards to the limitations of AI, and therefore the burden of proof is on you.
I'm not arguing that computers cant create art; they can. I'm saying that they can't create any art of value because they don't have and will never have emotions.
Simply put, I don't think AI will ever have emotion and that will be its one and only limitation. That limitation alone will prevent it from ever being a truly perfect intelligence.
Where is your evidence? You have said emotion cannot be understood by humans. Why is that?
I'm not arguing that computers cant create art; they can. I'm saying that they can't create any art of value because they don't have and will never have emotions.
This is a meaningless quibble.
Simply put, I don't think AI will ever have emotion and that will be its one and only limitation. That limitation alone will prevent it from ever being a truly perfect intelligence.
Why couldn't we give computers emotions?
Its not a meaningless argument, its just a debate that belongs in another thread.
The depth and scope of what exactly emotion is means that it is entirely impossible to replicate.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
1. You're shifting the goalposts. Are you arguing that computers can't create art or that computers can't have emotions? Different but related things.
2. I am not really asserting anything; I just refuse to discount any possibilities. You have made concrete assertions with regards to the limitations of AI, and therefore the burden of proof is on you.
I'm not arguing that computers cant create art; they can. I'm saying that they can't create any art of value because they don't have and will never have emotions.
Simply put, I don't think AI will ever have emotion and that will be its one and only limitation. That limitation alone will prevent it from ever being a truly perfect intelligence.
Where is your evidence? You have said emotion cannot be understood by humans. Why is that?
I misspoke in saying we can't understand emotion. We can understand it to a point. My assertion is that its depth and scope means that we will never fully understand and therefore never be able to recreate it.
I misspoke in saying we can't understand emotion. We can understand it to a point. My assertion is that its depth and scope means that we will never fully understand and therefore never be able to recreate it.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
I misspoke in saying we can't understand emotion. We can understand it to a point. My assertion is that its depth and scope means that we will never fully understand and therefore never be able to recreate it.
Why is that?
It's all just energy/matter doing shit.
Oh well if its just energy and matter doing shit, lets get that chem lab at home going and figure it out.
I misspoke in saying we can't understand emotion. We can understand it to a point. My assertion is that its depth and scope means that we will never fully understand and therefore never be able to recreate it.
Why is that?
It's all just energy/matter doing shit.
Oh well if its just energy and matter doing shit, lets get that chem lab at home going and figure it out.
Remember when you were complaining about strawmen? The brain is just matter and energy. That doesn't mean it isn't highly complex.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
In the act of simulating a brain your going to need to replicate emotions.
...right. Which is exactly what Yar is proposing.
No. He said that it will create it on its own. It won't. Its not human. It won't be born with them. As its creators, the ones who develop it will need to instill it in the computer in question. I'm saying thats impossible. Emotions are simply too complex.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
In the act of simulating a brain your going to need to replicate emotions.
...right. Which is exactly what Yar is proposing.
No. He said that it will create it on its own. It won't. Its not human. It won't be born with them. As its creators, the ones who develop it will need to instill it in the computer in question. I'm saying thats impossible. Emotions are simply too complex.
No. He said that it will create it on its own. It won't. Its not human. It won't be born with them. As its creators, the ones who develop it will need to instill it in the computer in question. I'm saying thats impossible. Emotions are simply too complex.
Emotions are part of the brain. Ergo, an artificial duplicate of the brain will have emotions. I do not understand why you can't handle this idea. Also, emotions may very well be the least complex activities in the brain.
Emotions are actually a whole-body thing. If people stop at the brain when designing AI they're missing all the functions of the spine and a lot of other systems.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
In the act of simulating a brain your going to need to replicate emotions.
...right. Which is exactly what Yar is proposing.
No. He said that it will create it on its own. It won't. Its not human. It won't be born with them. As its creators, the ones who develop it will need to instill it in the computer in question. I'm saying thats impossible. Emotions are simply too complex.
if there is no supernatural force behind the mind, then we can safely say that "yes emotions can be explained and accurately simulated".
DanHibiki on
0
HachfaceNot the Minister Farrakhan you're thinking ofDammit, Shepard!Registered Userregular
edited January 2009
This whole emotions thing is kind of tangential, anyway. When we go ahead creating AI, I suspect that creating Robot Shakespeare will be well behind Robot Einstein in terms of priorities.
Although Robot Einstein might surprise us and write a sonnet. That would be an exciting day.
Hahahahahaa. I have a horrible headache so I'm out for the night. Just to clarify, I don't think AI in the future is necessarily impossible. But I think its bound by the abilities of its makers. And to an extent, however smart it will be, I don't think it will have the creative abilities of your average fourth grader. Because it will be bound by the limitations of its creators.
And its that lack of creative abilities that will prevent it from making humanity obsolete.
Hahahahahaa. I have a horrible headache so I'm out for the night. Just to clarify, I don't think AI in the future is necessarily impossible. But I think its bound by the abilities of its makers. And to an extent, however smart it will be, I don't think it will have the creative abilities of your average fourth grader. Because it will be bound by the limitations of its creators.
And its that lack of creative abilities that will prevent it from making humanity obsolete.
Night all.
People who design bulldozers aren't comparatively strong. Being capable of something and designing something that is capable of more are two very different things.
Edit: Also, you never explained why the fact that emotions are essentially just chemical reactions keeps AIs from experiencing the same thing.
This is basically what you're all talking about and have no idea.
Artifical Intelligence is meaningless, we've achieved it. All it is in terms of engineering is a system that responds and adapts differently based on a possibly dynamic set of rules over a stream of incoming information.
When machines can think as we think, that is to say they can achieve abstract thought it should be considered Artificial Consciousness.
Consciousness leads to self-awareness, once machines are self-aware they can improve their own designs, and most likely advance faster than we can, thus taking humanity out of technological advancement.
I highly doubt that said machines will be cold, utterly logical creatures. If these entities are capable of consciousness it is not a stretch of reason to assume they will exhibit other contemporary aspects of humanity. They should only deem it worthy to engage us in conflict if we give them sufficient reason to, and not some silly "humanity is a threat to itself and must be exterminated" reason from Sci-Fi.
I would think that the most extreme scenario will be one that is more popular in fiction now; where we let them do everything for us, only to become slaves to our own children.
robotbebop on
Do not feel trapped by the need to achieve anything, this way you achieve everything.
Oh, hey I'm making a game! Check it out: Dr. Weirdo!
Posts
You have no evidence for your assertions. Why shouldn't your argument be dismissed? You are making a claim: "Computers will never be able to match human creativity." You need to back it up with something besides your gut opinion, which at this point looks like nothing more than a poorly thought out preconceived notion.
Lols. Honestly though, do you think Einstein would have come up with the Theory of Relativity if he hadn't led the life that he had, hadn't felt the way he felt? I know computers are smart, in terms of numbercrunching and analytical thought. But humans have abilities that far surpass computers because of their emotions.
I have no doubt that these copies will be more human then human.
Huh. Now show me some evidence that says computers have any honest emotion. None? Okay. Then this whole argument becomes entirely hypothetical, and I'm entitled to my opinion as much as yours.
First of all, you are assuming that we would provide our AI with no emotions. This is a baseless assumption. It stands to reason that we would indeed program our future machines with an obsessive interest in the tasks we want them to perform.
Are you suggesting that Einstein understanding that matter and energy are the same substance came from him oggling hooters when he was 13?
The assertion that emotions are inherently human and can't be understood or replicated is different from the assertion that humans may one day be able to create emotion in an AI.
edit: A hypothetical argument is not a go ahead for baseless assumptions.
1. You're shifting the goalposts. Are you arguing that computers can't create art or that computers can't have emotions? Different but related things.
2. I am not really asserting anything; I just refuse to discount any possibilities. You have made concrete assertions with regards to the limitations of AI, and therefore the burden of proof is on you.
Naw, but maybe the intellectual leap he took to make such an assertion resulted directly from his emotions and human interaction.
Or maybe because he was doing math and the numbers added up.
Oddly enough, computers are good at math and numbers.
I'm not arguing that computers cant create art; they can. I'm saying that they can't create any art of value because they don't have and will never have emotions.
Simply put, I don't think AI will ever have emotion and that will be its one and only limitation. That limitation alone will prevent it from ever being a truly perfect intelligence.
Pure speculation. It is equally likely that his formidable brain could have led to even deeper discoveries had he not been inhibited by quirks of human psychology.
This is a meaningless quibble.
Why couldn't we give computers emotions?
See this is what I'm talking about when I talk about straw man stuff. The theory of relativity is not all math and numbers.
And yes, computers are good at number crunching- the numbers humans tell them to crunch.
Where is your evidence? You have said emotion cannot be understood by humans. Why is that?
Its not a meaningless argument, its just a debate that belongs in another thread.
The depth and scope of what exactly emotion is means that it is entirely impossible to replicate.
Ahahahahahahahahahaha
AI is man-made. DUR.
The end-run around your entire flawed analysis is the computer program that simulates all the electro-chemical processes of a human neuron and of a human brain. Is such a thing inconceivable? We simulate more and more biological processes every day, with more and more accuracy. Once we can create a brain simulator, there is nothing we need to "instill" into it for it to create art or emotion. It will create them on its own.
Anyway, there is an entire society on this stuff, led by a renowned genius.
I misspoke in saying we can't understand emotion. We can understand it to a point. My assertion is that its depth and scope means that we will never fully understand and therefore never be able to recreate it.
This is romanticized bullshit. Emotions are chemicals reacting in a human brain. Why is that impossible to study and replicate?
edit:
See above.
Why is that?
It's all just energy/matter doing shit.
In the act of simulating a brain your going to need to replicate emotions.
...right. Which is exactly what Yar is proposing.
Oh well if its just energy and matter doing shit, lets get that chem lab at home going and figure it out.
Remember when you were complaining about strawmen? The brain is just matter and energy. That doesn't mean it isn't highly complex.
Clearly a chem lab at home is just as good as thousands of scientists world-wide in hundreds of labs with billions of dollars of equipment.
No. He said that it will create it on its own. It won't. Its not human. It won't be born with them. As its creators, the ones who develop it will need to instill it in the computer in question. I'm saying thats impossible. Emotions are simply too complex.
Sorry my sarcasm was lost on you. But now that you know its sarcasm maybe you get the joke
That you have no supported argument? Har de har har?
Emotions are part of the brain. Ergo, an artificial duplicate of the brain will have emotions. I do not understand why you can't handle this idea. Also, emotions may very well be the least complex activities in the brain.
if there is no supernatural force behind the mind, then we can safely say that "yes emotions can be explained and accurately simulated".
Although Robot Einstein might surprise us and write a sonnet. That would be an exciting day.
And its that lack of creative abilities that will prevent it from making humanity obsolete.
Night all.
Edit: Also, you never explained why the fact that emotions are essentially just chemical reactions keeps AIs from experiencing the same thing.
This is basically what you're all talking about and have no idea.
Artifical Intelligence is meaningless, we've achieved it. All it is in terms of engineering is a system that responds and adapts differently based on a possibly dynamic set of rules over a stream of incoming information.
When machines can think as we think, that is to say they can achieve abstract thought it should be considered Artificial Consciousness.
Consciousness leads to self-awareness, once machines are self-aware they can improve their own designs, and most likely advance faster than we can, thus taking humanity out of technological advancement.
I highly doubt that said machines will be cold, utterly logical creatures. If these entities are capable of consciousness it is not a stretch of reason to assume they will exhibit other contemporary aspects of humanity. They should only deem it worthy to engage us in conflict if we give them sufficient reason to, and not some silly "humanity is a threat to itself and must be exterminated" reason from Sci-Fi.
I would think that the most extreme scenario will be one that is more popular in fiction now; where we let them do everything for us, only to become slaves to our own children.
Oh, hey I'm making a game! Check it out: Dr. Weirdo!
http://www.damninteresting.com/?p=870