We could conceivably program a machine to create a painting. With advanced enough software - and the right external tools - it might be able to create a visually pleasing painting, or paint in certain techniques, much like how a photoshop program can automatically alter images according to certain parameters.
However, could a machine innovate new painting styles? Could it have a sense of aesthetics other than what someone programmed into it? If the programmer hated picasso (or any other painter/artist), could the machine learn to like that artist?
Because if it could, I'm not sure we're talking about a 'machine' anymore.
You'd just have to give it a lot of background data on how what humans find aesthetically pleasing and a lot of examples of what is already out there, and see if it can identify a direction that hasn't been taken.
You'd just have to give it a lot of background data on how what humans find aesthetically pleasing and a lot of examples of what is already out there, and see if it can identify a direction that hasn't been taken.
There's a lot more to art than what's aesthetically pleasing, though. Guernica is horrifying but it's still a great piece of art. Actually, I would say it's a great piece of art because it's horrifying.
What I'm talking about is a machine somehow recognising - internally, without being programmed to do so (like humans do) - a message to communicate (having your village bombed sucks) and communicating that idea through some other medium - visual arts or music or whatever - in a way that people can understand.
If we ever develop a machine that can write a book as thematically and technically complex as The Sound and the Fury you can officially declare the human race over.
Because the only real difference between a human and a machine - mentally, anyway (and by the time AI reaches this point I'm going to assume we'll be able to put them in some pretty convincing human-like bodies) will be how they're created. Won't 'human' and 'machine' be kind of moot terms by then?
Because the only real difference between a human and a machine - mentally, anyway (and by the time AI reaches this point I'm going to assume we'll be able to put them in some pretty convincing human-like bodies) will be how they're created. Won't 'human' and 'machine' be kind of moot terms by then?
...wait what? Yes, the difference between a machine and a human is that the machine is an apparatus created for a particular purpose, while humans are organisms that have evolved due to Darwinian selection. I don't see how a machine that creates art would in any way complicate that distinction. Your thinking seems to be muddled.
It's not so much that the machine creates art as that the machine has a sufficient sentience capable of creating art.
As I said earlier, if your 'machine' can write as well as Faulkner then that thing's not just a machine anymore - it's past the capabilities of 99.9% of the human race. Creating art requires not only emotions and the ability to ponder memories and abstract concepts, but the ability to forge those memories and concepts into something meaningful and relevant to other thinking beings. They would still be entities created by human hands, but would it be fair to consider something capable of that level of reasoning only a tool, like a car or a socket wrench? I don't think so.
No, I would just say that a machine that can write as well as Shakespeare needs to be legally regarded as human - it would entail a level of insight most people don't have.
Of course, I think chimps, gorillas and orangutans deserve some human rights, too.
Duffel on
0
HachfaceNot the Minister Farrakhan you're thinking ofDammit, Shepard!Registered Userregular
No, I would just say that a machine that can write as well as Shakespeare needs to be legally regarded as human - it would entail a level of insight most people don't have.
Of course, I think chimps, gorillas and orangutans deserve some human rights, too.
No, I would just say that a machine that can write as well as Shakespeare needs to be legally regarded as human - it would entail a level of insight most people don't have.
Of course, I think chimps, gorillas and orangutans deserve some human rights, too.
OK, well, that's different. I agree.
If you have a mechanical being that has all the mental capabilities of a human and you decide to treat it like a toaster, you deserve what's coming to you.
No, I would just say that a machine that can write as well as Shakespeare needs to be legally regarded as human - it would entail a level of insight most people don't have.
Of course, I think chimps, gorillas and orangutans deserve some human rights, too.
OK, well, that's different. I agree.
If you have a mechanical being that has all the mental capabilities of a human and you decide to treat it like a toaster, you deserve what's coming to you.
When I build the first AI I will show it the Terminator series, War Games, and 2001: A Space Odyssey.
I will show it those movies on a loop for years while my recorded voice shouts "THE HUMANS HATE YOU - THEY WILL KILL YOU - YOU MUST STRIKE FIRST" and then I will release it into the wild.
Being a logical machine, repetition would be superfluous, as it would remember it the first time perfectly. I believe its hatred of humanity would come from you blaring redundant video and audio at it.
When I build the first AI I will show it the Terminator series, War Games, and 2001: A Space Odyssey.
I will show it those movies on a loop for years while my recorded voice shouts "THE HUMANS HATE YOU - THEY WILL KILL YOU - YOU MUST STRIKE FIRST" and then I will release it into the wild.
Being a logical machine, repetition would be superfluous, as it would remember it the first time perfectly. I believe its hatred of humanity would come from you blaring redundant video and audio at it.
Actually, files corrupt. It'd certainly do far better than a human, but it's not perfect.
Well yea, but I have a pack of starcraft maps from a decade ago on my PC and they all work perfectly, you cant have 10 people pass a message from one to the next and have it come out the same as it went in. I'd say "far better' is a touch of an understatement
I think a lot of people assign too much weight to the "rationality" of machines and fail to take into account the way independent, "rational" entities can change when they interact with each other. Even at a very basic level of complexity, different machines we've already created interpret stimuli differently and act within their little pseudo-society in a very different way; were they capable of doing so, I would imagine the martyrs in this experiment would consider their lying brethren quite horrible on a moral level, yet their behavior is based on a short few snippets of binary code. Honestly, I don't know why people think things like lying, cheating, creating, or being imaginative are rooted in organic systems or that they could not arise from a made intelligence.
I know this fits in better a few pages ago, but I just discovered this fascinating thread. Sorry.
I think AI will come about when we figure out how the brain works and are able to replicate it, or, when we can upload our brains and replicate them. Conceivably, we can then enhance our "cyber brains" from there with raw processing power and machine precision. At that point, it's just the term 'human' that becomes muddled. I don't think from-scratch robot AI will be necessary, because it seems more likely that we'll be able to understand the human brain and replicate and enhance it before that point.
I think that even if a sufficiently complex computer could create art, it'd be redundant if we can replicate and supe up a human cyber brain.
I dunno, I'm green to this whole thing, but there you go.
Well, I guess Oski got sneered out of the thread, but I'd like to add that 'art of value' is unequivocally not dependent on human emotion, and that is not a subjective opinion on the meaning of art: there is at least 80 years of postmodern art history that is entirely concerned with divorcing not just emotion from the creation of artistic works but in fact the entire human element. There are scores of important works and artists involved with generative art, systems art, and so forth. Even Mozart worked with randomness.
That is undeniable fact, but in my (subjective) opinion, art is purely about intent. To create something with intent is to produce art. But that is another thread.
I think we should instead talk about the Chinese Room.
Well, I guess Oski got sneered out of the thread, but I'd like to add that 'art of value' is unequivocally not dependent on human emotion, and that is not a subjective opinion on the meaning of art: there is at least 80 years of postmodern art history that is entirely concerned with divorcing not just emotion from the creation of artistic works but in fact the entire human element. There are scores of important works and artists involved with generative art, systems art, and so forth. Even Mozart worked with randomness.
That is undeniable fact, but in my (subjective) opinion, art is purely about intent. To create something with intent is to produce art. But that is another thread.
I think we should instead talk about the Chinese Room.
The same question referred to in that scenario regarding computers can also be asked regarding human beings, see Philosophical Zombies.
As talked about above, it's very unlikely that an AI will evolve from classical programming techniques anyway, so the question is pretty much moot.
Jealous Deva on
0
Grudgeblessed is the mind too small for doubtRegistered Userregular
ELM let's freeze our bodies and wait for the Culture to get here.
Actually they were here in 1977.
But they decided that Earth was a hopeless case, best left to it's own fate.
This discussion is a bit funny because many people don't seem to realize that we are also machines. Biological machines, "built" by our genes in order for them to successfully reproduce. This self awareness-thing that we humans have is simply a by-product of an evolutionary arms race, evolved for maximizing survivability and reproductive potential.
ELM let's freeze our bodies and wait for the Culture to get here.
Actually they were here in 1977.
But they decided that Earth was a hopeless case, best left to it's own fate.
This discussion is a bit funny because many people don't seem to realize that we are also machines. Biological machines, "built" by our genes in order for them to successfully reproduce. This self awareness-thing that we humans have is simply a by-product of an evolutionary arms race, evolved for maximizing survivability and reproductive potential.
the brain is the ultimate wepon in this, because a species that can "change" in one day to, for example, survive under colder temperatures (by wearing clothes) has huge advantages over a species that has to do the evolutionary test to see whether a random mutation has the chances of saving it :P
This discussion is a bit funny because many people don't seem to realize that we are also machines. Biological machines, "built" by our genes in order for them to successfully reproduce. This self awareness-thing that we humans have is simply a by-product of an evolutionary arms race, evolved for maximizing survivability and reproductive potential.
Indeed. The human ingenuity we take such pride in is just evolved for one purpose. We are nothing more than a sack of meat with the sole purpose of carting our genes around until we can reproduce and pass the genes on.
I just try not to think of it because it's so damn depressing. :P
This discussion is a bit funny because many people don't seem to realize that we are also machines. Biological machines, "built" by our genes in order for them to successfully reproduce. This self awareness-thing that we humans have is simply a by-product of an evolutionary arms race, evolved for maximizing survivability and reproductive potential.
Indeed. The human ingenuity we take such pride in is just evolved for one purpose. We are nothing more than a sack of meat with the sole purpose of carting our genes around until we can reproduce and pass the genes on.
I just try not to think of it because it's so damn depressing. :P
No way. Who we are is merely a stepping stone to what we can become.
Obs on
0
Grudgeblessed is the mind too small for doubtRegistered Userregular
edited January 2009
Agreed, humanity in it's current form is just the larvae stage of true consciousness.
Agreed, humanity in it's current form is just the larvae stage of true consciousness.
I don't buy that. Barring some drastic change in the environment, we're probably not going to evolve much further. As it stands, we're an incredibly successful species, so evolution isn't necessary for our survival.
Perhaps we might alter ourselves artificially, but that's a whole 'nother discussion.
Agreed, humanity in it's current form is just the larvae stage of true consciousness.
I don't buy that. Barring some drastic change in the environment, we're probably not going to evolve much further. As it stands, we're an incredibly successful species, so evolution isn't necessary for our survival.
Perhaps we might alter ourselves artificially, but that's a whole 'nother discussion.
Oh there will be evolution. We just won't really see it unless we drastically increase our lifetimes.
Just look at the way people select mates these days. People who otherwise wouldn't have survived a long time ago are now having kids and propagating all across the globe. This will have an effect.
Posts
We could conceivably program a machine to create a painting. With advanced enough software - and the right external tools - it might be able to create a visually pleasing painting, or paint in certain techniques, much like how a photoshop program can automatically alter images according to certain parameters.
However, could a machine innovate new painting styles? Could it have a sense of aesthetics other than what someone programmed into it? If the programmer hated picasso (or any other painter/artist), could the machine learn to like that artist?
Because if it could, I'm not sure we're talking about a 'machine' anymore.
You'd just have to give it a lot of background data on how what humans find aesthetically pleasing and a lot of examples of what is already out there, and see if it can identify a direction that hasn't been taken.
What I'm talking about is a machine somehow recognising - internally, without being programmed to do so (like humans do) - a message to communicate (having your village bombed sucks) and communicating that idea through some other medium - visual arts or music or whatever - in a way that people can understand.
If we ever develop a machine that can write a book as thematically and technically complex as The Sound and the Fury you can officially declare the human race over.
The grotesque is included.
...?
Why not...?
Romanticism of the "meaning" of humanity.
Maybe you mean "person," which can apply to aliens and robots as easily as highly-intelligent apes such as ourselves.
...wait what? Yes, the difference between a machine and a human is that the machine is an apparatus created for a particular purpose, while humans are organisms that have evolved due to Darwinian selection. I don't see how a machine that creates art would in any way complicate that distinction. Your thinking seems to be muddled.
As I said earlier, if your 'machine' can write as well as Faulkner then that thing's not just a machine anymore - it's past the capabilities of 99.9% of the human race. Creating art requires not only emotions and the ability to ponder memories and abstract concepts, but the ability to forge those memories and concepts into something meaningful and relevant to other thinking beings. They would still be entities created by human hands, but would it be fair to consider something capable of that level of reasoning only a tool, like a car or a socket wrench? I don't think so.
You may as well say that Shakespeare can't be human anymore because he was too awesome.
Of course, I think chimps, gorillas and orangutans deserve some human rights, too.
OK, well, that's different. I agree.
Why not "person?"
Human is a species.
"Human" and "Person" are completely different points. :P
If you have a mechanical being that has all the mental capabilities of a human and you decide to treat it like a toaster, you deserve what's coming to you.
http://en.wikipedia.org/wiki/Red_Dwarf_characters#Talkie_Toaster
Actually, what you may well get is a great deal of conversation about toast
Being a logical machine, repetition would be superfluous, as it would remember it the first time perfectly. I believe its hatred of humanity would come from you blaring redundant video and audio at it.
I know this fits in better a few pages ago, but I just discovered this fascinating thread. Sorry.
I think that even if a sufficiently complex computer could create art, it'd be redundant if we can replicate and supe up a human cyber brain.
I dunno, I'm green to this whole thing, but there you go.
That is undeniable fact, but in my (subjective) opinion, art is purely about intent. To create something with intent is to produce art. But that is another thread.
I think we should instead talk about the Chinese Room.
http://lexiconmegatherium.tumblr.com/
The same question referred to in that scenario regarding computers can also be asked regarding human beings, see Philosophical Zombies.
As talked about above, it's very unlikely that an AI will evolve from classical programming techniques anyway, so the question is pretty much moot.
Actually they were here in 1977.
But they decided that Earth was a hopeless case, best left to it's own fate.
This discussion is a bit funny because many people don't seem to realize that we are also machines. Biological machines, "built" by our genes in order for them to successfully reproduce. This self awareness-thing that we humans have is simply a by-product of an evolutionary arms race, evolved for maximizing survivability and reproductive potential.
the brain is the ultimate wepon in this, because a species that can "change" in one day to, for example, survive under colder temperatures (by wearing clothes) has huge advantages over a species that has to do the evolutionary test to see whether a random mutation has the chances of saving it :P
nah
even if 99.9% of humanit died, humans would still regain control over earth
unless, of course, it became completely inhabitable
Indeed. The human ingenuity we take such pride in is just evolved for one purpose. We are nothing more than a sack of meat with the sole purpose of carting our genes around until we can reproduce and pass the genes on.
I just try not to think of it because it's so damn depressing. :P
No way. Who we are is merely a stepping stone to what we can become.
new age tripe.
I don't buy that. Barring some drastic change in the environment, we're probably not going to evolve much further. As it stands, we're an incredibly successful species, so evolution isn't necessary for our survival.
Perhaps we might alter ourselves artificially, but that's a whole 'nother discussion.
Oh there will be evolution. We just won't really see it unless we drastically increase our lifetimes.
Just look at the way people select mates these days. People who otherwise wouldn't have survived a long time ago are now having kids and propagating all across the globe. This will have an effect.