The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
Hey everyone. I'm familiar with the Kurtzweil Singularity, I've read a few of Dresden Codak's comics, and I know a bit about Transhumanism. I just want to get a feel for the kind of people who believe in it.
If you're a transhumanist, if you read h+, tell me why and how you believe the things that you do. Why is the singularity so appealing to you? What does it mean?
Transhumanists believe what they believe because they really buy into all the hype about the future.
The singularity is like a big nerd rapture basically, a ridiculous idea that someday technology will just be so advanced that everyone will just be saved and live forever and be all knowing and powerful. This is basically the equivalent of Jesus coming down and rescuing all the people who believed in him; it's probably just not going to happen. In essence, you can think of transhumanists as atheists who want religious ideas to be made reality through technology, things like eternal life and life beyond death.
I saw Old Man DuKane post, and figured I'd say something as well, even though I'm not a Transhumanist -
Like he said, Transhumanism is just the new Millenialism. Instead of "God will save us any day now!" it's "Technology will save us any day now!"
It seems fundamental the belief that exponential acceleration of technology will continue, and that we're not just climbing a Gaussian, interspersed with things like, "We invented a Cochlear implant!", the SINGULARITY and being a cyborg are two different things.
Transhumanism is often mixed in with those who believe in the technological singularity (wikipedia separates them into Transhumanism and Singularitarianism).
If you believe in the betterment of humankind or the human condition through technology...implants, nano-machines, hell even connectivity through things like the internet, you can deem yourself a transhumanist.
But the technological singularity is a whole other thing altogether. People who believe in that are essentially just substituting religious faith with a technological faith, and they're pretty much on the same level of superstition.
Transhumanism is often mixed in with those who believe in the technological singularity (wikipedia separates them into Transhumanism and Singularitarianism).
If you believe in the betterment of humankind or the human condition through technology...implants, nano-machines, hell even connectivity through things like the internet, you can deem yourself a transhumanist.
But the technological singularity is a whole other thing altogether. People who believe in that are essentially just substituting religious faith with a technological faith, and they're pretty much on the same level of superstition.
What's the difference between these and wearing a rolex and calling yourself transhumanist?
No, if you believe a person could be mechanically deconstructed completely into atoms and then be rebuilt at another location and still be the same person you are a transhumanist. You believe in relative souls.
I've never heard of this, but after looking at a few websites, I don't think I understand the singularity either. It's just a matter of creating a super-intelligent computer? We're not all being uploaded into the godhead or something?
I've never heard of this, but after looking at a few websites, I don't think I understand the singularity either. It's just a matter of creating a super-intelligent computer? We're not all being uploaded into the godhead or something?
If you create a super intelligent computer that knows how to make itself even smarter recursively you end up on an infinite recursion of increasing knowledge, until that knowledge reaches infinity, at which point things are just willed into existence.
In realtiy, there is no where to store all that knowledge, nor is there any basis for the idea that just because you posses infinite knowledge you can now just make things happen instantly.
...and in fact a singularity is impossible.
For instance, if a singularity were possible, and it was possible to create an infinitely all knowing machine, then that means this machine should in theory be able to produce every possible 32 bit image at a resolution of 1900x1280. Thus every possible graphic, photograph and image that could ever be possible would be contained within this set that the machine could calculate. Imagine anything, and it's there. All the porn ever possible, with anyone and anything, all in there. The future, the present, the past, every single moment of your life, in every possible alternate universe, and even every possible historical event that has occurred and will ever occur, all rendered in varying levels of accuracy and detail, several times. And this is only the beginning.
Except, it's not possible. The Universe would die of heat death several times before all those images could even be computed, assuming you are computing at a speed of the smallest possible unit of time, a planck's constant.
To do this task you would have to somehow create even smaller units of time through time travel. In fact you'd have to create negative time. In laidman's terms, you'd have to make an algorithim that actually ran faster the larger the input is. An algorithim that runs in 1/n time. Impossible sh!t.
But then there are those who say that this machine itself already exists, that it is actually the universe itself. Believe what you will. Eitherway, transhumanists are full of sh!t.
Uploading a brain would be child's play compared to this... but this is where transhumanists descend into batsh!t fundie craziness.
Uploading a brain to a machine is not the same as extending your life, in fact it doesn't do sh!t for you. All you're doing is making a copy of the state of your mind, which is not in itself an actually instance of you as you are now. If someone blows your brains out after you upload your mind you will be very much dead, not frolicking in the wonderland of a cyber singularity heaven with your peers. Your copy may be doing that, but you won't. Waste of f*cking time.
The only plausible way to really transfer yourself to a cyber brain might be to do it organically piece by piece. Let some little machines live in your brain to act as neurons and coexist with your cells, and then consistently introduce more and more until the majority of your brain is stored in little machine neurons. Who the f*ck really knows though.
It seems to me, in my limited knowledge of the subject, that "singularity" has a basis in fact (recursive computer aided computer development) but extends into retarded wrong impossible pseudo-science rather rapidly.
I mean, yes, someday, someone will design a computer whose sole task is to design another computer that operates more efficiently, and humanity will see a rapid increase in computer science, until we hit a ceiling called physics. time is only so fast, electrons only transfer so rapidly, metals available only conduct so well. something will bottleneck things before we reach "omg infinity"
For instance, if a singularity were possible, and it was possible to create an infinitely all knowing machine, then that means this machine should in theory be able to produce every possible 32 bit image at a resolution of 1900x1280. Thus every possible graphic, photograph and image that could ever be possible would be contained within this set that the machine could calculate. Imagine anything, and it's there. All the porn ever possible, with anyone and anything, all in there. The future, the present, the past, every single moment of your life, in every possible alternate universe, and even every possible historical event that has occurred and will ever occur, all rendered in varying levels of accuracy and detail, several times. And this is only the beginning.
Except, it's not possible. The Universe would die of heat death several times before all those images could even be computed, assuming you are computing at a speed of the smallest possible unit of time, a planck's constant.
What the hell is this? Does not follow much? I've never heard any transhuman tenet that says WE MUST SEE EVERY POSSIBLE IMAGE AT THIS RESOLUTION AND BIT DEPTH. I don't think anything has to be infinitely all knowing, either. Just far better than what we have. Shit, it doesn't even matter if the curve is Gaussian so far as the plateau is high enough.
Here's what I understand of transhumanism.
1) Through use of tools, we can make ourselves better. This isn't controversial.
2) Through use of tools, we can make better tools. This isn't controversial either.
3) At some point, we shall be able to make a particular tool called AI. An artificial intelligence. This is controversial.
4) At some point, we shall be able to make such a good AI that it can make better AIs on its own. This is also controversial, and is what leads to singularity--an explosion of technological advancement that drives itself.
5) The result of 4 and 1 can lead to something that is no longer necessarily human. Like a Transhuman.
I buy the premise so long as AI can actually be developed, and (not believing in souls) I think that's certainly possible, since regular I apparently works. Right now there's no clear path heading that way.
Going to recommend this to D&D so I can argue with people.
Transhumanism is often mixed in with those who believe in the technological singularity (wikipedia separates them into Transhumanism and Singularitarianism).
If you believe in the betterment of humankind or the human condition through technology...implants, nano-machines, hell even connectivity through things like the internet, you can deem yourself a transhumanist.
But the technological singularity is a whole other thing altogether. People who believe in that are essentially just substituting religious faith with a technological faith, and they're pretty much on the same level of superstition.
What's the difference between these and wearing a rolex and calling yourself transhumanist?
Nothing, but that's a small improvement, it's a man-made tool that betters the individual but you'd be hard pressed to say it improves the human condition or aids mankind as a whole. But of course, if you were to do this (slap on a watch, call yourself a transhumanist), you would also have to believe in other improvements that can be done too like what I listed among other things
It's as ProPatriaMori stated
1) Through use of tools, we can make ourselves better. This isn't controversial.
2) Through use of tools, we can make better tools. This isn't controversial either.
This is what I prescribe to anyway, I'm not big on the technological singularity.
Uploading a brain to a machine is not the same as extending your life, in fact it doesn't do sh!t for you. All you're doing is making a copy of the state of your mind, which is not in itself an actually instance of you as you are now. If someone blows your brains out after you upload your mind you will be very much dead, not frolicking in the wonderland of a cyber singularity heaven with your peers. Your copy may be doing that, but you won't. Waste of f*cking time.
That's debatable but is an entirely different topic altogether. Also, why do you censor yourself?
How do people actualy go about believing in speculative fiction? Surely at best it's just something they aspire to?
The same way people believe in building skyscrapers, blowing up the moon and other such grand projects, it's all just speculative fiction at some point. Building an ai or cyborg human isn't that far fetched really, we've already taken the first steps.
Do any transhumanists actually believe what is being said in this thread? All of the transhumanists I've met don't believe in it so much as say well that's what trends point to, but if it doesn't happen then it's just an incorrect prediction. It's not much different than what stock analysts do. If what they say doesn't happen then, oh well, it was just an incorrect prediction.
Believing in transhumanism seems scientifically untenable (because believing in anything is scientifically untenable), and the converse, believing that it is impossible for any kind of strong AI to be made is also scientifically untenable, since there isn't yet proof either for against it.
All the transhumanists I've met I'd say are at best agnostic but optimistic on transhumanist ideas. Not outright "believing" in anything, since it's a movement based on people into science, who would be aware of the idea that believing in anything is scientifically untenable.
Do any transhumanists actually believe what is being said in this thread? All of the transhumanists I've met don't believe in it so much as say well that's what trends point to, but if it doesn't happen then it's just an incorrect prediction. It's not much different than what stock analysts do. If what they say doesn't happen then, oh well, it was just an incorrect prediction.
Believing in transhumanism seems scientifically untenable (because believing in anything is scientifically untenable), and the converse, believing that it is impossible for any kind of strong AI to be made is also scientifically untenable, since there isn't yet proof either for against it.
All the transhumanists I've met I'd say are at best agnostic but optimistic on transhumanist ideas. Not outright "believing" in anything, since it's a movement based on people into science, who would be aware of the idea that believing in anything is scientifically untenable.
You seem to be more referring to transhumanist belief in the singularity. What about just improving the human being through technological or other types of modification? And doesn't science make incorrect predictions all the time?
I made a general transhumanist thread here, as H/A isn't the best place for debate
Transhumanism is often mixed in with those who believe in the technological singularity (wikipedia separates them into Transhumanism and Singularitarianism).
If you believe in the betterment of humankind or the human condition through technology...implants, nano-machines, hell even connectivity through things like the internet, you can deem yourself a transhumanist.
But the technological singularity is a whole other thing altogether. People who believe in that are essentially just substituting religious faith with a technological faith, and they're pretty much on the same level of superstition.
What's the difference between these and wearing a rolex and calling yourself transhumanist?
One is equipping yourself with a handy device for telling time and the other is a ridiculous quasi-religious nerd wank.
Posts
The singularity is like a big nerd rapture basically, a ridiculous idea that someday technology will just be so advanced that everyone will just be saved and live forever and be all knowing and powerful. This is basically the equivalent of Jesus coming down and rescuing all the people who believed in him; it's probably just not going to happen. In essence, you can think of transhumanists as atheists who want religious ideas to be made reality through technology, things like eternal life and life beyond death.
Like he said, Transhumanism is just the new Millenialism. Instead of "God will save us any day now!" it's "Technology will save us any day now!"
It seems fundamental the belief that exponential acceleration of technology will continue, and that we're not just climbing a Gaussian, interspersed with things like, "We invented a Cochlear implant!", the SINGULARITY and being a cyborg are two different things.
If you believe in the betterment of humankind or the human condition through technology...implants, nano-machines, hell even connectivity through things like the internet, you can deem yourself a transhumanist.
But the technological singularity is a whole other thing altogether. People who believe in that are essentially just substituting religious faith with a technological faith, and they're pretty much on the same level of superstition.
What's the difference between these and wearing a rolex and calling yourself transhumanist?
If you create a super intelligent computer that knows how to make itself even smarter recursively you end up on an infinite recursion of increasing knowledge, until that knowledge reaches infinity, at which point things are just willed into existence.
In realtiy, there is no where to store all that knowledge, nor is there any basis for the idea that just because you posses infinite knowledge you can now just make things happen instantly.
...and in fact a singularity is impossible.
For instance, if a singularity were possible, and it was possible to create an infinitely all knowing machine, then that means this machine should in theory be able to produce every possible 32 bit image at a resolution of 1900x1280. Thus every possible graphic, photograph and image that could ever be possible would be contained within this set that the machine could calculate. Imagine anything, and it's there. All the porn ever possible, with anyone and anything, all in there. The future, the present, the past, every single moment of your life, in every possible alternate universe, and even every possible historical event that has occurred and will ever occur, all rendered in varying levels of accuracy and detail, several times. And this is only the beginning.
Except, it's not possible. The Universe would die of heat death several times before all those images could even be computed, assuming you are computing at a speed of the smallest possible unit of time, a planck's constant.
To do this task you would have to somehow create even smaller units of time through time travel. In fact you'd have to create negative time. In laidman's terms, you'd have to make an algorithim that actually ran faster the larger the input is. An algorithim that runs in 1/n time. Impossible sh!t.
But then there are those who say that this machine itself already exists, that it is actually the universe itself. Believe what you will. Eitherway, transhumanists are full of sh!t.
Uploading a brain would be child's play compared to this... but this is where transhumanists descend into batsh!t fundie craziness.
Uploading a brain to a machine is not the same as extending your life, in fact it doesn't do sh!t for you. All you're doing is making a copy of the state of your mind, which is not in itself an actually instance of you as you are now. If someone blows your brains out after you upload your mind you will be very much dead, not frolicking in the wonderland of a cyber singularity heaven with your peers. Your copy may be doing that, but you won't. Waste of f*cking time.
The only plausible way to really transfer yourself to a cyber brain might be to do it organically piece by piece. Let some little machines live in your brain to act as neurons and coexist with your cells, and then consistently introduce more and more until the majority of your brain is stored in little machine neurons. Who the f*ck really knows though.
PSN - sumowot
I mean, yes, someday, someone will design a computer whose sole task is to design another computer that operates more efficiently, and humanity will see a rapid increase in computer science, until we hit a ceiling called physics. time is only so fast, electrons only transfer so rapidly, metals available only conduct so well. something will bottleneck things before we reach "omg infinity"
What the hell is this? Does not follow much? I've never heard any transhuman tenet that says WE MUST SEE EVERY POSSIBLE IMAGE AT THIS RESOLUTION AND BIT DEPTH. I don't think anything has to be infinitely all knowing, either. Just far better than what we have. Shit, it doesn't even matter if the curve is Gaussian so far as the plateau is high enough.
Here's what I understand of transhumanism.
1) Through use of tools, we can make ourselves better. This isn't controversial.
2) Through use of tools, we can make better tools. This isn't controversial either.
3) At some point, we shall be able to make a particular tool called AI. An artificial intelligence. This is controversial.
4) At some point, we shall be able to make such a good AI that it can make better AIs on its own. This is also controversial, and is what leads to singularity--an explosion of technological advancement that drives itself.
5) The result of 4 and 1 can lead to something that is no longer necessarily human. Like a Transhuman.
I buy the premise so long as AI can actually be developed, and (not believing in souls) I think that's certainly possible, since regular I apparently works. Right now there's no clear path heading that way.
Going to recommend this to D&D so I can argue with people.
Nothing, but that's a small improvement, it's a man-made tool that betters the individual but you'd be hard pressed to say it improves the human condition or aids mankind as a whole. But of course, if you were to do this (slap on a watch, call yourself a transhumanist), you would also have to believe in other improvements that can be done too like what I listed among other things
It's as ProPatriaMori stated
This is what I prescribe to anyway, I'm not big on the technological singularity.
That's debatable but is an entirely different topic altogether. Also, why do you censor yourself?
You tell them if they don't, Xenu will eat their souls?
Sorry, wrong fiction
The same way people believe in building skyscrapers, blowing up the moon and other such grand projects, it's all just speculative fiction at some point. Building an ai or cyborg human isn't that far fetched really, we've already taken the first steps.
Believing in transhumanism seems scientifically untenable (because believing in anything is scientifically untenable), and the converse, believing that it is impossible for any kind of strong AI to be made is also scientifically untenable, since there isn't yet proof either for against it.
All the transhumanists I've met I'd say are at best agnostic but optimistic on transhumanist ideas. Not outright "believing" in anything, since it's a movement based on people into science, who would be aware of the idea that believing in anything is scientifically untenable.
You seem to be more referring to transhumanist belief in the singularity. What about just improving the human being through technological or other types of modification? And doesn't science make incorrect predictions all the time?
I made a general transhumanist thread here, as H/A isn't the best place for debate
"The Singularity will happen, just not in my lifetime"
Makes you look a little less self-serving
One is equipping yourself with a handy device for telling time and the other is a ridiculous quasi-religious nerd wank.
CUZ THERE'S SOMETHING IN THE MIDDLE AND IT'S GIVING ME A RASH