I'm welcoming you all to come sit with me at the newly created Transhumanist Tavern. Emulate a seat, fabricate a beer and open your eleven senses. Let us talk like the futurists of old; those crazed science blokes with the poor gender-communication and brilliant minds nestled so confined in their inferior biologically-based skulls.
I'm interested in hearing your thoughts about:
[1] Transhumanism - A sin or evolution? (
Transhumanist Definition and
Resources)
[2] Extropianism - Adequate preparation for the future or a load of bullox? (
Extropy Institute)
[3] The Singularity Event - A threat or a blessing? (
Singularity Definition)
[4] Quantum Physics - A reality or an irrational fallacy?
[5] Futurist Speculation - Flying cars, yet? Please?
[6] Mind-blowing Technology - Nano-technology, Organ Fabrication, Self-cleaning Coffee Machines...
I'm hoping this thread will be very open-ended and allow for a lot of minds to express themselves. As always I'm hoping to learn and be enlightened. Enjoy your stay!
Posts
The only problem with a solution like this is that without a driving force which exists outside of the singularity, there will be no driving force to spark innovation. Society will devolve into a stagnant utopia of sorts.
Let's play Mario Kart or something...
Oh and lets bring on the Singularity!
Noted and fixed! I added definitions and resources to Transhumanism, Extropianism and Singularity. Unfortunately I only have really speculative sites as resources, such as Wikipedia, for the definitions. But since everything here is relatively theoretical it serves adequately enough.
I'm not sure if what creates my consciousness is a soul, or a spark or whatever. Really though, I don't care. What I do know (or think I guess) is that either blanking this body and 'uploading' something somewhere else or making a copy doesn't really do anything at all for me. I'm all for bionic hands, new skin, and maybe even putting super myelin all over my nerves. Maybe it's the last latent part of religisosity in my head. I guess that part could be deleted after the process.
Basically, if a computer could become me, I'd never need another friend. I'd play Chess against myself for hours. I've always wanted to do that! And I know the robot will to, cause once he's me, He's always wanted to do that! It's just not the same when I play a move, then another, then another by myself.
I've more thoughts but I think I've done enough in one post...
I'd have to agree. I feel that if you emulate your consciousness with technology you're aren't transferring your conscious mind over... you're making a copy of yourself. With that basis wouldn't you be committing suicide to make a machine that acts like you in every way? I suppose the big qualm I have with it is the transference of a "soul". Which is probably a big impossibility, if souls exist... Anyhow I'm completely down with body alteration: nanobots to cure me, slow my aging, boost muscle growth and all that stuff. But I'd be kind of skeptical about computer-brain integration. I don't want to be hooked to the internet at all times, though I would have more time for PA forums... hmmmm...
Your body is being constantly rebuilt anyway. There's no fundamental disconnect between your mind and an emulated version of your mind that isn't already present between you and an older version of you.
Not that I'd be willing to copy my own mind into a computer and kill myself, though. I prefer the philosophical solution of occasionally losing sleep over how meaningless everything is until I manage to block it out of my mind for a year or so.
Singularity Sky by Charles Stross by any chance?
As long as we are dependent on eachother through society, we will have to account for incentives to support eachother through society. If all our basic needs are perpetually cared for, I think society as we know it will not be needed anymore. However, solving those basic needs won't happen at once, and I really wonder how we'll deal with people who'll just say 'OK, this is enough for me, now I'll wander the world/lock myself in a room/be generally not contributing to further advancement of humanity'. I can't fault them, but I wonder if the sentiment will become epidemic enough for a standstill (like, the last living human will die a happy fat (wo)man while sexing a robot).
I could be a very simple mistake. On the extreme side someone could ask a hyper-intelligent computer to compute--go figure--a very advanced equation. And out of the interest of answering the question for the benefit of the asker, our super-intelligent computer would use all the available material in the solar system to build a computer capable of answering that question, hence destroying everyone in its path for the benefit of the asker. Granted now there's no one to give the answer too. But that's a really, really, extreme conjecture.
It could be true, but it deoesn't seem likely to me.
You say that with certainty, whereas I know for certain that you couldn't possibly know that for certain. Certainly a super intelligent computer would have the knowledge to not destroy his creator for the purpose of answering a question. But if the prime-directive of this super-computer were to obey without question in hopes of appeasing it's creator; it's plausible.
That's why anyone capable of designing a super-intelligent computer would have enough sense to not require it to obey its creator perfectly.
Take computing power. Computers have gotten exponentially more powerful over the past 30 or 40 years. This will continue for a while. However, each generation is more difficult and expensive to design in the first place. Back in the 70s you'd have people soldering transistors to a board and calling it a computer. Nowadays it takes the research labs of huge corporations like IBM millions of dollars and thousands of hours of research (human time and computer time) to pump out the next generation. As chips get smaller fundamental physical problems will cause more and more problems. Just because transistor computers become advanced so quickly doesn't mean that other types will take over when transistors reach their limit.
Nor do I think that the creation of Strong AI significantly smarter than a human is on the near horizon. I mean, it could be. But virtually nothing is understood about what actually makes humans intelligent. Problem solving, creativity, imagination, consciousness; these are things we can't even define, let alone get a computer to do. Once its figured out (and I do believe it will be) there is nothing that says intelligence is scalable to a limitless degree. There could be any number of hidden restrictions, perhaps the complexity of a mind increases much more rapidly than its intelligence. This would place upper limits on the intelligence of an individual mind (which gets dicey with definitions, but point remains the same).
Mostly I dont like it because it involves infinities (in the mathematical sense) coming up in human sociology and psychology. Do you know how many physics equations have an infinity in them? Not bloody many, an infinity is usually the result of putting something into an equation that doesn't make sense; an infinity is usually an indication of error if it comes up.
That, and concepts like thermodynamics make me instantly super-skeptical of something that gives accelerating returns up to some limit. Can anyone give me an example of anything else that does this, instead of decaying via entropy?
If number 4 is an irrational fallacy, I'd love to know how you're reading this.
Also, the 13 lists of "paradigm shifts" that were used to compile the graph shown in your Singularity link do not contain anything that happened during the last hundred years so he's using biological evolution and machines invented using wood and string to predict the technological evolution of computers and the modern society.
The guy who compiled the data that it's based on claims that technological progress is actually lowing down. I'm not saying whether that's a reflection on Kurzweil's speculation or Wikipedia's lack of consistency. Either way, I won't be losing any sleep over it.
Radical Evolution is a great book about post/transhumanism.
And Gorak, I believe what is occurring with the paradigm shift is that yes, evolution is slowing down because--as you know--we sustain and protect ourselves; essentially we protect the weak. But as the link describes it, every time that a technology approaches a barrier another technology has come around and broken the boundaries. We're slowing but our progression occurs in unprecedented "leaps and bounds". I'm interested in your line of thought though, do you have any links with better definitions; or perhaps more comprehensive context?
I've read this somewhere, like Scientific American, I promise.
We'll never reach Singularity. But we might get close. Not for a long time, though. We've got quite a few slow, small revolutions. Some of those revolutions won't even be noticed. They'll just happen under peoples noses, subtle changes in society that effect so profoundly our future, but make so small a splash in the pond.
Sure. When it comes to the human body (and most other areas as well) I suspect that nano-technology and small scale biological engineering will be indistinguishable. Hey guys, lets make tiny machines to transport medicines throughout our body! Well, look here, nature provides us a virtually unlimited supply of ready made, self replicating, self adapting, fairly easy to modify machines in the form of viruses, bacteria and the like. It will just be adapted to make use of technology (as an example, using a virus to build wire on a very tiny scale, so as to make electronics smaller; this has already been tested). So yeah, human medicine is going to go through the roof in the next 50 years, its already started and is only going to go faster.
The computerized stuff will be far more interesting I think, but also harder to predict. Without knowledge on how our brain really works these sort of things may come soon, may come later. I'd think the interface would be the biggest initial hurdle; say you have a computer attached to your head. This computer contains pictures of people (to help you remember them). When prompted, this computer can send an image to the part of the brain that processes visual information, tricking it into thinking that the person was actually seeing this image (artificial eyes already work in this way). The trick is, how does your brain prompt the computer for the image in the first place? You don't want that to just pop up at some random thought, it would be distracting. For the interface, more has to be learned about how the brain stores an accesses information normally, which unfortunately is something that very little is known about.
Well, that's not entire what I was trying to convey. In a very real sense evolution--for humanity--has ceased. If we were still in a societal/cultural structure worldwide that allowed the weak to die out then I would say, yes, evolution is very much active. But we aren't, we harbor the weak or physically unfit. Hence: evolution has ceased.
I agree with the Singularity acceleration theory you're postulating. It's projected that the half life for technology will be thirty six hours within the next fifty years. That means that every thirty six hours computers and engineers have developed technology to make that generation obsolete. I don't know if we'll ever be able to survive in such a fast paced world. Perhaps transhumanism will be the only way to compensate?
Codicil: I recall hearing something about human-evolution progressing solely in the brain with the most evident example being Attention Deficit Hyperactive Disorder-- because our media projects things at such a fast rate allegedly children develop ADHD as a compensation mechanism. Can't remember the source for this: anyone have any idea as to the validity of the statement?
Those who are skeptical about the non-supernatural nature of consciousness, there's a pretty good book by Antonio Damasio (though it's a little old) called The Feeling of What Happens that proposes what I think is a really good basis for a simple explanation for the sensation of consciousness. Basically he describes consciousness as a sort of feedback loop in which we experience something and then relate it to our mental object of ourself. What he's saying is that consciousness is not a "thing" so much as a process. The way I view it, though this isn't necessarily an accurate analogy, is that our brain goes through processing cycles the way a CPU might, and that at the rate that they occur they manifest as a seemingly continuous consciousness. In reality, though, we're going through a new cycle every moment, and the only thing that makes the consciousness belong to an individual is the fact that the cycle is drawing from that individual's memory.
This is a kind of weird explanation, but think of it as though you were just instantly born, but you share the same memory as the last guy who just died standing in the exact spot where you were born; you perceive your life as an unbroken chain up until this point. One second later you die, but the next person will remember everything from your life and the guy's life before you and the life before that. Together you'll all form a single, drawn out life because you're all drawing from and adding to the same memory.
What I'm trying to say, basically, is that when you refer to yourself, the consciousness is not the "you" part, your memory is the "you" part. So it shouldn't matter what is generating the consciousness, it's still "you" if it's drawing from the same memory.
So, by that theoretical definition, yes, "you" could be transfered into a machine or something of the like, as long as your memory remained intact.
― Marcus Aurelius
Path of Exile: themightypuck
Just looking quickly through that wikipedia article, a lot of the arguments against transhumanism seem really weak, and based mainly on religious agenda, or boil down to "It's wrong because it hasn't been done before".
Word.
Yeah, but Rhan what we're trying to say is that the consciousness doesn't remain the same, no matter what. I mean, this is all theoretical, but it appears as though consciousness is not a continuous thing, and a person doesn't just have a single consciousness, it's not a long unbroken thing, but a myriad of little starting and finishing parts. Your consciousness now is not the same consciousness as you had during your last thought. So, whether you retain the same brain or not, consciousness is not preserved in any instance.
Though the situation you bring up does raise the questions that would lead to the conclusion I have. If consciousness is, in fact, some sort of metaphysical entity, at what point does it become detached from the body? Would we find a very specific thresh-hold of things you can do to a person's physical brain before their consciousness is not the same? Would we expect to find a specific piece of flesh that ties our "souls" to us? I think the logical fallacies that answering these questions would bring up would lead to a falsifying of the "unbroken metaphysical consciousness" theory.
Ha, I like it.
AJAkaline40 that's something that I've thought about extensively. Yet I can't shake the feeling that there is something intrinsically binding between our biological frames and our "connection" to self-aware/consciousness thought. I've always thought of the "soul" as the friction created between our "mind"--not brain--and our mortal coil. If we are to abandon our "mortal coil" and reframe our consciousness into a mechanical or digital reality; how would this affect our spirit? Personally, as I've mentioned before, I'm willing to do many phenomenal things for my body; but abandoning the flesh completely--especially now where we are on the cusp of a very literal form of immortality--seems foolish.
Besides, there are "some things" that I think wouldn't quite be as fun if I were in a robot body.
To be honest, I can't really comment on the abstract concepts of "soul" and "mind" and "spirit" and "mortal coil". I sort of see where you're coming from, but those things are just so poorly defined I've never been able to straight out argue them. It inevitably comes down to personal preference, I like to think that the most complex functions of the human mind can be boiled down to emergent properties developed from the relatively simple base structures that make up our brain, and I'll pursue evidence to that end. In a sense that's my own "feeling I can't shake", though I am willing to accept that it might not be true. I just have a much higher faith in the power of incredibly complex systems to be born of simple part than I do a faith in the existence of the metaphysical realm.
Also, in reference to "some things", electrical stimulation of the brain will most definitely allow reliable reproduction of any potential stimulation a biological body would receive, and even in the exact same manner that your body would do it. Depending on how much weight you put on "authenticity", you could also really open yourself up to a realm of nearly infinite enjoyable possibilities, concerning "some things". What I mean, basically, is you could give all new meaning to "all my dicks".
Also, something that I was wondering about this earlier; would you think that what would essentially amount to a voluntary Matrix create Utopia? Essentially a world in which there is no concept of materialism, because everyone can have everything, as any object would just represent programs in a computer world? People who are crippled could walk, people who are ugly could look as they pleased, and the only sense of accomplishment would come from imagining some grand new object and creating it in the virtual world. I mean, there's a lot of underlying things to consider, such as authenticity; would someone perceive possessing something in the virtual world as tantamount to possessing something in the actual world? etc.
Those who spend their entire lives in low gravity environments may opt for a second pair of arms rather than legs as people become distinct subspecies of homo sapiens. Nikolai Fyodorov's ideas regarding transhuman existence and becoming gods among gods may seem like the stuff of madness in the world as we currently understand it, but if we do not go extinct within the coming eons or so, there is no telling where our species will go and of what it will be capable. All it takes is time. Someday, we'll be able to manipulate the stars themselves at the speed of thought. The universe is vast and the possibilities are just waiting to be explored.
We define ourselves through the cycles we break. Mortality may just be another cycle we will find ourselves overcoming. And while I doubt we would be able to resurrect the dead in the way Fyodorov imagined (through genetics and residual soul-like images), longevity and cellular renewal do seem like possible milestones and there may yet exist methods related to harnessing powers over time and space that would allow us to manipulate reality and extract or replicate entire entities to be preserved and elevated to the collective plane of metaphysical consciousness that is our own created heaven.
Well that's something that I've thought about quite a bit actually. The voluntary scripting of your mind into a "Utopian" digital world could be possible, and quite successful if human beings were uniform. What I mean by this is that by process of evolution there are certain human characteristics that will be extremely difficult to "weed out". By process of evolution we have gender differences. By process of culture we have the need for economy to not only define our reality, but therein define ourselves. If we were to "digitalize" our reality we would inherent all the traits of our current reality; the roots of all our sins--I believe--would follow suit. I think it's very possible that in the digital world any possibility is absolutely feasible. As you said before, the ugly could be beautiful, the crippled could fly. Truly the digital world would offer some truly wonderful things. But based on the precedence of our current culture and society as a race; access to that world would be nearly impossible. The standards for acceptance would be phenomenally high, the cost itself--if at this point we are still assigning values to things--would be astronomically high. Additionally any Utopian society would soon become a dystopia through the actions of free-thinking, impossibly changing and "powerful" entities living within a technically "boundless" reality.
I recently finished a thesis paper on the gender war and it's properties from initial concept to it's presence in modern day. The paper focused, however, on postulating a gender wars existence in a reality that you have described here: completely digital. I eventually concluded that gender--not sex--is a mental prescription devised by the mind to define oneself: it is in short a tag that says to the world: judge me thus. And considering the fact that trans-gender individuals do not have the same sex organs as the sex they subscribe to I inferred that the gender war would indeed transcend the physical plane to the digital plane. In short: a digital penis is as good as a physical penis. Because of this reason--and a multitude of other reasons (just consider this to be analogous with violence, hatred, etc) I believe that digital realm--when possible--could easily become as much of a dystopian existence as the world we currently live in.