The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
Please vote in the Forum Structure Poll. Polling will close at 2PM EST on January 21, 2025.
does the "cut" not kill the person? Unless you believe in a soul or metaphysical consciousness, then the "cut" would end the life of the person being downloaded or whatever.
it wouldn't feel like just going to sleep and waking up in a new robot body. not for you. instead, you'd just die.
that's what i am hypothesizing. oh, sure, the "pasted" intelligence would consider itself you and would tell you it felt like just going to sleep, but that's just a duplicate emulating the person it was based on
I don't see the problem
your objection seems to be strictly philosophical over whether you're technically human or not
who gives a fuck?
I'm not saying there aren't other problems, but I don't see that in particular as an issue at all
okay, say we have the technology to cybernetically replace or even augment a man's arm
no big deal, dude's still the same dude
then we replace his other arm, his legs, his heart, his entire digestive system, etc.
until eventually he's just a brain in a robotic body
is he still the same person?
cognitively, most likely, as the brain is the center of what is "you"
but then we start, bit by bit, replacing the brain with cybernetic implants
sure, he's smarter, has faster response time, can wirelessly access computer systems almost like a telepath
but at what point is he no longer the same guy at all?
where does that line get crossed
and it calls into question fundamental aspects of what we consider to be self-awareness
would your self-awareness remain the same, would you remain the same consciousness, even if your consciousness was "downloaded" into a robot brain?
or would you just die, and a perfect facsimile of you come into existence
Throughout your life all of the cells in your body are replaced naturally.
In fact, I'm sure it was said somewhere this takes only 7 years.
The person you were when you were born is, by your definitions, dead. You are a new person down to the cellular level already, without the need for cybernetics.
does the "cut" not kill the person? Unless you believe in a soul or metaphysical consciousness, then the "cut" would end the life of the person being downloaded or whatever.
it wouldn't feel like just going to sleep and waking up in a new robot body. not for you. instead, you'd just die.
that's what i am hypothesizing. oh, sure, the "pasted" intelligence would consider itself you and would tell you it felt like just going to sleep, but that's just a duplicate emulating the person it was based on
And now that you've told me my future digital conciousness is going to have Hell of guilt trips. Gee, thanks a lot, Pony.
How can it be slavery to you? Aren't you religous? (Jewish / Buddhist, right?).
Without an actual soul it would still be a thing and every sentience it posseses would be a simulation in the end.
i am of the personal belief that what defines us as self-aware persons isn't a magic soul exclusive to mankind.
if you make an artificial person that has all the sapience and intelligence of a human being, you have created a person and as a person they have rights
The question is whether true consciousness can be created artificially.
Which then raises the question of what "true" consciousness is.
Personally, I don't think humanity will ever be able to create an artificial consciousness, or that we will be able to somehow "transfer" consciousness from a natural construct to a synthetic one.
The question is whether true consciousness can be created artificially.
Which then raises the question of what "true" consciousness is.
Personally, I don't think humanity will ever be able to create an artificial consciousness, or that we will be able to somehow "transfer" consciousness from a natural construct to a synthetic one.
i believe it can happen and, in the case of creating full artifical consciousness, will eventually happen
so it creates ethical issues and philosophical conversations that should be had
Let us say we potentially can create a wonderful intelligence through some other means than fucking. If we then make a lower form of intelligence, are we then denying that creation its rights?
The question is whether true consciousness can be created artificially.
Which then raises the question of what "true" consciousness is.
Personally, I don't think humanity will ever be able to create an artificial consciousness, or that we will be able to somehow "transfer" consciousness from a natural construct to a synthetic one.
i believe it can happen and, in the case of creating full artifical consciousness, will eventually happen
so it creates ethical issues and philosophical conversations that should be had
I enjoy the philosophical conversations surrounding the issue even though I don't think it will eve be a real world problem.
Because honestly, I don't have a real scientific reason why I think consciousness will never be artificially created. It's more of a philosophical or religious belief.
That was a pretty crazy movie. I'm still not sure if there was anything that made sense beyond the pretty animation, but I probably just need to watch it again.
How can it be slavery to you? Aren't you religous? (Jewish / Buddhist, right?).
Without an actual soul it would still be a thing and every sentience it posseses would be a simulation in the end.
i am of the personal belief that what defines us as self-aware persons isn't a magic soul exclusive to mankind.
if you make an artificial person that has all the sapience and intelligence of a human being, you have created a person and as a person they have rights
like the right to not be a sex slave
What if it wants to be a sex slave? Is it still truly a slave then?
guys what if you were just crazy and carved a lady out of cheese, thought it was intelligent and made that your sex slave
are you morally wrong there. it has no real intelligence, just what you gave it in your mind. but in your mind it is as human as anyone else.
i don't believe morality is an objective force in the universe
as a result
if the person doesn't feel bad about fucking their strange cheese person despite believing they are intelligent
it doesn't mean they're doing something immoral
but they're the kind of person who should be watched because not only do they have some kind of weird schizophrenic break with reality, they don't feel particularly bad about it
Let us say we potentially can create a wonderful intelligence through some other means than fucking. If we then make a lower form of intelligence, are we then denying that creation its rights?
What? Why? If my child is born with Down's Syndrome are his rights violated?
How can it be slavery to you? Aren't you religous? (Jewish / Buddhist, right?).
Without an actual soul it would still be a thing and every sentience it posseses would be a simulation in the end.
i am of the personal belief that what defines us as self-aware persons isn't a magic soul exclusive to mankind.
if you make an artificial person that has all the sapience and intelligence of a human being, you have created a person and as a person they have rights
like the right to not be a sex slave
What if it wants to be a sex slave? Is it still truly a slave then?
and that raises a further question about AI programming
in creating an AI, are you going to allow it all the randomness the human brain allows for, or are you going "lean" the AI in a certain way so it wants to do certain things.
are you going to make soldier robots who enjoy war, who feel it is their purpose to defend their country and fight the enemy?
are you going to make prostibots who crave sex and feel it is something they need.
if you're making a robot, it only has the "needs" that you create for it
a bunch of internet nerds get all woody about "the singularity" and brain implants and shit like that
all i can think is
wouldn't that just be like slowly killing yourself
From an atheist's stand point no. Because 'You' don't even exist in the first place.
It doesn't really matter if your body is completely made up of flesh or metal as long as your cyber brain is capable of the same functions as the original.
let's say for example you don't believe in a soul or spirit or some other kind of remote quantity that makes you the person you are
let's look at it from the cold standpoint of pure science: you are the person you think you are because your cognitive processes allow you to. you think therefore you are, simply because your brain is advanced enough to think such things.
if that's the case, all that is you, your self-awareness, your sapience, is tied directly to the meat inside your skull
if that's the case, and you replace all that meat with machine parts, who you are, your consciousness, has died and isn't carried on to the new machine. the machine simply emulates who you were, but is a completely different being.
Well, there have been people who have had to have large amounts of their brain removed for various reasons and they remain the same person. On the other hand, some people with comparatively less brain damage lose all sense of who they are or lose their ability to think entirely. Generally speaking, it's a real mystery where the "self" resides in the mind, or whether the "self" exists in some way independant of the mind.
It's scary as hell to think about, personally. I believe that the soul exists metaphysically rather than physically, but it still is scary to read stories about people who are in a car crash and become entirely different people.
this is what i am saying
these are pretty important questions when we live in an age where things like cybernetic brain augmentations are becoming possible
old man otter might see it as pointless philosophical hand-wringing but it's kind of important to question how we actually cognitively perceive the self, especially if we're going to start dicking with the bits that most likely define those things
except that what you're failing to consider is that the person that's appeared to change to everyone else still sees him as himself
I've done considerable reading on the subject by doctors like Oliver Sacks, and one of the overarching themes is that even when someone suffers major brain trauma that seems to completely change the person from everyone else's perspective, or develops Alzheimers, Sacks has found that while the person is aware that others see them as different, they still have their self awareness and identify as themself.
again, you aren't getting what i am saying
i'm not talking about a guy with a single brain implant
i'm talking about synthetic brains, or people trying to "upload" their consciousness to a robotic body
if you are john smith, and you upload your brain patterns to a synthetic brain, the synthetic self is going to call itself john smith and believe it is john smith, because it has all the cerebral make-up that defines john smith as who he is.
but he is not john smith. he's john's clone, at best.
if john smith dies, he is dead. his robotic clone might live on, but the guy who was there originally is now dead.
i think any rational person can see it that way
now compare that "digital transfer" to something like slowly over time replacing your brain with synthetic parts
would there be a true transfer of the self, or are you just slowly killing john smith and creating a duplicate?
that's the whole ship of theseus thing
no, I get exactly what you're saying
it's pretty fucking clear that you're the one that doesn't understand
my point is quite fucking simple
if it thinks it is John Smith, then where's the fucking problem?
sense of self is intact, crisis averted
the only problem is external, how you define John Smith as an observer and your insistence that he's not just because he has a different body
whereas I don't really care if he is or isn't, as long as he has a self identity and is able to cope
If I'm all like hey Destructo 9000 make me a pastrami on rye and then blow up that car over there with your death laser Destructo 9000 better fucking hop to it
Posts
either way, the robots are getting some fun too.
but he is john smith
like you say
he's got the same everything that makes up john smith
therefore he is john smith
does the "cut" not kill the person? Unless you believe in a soul or metaphysical consciousness, then the "cut" would end the life of the person being downloaded or whatever.
it wouldn't feel like just going to sleep and waking up in a new robot body. not for you. instead, you'd just die.
that's what i am hypothesizing. oh, sure, the "pasted" intelligence would consider itself you and would tell you it felt like just going to sleep, but that's just a duplicate emulating the person it was based on
Throughout your life all of the cells in your body are replaced naturally.
In fact, I'm sure it was said somewhere this takes only 7 years.
The person you were when you were born is, by your definitions, dead. You are a new person down to the cellular level already, without the need for cybernetics.
How can it be slavery to you? Aren't you religous? (Jewish / Buddhist, right?).
Without an actual soul it would still be a thing and every sentience it posseses would be a simulation in the end.
as far as i am concerned i'd put that in the same category as dudes who jerk it to cartoon child porn
like, i don't think a crime is being committed because no child is actually being harmed there
but it's still gross and makes them a sick fuck
A sex golem
What if they transferred your conscience into two different robots?
And now that you've told me my future digital conciousness is going to have Hell of guilt trips. Gee, thanks a lot, Pony.
i am of the personal belief that what defines us as self-aware persons isn't a magic soul exclusive to mankind.
if you make an artificial person that has all the sapience and intelligence of a human being, you have created a person and as a person they have rights
like the right to not be a sex slave
they've created two duplicates of you
they aren't you
you aren't them
Is that Freakazoid?
Which then raises the question of what "true" consciousness is.
Personally, I don't think humanity will ever be able to create an artificial consciousness, or that we will be able to somehow "transfer" consciousness from a natural construct to a synthetic one.
how are they not you?
i believe it can happen and, in the case of creating full artifical consciousness, will eventually happen
so it creates ethical issues and philosophical conversations that should be had
are you morally wrong there. it has no real intelligence, just what you gave it in your mind. but in your mind it is as human as anyone else.
you've gone mad with power
you've made inserts for the vagina out of various other cheeses.
the gouda was your favorite
I enjoy the philosophical conversations surrounding the issue even though I don't think it will eve be a real world problem.
Because honestly, I don't have a real scientific reason why I think consciousness will never be artificially created. It's more of a philosophical or religious belief.
That was a pretty crazy movie. I'm still not sure if there was anything that made sense beyond the pretty animation, but I probably just need to watch it again.
What if it wants to be a sex slave? Is it still truly a slave then?
i don't believe morality is an objective force in the universe
as a result
if the person doesn't feel bad about fucking their strange cheese person despite believing they are intelligent
it doesn't mean they're doing something immoral
but they're the kind of person who should be watched because not only do they have some kind of weird schizophrenic break with reality, they don't feel particularly bad about it
If they possess all my memories I'd argue they are me. Basically as I said, I do not believe us to be more than walking sacks of flesh.
So the thing that would come even remotely to a metaphysical 'me' would be the sum of my memories.
And if you copy them they are basically me, until 1 second later we had different experiences and would become different beings.
What? Why? If my child is born with Down's Syndrome are his rights violated?
and that raises a further question about AI programming
in creating an AI, are you going to allow it all the randomness the human brain allows for, or are you going "lean" the AI in a certain way so it wants to do certain things.
are you going to make soldier robots who enjoy war, who feel it is their purpose to defend their country and fight the enemy?
are you going to make prostibots who crave sex and feel it is something they need.
if you're making a robot, it only has the "needs" that you create for it
it's pretty fucking clear that you're the one that doesn't understand
my point is quite fucking simple
if it thinks it is John Smith, then where's the fucking problem?
sense of self is intact, crisis averted
the only problem is external, how you define John Smith as an observer and your insistence that he's not just because he has a different body
whereas I don't really care if he is or isn't, as long as he has a self identity and is able to cope
Namely, blow shit up and make me sandwiches
what differentiates that from you?
what else is quantifiably you besides those memories and mannerisms?