Jump to content
IGNORED

Uploading consciousness into a computer


Guest Rambo

READ THE FIRST POST BEFORE YOU VOTE  

67 members have voted

  1. 1. Would you upload your consciousness to a computer?

    • As soon as it became available
    • Only if i was about to die
    • I dont know
    • Never


Recommended Posts

well, yeah, of course they'll both "be" you from an outsider's perspective. I'm saying which one will you experience being? Can that be predicted?

 

This is a nonsensical question. You'll be forked, and both entities will subjectively experience being "you" and would have equal validity of the claim of being the "real" you.

 

Stop trying to imagine that "you" exist outside of the natural world. You don't. The concept of experiencing being "you" is just one lifeform's way of experiencing being itself, and copying it would merely make there be two such lifeforms having the exact same experience. They're equally you. Neither has more claim to that than the other.

 

Isn't the sentence "They are you" nonsensical?

Link to comment
Share on other sites

  • Replies 144
  • Created
  • Last Reply

Of course. Both will be subjectively and objectively you. But if you keep a journal starting before the copy is made, and continuing through the duplication and after it, there will now be two journals, one corresponding to each copy. Eventually their contents will change as the copies have different experiences.

 

So to rephrase my question: Can you predict the eventual contents of your journal?

 

Both of them? If you knew all the inputs the brains would receive (now different for both of them), and you have a snapshot of their state before their divergence (woo, backups!) then you could predict both their outputs, sure. But then the distinction between a simulation of a lifeform and an actual lifeform seems artificial. You could equally say that my brain's merely predicting what someone would do in the situations it gets in.

Link to comment
Share on other sites

well, yeah, of course they'll both "be" you from an outsider's perspective. I'm saying which one will you experience being? Can that be predicted?

 

This is a nonsensical question. You'll be forked, and both entities will subjectively experience being "you" and would have equal validity of the claim of being the "real" you.

 

Of course. Both will be subjectively and objectively you. But if you keep a journal starting before the copy is made, and continuing through the duplication and after it, there will now be two journals, one corresponding to each copy. Eventually their contents will change as the copies have different experiences.

 

So to rephrase my question: Can you predict the eventual contents of your journal?

 

I though you were asking which copy the current you will be (meaning that the current you will have the first person perspective of that copy). This is what concerns me the most.

Link to comment
Share on other sites

Isn't the sentence "They are you" nonsensical?

 

Say you're a bacterium, and you reproduce asexually by eating then splitting in two. You wouldn't have an existential crisis over which one is the original and which one is the copy. (Or which is the parent and which is the child, if you like.) Such a destinction would be meaningless. Sure, you've become two different lifeforms, just like two human twins are, but asking which one is "you" is like asking a pair of twins which one is the "real" one.

Link to comment
Share on other sites

who said there would be no physical aspect?

 

So having your consciousness on some computer or in the cloud is in any way equivalent to having a body? I was assuming it isn't, tbh. Feel free to share your vision about uploading your consciousness into a computer and what sex would be like.

Link to comment
Share on other sites

Of course. Both will be subjectively and objectively you. But if you keep a journal starting before the copy is made, and continuing through the duplication and after it, there will now be two journals, one corresponding to each copy. Eventually their contents will change as the copies have different experiences.

 

So to rephrase my question: Can you predict the eventual contents of your journal?

 

Both of them? If you knew all the inputs the brains would receive (now different for both of them), and you have a snapshot of their state before their divergence (woo, backups!) then you could predict both their outputs, sure.

 

From a third person perspective. My question was rather: Can you predict the eventual contents of your journal?

Link to comment
Share on other sites

I though you were asking which copy the current you will be (meaning that the current you will have the first person perspective of that copy). This is what concerns me the most.

Yes, that is what I meant, sorry if i haven't explained myself well. It is indeed a hugely important issue.

 

Isn't the sentence "They are you" nonsensical?

 

Say you're a bacterium, and you reproduce asexually by eating then splitting in two. You wouldn't have an existential crisis over which one is the original and which one is the copy. (Or which is the parent and which is the child, if you like.) Such a destinction would be meaningless. Sure, you've become two different lifeforms, just like two human twins are, but asking which one is "you" is like asking a pair of twins which one is the "real" one.

Adam is right, you have to distinguish between a third-person perspective (which Zoe is talking about) and a first-person perspective.

Link to comment
Share on other sites

I though you were asking which copy the current you will be (meaning that the current you will have the first person perspective of that copy). This is what concerns me the most.

 

Let me put it this way: imagine going to a mind archivist, getting your brain scanned, and walking out of there. Now imagine going to a mind archivist, getting your brain scanned, and waking up in a simulated world. Both of these lifeforms have a continuous, uninterrupted consciousness with a linear narrative, and think of themselves as the "real" original person. Both are correct. They both have a subjective first person experience, and both of them remember being the same person as each other right up until the moment of the scan, at which points their memories and actions diverged.

Link to comment
Share on other sites

A simpler, more practical example, might be the one-celled twins. Although you could argue the starting cell has no consciousness (some people might argue otherwise), it's basically a real world example of cloning.

Link to comment
Share on other sites

My question was rather: Can you predict the eventual contents of your journal?

 

It's nonsensical to ask "when you split up into two different people, which one will you be?" Both were you. You'll just have to get used to non-linear identities with branching.

 

And we haven't even yet gotten into issues such as if you experience something traumatic, or die, and then revive your latest backup, who is you, only she's not the last few days of you, she's resumed from an earlier version.

Link to comment
Share on other sites

I though you were asking which copy the current you will be (meaning that the current you will have the first person perspective of that copy). This is what concerns me the most.

 

Let me put it this way: imagine going to a mind archivist, getting your brain scanned, and walking out of there. Now imagine going to a mind archivist, getting your brain scanned, and waking up in a simulated world. Both of these lifeforms have a continuous, uninterrupted consciousness with a linear narrative, and think of themselves as the "real" original person. Both are correct. They both have a subjective first person experience, and both of them remember being the same person as each other right up until the moment of the scan, at which points their memories and actions diverged.

 

All you've written is correct. But. You can't have two first person perspectives at the same time. Which one wil you maintain is the question. The "current you". Understand?

 

who said there would be no physical aspect?

 

So having your consciousness on some computer or in the cloud is in any way equivalent to having a body? I was assuming it isn't, tbh. Feel free to share your vision about uploading your consciousness into a computer and what sex would be like.

 

It would depend on the simulation of the enviroment, you assumed there wasn't any or it was very poorly done.

Link to comment
Share on other sites

You can't have two first person perspectives at the same time. Which one wil you maintain is the question. The "current you". Understand?

You're just being split up. If you remove one of them just after copying you'll just live on as the other one since there's still one of you. Nothing is lost. Why would you be worried about being the one that gets removed if the removed one wasn't conscious during the operation?

 

Edit: Or here's a thought. What if the device that extracted our brain would destroy every cell right after it extracted it. So it would be more like a move operation instead of copy & delete. Would that be different? I don't think it would.

Link to comment
Share on other sites

What do you mean split up? You can't be split up.

 

yeah, i don't get what you guys are talking about. your subjective conscious experience is linked to your brain. any "copies" made via computer would have their own, new consciousness that is unrelated to your own.

 

the only way to transfer your consciousness to some other medium would be to methodically deconstruct and reconstruct your brain using another material (like, nanomachines or some shit), similar to the ship of theseus.

Link to comment
Share on other sites

This is still not clear for me as isn't the ship of theseus problem. Well exact ship of theseus paradox is clear, but there are many variations that make it complex. What if you deconstruct your brain and reconstruct two copies of it?

Link to comment
Share on other sites

Unrelated? If it's a perfect copy of your brain, than at the time of copying the copy is in the exact same state as the original. After that these 2 instances of your brain and state divert and ofcourse each will be a separate consciousness. You'll be one of them after the copy or both of them looking at it beforehand. But I don't see how you could call the copy "unrelated"? Why would deconstruction be necessary, why would it need to be physically reconstructed? What about a non-destructive brain scan that loads it into a huge digital data structure which can be used in an unbiased simulation?

Link to comment
Share on other sites

You couldn't be both of them according to our laws of physics. How can they be related? You'll be one of them or none of them. The problem is which and why,

Link to comment
Share on other sites

Unrelated? If it's a perfect copy of your brain, than at the time of copying the copy is in the exact same state as the original. After that these 2 instances of your brain and state divert and ofcourse each will be a separate consciousness. You'll be one of them after the copy or both of them looking at it beforehand. But I don't see how you could call the copy "unrelated"? Why would deconstruction be necessary, why would it need to be physically reconstructed? What about a non-destructive brain scan that loads it into a huge digital data structure which can be used in an unbiased simulation?

 

i'm talking about preserving your own consciousness while modifying your brain so that you can "connect" to a computer. you would have to methodically "convert" the brain to some other form to do that (taking each neuron, destroying it and replacing it with some type of machine). otherwise, the consciousness that exists in the copy or new form or whatever would not be "you." there is no physical connection between the two, so they wouldn't have the same consciousness. preserving that physical connection is the key.

Link to comment
Share on other sites

I disagree. Maybe I can demonstrate with code.

 

Brain is created: (birth)

realbrain = MyBrain()

 

Sick of real brain, let's copy to another.

realbrain.pause()

copyofbrain = realbrain.deepcopy()

 

These 2 are now exactly the same, if I resume them both at this point they each become a separate instance and consciousness. But what if I delete my real brain at this point and resume the copy?

del realbrain

copyofbrain.resume()

 

The copy doesn't know it's a copy. The deleted brain doesn't know it's been deleted. It's a seamless transition granted that the simulation and copy is perfect.

Link to comment
Share on other sites

What if you don't delete the copy?

 

What would you feel from the first person perspective? You're the owner of the real brain. If someone copied your brain without you knowing it, you probably wouldn't know it. And if someone would turn your brain off you wouldn't just go to the copied brain. That's exacly what you're doing there. I wouldn't be so sure about Hoodie's solution but it certainly is more logical.

 

1. Two copies of consiousness cannot be related, from the first person perspective you can't be both.

2. Your counsiousness can't go from one copy to another when one is destroyed.

3. That's it, it's not a seamless transition, it's creating a copy of your consiousness and deleting you.

Link to comment
Share on other sites

Your consciousness doesn't go from one copy to the other when one of them is destroyed, your consciousness is in both after copying and when one of them gets discarded that's the only one that goes on. The one that's frozen is just data. So you've got a 100% chance to go on. If you would resume both, you've got a 50% chance to end up in either one of them.

Link to comment
Share on other sites

Your consciousness doesn't go from one copy to the other when one of them is destroyed, your consciousness is in both after copying and when one of them gets discarded that's the only one that goes on. The one that's frozen is just data. So you've got a 100% chance to go on. If you would resume both, you've got a 50% chance to end up in either one of them.

 

i don't see how your consciousness would be in both. i think it would definitely stay in the original.

 

the problem with your viewpoint, from how i'm understanding it, is that you're viewing the brain (and consciousness) as a static thing that can be copied. consciousness, in my opinion (because science has yet to definitively answer this question), is the sum of all the electrical activity occurring in your brain at any given point. this activity is constantly fluctuating and changing, but it's always there. it started as a chain reaction when your nervous system was developing in the womb and it's stayed there your entire life. copying it? how do you copy something that's constantly in flux? short of stopping time and meticulously replicating one's brain atom by atom, it's impossible. even detailed brain scans take time and are not an instantaneous snapshot of brain activity. not to mention the process of copying the brain...

 

and even if you somehow copy it perfectly and instantaneously, the two copies would diverge immediately, thereby creating two entirely separate conscious beings.

Link to comment
Share on other sites

if like 5% of brain activity is conscious and the rest is unconscious, in this uploaded version, you could have 100% access to your brain. so perceive everything that you actually perceive, and understand yourself a bit better (by a bit better i mean completely :p).

 

innit

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.