Jump to content
IGNORED

Uploading consciousness into a computer


Guest Rambo

READ THE FIRST POST BEFORE YOU VOTE  

67 members have voted

  1. 1. Would you upload your consciousness to a computer?

    • As soon as it became available
    • Only if i was about to die
    • I dont know
    • Never


Recommended Posts

  • Replies 144
  • Created
  • Last Reply
Guest sirch

maybe you are

 

if you use facebook, or twitter.. or frequent forums, etc. regularly, then you are kind of uploading yourself... to.. the net!!!!!!!!!!!!!!!!!!!!

Link to comment
Share on other sites

In 2005 de Garis published a book describing his views on this topic entitled The Artilect War.

Cosmism is a moral philosophy that favours building or growing strong artificial intelligence and ultimately leaving Earth to the Terrans, who oppose this path for humanity. The first half of the book describes technologies which he believes will make it possible for computers to be billions or trillions of times more intelligent than humans. He predicts that as artificial intelligence improves and becomes progressively more human-like, differing views will begin to emerge regarding how far such research should be allowed to proceed. Cosmists will foresee the massive, truly astronomical potential of substrate-independent cognition, and will therefore advocate unlimited growth in the designated fields, in the hopes that "super intelligent" machines might one day colonise the universe. It is this "cosmic" view of history, in which the fate of one single species, on one single planet, is seen as insignificant next to the fate of the known universe, that gives the Cosmists their name.

 

Terrans on the other hand, will have a more "terrestrial" Earth-centred view, in which the fate of the Earth and its species (like humanity) are seen as being all-important. To Terrans, a future without humans is to be avoided at all costs, as it would represent the worst-case scenario. As such, Terrans will find themselves unable to ignore the possibility that super intelligent machines might one day cause the destruction of the human race—being very immensely intelligent and so cosmically inclined, these artilect machines may have no more moral or ethical difficulty in exterminating humanity than humans do in using medicines to cure diseases. So, Terrans will see themselves as living during the closing of a window of opportunity, to disable future artilects before they are built, after which humans will no longer have a say in the affairs of intelligent machines.

 

It is these two extreme ideologies which de Garis believes may herald a new world war, wherein one group with a 'grand plan' (the Cosmists) will be rabidly opposed by another which feels itself to be under deadly threat from that plan (the Terrans). The factions, he predicts, may eventually war to the death because of this, as the Terrans will come to view the Cosmists as "arch-monsters" when they begin seriously discussing acceptable risks, and the probabilities of large percentages of Earth-based life going extinct. In response to this, the Cosmists will come to view the Terrans as being reactionary extremists, and will stop treating them and their ideas seriously, further aggravating the situation, possibly beyond reconciliation.

Link to comment
Share on other sites

I think there'd be something very satisfying about humans creating super-powered artificial life that goes out to explore the cosmos. It'd be like we as a species watching the kid we raised grow up to cure cancer, & being able to go live out our days in relaxation on a tropical island somewhere, knowing we did our part well. It would take a lot of the subconscious pressure off of us to become God, because we would have made God.

Link to comment
Share on other sites

I think there'd be something very satisfying about humans creating super-powered artificial life that goes out to explore the cosmos. It'd be like we as a species watching the kid we raised grow up to cure cancer, & being able to go live out our days in relaxation on a tropical island somewhere, knowing we did our part well. It would take a lot of the subconscious pressure off of us to become God, because we would have made God.

zole yeah right. Humans creating super-powered artificial life would just mean we'd have newer, more ridiculous ways to destroy each other.

 

Then again, the possibility of deep space travel as some kind of conscious robot sounds pretty amazing. Dunno what we'd do for the thousands/millions of years it'll take to reach the nearest star though.

Link to comment
Share on other sites

Reminds me of this segment from Mike Leigh's Naked:

Do you think that the amoeba ever dreamed that it would evolve into the frog? Of course it didn't.

 

And when that first frog shimmied out of the water and employed its vocal cords in order to attract a mate or to retard a predator do you think that that frog ever imagined that that incipient croak would evolve into all the languages of the world, into all the literature of the world? Of course it fucking didn't.

 

And just as that froggy could never possibly have conceived of Shakespeare so we can never possibly imagine our destiny.

 

I think, just like today, people will have wildly varying ideas of how they want to live their lives.

 

I might "go digital" if I was old, afraid of death and the tech was perfect. I've been thinking along similar lines because I've been watching Kaiba and have been taking too much psychedelics which made me feel connected to natural evolution. Would I still feel as connected when evolution happens by virtual or artificial means? I'm not sure. Is a world that can be shaped to become anything, that can be shaped to offer you maximum value or happiness a good idea? I'm not sure about that either and I don't think I ever will.

Link to comment
Share on other sites

Would I still feel as connected when evolution happens by virtual or artificial means? I'm not sure. Is a world that can be shaped to become anything, that can be shaped to offer you maximum value or happiness a good idea? I'm not sure about that either and I don't think I ever will.

I don't think a world of pure happiness and no conflict would be satisfying.

Link to comment
Share on other sites

I don't think so either. But your life could be shaped to be balanced yet mostly awesome. I don't think I'd want a fabricated life. Yet how do you stop yourself from fabricating one when there are no rules? Maybe it ought to be a world that only operates using our laws of physics. But where's the fun in that?

Link to comment
Share on other sites

I think the catch here is that you'd have no way of knowing beforehand if it's actually you who ends up in the computer world, or just a replica indistinguishable to everyone else on the planet.

 

What's the difference? :D

Link to comment
Share on other sites

well, yeah, of course they'll both "be" you from an outsider's perspective. I'm saying which one will you experience being? Can that be predicted?

 

This is a nonsensical question. You'll be forked, and both entities will subjectively experience being "you" and would have equal validity of the claim of being the "real" you.

 

Stop trying to imagine that "you" exist outside of the natural world. You don't. The concept of experiencing being "you" is just one lifeform's way of experiencing being itself, and copying it would merely make there be two such lifeforms having the exact same experience. They're equally you. Neither has more claim to that than the other.

Link to comment
Share on other sites

Wasn't this the plot to a really bad episode of X-Files ?

 

Edit: Yes Pete, yes it was - http://en.wikipedia....h_(The_X-Files)

 

Yeah, I like Gibson's first novel, but after that, Stephenson really took over writing decent cyberpunk, and that episode of The X-Files really didn't do Gibson any favours...

Link to comment
Share on other sites

I don't think a world of pure happiness and no conflict would be satisfying.

 

Smith said you'd say that. Really, though, I'd be happy with discovering more and more about how the universe and everything in it works, and creating things. I needn't define a life's importance in terms of its suffering or overcoming oppressors or obstacles. Overcoming such things is important, but a world without tyranny would be a fine thing indeed.

Link to comment
Share on other sites

Would I still feel as connected when evolution happens by virtual or artificial means? I'm not sure. Is a world that can be shaped to become anything, that can be shaped to offer you maximum value or happiness a good idea? I'm not sure about that either and I don't think I ever will.

I don't think a world of pure happiness and no conflict would be satisfying.

 

I'm pretty sure it would. You're just thinking like this because right now if we don't get chalenges we feel unacomplished and like we have no place in this world or something. This feeling could be removed with the technology. You could experience the ultimate happines non stop and you'd be ultimately satisfied.

Link to comment
Share on other sites

there is no such thing as analog in simulated worlds, what would be the point?

 

but srsly, I would only do it if I was on the verge of death, and only if the technology was flawless. It would need to feel exactly like real life (important features: breathing, eating/drinking, natural/wild nature (not pre-programmed/unrealistic), animals, unpredictable weather, real stars/planets in the sky, real social interaction, drugs, sex, videogames separate from the virtual reality you'd be living in/non-immersive games, music, etc.) and I would need to be able to communicate/interact with the outside world somehow. Otherwise I would probably go crazy and be incredibly depressed, and it would be far worse than being dead after a few hundred years of that, so I would probably opt out even if I was dying.

Link to comment
Share on other sites

well, yeah, of course they'll both "be" you from an outsider's perspective. I'm saying which one will you experience being? Can that be predicted?

 

This is a nonsensical question. You'll be forked, and both entities will subjectively experience being "you" and would have equal validity of the claim of being the "real" you.

 

Stop trying to imagine that "you" exist outside of the natural world. You don't. The concept of experiencing being "you" is just one lifeform's way of experiencing being itself, and copying it would merely make there be two such lifeforms having the exact same experience. They're equally you. Neither has more claim to that than the other.

i've seriously offended people when i've said that. to the point that they thought i was evil (and these were atheists, it wasn't about the soul etc.)... i still don't get why it's such an upsetting idea for some people.

Link to comment
Share on other sites

well, yeah, of course they'll both "be" you from an outsider's perspective. I'm saying which one will you experience being? Can that be predicted?

 

This is a nonsensical question. You'll be forked, and both entities will subjectively experience being "you" and would have equal validity of the claim of being the "real" you.

 

Of course. Both will be subjectively and objectively you. But if you keep a journal starting before the copy is made, and continuing through the duplication and after it, there will now be two journals, one corresponding to each copy. Eventually their contents will change as the copies have different experiences.

 

So to rephrase my question: Can you predict the eventual contents of your journal?

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.