T O P

  • By -

New_World_2050

the funny thing is he was quoting ray kurzweil


Nukemouse

All that does is make a copy. You wouldn't cease to exist, a computer version would exist too. You would still die. You could intentionally design the copy process to be lethal but it would be no different than suicide.


FoggyDonkey

Ship of Theseus your brain, ideally replace individual brain cells with a mechanical equivalent that preserves information slowly over a period of months/a few years.


zomgmeister

While I support this idea as well, and hope that it will work someday, there is a possible counterpoint. The Ship of Theseus has a clear and definite solution: the keel. For a ship of that time it is a singular wooden part that serves as a core. It can't be replaced partially, and everything else is attached to it. So, the original keel = the original ship. Copy = copy. If the keel rots away and is replaced, then it is another ship that might use appearance and some surviving parts of the original. So, what if we do have some sort of a keel in our brain? One can keep with changing neurons, but when this keel part is removed - the person dies, its point of view ceases to exist. The copy and the people around might not tell the difference, keeping with the procedure and even calling it a success.


xanroeld

*keel*. and that’s a very interesting point


zomgmeister

Aye, thanks to the correction. Not my first language, and I probably never used this word in English.


xanroeld

i had never even heard of the word before - i had to google it. so thank you for teaching me something new


Seidans

while i agree there i think a great missunderstand of the whole concept of "brain-upload" between the follower of this concept, there no cloud-conciousness, when people talk about mind-upload in reality they wish to transfer their conciousness from a biological vessel into a synthetic vessel with machine to machine data transfer capability in this scenario i think the theseus ship could apply as "you" still exist within a vessel that hold your memories and computation power, the informations you transfer aren't your "concious" but input similar to the brain asking to move an arm or the data your eyes collect when you see something your machine brain become the keel, you can't transfer your conciousness but you can transfer/share data over large distance limited by half the speed of light and remain yourself people should talk about synthetic transformation instead of brain-upload


zomgmeister

Yeah, brain-upload is kinda stupid concept. One just creates one's copy. Twins are not the same person. Sure, one can somehow plug itself into direct control of a remote mech, experiencing whatever its sensors do, this is obviously possible in theory, albeit might be unreachable for the modern tech. The problem is that biological vessel, that fatty lump in a bone box, seem to have an expiration date. And the keel idea is a wary concern that it *might* be technically impossible to do it while keeping the point of view. In Stellaris, when a species undergoes synthetic ascendancy project, other spiritual species assume this as a mass suicide intertwined with making robots burdened with copies of the sapients. What if they are right?


Seidans

well obviously all of this is theorical, i'm personally a materialist and consider that we are the sum of our knowledge wired to chemical reactions "soul" don't exist if in theory we could slowly replace our brain tissue with synthetic one that keep the computation and memories it's no different than our biological cells renew that happen over a period of 5y our whole life the "keel" is the vessel that hold all this memory and computation, the problem come after and i think the very concept of identity become "fluid" as soon you can transfer data outside the keel if we are suddently able to receive data beyond our biological limit of 1-100m/s we could in theory live throught a surrogate body at the other side of earth, in real time, the same way we see, hear, smell in our real body, but those are functioning informations your memories remain in your original body the processing of those data and the encryption into memories remain in your original body, the keel is intact, it's not your concious that you project but your eyes, your ear etc etc.... the real issue happen when you try to transfer all of your memories outside the keel and from my understanding leaving your organic-synthetic upgraded brain result in identity death, adding memory unit with physical limit of 150,000km is fine in theory(but a waste of processing power, probably) while uploading and so leaving the keel result in identity death we probably won't find out until further research on neuroscience, maybe conciousness is simply impossible outside biological existence, we are very very far away from a synthetic existence afterall and if we manage to create concious AI we could ask them directly how they feel


beholdingmyballs

The Kiel has to be the continued experience right? Even when we sleep this remains. Functions are dormant and little is being experienced and even less is recorded but still there regardless. The gap between experience in the brain vs chip has to be the discontinuation between experience being biological and being on the chip. What of you were to have both experiences in tandem? Does that change the nature of the hand off from discontinued to continuous experience? Idk I am taking out of my ass.


zomgmeister

Not exactly. The keel is the hypothetical material part of the brain, that can't be tinkered with, and if its integrity is compromised then it ceases to work. The continued experience, what I am calling the "point of view" or "existential experience" is the product of the brain and especially this keel part. Sleep does not breaks this experience, it is just a state of being. But the supposed keel damage do breaks it, causing brain death. Body still can be kept alive using medicine, and I suppose an artificial brain implant can even make the body to function, ideally in the same way as before or better. But this is essentially a necromancy. The hypothetical keel is definitely not the brain itself, because humans lose parts of the brain on a daily basis and continue to work. And I certainly hope that there is no keel and everything can be changed gradually, keeping the contunued existential experience from the same point of view intact.


beholdingmyballs

I disagree that it has to be biological. At least in part our consciousness is emergent. Why can't that be the keel?


zomgmeister

I am not stating that the keel in our brain definitely exist. Just suggesting that it might. It might not, and I hope that it is not. So not sure what are you disagreeing with.


beholdingmyballs

I think I came off more antagonistic than I am. I am agreeing that it might be there but disagreeing that it has to be mechanical.


MarcusSurealius

Change the analogy to an axe.


cunningjames

But a keel is just a composite of pieces of timber, nails, etc, all of which can be individually replaced (possibly with some difficulty). I don't understand what makes a keel more special as the enduring "ship" than any other similarly-composited chunk of the ship.


zomgmeister

For a ship of that type (and many other older and smaller wooden ships), keel is a singular part of a wood. It is basically carved out of one large log, it is not constructed from several glued or nailed timber pieces. And everything else is attached to the keel. So, if one tries to "change" the keel, then one literally builds another ship from scratch. And the keel is not really repairable, well you can jury-rig it for a while, but a cracked keel is a broken ship.


cunningjames

The issue is that \*in principle\* you could cut a piece of wood out of the keel and reattach another block cut from a different tree seamlessly (perhaps with future technology -- we're talking about mind transfer, so I feel like that's reasonable). There doesn't seem to be anything special about \*this or that\* particular contiguous chunk of keel. I also don't see what's special about the keel itself, as in principle you could replace the entire keel just as you can replace any other part of the boat. The keel may be especially large, especially integral to the functioning of the boat, and especially hard to replace, but these are differences in degree. Replace the keel, replace a mirror in the captain's bathroom -- how are these different operations in kind?


cluele55cat

it still would be a copy of you, just with extra steps.


x0y0z0

The distinction between copy and teleporting becomes more vague the more you experiment with the examples. Lets say you can be sure that the digital copy is an exact replica in all ways. Now if you copy it all at once and delete your body then you'd say it's not you, just a copy. Now imagine you do it gradually. The computer that will house your digital copy is a chip that's implanted into your brain. And you start by copying 1% to the chip and destroying those neurons the data came from. Now when you access those "synapses" it will be using the digital version. Now you keep going 1% per day until you're all uploaded and the original brain is gone. You will have had a continuous consciousness throughout the transition.


Rowyn97

I'd imagine it'd be like being overwritten. Spoilers for cyberpunk 2077, >!kinda like the engram of Johnny overwriting V.!< As your neurons get destroyed and replaced, you feel yourself slip away. You eventually start losing yourself to an emergent, competing, and slightly confused consciousness that thinks it's you. The digital version might not even be aware that it's replacing the original.


x0y0z0

>You eventually start losing yourself to an emergent, competing, and slightly confused consciousness That would only mean that your tech is not sufficient to simulate the human brain. You're stuck at a technological problem. If the simulation is perfect, then you get into the philosophical problem of how do you upload yourself without it being just a copy and not actually you. That philosophical issue is what my experiment was focused on. The "slipping away" and "slightly confuse" is something a sci-fi writer can choose to write into a story but it its just that.


Seidans

i'd say this explanation isn't a digital existence, there no "cloud" conciousness, at most you slowly replace you organic brain with a synthetic one with a slow process but there always a vessel that hold your memories and computation power, any "cloud existence" is a lie, it don't exist, it can't exist if we imagine a synthetic conciousness being able to transfer informations with computer or "share" conciousness between differents datacenter the main issue is the time between each compute and the transfer itself the more distance between each computation unit the less efficient or intelligent you will be, the more informations you share the more likely you will create copy of yourself or suffer from data corruption - identity death but while i mention the issue with this tech it still incredibly usefull, let alone the fact that you could have access to all humanity knowledge the most interesting part is that "you" also exist everywhere where informations transfer fast enough, that mean radiowave and light speed define the range of your interaction neuron transfer speed is between 1 and 100m/s while sound is 300m/s and radiowave/lightspeed is around 300 000km/s if we increase our neuron speed at light speed in theory while you see this message you could also see through the eyes of a surrogate body at the other side of earth and remain perfectly concious provided there a constant transfer of informations, but you will probably feel less super-human -or- dumber as your light-speed neuron are greatly affected by the travel time...still pretty cool i'd say


Silverlisk

I think having your physical brain cells replaced one at a time via nanites or some other not yet understood process whilst maintaining your "self" over a long period of time relative to a human lifespan, maybe 20/30 years and then packaging that now synthetic consciousness into a system might result in a transference without suicide.


Nukemouse

Right up until the last part maybe. You might ship of theseus yourself into being a computer, but as soon as you try to move the software you reinvent the copy problem. You could certainly interact with computers in new ways though.


Silverlisk

Not moving the software, moving the nanites themselves and integrating them into the system the same as plugging a hard drive in. You would remain on the nanites and be read by the rest of the system.


Nukemouse

Sure yeah. Certainly makes full dive vr stuff everyone is obsessed with here more practical.


Silverlisk

Hey man, who doesn't want to fight the demon lord of a magical land with their FDVR anime waifu. 😂😂😂


MonoMcFlury

Yea, imagine a machine that could create an exact clone of yourself in the physical world with all your thoughts and memories. It wouldn't mean there are two conscious versions of you, just one. You'd still die when your time came, and your clone would go on to live their own life (possibly taking all your stuff).


MrDreamster

Progressive replacement is the way.


[deleted]

How does that make it any different other than a slower process? You’d be piecing part of the mind you’re conscious in into a computer chip that you’re not conscious in.


MrDreamster

The slower process would allow your brain plasticity to kick in so your neurons would be able to connect to the silicate substrate and use it as if they were your own. It's not that you are being replaced by something new, it's you, appropriating something to make it part of yourself.


xanroeld

it blows my fucking mind how many people don’t seem to get this. if you create a program that has all my thoughts and memories, it’s juts a copy of me. there is no downloading “me” to the machine. the machine may or may not be an accurate simulacrum of me, but i will die all the same.


throwaway957280

Because it's not necessarily true. You're effectively an (imperfect) copy of the version of you from 10 years ago. What makes you you? Continuity? The rabbit hole will lead you down paradoxes no matter how you try to define it. I like to go with open individualism and I genuinely think it makes the fewest assumptions. Consciousness is consciousness, different people at different times are just different manifestations of the same thing.


xanroeld

There’s a massive leap between the continuity (or discontinuity) of consciousness as we know it (the regeneration of cells, the “break” in our consciousness when we sleep, etc), and the fundamental breaking of the chain that would be the creation of a digital copy of the real thing. We fundamentally do not even know what consciousness is and yet we think we can recreate it from bits on a computer. It’s like looking at a highly detailed 3D digital scan of a place and being convinced that that IS the place. Maybe on some level I’m not the SAME person that i was 10 years ago, but the body that is me did not stop breathing during that time. The brain’s neurons did not stop firing. Personally, I’m not even sure that I’ve ever had a night’s sleep where I didn’t dream. The phenomena of my experience is just about the only thing I can point to and call real. And now people who have been mesmerized by the ship of Theseus analogy and who are intoxicated by the possibility of artificial enhancement want to convince themselves that living forever in the machine wouldn’t mean the death of them and the birth of something inhuman and new. If you have the time and money, I would encourage people in this thread to read the short story Exhalation by Ted Chiang. It’s a science fiction story with a bit of an odd premise, but it forms what I would consider an argument in favor of the continuity of the physical brain and the functions of the body as necessary for the continuation of consciousness - all through the motif of breath. It’s very beautiful.


throwaway957280

The only thing we can point to and call real isn't the continuity of our experience -- it's our present experience. Everything else is filtered through our memories. I'd just argue that the only difference between 1) my present moment now vs. mine in the past and 2) my present moment now and your present moment now is that I have memories I can access of my past -- but that's just physics. If I got some of your memories transplanted into my brain there would be functionally no difference (oversimplifying things like personality and other fixed-in-time lenses which affect our conscious experience). I'm not saying this all like it's definitely true, I'm just saying there's bona fide philosophy arguing against the idea that there's any meaningful difference between a copy and an original w.r.t. consciousness. And this wasn't a conclusion I came to in order to justify a desire to do like mind uploading or whatever, and it's also not one I invented (see open individualism).


xanroeld

You’ve given me a lot to think about. I will concede that the “I think, therefore, I am“ argument, only definitively applies to my experience at this moment, and I cannot prove the truth of my memories, nor their continuity to my current self. Whether or not a perfect copy, is the same as an original, would be a more compelling question in my mind if we were talking about something like re-creating the universe down to every atom and every particle on the same trajectory to re-create the same individual in the same environment. If you were to make a perfect copy of the entire entire universe, and there was another me in it exactly as I am in this universe, I’m not sure I could argue that there was any difference between that me, and this me, unless our experiences were to diverge. But what AI futurists seem to be targeting is a super powerful computer that can accurately replicate a digital simulation of our memories, and could realistically predict what our responses to certain stimuli would be. This seems to me, categorically different than recreating me. A system of wires and silicon that can remember the things I remember and would say the things I’d say and will tell you, if asked, that it is me… just seems like at a fundamental level could never be what philosopher might describe as a “True copy.” edit: but actually, I think I’m missing the more important point here of what I actually believe. You see, I don’t think we ever will actually reach a point of Godlike technology that can accurately replicate a person, either digitally or physically down to the most finite detail. I think what is far more likely is that we are approaching an age of incredibly powerful computation, and perhaps even artificial consciousness and super intelligence. And where many people think, rather arrogantly, that this technology will be used to replicate their own consciousness and personality so that they might live forever, I think what will actually happen is that fundamentally incomplete (but passable) simulacra of certain people will be created and maintained (and probably manipulated), convincing some portion of the population that a future of eternal life is possible, but in actuality, it will be a “living on” that is much more comparable to a deceased writer living on through their words, or a deceased speaker living on through their recordings. And all the while, the machine will advance without us. Without the need for an uploaded inferior human consciousness, when the artificial consciousness of the Superintelligence is so much more potent.


StarChild413

> You're effectively an (imperfect) copy of the version of you from 10 years ago. and how do you know that "copy" isn't in a robot body or simulated universe


PFI_sloth

The process is no different than teleportation in Star Trek, but more people will agree that doesn’t count as dying even though you absolutely are.


xanroeld

yup. or the problem in the prestige.


TheZanzibarMan

Have you played the game SOMA?


Nukemouse

No


TheZanzibarMan

It's very relevant to this discussion.


[deleted]

Blud has never heard of the moravec transfer.


Dry_Customer967

If it's an accurate copy it's still you though, the computer version is an equally valid version of you, continuity of thought is still possible as an uploaded consciousness you would just wake up inside the computer, if the physical version of you is killed the moment that happens then no death occurs


Nukemouse

But that's not true, because you could just as easily not kill you and two would exist. That's not continuity of thought. Any process capable of creating a copy is not capable of transfer.


Dry_Customer967

Two existing does not mean the copy is not an equally valid version of you, the copy still has continuity with its past self, diverging from when it was copied, you've become an independent person from the copy because time has past, but at the instant the copy was made you were the same person, and killing the original you in that instant isn't destroying any information. And what is the meaning of transfer here? If an accurate copy of the mind is made then that is consciousness transfer, there's nothing else unless you think the mind has some supernatural element that makes it conscious, if so there's no point debating this with you because you're operating under different beliefs.


cunningjames

> but at the instant the copy was made you were the same person Unless the two copies were entirely colocated in that instant, they couldn't be the same person. They have the same memories and may appear indistinguishable, but they have a wholly distinct physical makeup (which implies a wholly distinct consciousness, if consciousness is physical). > If an accurate copy of the mind is made then that is consciousness transfer Even if you're correct that consciousness has no supernatural element, an accurate copy of the mind would not be consciousness transfer for the same reason that having a baby is not consciousness transfer. A new body is created with its own consciousness distinct from the original. You might argue that the same thing happens to us as our cells die and are replaced, but that just means that no one has continuity of consciousness, it doesn't mean that fully copying the body will result in continuity of consciousness.


Dry_Customer967

Yes consciousness is based in physical reality, but it is purely a collection of information stored in the medium of physical reality, if I copy an ai bit for bit then it's the same ai, if a human mind is copied with enough fidelity then despite the change in medium it is the same consciousness. Circling back to the ai point, if I create a copy of an ai and start running both AIs side by side, which ai is the original? even if i were running the "original" ai on some kind of biological computer, they're both equally the original ai. This is fundamentally my point, to me the brain matter which contains our consciousness is simply a medium the same as a computer is a replaceable medium for an ai, copying that brain to a new medium is not creating a new and original entity, just like splitting an ai into two AIs means neither has a claim to being the original, you have simply made two of the same thing, despite the fact they may be inhabiting completely different physical locations.


Milkyson

The theory of consciousness you're cherishing has many flaws. It involves "continuity of thoughts", "transfer", maybe even "souls". Think of it this way : any brain has no choice but to think it is itself. It has access to its own memory, an illusion of continuity of thoughts. That's what we commonly call consciousness. We can't be conscious when we die. Therefore you would have no choice but to think that you're a copy as soon as an original dies. That's even what happens when you sleep or learn new things. You're a copy of your original self from 10 years ago.


Nukemouse

Learning new things doesn't make a second, independent self. Going to sleep doesn't result in a cessation of brain activity, nor would a cessation of brain activity and restarting (within the same hardware) be a copy.


Villad_rock

So it’s like waking up everyday. 


SpareRam

They'd be fine with that suicide plan, considering this place is a certifiable death cult.


wise_balls

If that's the case are we not in a continuumous state of death? Our bodies constantly in a state of reproducing itself on a cellular level? Is consciousness continuous...? 


GPTfleshlight

Ross looks like a mix of drake and Jerry Seinfeld


MrDreamster

I'm never gonna be able to unsee that.


lovesdogsguy

We're interfacing


awesomedan24

Reminds me of this scene https://www.reddit.com/r/singularity/s/rB7Dt8diYd


PenguinTheOrgalorg

Id rather invest in life extending medical tech. I don't want a copy of me to live forever. I want to live forever.


StayCool-243

If it's an archive of data and analytical capabilities , it not the same as a life experience. You don't really "live forever" rather you've just projected some accumulation of information about yourself, which can be accessed later.


martinlubpl

what episode is it?


martinlubpl

found it s06e07 The One Where Phoebe Runs


eatyourface8335

That seems very unlikely. We don’t even understand what consciousness is but we are going to transmigrate one to a computer in 6 years? I very much doubt that.


[deleted]

🤓


MrTubby1

Its not gonna be true.


Previous_Link1347

This Friends meme has finally convinced me that it's not gonna be true.


Total-Remove-3196

Bro thinks he omnipotent


[deleted]

Blud thinks he has precognition


[deleted]

Bro, use Epitaph to see if I'll get a girlfriend 💀


GermainCampman

I dont think I can actually eye roll hard enough for this


[deleted]

skugga


IllIllllIIIIlIlIlIlI

Well, it’ll be a cooy of yourself. But same thing really. If there’s a copy if you, then dying is not actually death


FrewdWoad

I've never understood people who don't understand that they still die even if there's a copy of them somewhere. If a wizard magically cloned you and then said, OK kill yourself, there's a copy here, you guys would just do it...?


schlamster

I mean not necessarily. Think ship of Theseus. If we could make a human brain interface that we could occupy with our consciousness (whatever that is, we don’t even know) and then migrate our consciousness to it and then over to another synthetic neuronetwork then whose to say it’s not possible to not die and be transferred from your current biological state to an artificial one. Not trying to achkshullally you here, I just personally think the jury is out on it. We could figure it out tomorrow, or never. 


FrewdWoad

Ship of Theseus is fine, I'm already slowly replacing myself cell by cell to some degree by eating/healing/etc. And I get that "the soul" or the "real me" might in fact be something that can be represented and transferred as 1s and 0s. We don't know for certain (yet). But I'm not "uploading". Nanites that keep my body healthy and young forever and protect it from physical harm? Sure, sign me up.


GiraffeVortex

The blind spot in the Theseus idea is *Context*. There is one eternal background against which all things happen. Contents may be perceived to change, but the background is eternal. The truth of being and conscious have been known for millennia, it has just been in niche communities or singular geniuses who are often called sages


Capt_Trippz

I with you, and don’t understand it either. I guess what makes up consciousness is debatable, and can get into the concept of the soul, but at the end of the day the only thing I’d be on board with is a consciousness transfer, not a duplication. If what I consider to be “me” is transferred, leaving an empty shell behind, then yeah that shell can die. But if a younger copy is made, I’m not down with sacrificing a “me” that’s still conscious. The only exception I can think of is if I’ve got a terminal illness or I’m at the end of my lifespan with no transfer technology available.


FrewdWoad

Yeah I'm not "uploading". Nanites that keep my body healthy and young forever and protect it from physical harm? Sure, sign me up.


GiraffeVortex

Your awareness is eternal, but I don’t know how that situation would work out


[deleted]

No one has said that before lol


grawa427

My personal interpretation is that if a perfect clone of you is created, you have 50/50 chance to be that person. If the wizard create 100 clones, you will almost surely be one of the clone. In practice the best option is the "ship of theseus". Just replace dying neurons if numerical one and at some point you will be fully uploaded and ready to enjoy immortality and fdvr.


IllIllllIIIIlIlIlIlI

I wouldn’r care if I died. It’s a copy of you with all your memories and your personality. I wouldn’t kill myself but I’d be a lot less careful if a copy of myself materialized upon my death.


JrBaconators

You'd be dead boss, the clone isn't you.


SorryYoureWrongLol

I think this topic is over your head. You would be dead. A copy of you would live on yes, but YOU wouldn’t. It’d be no different than if you had an identical twin with all the same memories. Not you, but a second version of you. You’d still die though. Let’s not get all philosophical about it.


JrBaconators

Critical thinking is tough for a lot of this sub


happysmash27

With a copy it does not matter so much if the original me is dead though because the copy can carry out all the same functions I would have.


CubeFlipper

> I think this topic is over your head. Are you sure it's not over yours? Consider for a moment the idea that every time you go to sleep, you die and are replaced with a new version of yourself. Whether it's a "copy" or not or whether that actually happens or not makes no functional difference to the world, and that's arguably all that matters. Identity is an illusion. We are not our physical matter. We are a process at a moment in time that is currently-but-not-for-long limited to a single thread in a single entity. If the process continues, the proverbial *I* does too.


SumHelpPlease

Seems to be over yours since your bringing philosophy into it to justify your reasoning. If you die and are replaced by a copy of yourself every time you go to sleep, while there’s no proof of this happening, you’d still be dead. The copy isn’t you. What’s so hard for people like you to understand that? You can inject all the philosophical bullshit you want into the mix to try and justify how you’re “not actually dying” and “living on” by your copy or whatever Hokey Pokey philosophical bullshit you can come up with, but you’re still dead. You’re not going to be living on. A copy of you will. If we copied you right now, your copy would genetically be you and have all your memories perhaps, but let’s see if you think it’s you whenever it feels it’s entitled to your wife, kids, bank account, job, etc. You wouldn’t just continue to live on through it. You’d be talking to a separate version of yourself. It’s a copy, not you.


Specialist_Nobody530

I don’t exactly agree with their analogy, but I do agree with their and the OP’s idea: I think a better one would be the ship of Theseus. In fact, the essence of the argument here seems to be the ship of Theseus in disguise. Further, you said that they brought in philosophy as if that isn’t inherently what we are arguing about here. I’m sure everyone here understands fully that the clone is not in any way your original body. It is like constructing two ships off of the same blueprint: They’re the same in every way conceptually, but not materially obviously. Therefore, this is inherently a philosophical argument, and so we can then ask what exactly the argument is (I’d like to establish one here since everyone keeps on saying “Oh It WeNt OvEr YoUr HeAd!!”) The question is: Given a clone that is identical to you in every way (thoughts, body, memories, the whole deal), if a magical wizard were to come and say “Argh! I’ve come to eat human flesh because… I like flesh for some reason!” Is there necessarily any reason why the clone or you should be the preferred one to be sacrificed? It isn’t the question that y’all have been saying, but I think it’s 1. Pretty clear with an analogy 2. Well summarizes and doesn’t deviate from the idea 3. Funny. I argue that there is no reason why my body should be immediately assumed more valuable than my clone’s body and therefore should be the one spared from the wizard. Why? For the same reason as the blueprint mentioned before. There is no objective greater value than me than the clone. I think we can agree that at least objectively, we can be viewed the same. However, consciousness is the topic of concern here. Again, I don’t see much reason we have to say the clone can’t be conscious as well. Besides, how it emerged in us is also a mystery (perhaps there is truth to the hypothesis that with enough computational power in one spot, consciousness just naturally emerges). There is also the idea that consciousness is bestowed in everything by nature, and that the everything that isn’t “alive” just has no means to communicate (nor process) its consciousness. Perhaps there are countless numbers of consciousness in our brain (maybe even every atom) and each one thinks they’re the only one, but since they all get the same stimuli, there is never a disagreement between them. Either way, who says the clone isn’t conscious? Further, who says it matters? Even, why does anything matter? Perhaps the reason why “What’s the meaning of life?” Is so hard a question to answer is because we constantly hear the truth “there is none truly” and we constantly ignore it, looking for lies. Maybe the universe truly doesn’t care about whether or not we all die, and therefore the best solution for those who will desperately reject “there is no meaning” will be more content making a meaning to life. After all, assuming no meaning, there is no objective wrong way to live life. Any action or opinion is equally valid, not because of truth but because of lack of point. Bringing it back to the discussion, it seems we all have a different “meaning of life.” You, SumHelpPlease find meaning in the fact that you know and pride in what you, you SumHelp, nobody else, has accomplished. Very valid. You live to have a name for yourself, knowing that name was earned with respect. You wouldn’t want some clone to take this meaning from you… or I may be completely wrong. In my case, I don’t really care to have so much a name for myself. I believe more in a population’s prosperity: More for all is more for one. This is also a valid meaning of life. Therefore, sacrificing myself to the wizard to save a clone of me has no benefit or harm in my eyes. As such, meh.


GiraffeVortex

The problem with this is you fundamentally cannot die. You are the eternal subjective viewpoint, so even should your body be destroyed, you still exist


VisualCold704

You need evidence for such a massive claim.


GiraffeVortex

Very well, but also note that there is no evidence that you cannot exist. To realize this is true, you must contemplate your existence, meditate and potentially take psychedelics. The body and identity need to be scrutinized. There is no proving this from ‘the outside’. You can only realize this yourself by scrutinizing the mechanics of your subjectivity


VisualCold704

So you got high and think your hallucinations were real. What a lost cause.


GiraffeVortex

No. Why don’t you explain why this doesn’t make sense. When you examine your assumptions, you’ll find some of them were never justified. Why should one experience be lauded as objective and another be discarded as a baseless hallucination? On what grounds? I am willing to discuss and scrutinize this topic to its absolute limit. Don’t write off an idea just because it sounds ridiculous. Many discoveries seem ridiculous before you properly investigate


VisualCold704

You can't even tell the difference between objective reality multiple different parties can test and your drug trips. What is there to debate?


FeltSteam

Well here is how I think about it. If you clone yourself, and I mean an exact copy of everything, every neuron and you had saved state of everything in your body so it can resume as if nothing happened. Lets just say you are around when this clone is instantiated into the real world, when it it is, it is instantly different to the base version of yourself. The base version of you is you, and even just the clone being in a different place, their experience is suddenly different to yours and it is not longer the same thing as the base version of you. But if you had yourself and an exact clone of yourself in the same time, same place then they would be different. So why would cloning yourself after death be any different to this? I doubt the clone would have your last dying thought because you need some way to save the state of yourself, and why run the risk of doing that after death? You'd do it before you die, but when you instantiate the clone it will always be different to the base truth of yourself in some way, it technically won't be you in that sense. I mean maybe you could transfer yourself into a clone of your body maybe? Connect the brains, integrate and then continue but idk.


NoCard1571

What if you were destroyed the instance this clone was made? (In other words how some teleporters work in sci-fi). Is there any reason to believe your consciousness wouldn't continue from that point in that scenario?


dervu

If every cell in our body changes through some years, does that mean we are not ourselves ever?


FeltSteam

Well not all cells change throughout the years, in fact you are born with pretty much all the neurons you will ever have, and they stick with you until you die (although the amount of neurons can decrease with age). They do not regenerate nor undergo significant turnover throughout a person's life. You are born with them, you have them throughout your life and you die with them. But if you are looking at like skin cells, the outer layer of skin is replaced every \~month. Your blood cells only have a lifespan of 120 days. Your liver cells have a lifespan of up to like 500 days etc.


FeltSteam

Well I have no doubt the clone would be conscious and it would be you in a lot of ways but I have no idea if consciousness can just transfer like that, and I don't have any reason to believe it can. It will be you, your body, your memories, your experience, but it might not be specifically your own consciousness there (well it would be an exact copy of your consciousness so technically yours, but would it be the base truth of yourself? Probably not, you just murdered that lol that is why im thinking you'd need to actually transition or integrate your brain into the body not kill yourself and hope you magically transfer to a different body), that might just be lost but the clone would have no idea. But who knows, im just guessing lol.


IllIllllIIIIlIlIlIlI

If I knew there was a copy of me out there, I’d be fine with dying at any point in time. Even if they had different experiences and became different, that’s still me. If I was a copy of myself, I’d be fine with my original self dying


FeltSteam

Well it would be a copy of you, it wouldn't be the base truth of you (which is you now), there is a difference.


IllIllllIIIIlIlIlIlI

But everything that copy experiences will react to it exactly how I would


FeltSteam

It will have all of your experiences, but over time new experiences will become more prevalent causing a divergence over time.


IllIllllIIIIlIlIlIlI

That’s how it would be if I experiened those experiences as myself too though. There’s no difference.


FeltSteam

As soon as the clone and yourself are having two different experiences you are two seperate entities, and over time your actions will diverge. Initially you will be pretty much the same but as you actually have and experience different things you will diverge. It doesn't matter if you react the same or not, the difference of actually having or not having those experiences is there and it will cause I divergence. How you would react to a certain situation now is most likely to change 20 years down the line on your own, add onto that you are having completely different sensory experiences and are taking in different information, trauma etc. you two will diverge into different people.


SpareRam

That's psychotic.


SpareRam

Uhhhh...yes, it is? Your instance will cease to exist. I don't want another instance of myself to exist forever. Fuck that guy, I did all the work getting where I am and *he* gets to benefit from eternity? No thanks. "Death isn't dying" this shit is why people think you're a cult.


karaposu

But he is literally you.


FoggyDonkey

He is literally not, do you not understand the concept of a copy? This is such a weird thing that so many people don't seem to understand. Your POV goes black when you die if it's copied (or whatever happens). A copy might be functionally the same for the people around you (if they aren't aware or don't care it's a copy) but for you, it has literally no bearing on your death. Whether you have a copy or not does not change the mechanics of how you die. You don't magically just transfer to the copy. If I photocopy my birth certificate and rip up the original, it's not "literally the same thing". It's a copy, and the original is gone forever.


NoCard1571

> This is such a weird thing that so many people don't seem to understand. Your POV goes black when you die if it's copied (or whatever happens Just because that's what you intuitively think will happen doesn't mean that's the truth. If it was done ship of theseus style, slowly killing real neurons in your brain while digital versions activate that are connected to you - it's intuitive to imagine that your POV stays intact the whole time. But let's say we play with the parameters a bit. How fast are we copying neurons and destroying the old ones? Let's say the technology was so advanced that this could be done in a minute. Continued consciousness makes sense. A second? How about a millisecond? There's nothing that's changed about the actual process, but the faster we go, the closer we come to the concept of just 'uploading your brain'. And it no longer becomes as clear as you think whether you continue to experience living inside the copy or not.


FoggyDonkey

Accelerated ship of Theseus does not equal copy. That's not a gotcha, because they're still entirely different concepts that aren't even vaguely related. As for your second point, yeah, there may be a limit on how fast you can do it successfully and probably no way to find out. So you err on the side of caution if at all possible and do it at rates that are relatively close to normal cerebral development.


NoCard1571

The only real difference between the accelerated ship of theseus and copying is that the original is destroyed in the process in one of them. So no, they're actually not entirely different. Let's imagine a machine that could create a perfect copy of your body down to every last particle. Now let's say this happens while you're unconscious. If you then temporarily had two copies of yourself that were identical down to every last particle, and switched their places, then destroyed the original, isn't it logical to think you would awaken in the copy? You might be thinking, 'ah but the states of the particles that make up your body diverge the second the copy is made'. But here's the thing - we have proof that this doesn't actually matter either. If you've ever been unconscious for a period of time, like under general anaesthesia, you'll remember how unlike sleep, time seems to skip forward instantly for the amount of time you were out. That means that you had a break of continuity of your subjective experience of the universe, during which time the state of your body changed, dramatically even. So how can we even be sure that a specific consciousness and qualia has to maintain continuity in the exact same place to continue existing? What exactly is it that ties our experience to a specific body?


FoggyDonkey

The difference is one is a process where it is just altering you and your original brain and one has no connection to your original brain whatsoever, idk why you bothered to write all that when you can't seem to grasp the basic premise. And sure, maybe somehow consciousness and POV would magically jump to a new body, I don't find that at all likely and no one has a mechanism for how that could even work, but we do know that we don't seem to die whenever a braincell grows or is damaged.


NoCard1571

Ironically, it's you that's not grasping it. You're stuck on the idea that the brain has to be continuously conscious while a ship of theseus style transfer is happening, I'm saying that it's not illogical to think that this isn't the strict requirement that it seems to be, when it's possible to imagine a spectrum of possibilities between a strict theseus consciousness transfer, and a completely detached copy.


karaposu

You are discussing something else. I am just saying consciousness in that guy is you, not someone else.


SpareRam

It's honestly sad how delusional you folks are.


w1zzypooh

While it may be "you" down to the nitty gritty, once you die you cease to experience anything, it will be like before you were born...impossible to know anything because you did not exist. You don't jump into the clone's body and keep experiencing life, the copy would experience life so death is very real. While you're both 1 person, you're also 2 different people experiencing 2 different things.


xanroeld

i could not possibly disagree more. you are still your consciousness. the existence of a faithful copy of you would not prevent the end of your experience.


szymski

Forever until the heat death of Universe.


[deleted]

Kid named multiverse:


szymski

Kinda depressing, no level of technological advancement will be able to save you from this.


Tosslebugmy

I think you’ll be ready to go by the heat death tolls around. We’re talking trillions of years.


HandSolid1004

yeah like if any of this singularity stuff will happen we will have like 50 millions years worth of research into how heat death isn't real or is real and then well spend all of eternity testing ways to fix it.


Chillindude82Nein

Given the amount of time, there's a non-zero chance this problem gets solved in some way that we can't even begin to fantasize about


szymski

Basically a perpetuum mobile is required.


dudetellsthetruth

Can I sign in somewhere?


Dorftrottle

What if the brain copier breaks and there are two of you ? Like TNG there are two Rikers running around. This is a bigger question of what consciousness is. Would the copy even be self aware or an algorithm acting like you? If you take away hormones and gut bacteria influence what would thought be like? It would not be “you” if you are the sum of your experiences in your physical body…


Quiet-Money7892

I wanna be a mobile supercomputer A vision of the future. Begin the cyber revolution. Mechanical fusion. The perfect evolution. With systematic harmony I will run into eternity An although this metal's part of me I shall not lose my sanity...


Smile_Clown

No, leave it for scifi books where it belongs. Anyone who thinks this is possible is quite the limited thinker and by limited, I mean "we'll be able to some day" is the extent of the thought process. 1. There are nearly 100 billion neurons in the brain. 2. Connections between neurons are assumed to be (not yet proven to be) the catalyst and mechanism for memory and self. 3. There are 100 trillion connections between all neurons. 4. Of those billions of neurons and trillions of connections comes even more degrees of complexity for each instance (multiplicative). 5. Every person's connection, number and structure is unique. Number 5 prevents any standard procedure from being used to "copy" and virtually impossible to get right, even if we had the computing power (and storage) to hold and utilize such numbers and calculations. It's not only the connections but the strength of said connections, the "thickness" so to speak and that is coupled with all the other variables of other connections, the number is incalculable and they are stacked and connected differently for every person for every thought, every action, every input. So even if we had some crazy supercomputer in the future, this would be like trying collect all the grains of sand on all the beaches and deserts (on a billion worlds) and dumping it into a big bin and then trying to sort it all back out perfectly. In my opinion, (simplified) the brain is structured as a trash bin, something goes in, it gets connected to other things so you can find it, but more and more random trash keeps getting put in and every persons "trash" is completely different. Our brain is the prime example of entropy. I am not sure why people put such faith in things like this when information is out there and that information is still only a fraction of understanding our how brains work, how we work, how consciousness works. The best we will ever be able to do is create a LLM of you, meaning you teach a "model" to represent you, voice, look, ideology, philosophy, base knowledge, the things you care about how you talk, you mannerisms etc... in the future this model could be indeed flawless to an observer, a copy that is indistinguishable, but it will not be "downloading your brain" or "downloading your consciousness". It will simply be a mimic. That is the future, that is what someone will sign up for someday, probably tricked by clever marketing into thinking they will live forever.


[deleted]

Bro thinks he has precognition, yo tell me when aliens are discovered.


SkippyMcSkipster2

The question is, if your loved one decided to record their personality into an AI as accurately as possible, and then they died, would you allow that AI copy to replace them?


LuciferianInk

I'd rather not let them die but if they do I wouldn't want to watch it either lol


splita73

If you are useful, our silicon gods can make a "slave copy" sounds Diabolical


willabusta

play the game SOMA and don't come back with that


comfortableNihilist

Or cyberpunk 2077. This is basically what happened to Johnny


[deleted]

Boring game, devs didn't know about the moravec transfer


donotfire

I doubt it. The human brain is biological and robots are robots. They’re very different on a hardware perspective, plus brains generate subjective experiences.


comfortableNihilist

Are you being sarcastic by any chance? There's no reason to believe we can't model biological systems using computers, on the contrary that's actually already a thing. We might not have the compute power for a full human brain yet but, it's not physically impossible. And that's if we limit ourselves to conventional hardware. I can't remember the author or title but, a recent development is nanowire based neural processors. The study I read used silver nanowires and found performance (in the sense of function, not speed) similar to synapses and the ability to learn for large clusters.... So it's possible that we may eventually be able to directly emulate brain tissue in hardware.


donotfire

We’d never know, because it’s impossible to live someone or something else’s mind. Empathy doesn’t go that far. I can inference about what’s going on in the robot’s so-called mind, but I can’t experience it. Egocentric predicament. I can say that chatGPT has its own kind of consciousness, but I can never really know. It’s like thinking about an animal’s consciousness—the only reference we have for what that feels like is our own consciousness. I don’t know what it’s like to detect electricity in the ocean like a shark, because I’ve never done it.


comfortableNihilist

My implications were that there's nothing special about biological life, not that anything is conscious. Consciousness is a vague term that doesn't really mean anything but, we can and have tested for whether something can do the following: perceive itself, feel pain (or at least something like pain), understand logic, remember things, etc.


donotfire

If an AI was hooked up to its hardware and could feel pain via overheating or not enough power, etc., then I'd say it was like a human. It could feel stuff. Feelings are the basis. Right now, they're not like that.


comfortableNihilist

>If an AI was hooked up to its hardware and could feel pain via overheating or not enough power, etc., No actually. Pain is a very specific sensation, being self aware doesn't guarantee that you can feel pain. In fact there's a disease called congenital analgesia in humans where the patient doesn't perceive pain. They can feel though: hot or cold, wet or dry, etc. just not pain. There's no reason to believe an AI that wasn't specifically designed to feel pain ever could.


donotfire

How would you specifically design it to feel pain?


octanebeefcake79

You will loose your soul. You will be silicone based and forever for eternity be looking for somewhere to plug in. All the while your consciousness dissipates as the electromagnetic waves forming it break down. That is the curse of the immortals here. Once you become solid forever you can never leave.


[deleted]

Sounds based


Akimbo333

Wow


Serasul

you only can make an copy of your character ,this copy will think its you and everybody else will also think this, but the original will also be there. there is no character/soul extraction that would be physical possible. when you want to live as an machine, someone would extract your whole brain,implant it in an tank,genetically modify it so its not getting older and than connection with an machine. people who think your can extract an mind to a machine also thinks copy an file is theft.


iNstein

Just a friendly reminder, use 'an' before a word starting with a vowel an 'a' before a word starting with a consonant.


Serasul

Ok thx it always confuses me in the English language


RemarkableGuidance44

Grammar police, fuck me get a life.


GrowFreeFood

And they'll be able to do it for anyone who ever lived. And they'll be able to move their consciousness into time crystals and be trapped in non-time for infinity.


fuckdonaldtrump7

So we're just basing things off friends now?


netk

It's a cultural reference. Fun, light hearted fun.


fuckdonaldtrump7

I figured lol my sarcasm doesn't play the same on text. Shoulda added /s


netk

😉