A thought experiment:<p>One by one your neurons are replaced by their digital counterpart as a nano-scale computer in-place, with equivalent functionality. After which neuron number are you no longer you? You remain conscious throughout the process, the process may last however long with pauses for sleep.<p>After the replacement is complete, one by one these neurons are switched off with their functionality offloaded to their clone instantiated in a computer. After which neuron number are you no longer you?<p>This mind upload thought experiment convinced me that as long as there are no sharp discontinuities in experience, it makes no sense to ask what happened to the you. It also carries the implication that you are not your brain, but rather the abstract dynamical system instantiated in it.
Another thought experiment:<p>One by one your neurons are replaced by water. At which neuron are you no longer you?<p>The fact that there is no clear time to pinpoint for a change doesn't imply that there is no such change... so your thought experiment doesn't really prove that the 'you' is kept.<p>In the end, Theseus ship is, and is not, the original ship, and both things can be true at the same time.
When a person dies, not all of the cells in the body 'blink out' at once.<p>I've never died, but I imagine that a near-death person with 80, 60, 40, or 20 percent brain cell function is still in possession of qualia - lived experience of some sort.<p>I also imagine that this experience is diminished and likely otherworldly compared when compared to my normal goings on.<p>Finally, I imagine that this qualia 'fade to approximate black' as the living cell count trends to zero.<p>These neural-replacement thought experiments have never been convincing to me. What's proposed is that my brain dies - piecewise - and a new mecha-brain is born, piecewise. The period of interop between meat-me and mecha-me leaves me no less dead at the end, and only slightly less "partially dead" than in the normal death process ("more alive", or at least, "less bizarre" because my remaining living cells are receiving coherent inputs rather than random noise / silence).<p>Note that I'm not engaged in meat chauvinism here - I don't deny the potential consciousness of the mecha-me, but it's a different consciousness than mine.
Plenty of people (not tons, but many) experience loss of neurons through injury, disease, or aging. We generally consider the person to be the same person, even if their personality changes.<p>Likewise, as you grow in childhood, you create many many neurons. Again, you are still you throughout the experience of growing up.<p>I suspect in your thought experiment you'd remain you throughout. Honestly, a community I bet does a lot of thinking about the self and what makes you "you" is the trans community, given their experience grappling with bodies and identity already.
My grandma got dementia, and she has been deteriorating quite a lot just now at the hospital to the point where she is unrecognizable. Yes, technically she is still my grandma, but at the same time she is not. I hope you understand what I am trying to get at.
Just wrote about this further down. I would be satisfied with identity retention with ALL neurons replaced, if certain conditions were met.<p>I'll just copy/paste the relevant part so no one needs to follow the other comment* if they don't want to:<p>---<p>And finally, I would settle for a not so perfect copy of a brain scenario. Magic "Nanobots" replacing the neurons of a subject in vivo, gradually over the span of a few days/weeks/months. The new neurons can be non-biological, but must work identically to the original ones and obviously the connectome should be identical. The subject would be asked if they feel ok with this process regularly and the general part of the brain that is responsible for answering that question would be the last part to be replaced, otherwise it would be cheating wouldn't it? On completion, I would assume it's the same person and if it was me I would assume I'm the same person. This preserves continuity (of the self) to my personal satisfaction. Anything less than that is cryonics-level of bull*.<p>---<p>Edit:<p>> clone instantiated in a computer.<p>Oh and as far as clones go, I would assume it is the same identity. There would be no original or copy after this, all copies and original are the same as a string of bytes in memory. But... Only at inception. Then as more life experiences each one accumulates, the more they diverge and cannot be treated as the same person. And as such, I wouldn't let a clone like this, nowhere near my imaginary partner for life.<p>* - <a href="https://news.ycombinator.com/item?id=47467300#47471320">https://news.ycombinator.com/item?id=47467300#47471320</a>
> one by one these neurons are switched off with their functionality offloaded to their clone instantiated in a computer.<p>We are not just brains in a jar. Our bodies provide rich sensory information, and feedback loops without which we would definietly not be “us”.<p>To use the “Ship of Theseus” analogy you are proposing to replace all the parts of the cockpit with new parts, but you are also at the same time discarding the hull, the sails and the oars. It would not be the same “ship” without those other things. You also need to very precisely replicate/simulate those parts too otherwise you won’t have the same person.
>> This mind upload thought experiment convinced me…<p>Start at the color green and gradually add in some blue. As long as there are no sharp discontinuities there is no such thing as the color green.
During your sleep, we clone you fully down to brain connectome and burn your original body.<p>On waking up, we show you the video of what happened. You feel the same person as yesterday when you went to bed. Are you?
<a href="https://en.wikipedia.org/wiki/Chinese_room#Brain_replacement_scenario" rel="nofollow">https://en.wikipedia.org/wiki/Chinese_room#Brain_replacement...</a>
It’s somewhat impossible to me to determine if a sense of self is an emergent phenomenon or an illusion without Descartes.
Glad to know I'm not the only one that thinks along these lines, and presumably by extension that teleportation by delete and rebuild mechanics would be mass murder.<p>You might no longer be you in the sense of you behave quite differently, but I remember being 16 and I don't behave like that either. The discontinuity is the point, as you eloquently put it.
> with equivalent functionality<p>This part here forces the answer. What is necessary for equivalent functionality? If you ask this question of someone who believes in immaterial souls, they can simply deny that equivalent functionality is possible in the first place.
Chip of Theseus
Permutation city, by Greg Egan. You don't even need the computer once instantiated!
Are you the same "you" that went to sleep last night? (Thought experiment: what if we die every night we go to bed? Seriously consider this for a moment.)<p>Are "you" a "you" at all?<p>What if "you" is just a prediction machine linked to prediction capability + history + planning + objectives + online learning? Your memories are just seasoning for the weights?<p>Personality, nostalgia and frisson are just hallucinations of experience. Biochemical complexity feels alive and conscious in large numbers and at scale, but it's all simple physics.<p>Microtubule dynamic instability etc. etc. is just how evolution stumbled upon a runtime learning algorithm. It's not the only solution.
RE the first question I'm sure someone hasn't seen this yet:<p><a href="https://existentialcomics.com/comic/1" rel="nofollow">https://existentialcomics.com/comic/1</a><p>Subjectively though, sleep 'feels' like it maintains continuity in a way. But of course that could just be a rationalization.
No amount of replacement removes your sense of self. "You" become a machine.<p>I think the main thing that makes this inrealistic is the scale. In theory it also could take place on a larger level of discretization: chunks of many neurons.<p>I think it also matters what the nature of the replacement is. Is it possible for you to retain memories and skills throughout this intrusive process? Presumably, if the computer has "equivalent functionality".