Uploading and Personal Identity, part 2
In Favor of Survival (The Optimistic View)
Earlier, we saw some problems with the thesis that you could survive being uploaded. Chalmers takes a different approach toward an optimistic view of survival, in an attempt to avoid those problems.
First, note that someone with an optimistic view must argue for TWO things:
The result of uploading your mind would be conscious. The result of uploading your mind would be you.
1. Uploads Will Be Conscious: Chalmers begins with the assumption that a nonbiological system could at least BEHAVE as if it were conscious. Even those who doubt that artificial intelligences are truly conscious generally admit that, in principle, we could program a robot to ACT as if it were conscious—i.e., have conversations, claim to see colors and shapes, say ‘ouch’ when it is harmed, and so on. (Though, note that some do deny this claim, stating for instance that there are algorithms in the human brain that are so complex that they simply cannot be translated computationally.)
Now imagine the following process:
Gradual Uploading: One by one, once a month, each of your neurons are replaced either with functionally identical silicon duplicates (nanocircuits) or else by being uploaded into a computer that is connected to your brain. (Chalmers writes, “The components might be replaced with silicon circuits in their original location, or with processes in a computer connected by some sort of transmission to a brain. It might take place over months or years, or over hours.”)
What will happen to you—to your consciousness? There are three possibilities:
(a) At some point, consciousness will suddenly disappear. (b) Consciousness will gradually fade throughout the process. (c) Full consciousness will remain present throughout the process.
(a) is Implausible: The idea that consciousness would suddenly disappear is odd. Consider an even simpler example: Imagine that your neurons are destroyed one by one. Is there really a sharp cut-off point where you are fully conscious, but the destruction of only ONE MORE brain cell would result in total loss of consciousness? That seems unlikely. So, even less likely then is sudden loss of consciousness when the neurons are merely REPLACED rather than DESTROYED. (b) is Implausible: But, Chalmers finds the idea that consciousness would slowly fade odd as well.
First note that, by ‘fading’ consciousness, Chalmers means that either the quantity (i.e. the amount of information being processed) or the quality of the experience would be diminished (e.g., perhaps colors would become less vibrant, or become black and white, or the visual system would shut down while the auditory system remained, etc.).
Second, keep in mind that Chalmers has assumed that such a system would continue to BEHAVE as if it were conscious. So, if consciousness slowly fades, then what we have at stage 1 is you, the fully conscious individual; and what we have at the final stage is some artificial intelligence that BEHAVES as if it is conscious, but is not conscious at all (i.e., there is no “what it’s like” to BE that individual). Chalmers calls this kind of being a philosophical zombie. So, “On the fading view, these people will be wandering around with a highly degraded consciousness, although they will be functioning as always and swearing that nothing has changed.” He finds this highly implausible.
Conclusion: (c) is Most Likely: Chalmers thinks that what is most likely is that, during gradual uploading, consciousness will remain present throughout. Certainly, once people begin undergoing this process (which seems likely to happen in this century), we will interact with these individuals and they will SEEM conscious to the point that we will find it odd that anyone would ever doubt this.
[Though, note that, if he’s wrong, then a society where uploading is common would be something of a horror story, where individuals are murdered and replaced by silicon-based zombies, and no one will realize it!]
Brain Chauvinism: Ultimately, Chalmers thinks it is strange to believe that something other than organic brain matter (e.g., silicon) could never be conscious. For this belief requires something like “brain chauvinism” (a kind of unjustified discrimination akin to racism or sexism). Why should we think that only a carbon-based, organic BRAIN could be conscious? What makes it so special and unique?
2. Uploads Will Be Numerically Identical: Recall the story of BioDave and DigiDave. Building off of the above, Chalmers suggests that, during gradual uploading, not only is consciousness preserved, but so is personal identity.
Call the original Dave ‘Dave0’. When he has one neuron replaced during a gradual uploading, call him ‘Dave1’. It would seem odd to claim that Dave no longer exists—as if a person will suddenly go out of existence if only a single neuron were destroyed or replaced! So, Dave0 = Dave1. But, then, likewise Dave1 = Dave2. You can probably see where this is going. This is a Ship-of-Theseus-style gradual replacement scenario. Argument for Survival of Gradual Uploading
1. Gradual Replacement Principle: If you take a conscious person P1, and replace a single neuron by uploading it in order to obtain a conscious person, P2, then P1 = P2 (i.e., they are numerically one and the same person). 2. Transitivity of Identity: If A=B, and B=C, then A=C.
3. Therefore, P1 = Pn, where Pn is the person who has been fully uploaded.
Recall the Theseus case. The conclusion is entailed by premises 1 and 2 because, based on gradual replacement: while P1 = P2, it is ALSO the case that P2 = P3, and it is ALSO the case that P3 = P4, and so on down the line so that
P1 = P2 = P3 = P4 = … = Pn
So that, by transitivity,
P1 = Pn
Toward Instant, Destructive Uploading: But, Chalmers goes farther. In our Gradual Uploading scenario, your (or Dave’s or whoever’s) neurons are replaced one each month. But, what happens if we speed up the process? What if they are replaced, once a day, once an hour, once a minute, once a second, or nano-second, etc.?
Is there a point at which the replacement becomes so fast that it no longer preserves identity? Chalmers thinks that this is counter-intuitive. So, the natural conclusion—since the limit of speeding up this process is simultaneity—is that you would survive being uploaded, where the original is destroyed during the process, even if that process were INSTANT!
Conclusion: Chalmers concludes that it is quite plausible that you would survive gradual uploading, and that there is also at least a prima facie intuitive case to be made for survival in instant, destructive uploading.
Note: If he is right, then this is remarkable news. For, it means that YOU in your lifetime may very well be faced with the option of becoming immortal.
Objection: You might be thinking, “Wait. Hold on. I believed that Theseus’s ship survived when the planks were replaced slowly, one by one, over time. But, speed this process up enough and I no longer believe it. Certainly, instant replacement involving destruction of the original planks and instantaneous replacement by all new planks would not result in the same ship!” The same could be said of instant destructive uploading. Something fishy is going on here.
[What do you think? First, how fast can the replacement in Theseus’s case be before the identity of the ship is no longer preserved? Second, is the Ship of Theseus case analogous to the Gradual Uploading case? Why or why not?]
Reconstructive Uploading: There is one more future technology worth considering. Several companies are already working on programs that will be able to scan all of your texts, tweets, pictures, emails, and so on—in short, all of the data that we have of your life—and “reconstruct” you after death. People will be able to interact, with this person, have conversations with it, etc., and it will respond as if it were you! [Note that this is the plot of Black Mirror, season 2, episode 1, “Be Right Back”.]
Question: Will this reconstructed digital persona be YOU? Chalmers notes that, if you accept that you survive instant destructive uploading, then you might also accept that you survive in reconstructive uploading. For, the only difference (if any) might be a time delay; e.g., the person dies (is destroyed) and then the upload/reconstruction occurs 10 years later rather than instantaneously.
[Is that right? Can the person that is YOU be boiled down to a bunch of behaviors culled from emails and pictures and videos, etc.? Or, rather, is there something special about the actual, complete set of synaptic connections in your brain that would need to be reconstructed as well?]