Skip to content

Identity: Does it exist?

December 8, 2011

Our recent questions, directly or indirectly, have implied that animals in some sense share our human identity. Animals may have rights–to live, not to be eaten, not to suffer. If animals have rights within human society, does that imply that they are “part human,” or that animals and humans share a common philosophical identity? Have individual minds?

What happens once we upload our individual minds? Contrary to arguments against the Singularity, I believe mind upload is inevitable. We’ve already made mental bacteria, and arguably a cockroach. Humans aren’t far behind.

The British philosopher Derek Parfit argues that already, individual humans have no individual identity as such; that identity or “selfhood” is not a unitary thing, but a continuum that can exist by degrees. For example, suppose the right and left halves of your brain are separated–we know what happens, it’s as if two conscious beings inhabit the same body. But now suppose that each brain (half) could be transplanted into the brainless body of another person. Each thinks of itself as “I,” sharing the same memories and history of self. Yet there are now two physical persons, each departing from the other on a separate path through time. What happens?  Are they both “you”?  Or neither?  Has the original dual-brained “you” ceased to exist? Or are there two of “you”?

Parfit’s point precisely is that there is not answer–that each of the new entities, and the original, have something of “you-ness.” And even that some “you-ness” continue to exist after your death. The  mind uploaders think this will all somehow be ok, so long as “we can let go of Boolean logic.”  Do you agree?

33 Comments
  1. Petter permalink
    December 9, 2011 3:21 am

    Identity in western culture has been to strictly limited to what happens inside our body. The expectations of our peers and the context they provide has not been defined as part of our identity, but imo they should. Several drug abusers and criminals work well in society when they are removed from their original context (ie school ships, going fishing in a remote area), but when they return they go back to their old habits. The context is part of their identity, and when relationships and expectations are changed the behaviour and identity changes radically too.
    Same thing about our electronic environments, and they are gradually increasing. So far most decisions originates from my physical body, but not all (gmails spam filters make hundreds of decisions for me each day). As my electronic devices become more and more advanced I expect them to make more decisions and eventually be self-improving in their tasks. As I get older I might want to let those devices network with each other and do more and more stuff without my biological self getting involved. Eventually my biological body will wither away and die, but by then I dont think its inconceivable that an autonomous electronic identity can and will do most things that I can do now, and much else as well.

    • December 9, 2011 12:34 pm

      fascinating–a new concept, to me, anyhow. It would be a thousand times worse than having a deceased person’s address in a mailing list that keeps sending junk mail years after the person was gone! It might even become difficult to ascertain living or dead, if we ever get so i-connected we stop seeing each other face-to-face. One might have to get a warrant to certify someone as alive or dead–great fodder for one of the darker SF stories!

  2. frances permalink
    December 9, 2011 4:47 am

    I’m curious how one would upload (for want of a better term) body memory, in part covered by all the ~ceptions (proprioception, nocioception) as well as muscle memory. What about the endocrine system? It seems peculiar (as well as old-fashioned) to me there exists a neutral ‘me’ residing in a consciousness somewhere that can be separated from these other physiological parts of myself, and which can exist in a recognisable state, or at all without them.

    • December 9, 2011 12:39 pm

      it’s all a matter of degree–if we use the two half-brains example, the ~ceptions are still there (or are in fact doubled), but for digital-upload-personae, brain-synth OS’s would have to allow and include the less blatant perceptions, muscle memory, time passing, etc.
      In fact you have hit on a great challenge–drawing a border between mentation and sensation might lead to the conclusion that no set border exists–that a person is WHAT they are as much as what they THINK or FEEL they are.

  3. Jon permalink
    December 9, 2011 4:50 am

    This enthusiastic post is marred by a number of questionable assertions. The move of the meme uploading persons or here, identities from science fantasy to science should be accompanied at least by some attempt to explain the fundamental problems that suggest it isn’t theoretically possible. This is important I believe when writing for a general audience that may believe uncritically assertions as opposed to facts or hypotheses so nearly conceptually and perhaps experimentally buttressed that they are clearly on the verge of acceptance as scientific theories (not theories in the unfortunate non technical usage of the term).
    First, the concept of” copy” should be explained in view of at least two or three major problems. First, explain why the Heisenberg Uncertainty Principle does not preclude making an identical copy of a given brain’s precise structures and content down to the presumed level of specificity required to be sufficient to reproduce any acceptable degree of functioning mental activity that reasonably qualifies as a conscious mind…let alone an identical mind to the original.
    Second, explain how it is theoretically possible to avoid the no-cloning theorem, a familiar part of many modern quantum text books. Essentially, the (proven) theorem shows that it isn’t possible to copy an unknown quantum state. Any attempt not only destroys the quantum state, it does not produce a copy. There is a theoretical possibility to produce something similar to the original state, but not identical. This is a severe problem if you believe there are any quantum states at all involved in whatever processes and structures in the brain necessary to the unknown generation of consciousness and mind. To rule out quantum states in processes for instance at the synapses and their functions in concert with the neuro transmitters within the associated vesicles and the electron flows would appear problematic. Given the vast number of quantum states likely to be present within the complex web of the brain’s biochemistry and electrical functioning, the multiplication of errors in each of those cast number of quantum states, were even a near cloning of each quantum state possible would result in a “scan” that would be enormously different than the now destroyed original.
    Another problem that is ignored and a previous reply here may have touched upon is the concept of embodiment and its relation to mind or more precisely, one’s concept of self…or “identity”. Consciousness and mental states are mediated by more than the brain alone, as various experiments have demonstrated in recent years.
    Lastly, the assertion that split brain persons are two different identities yet share “the same” memories and other characteristics, and that could be placed in two different bodies, is fraught with unexamined assumptions. Not least problematic is the question of the same memories. We are speaking of two hypothetically different selves, although experimentally it is differing knowledge and perception that have been shown to be available regarding the same event in the lab. Accepting for discussion that this means there are two separate selves or identities, they clearly have different memories of the lab events, and by extension presumably of other events beyond the lab. Therefore, they clearly would not share the same memories!
    Reasonably, what we intend when toying with the idea of making copies of ourselves is that we would be creating another consciousness, another mind that is termed “identical” to our current, original self – however we may define that. However, despite much brain research and millennia of philosophizing, no one really knows how (embodied) brains generate or are endowed with consciousness, and how these terms such as mind actually relate to the brain and what is actually transpiring

    • December 9, 2011 12:53 pm

      I agree, as far as you’ve gone–however, if the brain is more breadboard than quantized, the mere electron flow may be sufficient to create a digital copy of a brain. Also, you are too rigid about the syntax–‘duplication’ is not identity–a copier machine does not reproduce an exact copy of an object, merely ‘significant digits’, i.e. the texts and images.
      To ‘copy’ a mind can be interpreted in many ways–memory-only (as a ‘reference book’), personality only (entropisms, atropisms, opinions) or as a download meant for a cyborg-physical environment (a digital immortality, if you will). None of these would be true copies, but all such concepts may have their uses. Also, I feel that your concerns as to ‘destroying the original’ are unfounded. I for one would never volunteer for any such procedure, even if it’s reliability had been proven over years–this is the ‘Dr. McCoy’ effect–‘I have lived this long with my molecules left where they are–I see no reason to let a computer scramble them for me and reassemble them somewhere else!’

  4. December 9, 2011 10:12 am

    Sounds somewhat like holography: Cut a hologram in half, and you have a smaller or blurrier image of the whole.

    Not entirely accurate, though. The brain isn’t 100% redundant on both halves, most people would lose some capabilities with only a half-brain mind. [others, say, politicians, might end up with a net benefit, but that’s another show].

    • December 9, 2011 11:06 am

      Yes, I think the hologram is a good model. The separated left and right brains would each have a view of the same whole, but each view would be missing various pieces throughout.

  5. December 9, 2011 10:57 am

    Large number of fascinating points here–I’ll try to respond to some.

    Frances mentions the endocrine system and other aspects of the body that would not be “uploaded” with brain information. There are two ways to look at this:
    (1) Brain memory upload would not include your current perception of body state, only memory of (some) past states. It’s unclear how your future “virtual existence” would compare to the bodily existence, unless the network somehow mimics all the physiological inputs.
    (2) Perhaps the upload would have to simulate the entire body, not just the brain? I wonder how much more complicated that would be. Maybe a lot–or maybe not. The brain itself is the most complicated part of the body (we think.)

  6. December 9, 2011 11:03 am

    Jon:
    (1) About copying, I agree that I don’t see an “exact” copy happening. However the brain is “copied” or simulated, it won’t be exact, it will be more like a photocopy, losing some information and introducing some. But doesn’t that happen to us today? Am I “exactly” the person I was twenty years ago? Or even twenty minutes ago, before I read your post? 😉

    One could argue that every moment we are “uploading” ourselves, copying ourselves into the next self–with loss and gain of information.

    (2) I am convinced that all of what we are is a function of molecular processes. As Picard said, in his defense of Data, “We are machines.” The more we learn about the brain and our senses, the more of what we know can be explained mechanistically as the inputs of cellular and molecular processes. By extension, eventually enough about these will be known to put together a simulation–a simulation good enough to know that it exists.

    • December 9, 2011 1:11 pm

      I see that I should have read the whole thing before replying–I repeated (but less cogently) many of your remarks, Joan. I know that, as an SF fan, I’ve seen authors’ conceptions of brain/computer connectedness and-or replication go from the simple to the sophisticated. Some stories posit a digital world-in-a-box, wherein the entire population of a world is sucked into a processor that simulates their entire existence (talk about needing to make backups). Some stories suggest that our Universe, and us and all of reality, IS a simulation being run by some ‘god of the multiverse’. Some stories focus on the personal feelings of people who have been translated into a mechanistic medium such as a cyborg–how they’ll feel about the unavoidable differences between their ‘real’ (original organic) existence and their new (synthetic) life–the return of lost limbs or lost brain functions, or the adjustment to having a vastly more powerful mind (with total recall, near-instantaneous computational power, and unlimited reference data and online connectivity…
      When I saw the news, not too long ago, that a paraplegic had a chip implanted in his brain that enabled him to move a ‘mouse’ on a monitor display, I thought of it as congruent to that moment in 2001: A Space Odyssey, when the ape begins to use the big leg-bone as a club.
      On-board Interfaces and memory upgrades are not far off–then it’s off to the races, with who knows what kind of society at the end…

    • February 1, 2012 5:31 am

      I’m glad nboody complains about the ethics of “constraining” human neurons, arguably the noblest cells in the human realm to express themselves in the limits of a mouse brain. Now how about the reverse experiment, would you like to be “enhanced” by mouse neurons? Btw, this experiment will teach you new tricks, like the ability to recognize at once the unique signature of thousand of CHEESES

  7. December 9, 2011 1:20 pm

    Having had my own perception-suite and brain functionality altered by illness, in some ways permanently, I have a more flexible image of brain processes than people who have taken their minds for granted all along. When the mind works well, one has little awareness of it–when it begins to error-out and fail at key moments in ones daily life, it prompts examination of the hitherto perfect brain. This is similar to the physical changes of aging–as young people, our bodies rarely do anything other than grow and improve in power and control–as older adults, we come to spend a lot of time noticing the changes that come from reduced telomere length.

  8. Jon permalink
    December 9, 2011 4:15 pm

    Thank you for the replies. These are interesting topics. The use of a Xerox or other paper copy is a nice try at a verbal analogy. However a moment’s reflection upon the science and math involved… which is my primary point… demonstrates the huge gap between a sheet of paper and a brain.
    You are correct, and we agree that duplicates, copies that relate to the concept of identity in the sense of preserving a unique mind, however generated or endowed is not theoretically possible (outside of fiction). Consider the implication of a paper copy compared to a human brain (let alone the body which we now are learning is intimately involved in the processes termed self and consciousness).
    By copying a person – uploading that person as opposed to cloning a body – we usually are asserting that it is first the information that is being copied. Infirmation copied closely enough to be considered a faithful copy of the original and recognizeable enough to both the copy and other persons as possessing the “same” mind, the same self, the same identity in the context of this discussion. Otherwise, we are discussing the creation of twins in the everyday sense. And twins are not the same person. Even the genetic level copy in identical twins is far from producing a satisfactory backup, an immortal entity. Point being the degree of exact, identical copying of information matters.
    Back to the sheet of paper. Calculate the number of bits of information potentially stored on one sheet of paper. You might make it astronomically easier by only considering the number of spaces on a standard typewritten page and the possible letters, numbers and typographical symbols that could be in each space. It is a large number, and represents the number of bits of information we could store on one page of typing. Accuracy at this level is not extremely original, although poor copies are clearly not “the same” as the original.
    Now back to the brain. We’ll leave out the body for simplicity. I think we would agree that the level of information important for a hypothetically copied human self, that other persons as well as itself might mistake itself for the original, is significantly finer than that of typed letters scaled to a standard sheet of typing paper. It is orders of magnitude finer.
    Now we face the first problem I asked about, the impact of the Heisenberg Uncertainty Principal. We agree that there can be no exact copying of the information on the minute levels needed to capture the unique structures at the level of the billions of synapses and associated structures that contribute to making your brain different than your mother’s brain or anyone else’s.
    The barrier to making a recognizeable copy of the information content including the memories possessed by the original – which imply preserving the spatial and process relationships of all of the information that implies- is that the errors in copying are not only vast, they are at least geometrical in piling up. Not only will errors enter into the copying of each bit of information regarding the structure, the location, and the orientation of that bit. The vital level of the web of interconnectivity that must define in part the uniqueness of your brain from any other will depart from the original in leaps and bounds as the underlying errors- differences- multiply.
    In other words, miscopied bits on the order of magnitude far exceeding the number – billions, trillions- of neurons let alone synaptic connections let alone levels more fundamental, are simultaneously creating massive numbers of errors in duplication of relationships within the brain’s vital structures that we need in order to suggest that we have “copied” a person’s self or original. Sadly, those relational errors also compound in mind staggering numbers. You end up with data representing a brain, but unrecognizable as the “same” brain.
    The verbal analogy of copying a printed page or picture with copying a human brain and whatever it is that you identify as its associated consciousness, its self or selves, simply doesn’t work.
    Further, I seriously doubt that one can dismiss quantum states within the brain contributing to the nature of that brain’s identify with a consciousness or instantiated mind and awareness. Quantum mechanics cannot simply be asserted away. Words, verbal analogies are only potentially useful when the scale of the comparators is not only on a same level, but within a classical physics model and not on the quantum level. The no-cloning theorem eliminates any potential for duplicating, copying a person in the way fiction and speculation intends. Star Trek scripts are simply not useable!
    Lastly, the problem of consciousness, of mind, has been glossed over. One may prefer or believe that the mind is an emergent property of a brain, or one may believe that it is a fundamental entity, albeit non-physical. In either case, one is dealing with severe problems that no one today has understood let alone measured. The emergent position is fraught with physical and philosophical issues that so far render it unconvincing. We do know that even on basic levels of experience such as sensory inputs, that the physics and biology of a certain wavelength is not “the same” as your experience of “red”. The mysteries of instantiation remain. Similarly, consciousness as a fundamental property in the universe raises knotty and serious problems, not least of which involve the points of contact with the physical world, presumably via the brain. How we could “invite” the “same” consciousness, following the destruction of the original, into a different person that is a nonduplicate is unknown. Similarly, if consciousness is emergent, the newly emerging consciousness, with a different set of memories in a different brain – significantly different due to the unavoidable piling up of copying errors – would be more than difficult to label essentially as “the same”. And if quantum states are necessary to the functioning of the brain and consciousness, then we are further barred from making reasonable facsimiles.
    A number of other problems exist, but I was hoping to show the need to be careful to not stray far from the implied math and underlying science when speculating about complex science issues. For instance, you can take a stab at calculating both the quantity of information extant in your consciousness at any given moment by analyzing the field of vision (and similarly, hearing, taste, etc.) and what must be the minimum number of bits needed for your conscious perception of that frame. Human optical resolution is known and the full field of view information minumum content in bits when staring at a single color wall can be calculated. Visual consciousness can be calculated for at least a 30 to 50 frames per second bit rate. These parameters can begin to provide another angle for getting at the order of magnitude of information required to reproduce a mind – let alone a person.
    You can choose an copying error rate per bit, based on the physics involved in any of these approaches for attempting to copy a facsimile of a particular person. When applied to any calculated number of bits of information involved in a hypothetical copy of a person, you can see that the resulting “copy” (of what can even be theoretically copied) is simply not presentable as anything close to the original. In other words, you might at best come up with a Bizarro World version of the original, posessed of cryptic, incomplete jumbled of memories, different and perhaps flawed thought processes, fine level structural errors within the neural processes, and possible gross level misshapen anatomy, if you believe in emergent consciousness and thus are attempting to create a duplicate of the original brain.

    • December 9, 2011 10:00 pm

      You’re right that a lot of bizarre jumbled “copies” will result.
      But doesn’t life do that anyway? As we age, for instance, don’t we become deteriorated “copies” of our earlier selves?

      • Jon permalink
        December 10, 2011 4:21 pm

        Joan, yes, thanks. Many jumbled copies if any means were available theoretically, to attempt to upload a backup or replacement of oneself. However, you seem to conflate that non-existent process with your and our current, real biological, wholistic existence. So, no, I couldn’t agree that these two concepts are “the same “. One is an actual working biologically derived reality, while the other is an artificial hypothetical construct which analysis shows to be flawed on several levels. You can’t indict the actual entity, our reality, by loosely asserting that you solved the hypothetical model or concept’s flaws by saying the flaws apply to the real world, to human beings and their actual brains.

        I apologize for the length of my previous notes. The short version would be, first look carefully at the science; then actually do the math. Done with even back of the envelope approximations, the rough approximations will deliver a “face validity ” ability to judge a speculation or hypothesis…a sense of the scale of the problem, or other useful information, including in this case a sense of the two scales you are attempting to compare. Also, don’t mistake the attempt to somehow duplicate artificially a brain and contents etc. with the actual brain and person, all of the (often unknown) processes that continue through time.

        The question of identity is a real and profound one. Asserting uploading of one’s identity, or self, or person, – all very fuzzy ill defined concepts, runs into the assertion of copying at the least some aspect of the brain, and assumptions about consciousness and so forth. That considerably muddies the useful discussion of identity by introducing various unexamined assumptions that are often false and misleading.

        Oh! I have to mention one other assertion in the original post! Bugged me … we can upload “our selves ” in the future because creating bacteria or something like that has been done.!!! The question is one of identity, copying, etc. Not constructing a life form or artificial version of one. That is different from duplicating an elusive entity like identity, as well as consciousness (if, indeed, those two terms are separate) .

  9. SFreader permalink
    December 9, 2011 5:49 pm

    The few real-life examples of an original and a ‘copy’/duplicate that I’m aware of (identical twins, conjoined twins, and clones – Dolly, and a few other animals) seem to point to the conclusion that any difference, no matter how trivial, results in a different person/personality. And, individuation increases with time/distance/experience.

    SciFi obviously has dealt with this notion many times and depending on the author or script writer reached different conclusions. For example, in the Star Trek ‘Second Chances’ episode, Riker becomes two persons due to a transporter malfunction, Will who arrives on the Enterprise is deemed to be the true or original, while the Riker who remains/is abandoned on the planet is later rescued and renamed Thomas. This episode essentially takes the position that social consensus determines personhood – the individual in question does not have a say in determining their own identity even though they are granted personhood status. In contrast, Richard Morgan’s fiction often includes personality transfer across light years. The personality is electronically(?) encoded then beamed to an awaiting body (sleeve). In Morgan’s fiction, the person is the personality, the body is incidental. Robert Sawyer looked at this question in Terminal Experiment from a slightly different angle – trying to tease out what bits – make us human/who we are by using three variants electronic of the same uploaded person. (The original human remained and acted as a control.)

    What most of the authors/writers seem to agree on is that the manner/mechanism of splitting/copying a personality is not the most relevant aspect in defining personhood. I agree with this and also feel that we should separate personality from person because ‘person’ is too all-encompassing. Plus, the technology speculated is likely to uplift only the personality – that is, the brain activity and whatever might be recorded/encoded in the brain.

    Personality uploading is I think a flip side of cryogenics/cryonics. And, as cryonics may be feasible sooner, it will likely be the test case for evaluating and establishing the boundaries of personhood.

    What limitations would you set on uploading a personality? Would you immediately destroy the original? How many ‘lives’ does a person have a right to live concurrently? Wouldn’t you do an electronic back-up of any personality transfer just to be on the safe side — and if because of a glitch you ended up waking both, would you kill one off because only one John Doe legally existed at the beginning of the transfer?

  10. December 9, 2011 10:04 pm

    IMHO, most science fiction fails terribly to address the copying issue.
    If Riker becomes two persons by mistake, why does that never happen to anybody else? Why not ten or a hundred persons?

    I can’t imagine anyone really “destroying the original” — especially since we’ve agreed here that the two copies won’t be “exactly” the same. Indeed, the moment a copy is “born” it starts acquiring new experience that makes it a different individual.

    • SFreader permalink
      December 10, 2011 3:20 pm

      “… most science fiction fails to address the copying issue”

      I think that’s because most SF writers use ‘copying/uploading people’ as a trope/convenience. Going back to the Star Trek example, Gene Roddenbury didn’t have enough budget to construct shuttle sets so came up with a ‘transporter’ to get Enterprise crew to wherever the action was. It was only during ST-TNG when budgets were more generous that Roddenbury gave a nod of appreciation to the show’s hard-sci fans by elaborating on various technologies including the transporter. BTW, the failure rate was touched upon in Realm of Fear where Barclay sees ghostly worms – people who had been trapped during transport and never reached their intended destinations. Some safety statistics were also mentioned.

      From the other contributors’ comments above, it appears that a personality transfer/duplication process would indeed require a much deeper understanding of the hard sciences than most people – myself included – have. However since most technology usually becomes widely distributed, then even non-scientists would need to join in the discussion as it broadens to consequences such as social structures, personal/family relationships, work, environment, economics/trade, medicine/health, etc.

      Assuming that such a technology is feasible, at some stage people would want a test to prove that the person is who they are supposed to be. I’ve a couple of ideas but would be interested in hearing what the hard sci folk here would propose.

      • Jon permalink
        December 10, 2011 4:39 pm

        Might be good to clarify whether your post is intended to focus on questions of identity, or of copying one’s identity or self, or of science fiction methods.

        If copying is really at the root, then better to focus on SF and excuse the use of copying which violates physics, similarly as ftl travel is excused as a tool needed for constructing a story. Or pick your own violation of laws of physics (inertialess drives?) As a comparison. In other words, a device to aid the suspension of disbelief.

        On the otherhand, if seriously discussing questions of identity or of the possibility of copying a person for immortality or other reasons -travel etc., then I must reiterate that any assertion of the latter would violate several established laws of nature or physics. Better to discuss identity issues without falling into copying issues. In my opinion. Despite the fun of copying implications!

    • Jon permalink
      December 10, 2011 4:47 pm

      The issue is that in any attempt to scan the brain for the purposes in this discussion of copying, the original does get destroyed. In other words, it isn’t a legal or ethical question, rather a practical result of the process.

  11. December 10, 2011 8:39 pm

    Let’s look at at some different models for “copying.”

    A perfect “copy” of a person, atom for atom, is obviously impossible, due to quantum constraints.

    On the other hand a perfect copy of a data file is perfect (most of the time); our entire computing system depends on it. Do we agree on that?

    So, does the “identity” of a person depend on the atomic level, with all the implied quantum weirdness? It cannot–otherwise, you would be a completely different person from who you were in the previous nanosecond. So is a person more like a defined data pattern than a quantum construct? Could an uploaded “person” be as similar to the original as you are to yourself a year ago?

  12. December 10, 2011 8:43 pm

    Hmm. I’ve got two takes on this issue.

    My first novel Scion of the Zodiac was really about identity and how you transmit it to colonize a new planet. The system I created assumed that physical uploading (creating a full record of your brain and body) is kind of silly. I postulated instead a system that used an intra-cranial computer system to log your life. The system also taught itself to be you so well that it could predict what you would do in all situations you’d been exposed to. The system thinks as you, and it may even think it is you. To create a download of yourself, you implant a copy of this system in the head of an infant clone of you (the scion of the title). As the child grows, the system teaches the child to be you, along with logging all of its life experiences. The system can continue indefinitely. The system can also be hacked (from which depends a story, if you’re interested in learning more).

    There are a couple of reasons I made this system. One was that I wanted to figure out what minimal uploading was necessary to ship a colony of humans to another star. The less space it takes to store a human, the more humans you can ship, and sending a bunch of expert systems is better than sending enormous, cell-by-cell records. Also, I think that reincarnation by building an adult human cell by cell is about as stupid as building a 30 year old car with the engine running. Given a choice of how to reincarnate, I’d rather come back as a child anyway, because children are better at adapting to new worlds. Perhaps that’s why Jesus was born in Bethlehem, instead of arriving as an adult in a blaze of heavenly glory? Notice also that the angels never stay here long. (Christmas reference. Sorry)

    The second answer is that I agree with the Buddhist belief that self is an illusion. I first really experienced this when I was doing research on Glomalean fungi . These are among the most common fungi on the planet, and they’ve got a bizarre cellular structure: they’re coenocytic, meaning there are few cell walls within their mycelial bodies, and many nuclei within each “cell.” Worse, the evidence when I was working on them seemed to say that the nuclei within this soup were genetically distinct (they have multiple distinct rDNA sequences per multinucleate spore. Why is unclear).

    I tried to imagine myself as a little glomalean fungus: if you cut this fungus in half, there are now two of them, so being a physical individual is meaningless to them. Are they genetic individuals? Possibly not. What’s an individual in this fungus? It’s a meaningless question. And try visualizing yourself as such a being. The fact that they’re among the most common organisms on the planet strongly suggests to me that individuality is over-rated.

    If you look at what makes a human, we’re basically accumulations of history, memories, patterns, conditioning, and a strongly panicky reaction when anyone suggests that this conglomeration is anything other than one solid individual. The Buddhist point is that recognizing this reality is a way to happiness, or at least not suffering any more. There’s something to it.

    • December 10, 2011 9:07 pm

      Hm, I wonder what system could possibly teach a child to be “you” rather than follow its own incorrigible devices. 😉 Sounds worth reading though.

      Does your fungal example really show that self is meaningless–or distributed? What if “self” is distributed amongst your past and future selves, your community of interactors, and their past and future as well?

  13. SFreader permalink
    December 11, 2011 11:28 am

    I assume that if personality transfer (or duplication) technology were seriously considered, it would probably proceed in increments with each step of the process studied in detail.

    Has there been any research conducted among people awakening from extended comas (several years) that examined how their personal identities changed? Since the quantum states of their passive/dormant brains would have changed moment to moment – probably as much as anyone else’s – then they could be good subjects for studying what happens when you put a brain on ‘hold’.

  14. SFreader permalink
    December 11, 2011 1:21 pm

    I’m now wondering just how much of the ‘brain’ actually contains an individual’s identity. Probably not all of it. If the brain is comparable to a computer – the most common analogy – then a good chunk of it is “platform”: the individual’s personality would be comparable to the content added over time – files and apps – just like in one’s laptop.

    This brings up another point. Transferring between computer platform versions is usually okay for maintaining data/content integrity. However, switching between generations/ versions within the same platform (e.g. Windows XP to Windows 7) can sometimes be tricky with some old/obsolete apps and their associated content files becoming lost or unusable due to incompatibility. I imagine that data loss/incompatibility might be worse switching platforms, i.e., Windows XP to Apple/Mac OS. So this suggests that identifying the critical compatibility connections/centers would be key and that new-gen “personality containers” would probably need to maintain and ensure compatibility of ‘brain/personality’ platforms for the future.

    This comparison to a PC also suggests that we might need a defrag function for our brain’s content to get all of the relevant “individual” bits into readily machine-identifiable and coherent chunks to make moving the personality faster/easier.

    A third potential need is a safe way to either move or update independent ‘apps’ – things that enable us do something (i.e., procedural memory such as speaking English or playing the piano) but that may not be stored in the same data set as the experiential/autobiographical “personality” content.

    • December 12, 2011 1:07 am

      The system I’d thought up was basically a collection of data (readouts from neurons, correlated with a couple of tiny external cameras and mikes), plus annotations to turn the data file into a standard data format (think of it as consciousness markup language) for upload, data transfer, and conversion into a pure upload personality. I also invoked optogenetics, so the transfer of information to and from neurons could use light, rather than electrical impulses. Basically, I’m assuming that the critical, complex part is the interface between computer and brain.

      As for teaching a new clone, the system reverses, using the lifelog of the original to gradually teach the child’s neurons to form the connections that the previous copy had. This is as close as I could come to a memory transfer.

      Are they the same person? Yes and no. I’m already planning a sequel based on the fun you can have with a version tree of a particular individual, as clones make their own lives.

  15. December 11, 2011 4:16 pm

    Identity is an illusion–well, maybe… but our awareness is the sum of our sensory input, our physical activity, and the brain’s processing of these inputs, yes–but all Filtered through what we think of as our consciousness, our ‘inner selves’–the part of us that talks to itself and makes conscious choices.

    But when we go to sleep, we let go–it collapses. When we awake, we do not spring out of bed and rush off to where we left off yesterday-we spend a significant amount of time re-assembling our identities, plugging back into our sensorium, location-orienting, sometimes even stretching parts of our body to facilitate the re-birth to awareness and activity.

    So, for starters, our identities are not open 24-7, they are cyclic. And they are re-brewed afresh after every sleep cycle. If we could study the process of awakening, we might just find a path to copying Identity by discovering how our natural brain’s Identity swims to the surface every morning.

    • December 12, 2011 1:51 am

      There’s a book on Zen buddhism called Untraining Your Parrot. The idea with the parrot is that when you get in the habit of listening to that voice, it starts sounding a lot like a flock of trained parrots. Part of the realization that self is illusion is when you realize that the part of you that’s aware of the voice (what people tend to call “me”) isn’t the same as the one that’s talking. It is a realization (as in an experience), not a concept.

      The idea that inside us there’s a little parrot screaming “how can you copy me? I’m unique!” is actually humorous. Isn’t it a great thing to train a parrot to say?

  16. December 12, 2011 1:13 pm

    I always imagine myself having to repeat a phrase over and over to train the parrot to say the phrase–I can’t imagine any joke would survive such abuse–perhaps that is why parrots tend to profanity–it’s the one thing that retains its efficacy with repetition.

    And, having deviated off topic, I’m not really into the Zen stuff, except where it enables self-control–(if find Buddhism is excellent for self-analysis and self-awareness).

    My personal ‘zen’ is the duality thing–brain and body are separate but they are not separate, awarenesses are individual identities, but they are also a coherent group in many ways, from ‘family’ to ‘neighbor’ and all the way up to the group, ‘Species’. Matter and energy are two separate things, but they are the same thing in different forms…

    The part of me that’s aware of the voice isn’t the same as the one that’s talking, but both are a part of my Identity.

    The lack of absolute borders is as true in science as it is in consciousness–a thing with such gossamer, many-layered fabric as ‘Identity’ will remain outside of our ‘tech’ until we have made much greater progress in understanding ourselves.

    But I still say Identity is cyclic, and that the process of awakening from sleep, even if it isn’t a key to reproducing an identity, will still be part and parcel of the answer to that problem.

    • December 12, 2011 8:24 pm

      Identity may be “cyclic” in the sense that we have to keep replaying our memories over and over again, in order to keep them alive. Perhaps all our memories are like a piece of music that we have to practice playing over and over.

  17. December 17, 2011 3:02 pm

    It seems to me that we have to distinguish between “self” and “identity”. I agree that “self” is an illusion, in the sense that I define it as an experience that some organisms such as humans can have that fuses sense impressions, memories, and a narrative about their relation to the outside world into an illusion of an agent separate from all others and from the environment in which it lives. “Identity” is a tag we place on each agent we see (and trivially on ourselves) that attempts to define the borders between those agents and between the agents and their environments.

    But the identity of a human being is very slippery thing. Clearly we need to include the nervous system as part of identity; we wouldn’t have cognition, sense impressions, memory, or interaction with the world without it. And we should probably include the endocrine system; emotions are a part of us. But what about the immune system, which affects the other systems, almost certainly in some ways we don’t understand yet? And what of the commensal (and largely symbiotic) bacterial colonies we each carry? Do they contribute to identity?

    These questions apply to the notion of mind-copying. How faithful a copy is one that doesn’t include one or more of those systems? And there are other questions of fidelity that arise when you ask what contributes to our identities as individuals. For instance, at what time scale do we need to consider an identity to be coherent? If it takes 100 milliseconds to record everything important about a person, is that record internally consistent? How about 100 seconds, or 100,000? There’s some evidence that our senses and cognition are smeared across a time window of several hundred milliseconds, but nervous potentials change on a time scale of low milliseconds or microseconds. Oin the other hand, endocrine secretions change levels on a time scale of hundreds of milliseconds to seconds.

    The answers to these questions will have an effect on whether copying of an identity is possible at all, and whether it can be sufficiently faithful that it might actually be useful for something.

    • December 17, 2011 4:36 pm

      The microbial community poses an interesting question–genetically, as well as behaviorally. Genetically, our microbes are estimated to contain 100 times the gene content of the human genome. And individuals do possess different microbial communities. Behaviorally, there is evidence that gut microbes influence the brain via the vagus nerve.

      The time scale raises another issue; and so does the question of location in space. In some cultures, people are aware of a very precise location of self, almost like an internal GPS.

      I think all this goes to show that no “perfect” copy of an individual can be made. But it also reminds us that, from moment to moment, we are making an infinite series of “imperfect” copies of who we were in the past moment. Could an electronic system create a copy that fits within the cloud of possible “imperfect” copies that are made by life process?

Comments are closed.

%d bloggers like this: