Skip to content

Mitochondrial Singularity

March 25, 2013

The singularity is the day when machine intelligence finally overtakes the human mind. But what if the singularity is already underway? And if it is–what does it look like?

Suppose it looks like mitochondria. Suppose we’re becoming the mitochondria of our machines.

How did mitochondria get to what they are today? The (now classic) theory of endosymbiosis began as a New-age feminist plot by Lynn Margulis, a microscopist known for setting protist videos to rock music. Around one or two billion years ago, a bacterium much like Escherichia coli took up residence within a larger host microbe. Either the larger tried to eat the smaller (like amebas do), or the smaller tried to parasitize the larger (like tuberculosis bacteria do). One way or another, their microbial descendants reached a balance, where the smaller bacterium was giving something useful to the host, and vice versa. In fact, this sort of thing happens all the time today. If you coculture E. coli with amebas, an occasional ameba will evolve with bacteria perpetually inside–and the evolved bacteria can no longer grow outside. They are slipping down the evolutionary slide through endosymbiosis, to eventual become an organelle.

But the price of endosymbiosis is evolutionary degeneration. Genetically, the mitochondrion has lost all but a handful of its 4,000-odd bacterial genes, down to 37 in humans. Most of these genes conduct respiration (obtaining energy to make ATP). From the standpoint of existence as an organism, that seems pathetic. The mitochondrion is a ghost of its former identity.

But is it so simple? Did mitochondria really stay around just for that one function? If that’s all the genes that are left, then how do mitochondria contribute to tissue-specific processes such as apoptosis (programmed cell death), production of oxygen radicals, and even making hormones?

Surprise–about 1,500 of those former mito genes are alive and well in the nuclear chromosomes. How did the genes get there? First, mitochondrial DNA replication is error-prone; errors accumulate there much faster than in the nuclear DNA. Second, DNA replication often duplicates genes–the leading way to evolve new functions. Suppose a duplicated gene ends up in the nucleus. It will stay there, while the mitochondrial original decays by mutation. Thus, over many generations, the mitochondria outsource their genes to the nucleus.

Is this starting to sound familiar? As Adam Gopnik writes, “We have been outsourcing our intelligence, and our humanity, to machines for centuries.” Long ago, since Adam and Eve put on clothes (arguably the first technology) we have manipulated parts of our environment to do things our bodies now don’t have to do (like grow thick fur). We invented writing, printing and computers to store our memories. Most of us can no longer recall a seven-digit number long enough to punch it into a phone. Now we invent computers to beat us at chess and Jeopardy, and baby-seal robots to treat hospital patients.

As we invent each new computer task, we define it away as not “really” human. Memory used to be the mark of intelligence–before computers were invented. Now it’s just mechanical–but as Foer notes in Moonwalking with Einstein, memory is closely tied to imagination. Once we can no longer remember, how shall we imagine? And if all our empathy is outsourced to dementia-caring robots that look and sound like baby seals, what will be left for us to feel? Poetry and music–don’t mention it, computers already compose works that you can’t distinguish from human.

Yet we humans still turn the machines on and off (well… sometimes). The machines aren’t actually replacing us, so much as extending us. That’s the world of my Frontera series. Humans still program the robots and shape the 4D virtual/real worlds we inhabit. But those worlds now shape us in turn. Small children exhibit new reflexes–instead of hugging their toys, they poke and expect a response.

The real question is, what will be the essential human thing left that we contribute to the machines we inhabit? Will we look like the brainship of Anne McCaffrey’s The Ship Who Sang–or like the energy source of the Matrix? Mitochondria-hosting cells ushered in an extraordinary future of multicellular life forms, never possible before. Human-hosting machines may create an even more amazing future world. But if so, what essential contribution will remain human?

Note: Posted also on Charlie’s blog; reported by NBCNews and Huffpost.

2 Comments
  1. March 25, 2013 1:41 pm

    In terms of academic disciplines, I think of the humanities and fine arts as most distinctly human. Computers can produce paintings, poetry, novels, or dance numbers based on algorithms, but I have yet to encounter a computer program that can make the divergent leaps of imagination or ingenious syntheses of information that humans can. Furthermore, so far I have yet to meet computers that set aside time in their day just to marvel at the humor in a comedy sketch or the bleak elegance in a musical number.

    In my mind, the pinnacle of intelligence as we understand it is the ability to produce AND to consume pieces of art. People worry about how computers and robots will take over their jobs with their relentless efficiency, but I anticipate that as artificial intelligence becomes more and more complex, it will serve as an equalizer between humans and robots because robots will not be satisfied with simple rote task production, but will also want time to enjoy movies and draw pictures etc. Highly complex robots may feel pain and may not be satisfied with limited warranty programs and may instead want comprehensive healthcare. As these changes take place, I think it will be harder to differentiate between robots and humans even as now in the 21st century it is becoming harder to differentiate different races and cultures of human beings.

  2. jonathan cole permalink
    March 28, 2013 5:20 pm

    I think the only essential contribution that will remain purely human will be something we could call ‘the human experience’, which is only directly experienced by humans but can be quantified and then created by our computer helpers. As we use machines more and more to make things, we still can appreciate things that we make. Even if we only do 1% of the labor/design by using CAD software and a 3D printer, we will still admire ‘our’ work. So products and environments will still have to meet some standard of the human experience and those purely created by machines, for machines will probably feel repulsive to most humans. Our final essential contribution will be as admirer and judge of that which is created.

Comments are closed.