Slimy balls rolling around in my skull!


Peter Watts has this short short story about a brain interface technology that allows people to merge their consciousness with other organisms -- and in this one, "Colony Creature", someone experiences what it is like to be an octopus, and is horrified by it.

“Those arms.” His Adam’s apple bobbed in his throat. “Those fucking crawly arms. You know, that thing they call the brain— it’s nothing, really. Ring of neurons around the esophagus, basically just a router. Most of the nervous system’s in the arms, and those arms… every one of them is awake…”

It's a good story, and I'm not knocking it. I think it's also important to recognize that the experience of being a non-human organism is probably fundamentally different than being a human.

However, while it's true that 2/3 of the neurons in an octopus are found in the arms, not the brain, I doubt that the experience of being an octopus is quite so distributed, or that it consists of eight independent conscious entities.

For example, your eyes are actually complex outpocketings of your brain -- each one contains at least half a billion neurons interconnected with some very intricate circuitry. Likewise, your enteric nervous system -- the neurons that drive the activity of your gut -- consist of about a half billion cells, too. These things are bigger and more complex than the nervous system of my larval zebrafish, which can swim and learn and eat and carry out goal-oriented behavior.

If a Vulcan mind-melded with a human, do you think that they would report that the human experience is all about these creepy autonomous orbs on their face, darting about and collecting photons, and that we've all got this colossal slimy worm hanging from our heads, writhing and squirming and demanding to be fed? It would make a good creepy story, but it's not an accurate picture of our conscious experience.

I'm not sure what consciousness is, but it's almost certainly got to be a synthetic process that integrates complex information from multiple sources, guts and sensory organs and spinal cord and multiple modalities, and is also built up from different processing centers within the brain. I would expect the same is true of the octopus.

Although…a talented writer could probably put together a good horrifying story of a person with a fractured brain, where they became aware of all the separate components of the inputs to the human experience as if they were independent intelligent agents with limited and specific needs having conversations with a central 'self' (another problem with the Watts story: what is the octopus 'self' that the story-teller has become?). What do my eyeballs want? What is my amygdala muttering to itself? I know what my gonads want, and they better behave.


More like this

"I think it’s also important to recognize that the experience of being a non-human organism is probably fundamentally different than being a human."


By See Noevo (not verified) on 19 Mar 2016 #permalink

My favorite author.
He makes me wish I were a more cheerful person.

By Hank Roberts (not verified) on 20 Mar 2016 #permalink

PZ, if you don't know Jaron Lanier, look up his stuff: he's the computer scientist who invented virtual reality, he's a polymorphic Renaissance genius of high order (in the humanities as well as the sciences), and he also has a kind of sympathy for cephalopods that you might find has something in common with yours.

Where that's relevant to your story is, in the early days of his work with VR, he envisioned one of the applications as being able to simulate what it's like to be another person, or even something that isn't human. Such as a cephalopod. Or as he said at the time, "imagine playing the saxophone and becoming the saxophone." Metaphorically of course, and without mind-meld, but none the less a potentially insightful way to perceive the world.

He's also a major skeptic of The Singularity and all that pseudoscientific nonsense, but he has a very interesting take on the subject. In addition to strong skepticism about "upload," he says that interacting with AIs diminishes the uniqueness and individuality of persons, by giving them feedback that reinforces algorithmically-predictable behavior rather than thoughtful behavior.

If you don't know of his writings, look him up, and don't be scared off by the somewhat wild appearance of his website etc. The man is a major genius and you & he might have much common ground.

So back to the story: It's easy to write for "primal" emotions such as creepiness, sexuality, aggression, and so on. But far more interesting to write for the more complex and subtle emotions, and far more interesting to write about empathy with the real intent of conveying another person's (or another life form's) experience to the reader.

I'll admit that I'm more partial to mammals, but thinking about the experience of being a cephalopod, I envision as having a completely new sensorium, in which the colors and textures of things are rich in a radically different way than we're used to. Consider the experience of crawling along the ocean floor on all fours, but instead of only your hands and feet, most of your body feels the direct contact with every shape and texture and surface. It would be entirely mundane on one hand, but could incite deep curiosity from the human side of the experience.

Envision how the distributed awareness we already have, such as the muscle memory we use while typing or walking or whatever, changes when it has more processing capability and can act semi-autonomously: "What did my fifth arm expect to find as it searched around in that burrow.... aha!, food!, excellent, and it felt like a small crustacean...."

Re. your last paragraph, that kind of multi-level awareness occurs naturally as a result of mindfulness meditation: with practice, you get good at noticing the origins of various threads of thought, and you can begin to observe multiple threads of thought that are running simultaneously. This is not difficult, it only takes routine practice, and it entails no supernaturalism or other "stuff" that doesn't comport with science.

Hypothetically, a human brain can support more than one locus of conscious thought at a given moment. Ordinarily we "choose one" or we find ourselves distracted, or we integrate them into our idea of a unitary self. But learning to discern among them, and notice how and where they interact, is a very interesting process to observe. Learning to observe that, might be useful for imagining how animals with other types of distributed nervous systems perceive the world.

Completely absolutely off-topic, but speaking of slimy balls, just reread the Lenski affair again (no, Dr. Lenski wasn't the name that popped into my head when I saw "slimy balls" but you can guess whose name did). So beautiful, classic, a wonderful second response. Reading it should be an annual tradition.

Also liked the PNAS response, Schlafly also wrote to the publishers of the paper complaining about the statistics. Part of their response was, "The issues raised by Mr. Schlafly are neither obscure nor subtle, but are part of everyday statistical analysis at a level too elementary to need rehearsal in the pages of PNAS." I'd say that's going to leave a mark, but after Dr. Lenski's response there'd not be much left of Andy for another mark to be left.

By Dan Andrews (not verified) on 29 Mar 2016 #permalink