Over at the MIT Tech Review website, neuroscientist Ed Boyden argues for brain augmentation:
It’s arguably time for a discipline to emerge around the idea of human augmentation. At the MIT Media Lab, we are beginning to search for principles that govern the use of technology to augment human abilities–that make the idea of normal obsolete…
One argument in favor of going for optimality, and forgetting about normal, is that it’s becoming harder and harder to know what is normal. For example, it’s been demonstrated that two-thirds of all people have at least one copy of a DNA sequence that makes them more likely to become depressed after a stressful life event. The rest of all people, a minority of one-third, are more resilient to stress than the other two-thirds are. Thus, it could be argued that becoming depressed in response to stress is the normal state.
I agree that the complexity and diversity of biology make it very difficult to define “normal”. (It’s the equivalent of defining “authenticity” in music.) So I’m with Professor Boyd on that point. But I fail to see how the absence of clear norms make a compelling case for human augmentation. Instead, I believe the case of brain augmentation should be based on one, and one variable only: it is good for us? This question isn’t easy to answer, but I believe the debate should be entirely focused on the efficacy of the possible treatment. We should ground our ethics in pragmatics. After all, we’ve been augmenting our brains for decades – reading this post is a subtle form of brain augmentation – so I don’t believe there’s any bright line separating the brain microchips of the future from the (largely ineffective) synaptic drugs of today. The basic question should stay the same: does this intervention work? What are the side-effects? And are they worse than the cure? The history of scientific brain augmentations should make us skeptical that there is a plethora of panaceas around the corner.
Hat Tip: Neurophilosophy