01110101011011100110 10010111011001100101

IMG_1090.jpg

The history of information -- which is to say, the history of everything -- is littered with codes. Some are cryptic, designed to be understood by only a few, while others are made to be cracked. Numbers, for example, are symbols which translate the abstraction of mathematical information into a code we can understand. Language, too, is such an idea code. The Dewey Decimal System was a code for organizing all knowledge into ten distinct classes. Morse code broke meaning into short pulses of sound. HTML and other computer programming languages are codes which make the arrangement of graphics and color possible in simple text.

In the last few decades, however, one quite unassuming code has dominated all the others: binary. Although, at its simplest, a slew of ones and zeros, binary code has proved capable of digitizing almost all media -- including our previous superstar codes, like language and mathematics -- into tiny particles of information called bits. Not to be confused with bytes, which I am generally uncertain about.

A bit is a simple unit; the DNA, if you will, of the body of information coursing around the world. The information one bit contains is represented in its state of being: either on or off, black or white, true or false, up or down. Whether the bit is in one state or the other -- something we represent, for convenience, as being a 1 or a 0 -- determines, when arranged into patterns, the nature of the coded information. Binary code was originally used quite logically for numerical computing, but now these combinations of 1 and 0 can be used to code images, digital video, and audio. CDs are just audio waveforms translated into bit strings which play at such a rapid sample rate that we experience them as a continuous tone. I know: a huge suspension of disbelief is necessary to deal with it.

When I was a pre-teen, one of my parents' friends made a joke, at a dinner party, about the candles cluttering the guest of honor's birthday cake. "It's in binary!" he tittered. The concept flew over my head at mach-10. There were a great deal of candles -- more than necessary, since the cake recipient was old enough to enter the non-literal period of representational birthday-candle arrangement. I kind of get it now, though -- for example, here is the sentence "It's in binary" translated into binary code:

010010010111010000100111011100110010000001
1010010110111000100000011000100110100101101
1100110000101110010011110010000110100001010

The era of digital information, of which we are in the throes, beats to a binary rhythm. Although we cannot see them, the ones and zeros of binary code dominate our lives because they are at the basis of the transferral of all media; it is, after all, much easier to send media around the world in the form of light-speed bits than it is to send physical objects -- newspapers, say. In abstract terms, consuming digital media is a navigation through a grid of numbers that disseminate information. It's a remarkably efficient and populist coding system, despite the fact that practically nobody -- well, those of us who aren't baton twirlers in the nerd parade, anyway -- knows how to interpret it directly.

If all of the world's information, however, can be distilled into a coding system that relies on polar opposition, what does this mean about the nature of the world itself? After all, the world as we experience it is a pretty "analog" place; things almost never fit into black and white categories. Nicholas Negroponte, one of the founders of MIT's seminal Media Lab, noted in his book Being Digital that the world "is not digital at all but continuous," since "nothing changes from one state to another without going through a transition."

Nighttime in Los Angeles is daytime in Shanghai, and there's an infinite number of incremental light gradations in between the two; it would be impossible to find the place on Earth where night ends and day begins. Are we losing some of the world's natural variability in the process of turning digital? In a manner of speaking, binary code does not allow room for fractions. Of course, this is only a philosophical argument, but it's worthy of consideration, especially as the codes organizing the world become more complex. Binary opposition, conceptually, is something academics (especially those of the postmodern bent) evangelize against. Derridean, feminist, and postcolonial scholars will all bend over backwards to deny the existence of binaries, which are quite rightly conceived as being structurally derived notions that mirror the human inclination to think antagonistically.

Academics love/hate the idea of inherent antagonism, I imagine, because language is inherently antagonistic: every word is defined by what it does not mean. Ferdinand de Saussure, the "great" Swiss linguist, argued that language is a rule-bound system of oppositions that governs a closed and infinite set of operations. Famously, he wrote that "In language there are only differences...without positive terms." That is to say, every word (or sign) is different from all others, and hence is defined by a negative difference. If the word "dog" is not paired with its corresponding image (fluffy canine), then you might as well represent it with a "0." Binary. Feel me?

In any case, the interdisciplinary parallel is unnerving.

To me.

The scientific community, however, always finds way to muddle such neat connections. It turns out there might be new coding systems, maybe more appropriate for this world of ours. Consider "quantum computing." It's amazing for plenty of reasons; for one, it's a practical application of Quantum Mechanics, a branch of modern physics that jockeys with Einstein's General Relativity for the title of "Theory of Everything," and which is generally acknowledged to be radically incomprehensible. For another thing, quantum computing hinges on a completely different basic unit of information than the binary bit: something called a the qubit. Unlike the bit, the qubit can be either 1, 0, or both. There goes the neighborhood, right?

Although explaining how quantum computing works would take much longer than my tolerance can afford me (not to mention that it would be squarely out of my league of understanding as I'm still grappling, after all, with CDs), the essential gesture is this: in a quantum computer, qubits can exist in a superposition of all the classically allowed states of being -- all of the shades of gray at once. Because of this, a qubit's storage capacity increases exponentially: three qubits can store 8 different numbers at once, four qubits can store 16 different numbers at once, and so on. I don't understand how it works in the slightest, but qubits are antibinary, and hence non-antagonistic.

The research, though concentrated, is pretty nebulous right now; quantum "computers" aren't even pieces of hardware as much as observed phenomena on an atomic level, one which ostensibly obeys the rules of quantum mechanics. However, quantum information will undeniably be a mainstay of future -- perhaps long-term future -- technology. The idea is that if computers are to become smaller, if nanotechnology is to exist (something I'm banking on, because it feels so "future"), then quantum technology has to come into the equation to supplement the binary world.

I leave you, then, with a question -- one which I'm honestly hoping to hear some ideas about. Is this code more or less of a faithful model of our endlessly variable "real" world?

Tags
Categories

More like this

Congrats to Andrew Houcke for being selected as one of MIT Technology Reviews 2009 Young Innovators under 35. Houcke has been one the leading experimentalists in superconducting qubits, in particular doing pioneering work in circuit quantum electrodynamics. News article from Princeton: Andrew…
Over at Emergent Chaos I found an article which throws down the gauntlet over quantum computers. And there isn't anything I cherish more than gauntlets thrown down! Note: I should preface this by saying that I don't consider myself a over the top hyper of quantum computers in the sense attacked…
(This is the second of two background posts for a peer-reviewed research blogging post that has now slipped to tomorrow. I started writing it, but realized that it needed some more background information, which became this post. And now I don't have time to write the originally intended post...)…
Okay, well apparently the paper arXiv:0804.3076 which I mentioned in the last post is being picked up by other bloggers (see here and here as well as here) as a legitimate criticism of quantum computing. So before anymore jump on this bad wagon let me explain exactly what is wrong with this paper…

I am trying to imagine how it would workif Radio Rahim from Do The Right Thing had big gold 1 and 0 rings on his two hands; then my mind kind of tunes out his speech about the battle between love & hate,and underneath is Hamlet's "to Be, or Not to Be?" soliloquy instead, like if I could read lipsAND switch off my ears when I felt like it, and it turned out he had actually been saying

To be or not to be; that is the question.
Whether 'tis nobler in the mind to suffer
The slings and arrows of outrageous fortune
Or to take arms against a sea of troubles,
And by opposing end them?

all along. imagine my surprise. unnerving is a good word for it. ---------------------------------------see the thing is (with apologies for maybe coming across like you don't already fully 'get' this), there ARE no "shades of grey*." Like you (& N. Negroponte) said, reality is continuous. {I'm sure I'm being too literal as usual, but it strikes me as a tad problematic to hold up the phenomenon of transition as an example of seamless continuity?} Given that, as you succinctly express, language is a code for ideas, I tend to conceptualize the process of digitizing information, for example, not expressly as 'encoding' but rather "translation" (though I don't grok binary in the slightest). [now you've got me wondering my place in the nerd parade! Cursing out a blown engine with grease on my hands in a sweltering hangar somewhere -awesome float gathering dust for another year- no doubt. Sigh.] And, knowing full well this is not a 'real' difference.A dog can be fluffy, or short-haired; totally loving & loveable, or a vicious killer- same dog!, before & after a summer trim, and depending if you are a bonded friend or the neighborhood raccoon. So, YES: this Q-bert seems promising (having only met it through you. And thanks!).I realized the other day that one difference between fractions and decimals is that fractions leave the work still to be done. Like, when you divide something and you get a remainder, say.. seventeen into three; it's five and 2/3rds(whatever the hell two divided by three is). Though "cut it into three equal pieces; now take two of those" is a shitload more intuitive than "six-hundred-sixty-six-thousand-six-hundred-and-sixty -six ten-millionths" or whatever, to any creature whose first language is neither binary nor hexadecimal, I'd wager; to say nothing of infinity! Ironically, part of the work that fractions leave unaccomplished is the realization that it will never be accomplished. *these incremental gradations are something your mind DOES to reality so that it will fit in there, or so that you can refer to this or that part of the spectrum; which, you know, is useful a lot of times. Even the distinction Hamlet is on about -between 'the quick and the dead'- disappears when you take that step back into realizing that the passage of time is an illusion~ albeit an extremely useful one.

On the subject of our thinking in shades-of-grey vs bit-wise "truth":

The various theories of what "truth" is point toward binary categorization, as in something is either "true" or "false". They are, by definition, absolute black or white values, not "grey", even when based on consensus theory.
There is a rather thorough analysis of the various philosophical theories on defining truth at wikipedia:

http://en.wikipedia.org/wiki/Truth

Okay, so this is where Steven colbert is great at mocking the neo-Cons manipulation of language recently! He's started saying "truthiness" all the time, which I find very bitingly clever. By playing a deadpan conservative on his show, saying "truthiness" mocks how neo-conservatives have been using the media to manipulate the English language to make the word "truth" substitute in where the word "plausible" used to be correct. Or at least confuse people into excepting less-than-true facts as assumed truths. "Truth" is a binary "true" or "false" with no inbetween, but journalists on networks like Fox seem to intentionally blur that which "sounds true" aka "is plausible" to be equivocal with a "true" status.

Of course, this kind of language manipulation screams of propaganda, as fascists have always tried to publicly reclaim and redefine words in order to subvert people's thinking.

You could say that this is an attempt to make a binary (digital) value into a more arbitrary (analog) value.

To clarify, I meant "shades of grey" to represent continuous reality ("sea of troubles," in Hamlet's terms). I realize now that it is misleading -- no increments, rather, a wash of greys.

The distinction between encoding and translation is delicate; I've always found it's easier to think of translation as the process of translating oneself (and one's accompanying world) into a new language, instead of trying to mold the language to your world. I don't believe in semiotic invariance: putting the world's information into binary modifies it, but not to the point of it being irreconcilable.

The word is indeed not a binary place. There are an infinite number of transition states just like there are an infinite number of slices to be taken when determining a derivative. We are therefore left, as far as i can see it, with a choice of how to represent the states of the world around us so that we can discuss, share, and explore them. Binary was chosen, more or less, because it is easy. On or off, high or low, 1 or 0 is an easy thing to represent physically in the hardware. (It also lends itself to some neat tricks mathematically as well depending on how you represent decimal numbers). The trick then in my mind is to represent more and more of these slices of the state of the world until we get as close to the real thing as we can. In theory, we can do anything with a general purpose computer. The Church-Turing thesis claims that anything can be computed by an algorithm, couple this with Moore's law and (as AI classes love to point out) eventually we'll be able to compute anything on a standard PC. Anything. I find it hard to believe. But at the same time who would have thought that we could even have this ability to instantly share our thoughts across the world. The problem is in the describing mostly. Qubits (and other areas of quantum computing) will just give us more room and more tools to do the describing.

For a rather comprehensive treatise on the bit, computing, general systems analysis (and how this relates to universe in both macro- and microcosmic terms) see:

Operating Manual for Spaceship Earth
R. Buckminster Fuller

If you haven't read the Fuller, I think you'll love it.

I like that you are banking on nanotechnology because it "feels" future.
Also, the picture that you posted at the end of the entry helpes me to understand the concept of binary code from a different perspective. The line drawing is simply lines and two colors-black and white, but I get the meaning and the idea that is being communicated.

oh, yeah, i think that binary code is an o.k. representation of the real world, but I can't imagine that technology will ever get so good that a CD played on a nice stereo could ever resonate like a live instrument and if it did, could I afford this nice stereo?

But is reality continuous? Don't things like Russel's paradox (http://en.wikipedia.org/wiki/Russell's_paradox), which effectively means that mathematics cannot be logically derived from axioms, and the discretization of elementary physical phenomena described by quantum physics lead instead to the conclusiom that reality is inherently and fundamentally discontinuous? (I'm getting my info from William Everdell's book The First Moderns)

I guess what I mean is, because the fundamentals of physics (and the photons in quantum computers) are exisiting in a cloud of probability rather than along continuous points in space, I'd say that it is more correct to call our "endlessly variable" world endlessly probable. And that's a pretty strange idea to wrap your head around.