world in a grain of sand

What would it take to capture our life, in full fidelity?
It may be less than you think.
I think.

The body has about 1028 or so atoms, but to reproduce our existence it is not necessary to record the 1029 or so bits, per dynamical time, necessary to keep track of them.
We lose, add and move a lot of atoms around without it really affecting our existence.

Our primary sense of existence is the input to our senses - this is totally dominated by the optic nerve, which consists of about a million fibres, linking a hundred million sensors to the brain, with a firing rate of order 10 Hz. It is not clear what the bit size of the signal is, but it must be of order 10, so we'll call it 100 megabits per second.

The other senses have orders of magnitude smaller bandwidths.

So, at 3 107 seconds per year, over 70 years we pick up about 2 1017 bits of information.

In contrast, our output bandwidth is pitiful. Writing or speech is limited to about a bit per second, sustained, or about 200 megabits total in a lifetime. Even blogging...

But, there is more to existence than what we perceive.
We have about 100 billion brain cells, with an average of about 10,000 synapses each. The synapse is not a simple bit switch, so lets allow it 100 bits each - it is conceivable that this is a gross underestimate, with each synapse equivalent to a local cpu, but I doubt it for a number of reasons.

This gives us a brain state that is about 1017 bits long.
That is interesting - it suggests the brain state is matched to our I/O bandwidth - which is plausible, there is anecdotal evidence that people with perfect recall may saturate memory after a few decades - less than a life expectancy - suggesting memory overflow. And for people with non-perfect recall, garbage cleanup, filtering and compression are important brain functions.

So, recording the brain state as well as all of its input doesn't change our requirements much.

What about other physiology? We have order 1014 cells in the body, and recording their position, type, key chemical gradients etc should require less than 1017 bits.
The genecode itself is of course trivially small.

So: all we are fits in less than 1018 bits.
With redundant coding and some redundancy, that is a nice round exabyte. Each.
A million terabytes - would cost a couple of hundred million $ right now to store.

What sort of plausible near future storage compaction could we hope for?
A plausible compact storage form, not requiring any new physics, would be, say, bit storage on atomic spins - Carbon-13, in diamond form would do nicely - do optical readout.
With added redundancy, in round numbers, this would require about 100 micrograms of Carbon-13 diamond.
With casing and I/O interfaces, we can call it a mm sliver.

So, in principle, with foreseeable technology - all that is you could be stored on a millimeter sized sliver of diamond-13. A coarse sand grain. Diamond sand.
That is kinda cool.

To run you, would require more - we're looking at 1011 small cpus, with high parallel I/O - maybe a million bit system running loose parallel at 10,000 teraherz. That is far above current processing capabilities, but not far above current grid systems, which also have not dissimilar structures. In some ways.
Our internal processing is very large compared to our I/O - this is plausible, since a lot of our processing seems to involve running simulations of the real world or doing very wide pattern matching searches.

However, we're in an interesting bind - if we were to expand our I/O - which is certainly possible - I mean 100 megapixel imaging is not that impressive, and we could certainly improve the bandwitdh on some other senses, not to mention adding more. More senses, that is.

But, we could not well internalise additional I/O.
We have the processing power to handle more data internally, we do so already by generating remembered or synthesised real world models, but we do not have the memory capacity to accept significantly increased I/O.
If we wanted higher resolution senses, we would have to also enhance our memory storage by orders of magnitude, and provide additional I/O to handle this storage.
Cooling might in principle be a problem also, may have to re-engineer bloodflow, maybe some active cooling...

Now, if we stick with current technology, as postulated, we get the immediate specs for what would be neede.
We need only a cubic cm of diamond to improve storage a thousand fold - and to go with that we would want 100 gigabit I/O - that is one thousand times the optic nerve's bandwidth - which would require only a billion or so neural fibers - not impossible with a 100 billion brain neurons to hook up to, though we might need to improve the connectivity algorithm for the synapses, which might be the hardest thing of all.
Buffer it all through our diamond memory slivers, of course.



Sadly, if we ever do this, and we well might, then the first thing it will probably be used for is porn.
Still be worth it for the spin offs...

Tags

More like this

Charie Stross looked at this from a slightly different angle a few months ago:
http://www.antipope.org/charlie/blog-static/2007/05/shaping_the_future…
He starts by working out the storage requirements for recording everyting you see and hear. His numbers are back-of-an-envelope compatible with yours.

Of course,when we do this, we will start to redefine what 'human' is.

It's probably the most important technology we'll ever develop. It's also one possible answer to the Fermi paradox. Perhaps civilizations don't travel out, they travel further in.

This kind of relates to the work of British Telecom at Martlesham Heath. In their labs during mid-1990s didn't they hope to upload or should that be download, someone to a computer with their soul catcher chip.

http://www.cs.man.ac.uk/~toby/writing/PCW/upload.htm

Incidentally, Chaz, I heard the phrase "backward envelope calculation" as opposed to back-of-an-envelope calculations. Not quite sure what the former would be...maybe something to do with retrograde motion in the postal system.

db

You people act as if this hadn't already been done to backup and store Dick Cheney.

That's why I coined and published the unit 1 Shannon = 1 mole of bits, 28 years ago, and mentioned using a mole of diamond as storage. I'd learned a lot of backenvelopology directly from Feynman in the 1960s.

"The Singularity" in Science Fiction / History / Futurology addresses futures where every blade of grass, and grain of sand, let alone every human, has a web page, or some snazzier equivalent.

There are ~ 10^18 square centimeters on the surface of the Earth. Dealing with global data at that resolution was addressed in:

Jonathan V. Post, "Quintillabit: Parameters of a Hyperlarge Database", VLDB 1980 [Sixth International Conference on Very Large Data Bases]: 156-158, October 1-3, 1980, Montreal, Quebec, Canada, Proceedings

Abstract
I am going to give some factual data about the well- researched year 1999, based on projections from current data. The figures: there are roughly 10^18, one quintillion bits of information in the world data base, split between the humans and the machines. Now when you talk about the implications of a quintillion bits you have to briefly mention size, cost, use, speed, the limits of quantum mechanics, the possibilities of industrial growth that make these
figures possible.

I guess your conclusions about the number of bits in the total human brain state is rather dependent on the brain operating strictly digitally. Is this really so? The following suggests possibly (or I would say probably) not:

www.sciencealert.com.au/news/20082606-17564-2.html

I am also far from sure that the "anecdotal evidence" of memory saturation actually stands up to scrutiny. It is surely at least as likely that other effects, such as age related decline in recall ability, are the cause, rather than postulate that it is due to saturation.

I think the whole approach, while very interesting seems to take a very reductionist view of the human mind/personality. I recall a useful analogy: one can't reduce a neon sign to simply a network diagram, one must also take into account the context and the message (including the syntax and grammer of the language in which it is composed, and the meaning of any symbols used). Failure to consider these results in only understanding part of the sign.

So even if the brain is strictly digital (and I am not convinced that is true) I still think that what would be created by producing one of your diamond minds (and I like the sound of the idea) would only be part of the story in producing a truely human mind.

By Mark Lees (not verified) on 03 Jul 2008 #permalink

Some people "think" in a digital way, partitioning things into instinctual, either/or dichotomies. Friend or foe, black or white, naughty or nice. But then, of course, mental functions are holsitic phenomena that cannot be reduced to atoms or cells, just like a book is so much more than ink on paper...

There is a 2002 book called "Altered Carbon" by Richard Morgan which this sort of storage technology is in use. People have the device implanted in their head as a kid and it runs as a backup. if your body happens to be ruined somehow (quite often in this hardcore detective sci-fi story) you can put your "stack" into a new body.

http://en.wikipedia.org/wiki/Altered_Carbon

I had actually just finished reading "Woken Furies" - a sequel to Altered Carbon.
Also reading Hamilton's latest "Commonwealth" where this tech (and more) is featured.

At some level it does not matter whether the brain is strictly speaking "digital" - as long as all parts of the brain obey the laws of physics we know within the standard model, then a digitization of the brain is guaranteed to provide exactly the same information and processing of information, if the digitization is fine grained enough.
So my estimate is of "how fine" we would need to digitize the brain to reproduce it with fidelity. We know the brain is robust, because loss of individual neurons, for example, does not generally lead to a large change in brain state.
The estimate I made is plausible, I would argue, because it is broadly consistent - it gives us a brain whose memory is well matched to the I/O capability and consistent with equivalent digital systems - the brain is clearly overspec'd on processing power, but a large fraction of the human brain appears to deal with both pattern matching, done with high parallelization, and to work on generating internal realizations of external events, including the possible mental states of others. Which would explain the processing overcapacity.

@2. There seems to be a false dichotomy, in which anyone who doubts the claims of governments doing magic against their citizens is considered to be supportive of those governments. The third side of this argument is so obvious that I can't help thinking people like you are deliberately ignoring it:

The claims about government magic are put forth by the governments themselves, to scare their citizens and to deny how little control they actually have over them.