so here I am, catching up on stuff on the laptop, and being mildly distracted by the television
and I notice my duty cycle of glancing at the television is very low, and I ponder why
the bandwidth of the ears is only about 10 kbaud, and auditory input is rarely near maximum throughput
in contrast the eyes have several megapixels, with scan rate of about 10 Hz, and even allowing for compression and redundancies, the bandwidth has to be 1-10 megabaud,
and 2-3 orders of magnitude higher than the ears
so therefore a lot of TV programming, uses static, uniform colour slabs as part of their information conveying presentation, with slow, low information content talking used for a large fraction of the key information presentation
yeah, I was watching local news and weather - but the contrast between, for example the animated precipitation maps, which had useful high density information, and the blather of the weatherman over static high/low temperature maps was very distinct.
Similarly the contrast between some video footage and study chatter was very striking.
The television is used very ineffectually, with short bursts of useful high density information presentation alternating with long, very low information density crap
life would be more interesting if television content providers consitently used the technology closer to its capacity
- Log in to post comments
I've seen various statistics showing things like the average time between cuts decreasing, decade by decade. The results are usually framed as "kids these days have such short attention spans!" but perhaps they are exactly what you're hoping for. Certainly it is a bit startling to watch an old movie - Treasure of the Sierra Madre, say - and pay attention to this sort of thing. The shots are much longer, the scenes are more static, and the dialogue tends to be longer speeches rather than choppier sound bites.
On the other hand, I'm not sure that your estimate of, ah, ocular bandwidth is quite right. For one thing it's perfectly feasible to compress (using frankly very simplistic perceptual models) a two-and-a-half-hour movie into 700 MB, which is only 600 kbps. And even there, there is a lot more going on on screen than your eyes are capable of taking in. Computer screens need megapixels of resolution only because they have no control over where the eye looks. The area over which we actually have high-resolution vision is surprisingly small.
I'm also pretty sure that the bottleneck in visual bandwidth is not the eyes themselves, but our ability to process the information. As an example, reading speed is not all that much faster than the speed at which we can understand speech (and the difference is even less if you compare with the accelerated speech synthesizers used by the blind). That said, we are visual animals by comparison with, say, cats or dogs - a cat's hearing apparently extends up to 100 kHz and has an angular resolution something like a degree, and much more of their brain is devoted to processing what they hear.
You must be every conference presenter's nightmare.
Seems to me interesting television would provoke more thoughts rather convey more information. If something is interesting enough, you might want to stop and ponder and perhaps even screen out other distracting messages. Neglecting content in favor of doing information theory can lead you astray!
"provoke more thoughts rather than convey more information."
@Lab Lemming - yes, yes I am...
Although, in retrospect, this post could be considered an object lesson in why impromptu posts after midnight are not always such a good idea.
@Eric - I agree that thought provocation is better than information overload - which is why I used the local weather as the example.
@Anne - I agree my estimate of the bandwidth is over-generous, but both ocular and auditory channels are under-used, and as a first approximation the underload is comparable - conversation is less than 100 baud, compared to peak capacity of 10kbaud. Both systems do compression etc - ocular system clearly does pre-processing, including local smoothing and differencing.
I suspect the sustained throughput rate is limited by buffering at the brain interface, not the bandwidth, or even the processing power - damned high level caches always get you.