What biological organ does this machine resemble?
In leaping beyond the two- and four-core microprocessors that are being manufactured by Intel and its chief PC industry competitor, Advanced Micro Devices, Intel is following a design trend that is sweeping the computing world.
Already, computer networking companies and the makers of PC graphics cards are moving to processor designs that have hundreds of computing engines, but only for special applications.
For example, Cisco Systems now uses a chip called Metro with 192 cores in its high-end network routers. Last November Nvidia introduced its most powerful graphics processor, the GeForce 8800, which has 128 cores. The Intel demonstration suggests that the technology may come to dominate mainstream computing in the future.
The shift toward systems with hundreds or even thousands of computing cores is both an opportunity and a potential crisis, computer scientists said, because no one has proved how to program such chips for many applications.
Yes, that would be the human brain. Like these future Intel machines, the brain isn’t simply an amorphous chunk of brute processing power. Instead, our mind is a highly interconnected network of different functional “modules,” or what Intel engineers call “computing cores”. We have a visual cortex (graphics card), which is then divided into five different regions (V1-V5), all of which are responsible for their own slice of the visual world. We have specialized neural units for the processing of language and irregular verbs and faces and gambles and moving different parts of our body. Like these future computers, our cortex recognizes that it’s more efficient to let specialized compartments focus on their specialty: it’s the neural equivalent of comparative advantage. It’s about time computers imitated this highly useful biological adaptation.
Of course, figuring out the hardware is the easy part. As the article notes, programming the software is the hard part.