A new way to skin the CAT (scan)

Many years ago -- about 40 to be exact -- I was working as the only medical doctor in a bioengineering research laboratory in a famous technological university. A lot of what was done there involved speech synthesis (one of the first reading machines for the blind using text to speech recognition) and also some digital image processing. One of my interests was in radiology (the medical specialty of reading x-rays) and I was doing some research in that area. X-rays of the chest were usually taken in threes: the "posterior-anterior" (PA) view (back to front), the lateral (through the side) and the oblique (taken while you stand at forty five degrees to the film. The purpose of taking three films was to allow the radiologist to distinguish what was in front or behind of what. The disentangling of relationships was done by the world's most powerful and flexible computer: the human brain. But it turned out we could do better with more data, but this overwhelmed the brain computer. By extending the idea to many more pictures at smaller angles and using a computer to reconstruct the spatial relationships, you wind up with what is now called a CAT scan. CAT stands for computerized axial tomography: axial because the pictures are taken around the body as the axis; and tomography because that is the word for the views obtained when you take slices. The Greek word for slice is "tomos" (as in atom, not further sliceable). In the course of figuring out how to do this we produced the first x-ray CAT scan in the literature. I recounted some of this story in another post.

Now, of course, CAT scans are routine, but the machinery involved and the computational power required is still very impressive -- and expensive. CAT scans are just one of many applications that require huge computational power to render images, however. Consider video games. We now have specialized computer units -- called video cards -- dedicated to rendering graphics quickly and efficiently. But they aren't used for CAT scans.

Until now (geek alert):

Source: gizmodo

More like this

In my younger days I was quite enamored of radiology as a specialty. I published some papers in that area and enjoyed reading x-rays, quite a complex task, requiring the reader to integrate three dimensional anatomy with two dimensional shadows and relate that to physiology, pathology, surgery,…
Forty years ago I was the only doc in a bioengineering lab in a famous technical university. We had our own computer -- it took up a giant room -- and had attached to it a rare device in those days, a scanner for converting transparencies to computer readable form. It was really a cathode ray tube…
"A colour is a physical object as soon as we consider its dependence, for instance, upon its luminous source, upon other colours, upon temperatures, upon spaces, and so forth." -Ernst Mach Our Sun, like all Sun-like stars, will come to the end of its life someday. All the hydrogen fuel in its core…
NOTE: Orac is on semi-vacation this week, trying very hard to recharge his Tarial cells. Actually, although he is at home, he is spending much of his time in his Sanctum Sanctorum (i.e., his home office) working on an R01 for the February submission cycle. Given that the week between Christmas and…

As this shows, it is of great value to have the freedom to tinker with computer hardware. Geeks can do amazing things when they can re-write, re-wire, re-code, or otherwise alter the ordinary functionality of commercial computer hardware and software.

Will Vista put an end to the freedom to tinker? Last year when Microsoft released Vista, I read that it also began essentially forcing hardware manufacturers (particularly of A/V devices) to sign agreements stating they will no longer disclose detailed hardware information to the public. Vista is designed around DRM from the bottom up, and in order for its complex copy protection procedures to work, any connected hardware must be in on the game. Obviously, knowing enough about the functionality of connected hardware might allow someone to write their own driver, or otherwise modify the interaction of the device with the OS, possibly circumventing the copy-protection. Microsoft basically said to hardware manufacturers, "you don't have to sign the non-disclosure agreements, but if you don't, we won't certify your drivers, and therefore they won't work with Vista."

You might be thinking, "there's always Linux (or Mac)," but since Microsoft monopolizes the computer market, how likely is it that hardware manufacturers will create two different versions of their products, one to comply with Microsoft's demands, and another for the open-source community? Microsoft's attempts at shutting down the flow of information for copy-protection purposes, essentially turning its computers into nothing more than DRM-laden multimedia devices, could affect all computer hardware.

Apparently much of the DRM technology in Vista is dormant at the moment. The system is literally designed with the ability to encrypt all data flowing across its own internal system buses, all in order to keep the user from circumventing the DRM. And that's only the tip of the iceberg. These features are latent because modern CPU's can't provide the power needed for them yet.

Here's more:
http://www.cypherpunks.to/~peter/vista.pdf

Very, very IMPRESSIVE!!!!

Imagine what other applications multi-GPUs could be used for.

Weather and climate simulation..,

Nuclear weapons research..,

All on the cheap!!

And as demonstrated by the third actor on the video, there's always DOOM III for your spare time amusement.