Why I think Google's Project Glass lacks vision

(A rant hammered out in bed. Not spellchecked, fact checked, or cross checked.)

So a few weeks ago the world swooned over the latest exciting tech announcement from Google, Project Glass. But I have mixed feelings, which in my time-honored style I am only just now getting down on paper. I watched the video, and I sat there and thought: really? Is that it?

It's perplexing to me that a technology that holds as much potential as Project Glass does should be introduced as nothing more than a glorified smartphone interface welded in front of your face, as if there was some pressing need to furnish hip winsome 20-something New Yorkers with yet another technological marvel. But I guess earnest videos addressing what augmented reality might do for, say, those suffering age-related visual decline don't really get frontpaged on Reddit. (that's not fair. Reddit users would totally frontpage that, the soft-hearted dorks). What I'm saying is, practically every developed country is facing a generation where the old outnumber the young; which is a market both larger and more in need of AR prothesis than ukelele-strumming Brooklynites, so it's odd to me to ignore that application.

A reader told me that Google were attempting to "colonise the visual field". I can see how that makes sense from a depressingly commecial point of view, and it's plausible enough, but it strikes me that Google have often seen further than anyone else. Maybe, like so much of their output, they're building the sandbox and inviting us to play in it. Perhaps it's not Google's lack of vision that I'm disappointed by.

So money-where-my-mouth-is time: what master plans do I have for Project Glass. The most obvious thing is this: if commercialised, Google goggles would represent the first serious advance in eye prosthetics since we began grinding lenses out of glass plates and hanging them on our noses. (OK, perhaps after retinal transplants. They're neat too.). Who on Earth could be so crushingly dull as to only see that as a way of shrinking the distance between your computer screen and your eye? Off the top of my head, some applications of augmented reality glasses:

1. Correcting color blindness. You're never going to be able to replicate perfect colour vision. But for the millions out there like myself who exhibit red-green color blindness, it would be pretty easy to add a filter that helped us distinguish one muddy hue from another. For those suffering total achromatia, you could add edge detection or even textured overlays to help them too.

2. I hate to break it to you, but you're going to get old. And when that happens your night vision will consist of varying clouds of dark fog. What's more, your spatial perception will be out of whack. But you're not the only one who could benefit from filters that boost night vision and identify collision hazards: truck drivers would get a kick out of that too.

3. Why stop at being the best you can be? Why not be superhuman? Humans have the brain space to perceive tetrachromatic vision, we just don't have the eyes for it. Why not add a camera that can sense UV, or infra-red, or polarised light? I think firemen would get a kick out of being able to detect where a fire was in a smoke-filled room. I'll leave it to you to suggest how surgeons might benefit from an overlay that highlighted barium-filled veins, say.

People go on about augmented reality, as if it were nothing more than adding an iPad HUD to everything they saw. This strikes me as lacking in imagination. I mean, hello, there are at least four other senses you can start tinkering with. My pops has hearing aids that automatically adjust the soundscape into the frequencies he hears best. What's that, if not augmented reality? When he picks up the phone, the aid detects that, and pipes the sound to both his ears. That's seriously freaking cool.

What if diabetics heard alarm bells, literally, when their blood sugar levels dropped too low? (a lady friend suggested that there are certain bodily processes whose inception women might like a subtle warning of, too). Why not choose to hear when a bus is approaching the stop outside your house? What if your could taste high blood pressure, smell a smoke alarm?

That, to me, is the true excitement of augmented reality. Not just adding content to our field of vision, but an approach to some kind of technological synaesthesia.

Tags
Categories

More like this

People with red-green colour blindness find it difficult to tell red hues from green ones because of a fault in a single gene. Their inheritance robs them of one of the three types of colour-sensitive cone cells that give us colour vision. With modern technology, scientists might be able to insert…
All of you are probably familiar with color opponency, but just in case, I'll give you a quick refresher. I'll even start with the history. In the 19th century, there were two competing theories of color vision. The first was the Young-Helmholtz theory (sometimes called the trichromatic theory),…
Short answer, no. Duh. Long answer, man do I hate how psychology gets reported in the media. If you were surfing around news sites earlier this week, you might have come across something like this: A study in Current Biology reports some of the first conclusive evidence in support of the long-held…
Apologies for the long radio silence. Travelling and the obligatory pre-travelling frenzy shut down the blogging assembly line for a couple weeks. Having wrapped up my west-coast jaunt (thanks to the great crowd that came out for the CSPAN taping at Stanford), I can write a bit about some of the…

Lol, sorry, but that still is ridiculously distant from actual augmented reality.

Actual AR looks like this:
Imagine you go to the biggest public place in your city, and you see a tower standing in the middle. It is 3D (Not stereo 2D like in the cinemas. Actual 3D. So your eyes have to focus on the depth too), it is lighted and shaded using the lighting infos from the environment (think ZBrush’s material creator + light direction detection), and the system internally has a full 3D model of the environment, built from the same radar/laser/whatever technology that e.g. self-driving cars use.

Then you go up to it, open a casing with your hands, and push a big red button. (Ideally feeling the pressure. But that’s still a bit far out, since it requires a robotic exoskeleton to push against you.)

After you pushed the button, the whole environment switches over to a night look (even though it’s still day), with stars, lights and everything. Hovering disco balls appear out of nowhere, music starts (with a reverb that’s rendered from the 3D model of the place), the floor tiles become lights that animate to the rhythm, and (simulated) people start to dance. (Not colliding with the real people at the place.)

THAT is fuckin’ AUGMENTED REALITY!

And we already have all the technology to do it. (Hell, most of it has been done since the 80s. It just was too bulky and expensive.)

What Google has done, is retarded shit.

By Evi1M4chine (not verified) on 01 Jul 2012 #permalink

P.S.: Give me the financing, and I WILL do what I described. Because that’s the only thing holding me back. (Small-minded imaginationless backwards retards who can’t see the advantages it would give them to finance this. They could take over the world, but nooooo…)

By Evi1M4chine (not verified) on 01 Jul 2012 #permalink