Catching up on posting the column from Sunday:
TechMan believes that the means we use to communicate with computers is the key to when they finally will be everywhere.
For a while, the betting money was on voice recognition. The user was going to be able to talk to a computer and, no matter how many beers he or she had consumed, the computer would understand what was being asked of it. Despite the lengthy dialogue between HAL and Dave in "2001: A Space Odyssey," voice recognition never caught on.
It seems clear now that the breakthrough sense is touch. Touch interfaces for computers are starting to pop up everywhere. Touchscreens first appeared in the late 1960s, and Apple's Powerbook 500, introduced in 1994, had the first mass-market touchpad.
At the beginning, a particular part of the screen had to be pressed, often with a stylus. This is called simple touch. TechMan bought an early PalmPilot soon after it came out in 1996 and it used simple touch. Numerous other PDAs (Personal Digital Assistants) imitated the PalmPilot, but they soon fell out of favor due to difficulties with their interfaces and the rise of smartphones that could do the same things plus make phone calls. However, simple touch still is used on ATMs, kiosks and lots of other places.
Then came multitouch, (Apple Inc. has trademarked the term Multi-Touch, so TechMan will use an alternate spelling to avoid having to use the little trademark symbol) where any part of the screen is touch-sensitive. Multitouch also supports gestures, such as pinching to make an image smaller or swiping to go to the next screen.
The first multitouch display was developed in 1982 at the University of Toronto, but it wasn't until 1999 when a company called Fingerworks brought out the iGesture pad and the TouchStream keyboard. Apple Inc. bought Fingerworks in 2005.
In 2007, using Fingerworks technology, Apple brought out the iPhone, the first phone using multitouch, and the iPod Touch. Fueled by Apple's juggernaut marketing machine, multitouch took off and has been accelerating ever since.
Multitouch phones have been pouring onto the market, including the Storm from BlackBerry; the G1, also known as the Google phone; and the recently released Palm Pre.
Also in 2007, Microsoft brought out the Microsoft Surface, a computer embedded in a table with a large, flat, touch-responsive top that uses small cameras as opposed to finger pressure or heat to sense touch.
And even larger touch surfaces are being developed. You may recall seeing CNN's "Magic Wall," used by John King to report on the presidential election. It comes from a company called Perceptive Pixel.
Meanwhile computer makers have boarded the bandwagon. Asus includes a multitouch touchpad on its EEE PC 900 netbook, and Dell has a similar device on its Inspiron Mini line of netbooks. HP makes a TouchSmart line of desktops that have touch screens.
Microsoft's Windows 7 operating system, due on the market before Christmas, will support multitouch. Microsoft and others have invested $24 million in N-Trig Ltd., which will make hardware that takes advantage of Windows 7's multitouch support.
But what's coming next could really make using a computer almost second nature.
At the recent E3 gaming show, Microsoft demonstrated Project Natal, a device consisting of a camera attached to software which senses movements and sound. It allows gamers to have full-body interaction with computer games without a controller. When such devices hit the market (and we don't know when that will be), it could take the need to touch out of multitouch.
Interestingly, in November at the Intel Research Pittsburgh lab on the campus of Carnegie Mellon University, TechMan saw a similar gesture-controlled game featuring Tetris projected on the wall and a player controlling the falling blocks by waving his arms.
So in the future, when you want to communicate with your computer, you can just reach out and touch it. Or maybe just wave at it.