Multi-touch pressure-sensitive interface

A quick google search says that I’m several months behind the times (it was posted to /. in Oct 2006) but today was the first I’ve heard about it so I thought I’d pass it along.

Jeff Han (and others?) is working on a new touch-screen-like device that integrates more natural human gestures for interacting with a computer. For instance, move a picture by dragging it, resize it by taking two corners and streatching it, scale the “desktop” by selecting two points on it and streatching it, bring up a keyboard and type directly on the device itself (you can also scale and place the keyboard interactively), ditto pan/rotate/zoom mapping applications, etc.

Here’s a link to the Flash video of the keynote that was presented at 2006 Technology Entertainment Design (TED) conference. See also the researcher’s web page.