Jeff Han, a research scientist at NYU, demonstrated a prototype of a gesture-driven user interface at TED 2006. The “interface-free” UI (I hate terms like that) is demonstrated with several interesting demos, which hew towards data visualization in interesting ways. Apparently, the device

...is force-sensitive, and provides unprecedented resolution and scalability, allowing us to create sophisticated multi-point widgets for applications large enough to accommodate both hands and multiple users.

That’s pretty cool, to say the least. A ten minute video can be found at YouTube.