At the risk of going too far off topic and without intending to be too harsh: at least in the video, that interface looks nice, but there is too much lag between when a tap/gesture is made and when the interface reflects it. This breaks the direct manipulation metaphor and is, I think, a showstopper. The pinch to zoom example was particularly off-putting: how can I know how much I'm zooming when the interface doesn't track my gesture? I cannot imagine being happy with pinching, waiting a second to see what's happened, then pinching and waiting to adjust (repeat until I get it right or am too frustrated).
I don't think there's too much danger in going off-topic if the conversation is interesting and has some substance.
Yea I think lag in touchscreen interaction is a huge problem. I think this is largely a result of hardware. We've had to ship on atom-based hardware with 2003-era DX9 graphics. Even the latest Intel chips with DX10 are horridly slow when it comes to graphics.
The other side is actual touchscreen hardware. The machine used in the video is an HP Touchsmart, which uses an optical touchscreen panel. These panels have a latency of around 100-200ms, and that's before we even start processing the touch event information. Capacitive touch sensors are much better, but the're expensive to manufacture above about 10 inches.
In the end lag/accuracy is a reality of the low-cost hardware OEMs use, and there will always be some trade-off between hardware cost and performance.