We have not pushed the code further than is visible in the video.
I was using the IntuiFace project as an advanced test-bed for our TUIO "contact" and "object" server.
The IntuiFace code I employed was unchanged from Seb's, so we get the object location and orientation effects, but I didn't add in the support for touches/contacts in parallel.
I can see from other applications how objects/shapes on the touchscreen can drive control/menu widgets. Location, rotation and finger interactions on these "physical widgets" permits very rapid, flexible and intuitive control of the program/presentation.