Tracking accuracy upgraded with multi cam

We recently upgraded the input tracking of the useTable with two gigabit ethernet cameras from uEye. Each camera delivers a resolution of 1280 x 1024 pixels. Both camera images are stitched together for a total tracked image 1280 x 2048 pixels. Stitching and tracking is done by CCV 1.5. The higher resolution is a fantastic benefit for our complex fiducial patterns.

useTable Workshop

In February 12 pupils from a local school visited the C-LAB in oder to participate on the first useTable workshop. During the one-day event they had to design different application ideas for collaborative multitouch setups. Further on they could test their programming skills within a multitouch ant-rugby contest.

Student tutor team playing ant-rugby

dSensingNI presented at TEI Conference in Kingston, ON

The dSensingNI framework was presented at the Sixth International Conference on Tangible, Embedded and Embodied Interaction (TEI 2012) in Kingston, ON, Canada. It allows the usage of a depth-sensing camera (e.g. Kinect) to track multitouch gestures and tangible object movements in 3D space. We use this approach for advanced interaction techniques on the useTable.For more information and Videos please visit the dSensingNI project website www.dsensingni.net.

Presentation of dSensingNI at TEI2012