T(ether) offers real time 1:1 hand and face tracking that can foster collaboration across several devices.
What if virtual interactivity vis as vis the iPad or other touch screen devices was more spatially aware? That is, how would it be enhanced if it not only allowed the person to manipulate objects but if that manipulation was supplemented with added hand and face tracking?
T(ether) is a new screen display that reads the movement of both the user’s hands and face to let them move around objects in real time. It creates a 1:1 relationship between the gestures in real space and those in the digital. The display also works across multiple devices enhancing collaboration between individuals. More than one person will be able to enter and manipulate the same virtual environment at once through their own device. Although the virtual world does not overlap real space (it simply tracks gestures in the former and translates them back to the latter), it is appealing to imagine the technology being used in conjunction with augmented reality. Since AR is mostly confined to individual devices, T(ether) could conflate multiple overlaid worlds further blurring lines between the real and the virtual.