Advancing Digital Scholarship with Touch‐Surfaces and Large‐Format Interactive Display Walls




Description

This project explores a  multi‐stage  program  of  research,  implementation,  and  evaluation  of collaborative,  interactive,  large‐screen,  gesture‐driven  displays  used  to  enhance  a  wide  range  of scholarly  activities  and  creative  expressions.    Although this project includes research topics such as:  seamless imaging, touch‐enabled computing, parallel rendering, design methodologies and intelligent networking; our main focus is camera-based interaction, i.e., study how to track people's locations, their features, hand-held objects, and hand gestures; using this information to trigger actions and to appropriately render imagery and sound, making possible an exciting multi-user experience with the computer system.
As an initial accomplishment, we have constructed the first version of our scalable, high-resolution display wall system at LEMS laboratory, in order  to conduct the early stages of our research with support of a seed grant awarded by Brown University.