Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Pages (CHI'14), ACM, New York, NY, 4107-4116
Multi-touch and tangible interfaces provide unique opportunities for enhancing learning and discovery with big data. However, existing interaction techniques have limitations when manipulating large data sets. Our goal is to define novel interaction techniques for multi-touch and tangible interfaces, which support the construction of complex queries for big data. In this paper, we present results from a study which investigates the use of gestural interaction with active tokens for manipulating large data sets. In particular, we studied user expectations of a hybrid tangible and gestural language engaging this space. Our main results include a vocabulary of user-defined gestures for interaction with active tokens, which extends beyond familiar multi-touch gestures; characterization of the design space of gestural interaction with active tokens; and insight into participants' mental models, including common metaphors. We also present implications for the design of multi-touch and tangible interfaces with active tokens.