Browse by topic
Type of publication
Error-proof, High-performance, and Context-aware Gestures for Interactive Text Edition. Proceedings of the 2013 annual conference extended abstracts on Human factors in computing systems (CHI EA), 2013. pp. 1227-1232. A*We present a straightforward solution to incorporate text-editing gestures to mixed-initiative user interfaces (MIUIs). Our approach provides (1) disambiguation from handwritten text, (2) edition context, (3) virtually perfect accuracy, and (4) a trivial implementation. An evaluation study with 32 e-pen users showed that our approach is suitable to production-ready environments. In addition, performance tests on a desktop PC and on a mobile device revealed that gestures are really fast to recognize (0.1 ms on average). Taken together, these results suggest that our approach can help developers to deploy simple but effective, high-performance text-editing gestures.