Parra González, Luis Otto2018-01-112018-01-112015-09-079781450334631https://www.scopus.com/inward/record.uri?eid=2-s2.0-84960101148&doi=10.1145%2f2829875.2829931&partnerID=40&md5=7f7fa4102644bf7df9ca7dc4fd18ac0bhttp://dspace.ucuenca.edu.ec/handle/123456789/29043Technological advances in touch-based devices now allow users to interact with information systems in new ways, being gesture-based interaction a popular new kid on the block. Many daily tasks can be performed on mobile devices and desktop computers by applying multi-stroke gestures. Scaling up this type of interaction to bigger information systems and software tools entails difficulties, such as the fact that gesture definitions are platform-specific and this interaction is often hard-coded in the source code and hinders their analysis, validation and reuse. In an attempt to solve this problem, we here propose gestUI, a model-driven approach to the multi-stroke gesture-based user interface development. This system allows modelling gestures, automatically generating gesture catalogues for different gesture-recognition platforms, and user-testing the gestures. A model transformation automatically generates the user interface components that support this type of interaction for desktop applications (further transformations are under development). We applied our proposal to two cases: a form-based information system and a CASE tool. We include details of the underlying software technology in order to pave the way for other research endeavours in this area.en-USCustomised GestureGesture-Based InteractionModel-Driven EngineeringUser InterfaceIncluding multi-stroke gesture-based interaction in user interfaces using a model-driven methodArticle10.1145/2829875.2829931