Parra González, Luis Otto2018-01-112018-01-112014-04-0716130073https://www.scopus.com/inward/record.uri?eid=2-s2.0-84924991846&partnerID=40&md5=8811528151ba9a07b310c614bddeb568http://dspace.ucuenca.edu.ec/handle/123456789/22051Currently there are several software development suites that include tools for user interface design and implementation (mainly by programming source code). Some of the tools are multi-platform and multi-style; that is, al-lowing the specification of devices, e.g. computer, notebook, smartphone, and user interaction styles, e.g. based on gestures, voice, mouse and keyboard. Among the styles, gesture-based interaction is neglected, despite the proliferation of gesture-recognizing devices. Given the variety of styles of human-computer interaction currently available, it is necessary to include information on these styles in software requirements specification to obtain a complete spec-ification prior to code generation. In this paper, we propose the design of a model-driven method and tool that allows specifying gesture-based interactions and then generates gesture-based interface requirements specification. We in-tend to our proposal be interoperable with existing methods and tools. The re-search method follows design science and we plan to validate our proposals by means of technical action-research.en-USGesture-Based InteractionGesture-Based Interface Requirements SpecificationModel-Driven MethodRequirements EngineeringUser InterfaceA model-driven method for gesture-based interface requirements specificationArticle