Currently there are several software development suites that include
tools for user interface design and implementation (mainly by programming
source code). Some of the tools are multi-platform and multi-style; that is, allowing
the specification of devices, e.g. computer, notebook, smartphone, and
user interaction styles, e.g. based on gestures, voice, mouse and keyboard.
Among the styles, gesture-based interaction is neglected, despite the proliferation
of gesture-recognizing devices. Given the variety of styles of human--
computer interaction currently available, it is necessary to include information
on these styles in software requirements specification to obtain a complete specification
prior to code generation. In this paper, we propose the design of a
model-driven method and tool that allows specifying gesture-based interactions
and then generates gesture-based interface requirements specification. We intend
to our proposal be interoperable with existing methods and tools. The research
method follows design science and we plan to validate our proposals by
means of technical action-research.