Tap the ShapeTones: Exploring the Effects of Crossmodal Congruence in an Audio-Visual Interface
Oussama Metatla, Nuno Correia, Fiore Martin, Nick Bryan-Kinns & Tony Stockman. 2016.
Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
There is growing interest in the application of crossmodal perception to interface design. However, most research has focused on task performance measures and often ignored user experience and engagement. We present an examination of crossmodal congruence in terms of performance and engagement in the context of a memory task of audio, visual, and audio-visual stimuli. Participants in a first study showed improved performance when using a visual congruent mapping that was cancelled by the addition of audio to the baseline conditions, and a subjective preference for the audio-visual stimulus that was not reflected in the objective data. Based on these findings, we designed an audio-visual memory game to examine the effects of crossmodal congruence on user experience and engagement. Results showed higher engagement levels with congruent displays with some reported preference for potential challenge and enjoyment that an incongruent display may support, particularly for increased task complexity.
Citation
Metatla, O., Correia, N. N., Martin, F., Bryan-Kinns, N., & Stockman, T. (2016). Tap the shapetones: exploring the effects of crossmodal congruence in an audio-visual interface. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 1055–1066). New York, NY, USA: Association for Computing Machinery. URL: https://doi.org/10.1145/2858036.2858456, doi:10.1145/2858036.2858456
BibTeX
@inproceedings{10.1145/2858036.2858456, author = {Metatla, Oussama and Correia, Nuno N. and Martin, Fiore and Bryan-Kinns, Nick and Stockman, Tony}, title = {Tap the ShapeTones: Exploring the Effects of Crossmodal Congruence in an Audio-Visual Interface}, year = {2016}, isbn = {9781450333627}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/2858036.2858456}, doi = {10.1145/2858036.2858456}, abstract = {There is growing interest in the application of crossmodal perception to interface design. However, most research has focused on task performance measures and often ignored user experience and engagement. We present an examination of crossmodal congruence in terms of performance and engagement in the context of a memory task of audio, visual, and audio-visual stimuli. Participants in a first study showed improved performance when using a visual congruent mapping that was cancelled by the addition of audio to the baseline conditions, and a subjective preference for the audio-visual stimulus that was not reflected in the objective data. Based on these findings, we designed an audio-visual memory game to examine the effects of crossmodal congruence on user experience and engagement. Results showed higher engagement levels with congruent displays with some reported preference for potential challenge and enjoyment that an incongruent display may support, particularly for increased task complexity.}, booktitle = {Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems}, pages = {1055–1066}, numpages = {12}, keywords = {audio-visual display, crossmodal congruence, games, spatial mappings, user engagement, user experience}, location = {San Jose, California, USA}, series = {CHI '16} }