Multimodality in Gaming
All games, regardless of form, evoke a range of senses, which, in combination, construct the game experience. Games engage players through a variety of modalities, such as audio, visual, and haptic cues. The combination of modalities are filtered through the player’s cognitive perception as a holistic experience, thus theme 3 builds on the idea that modalities interact with one another and also interact with the player. The term “multimodality” is crucial here because it accounts for the fact that separate components work together to construct the holistic player experience and immersion in a game. Research in Theme 3, Multimodality in Gaming, investigates how exactly modes of engagement function to interact with one another as well as the totality of the player experience.
Researchers theorize that a greater understanding of how games effect human sensation, perception, and cognition will contribute to the development of more immersive and evocative play experiences. An understanding of these physiological processes will also allow for researchers to gain deeper insight into other questions about games user research, game mechanics, game design, and even research about gaming culture. Therefore, a core question driving research of theme three is:
How can we leverage the human perception system to improve the game experience and game design?
Dr. Chris Joslin, leader of Theme 3, directed an interdisciplinary research team on this topic, with each member focusing their research on a select game modality. Notably, Dr. Joslin’s work focused on the integration of computer graphics and spatial audio while Dr. Collins, a contributor and Canada’s Research Chair in Interactive Audio, co-lead several projects that explored how sound interacts with other modalities in games. Other team members focused on a particular domain—haptic inputs and outputs, fidelity, aesthetic elements of games, cross-modal perception and interaction—and integrated their research with others in the theme.
Two goals informed the directions of IMMERSe research on multimodality in gaming: goal one looked at how to better understand players’ responses to stimuli in games. Goal two explored applications for multimodality research findings in regards to game design and development of more immersive games. An example of research toward goal one, better understanding player responses, is a 2014 project conducted by IMMERSe researchers Dixon, Harrigan, Santesso, Graydon, Fugelsang, and Collins on “The Impact of Sound in Modern Multiline Video Slot Machine Play.” Through the study, they examined the psychological and physiological impacts of sound in slot machine games, particularly those used to signal a game “win.” They found that the sound accompanying multi-line slot machines impacted the arousal of participants psychophysically and psychologically. 'The modality of the game sound, and its corresponding arousal effect, was leveraged to positively reinforce the behavior of the player, encouraging them to continue playing even if they were losing' (Collins et al). This study generated valuable insights into the effects of game sound on human sensation, perception, and cognition, and the contributing factors of gambling behavior.
In regards to goal two, IMMERSe researchers made significant contributions of research towards practices leading to effective and immersive game design. The findings presented at the In Ed-Media: World Conference on Educational Media and Technology, the ACM CHI Conference on Human Factors in Computing Systems, and the IEEE CSSE: 20th International Symposium on Computer Science and Software Engineering exemplify research toward designing and developing more immersive games. The findings presented at these conferences included research on frameworks for multi-modal education systems (GhasemAghaei, Arya & Biddle), models for affective emotion (Etemad & Arya), and design practices for multimodal mathematical learning (GhasemAghaei, Arya, & Biddle).
A seminal work in this theme was created by Jason Hawreliak, co-founder of First Person Scholar and Assistant Professor of Game Studies at Brock University. In his book titled “Multimodality and Visual Rhetoric" he discusses the rhetoric and semiotic frameworks that allow us to understand multimodality in games (Hawreliak).
Collins, Karen, et al. “The Impact of Sound in Modern Multiline Video Slot Machine Play.” Springer Journal of Gambling Studies, 2013, www.ncbi.nlm.nih.gov/pmc/articles/PMC4225056/
GhasemAghaei, Reza, et al. "Design Practices for Multimodal Affective Mathematical Learning." In IEEE CSSE: 20th International Symposium on Computer Science and Software Engineering, Tabriz, Iran, 2015.
GhasemAghaei, Reza, et al. "The made framework: Multimodal software for affective education." In Ed-Media: World Conference on Educational Media and Technology, Montreal, Canada, 2015. Association for the Advancement of Computing in Education.
Hawreliak, Jason. Multimodality and Visual Rhetoric. Routledge, 2018.
Etemad, Ali, et al. "Mining Expert-driven Models for Affective Motion‚", ACM CHI (Workshop on Gesture-based Interaction Design: Communication and Cognition), 2014.