How do gestural interactions support visuospatial cognition in STEM learning?
British Journal of Educational Technology
Published online on February 17, 2026
Abstract
["British Journal of Educational Technology, EarlyView. ", "\nAbstract\n\nExisting literature shows that touchscreen devices can support learning of visuospatially rich STEM content. However, the mechanisms by which touchscreen devices support cognition in learning remains unclear. This study examined how gestural interactions afforded by touchscreen devices support visuospatial cognition in STEM learning by comparing touch‐based gestural input with mouse input. This quasi‐experimental study was conducted during the implementation of an elementary cryptology and cybersecurity curriculum in afterschool settings. Data included measures of student performance including accuracy, response time and interaction efficiency across two visuospatial learning activities. Students' cognitive individual differences including verbal and visuospatial working memory, visual processing speed and cognitive inhibitory control were measured. A learning test was administered as a pretest and a posttest. Results show that in the simpler learning task, students using touch‐based gestural input demonstrated higher interaction efficiency, longer response times and more errors. In the more challenging task, differences between the experimental and comparison groups were not significant. Working memory capacity moderated the accuracy of students' learning performance, with the influence of visuospatial and verbal working memory capacity varying based on the type of representation used in each activity. Fitts' throughput mediated the association between input type and learning outcomes of the curriculum. This study advances our understanding of the cognitive mechanisms underlying the use of touchscreen devices in STEM learning by demonstrating that efficient gestural interaction supports visuospatial cognition and learning.\n\n\n\n\nPractitioner notes\nWhat is already known about this topic\n\nVisuospatial cognition plays a critical role in learning STEM content.\nVisuospatial ability predicts a wide range of learning performance in STEM learning.\nVisuospatial ability can be improved through training.\nTouchscreen devices can support student performance in conducting visuospatial tasks.\nTouchscreen devices can improve learning and motivation.\n\nWhat this paper adds\n\nThis study explores cognitive mechanisms underlying the use of touchscreen devices in STEM learning.\nFitts' throughput is introduced as a measure of how efficiently learners interact with virtual representations on the interface.\nLearners who can interact more efficiently with virtual representations on the interface tend to achieve better learning outcomes for a visuospatially rich, elementary level cryptology and cybersecurity curriculum.\nTouch‐based gestural interactions enhance visuospatial learning through improved interaction efficiency.\nWorking memory capacity moderates learning accuracy, with visuospatial and verbal working memory showing varying interaction effects depending on the representation used in digital activities.\n\nImplications for practice and policy\n\nTeachers can encourage and guide students to use gestures in both digital and unplugged learning environments for both abstract and concrete STEM tasks.\nDesigners should create intuitive and responsive gesture‐based interfaces that align with students' natural interactions and learning needs.\nEducators may differentiate instruction based on students' cognitive profiles by tailoring activities to maximize engagement and comprehension. Designers could develop adaptable tools that accommodate diverse cognitive abilities, ensuring equitable access to learning opportunities.\n\n\n\n\n"]