Programming touch and full-body interaction with a remotely controlled robot in a secondary education STEM course
Contemporary research has introduced educational robotics in the classroom, but there are few studies about the effects of alternative embodied interaction modalities on computational thinking and science education. Twenty-six middle school students were asked to program interfaces for controlling the heading and speed of a robot using two types of embodied interaction modalities. We compared touch and full-body gestures to autonomous control, which does not require any embodied interaction. We assessed the development of their computational thinking skills by analyzing the projects they created during a problem-solving task and examined their understandings of science concepts related to kinematics. We found that novice students preferred full-body interfaces, while advanced students moved to more disembodied and abstract computational thinking. These findings might be applied to focus computing and science education activities to the right age and abilities groups of students.
PDF
DOI
Merkouris, A. and Chorianopoulos, K. 2018. Programming touch and full-body interaction with a remotely controlled robot in a secondary education STEM course. Proceedings of the 22nd Pan-Hellenic Conference on Informatics, ACM, 225–229.BibTeX
Merkouris, A. and Chorianopoulos, K. 2018. Programming touch and full-body interaction with a remotely controlled robot in a secondary education STEM course. Proceedings of the 22nd Pan-Hellenic Conference on Informatics, ACM, 225–229.