Programming Embodied Interactions with a Remotely Controlled Educational Robot
Contemporary research has explored educational robotics, but it has not examined the development of computational thinking in the context of programming embodied interactions. Apart from the goal of the robot and how the robot will interact with its environment, another important aspect that should be taken into consideration is whether and how the user will physically interact with the robot. We recruited 36 middle school students to participate in a six-session robotics curriculum in an attempt to expand their learning in computational thinking. Participants were asked to develop interfaces for the remote control of a robot using diverse interaction styles from low-level to high-level embodiment, such as touch, speech, and hand and full-body gestures. We measured students’ perception of computing, examined their computational practices, and assessed the development of their computational thinking skills by analyzing the sophistication of the projects they created during a problem-solving task. We found that students who programmed combinations of low embodiment interfaces or interfaces with no embodiment produced more sophisticated projects and adopted more sophisticated computational practices compared to those who programmed full-body interfaces. These findings suggest that there might be a tradeoff between the appeal and the cognitive benefit of rich embodied interaction with a remotely controlled robot. In further work, educational robotics research and competitions might be complemented with a hybrid approach that blends the traditional autonomous robot movement with student enactment.
PDF
DOI
Merkouris, A. and Chorianopoulos, K. 2019. Programming Embodied Interactions with a Remotely Controlled Educational Robot. ACM Trans. Comput. Educ. 19, 4, 40:1–40:19.BibTeX
Merkouris, A. and Chorianopoulos, K. 2019. Programming Embodied Interactions with a Remotely Controlled Educational Robot. ACM Trans. Comput. Educ. 19, 4, 40:1–40:19.