Using the Engagement Profile to Design an Engaging Robotic Teaching Assistant for Students
<p>Basic concept: a robot teaching assistant can help learning at a master’s level engineering course.</p> "> Figure 2
<p>The dimensions of the Engagement Profile explained with short definitions, adapted to the teaching case. To define the value of a property, find the adjacent number of the phrases that fit best.</p> "> Figure 3
<p>Classroom setup.</p> "> Figure 4
<p>The Engagement Profile for the learning experience with the robotic teaching assistant. The green areas show the preferred values; the blue hatches show the Engagement Profile of the robotic teaching assistant during the first week.</p> "> Figure 5
<p>Some examples of the robot’s capabilities being demonstrated. (<b>a</b>) Reading: students gathered around the robot during a free exploration session, to participate in quizzes. (<b>b</b>) Greetings: the robot smiled when saying hello and closed its eyes as a metaphor for going to sleep when saying goodbye. (<b>c</b>) Remote operation: a graduated student and the second author in Norway speak through the robot. (<b>d</b>) Clarification: The robot shows some different kinds of graphs while the teacher speaks. (<b>e</b>) Alerting: the robot looks at the teacher while issuing a reminder. (<b>f</b>) Motion: the robot takes a kit from the teacher to hand to students.</p> "> Figure 6
<p>Change-diagrams weekly. Colour code: blue = stay; green = increase; (red = decrease).</p> "> Figure 7
<p>Some examples of lessons learned. (<b>a</b>) Reading: the robot’s monitor showing students waving to vote on learning activities is mostly dark; students described feeling shy about waving out of fear that they would act against the wishes of others. (<b>b</b>) Greeting: we explored designing different waving behaviors as the robot is not capable of moving its arm like a human, but the robot’s farewell wave on the first lecture was described as a student as resembling a Nazi salute. (<b>c</b>) Clarification: the robot’s speech recognition was not robust enough to hear some of the students at the back of the room during quizzes. (<b>d</b>) Alerting: a student reminded the class about an outing five minutes ahead of schedule, so the robot’s reminder was not used. (<b>e</b>) Remote operation: some pauses used to allow the robot to more easily distinguish its own movements from students’ movements interfered with handing out kits to the students. (<b>f</b>) Motion: the robot was too slow when handing out an attendance sheet on the first day and dropped the sheet.</p> ">
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
3.1. Robot Teaching Assistant Capabilities
3.2. The DEIS Course
3.3. The Baxter Robot
3.4. The Engagement Profile
4. Study
4.1. Setup of the Study
4.2. Description of the Experiment
4.2.1. Day 1
- C1
- Reading: quiz content was split between the lecture slides and the robot, based on the assumption that the slides would be better for clearly communicating some complex information such as equations, while the robot would be more interesting in general for communicating simpler content such as quiz questions. Six slides in the lecture presentation were allocated for quizzes. On reaching a quiz slide, the teacher pressed a button on the GUI to trigger the robot to ask questions and show a puzzled face on its display. Quiz topics included computational logic and time complexity. For example, one quiz slide showed a deterministic and non-deterministic state machine and some strings; the robot asked the students to consider which strings would be accepted by each.
- C2
- Greeting: the robot was set up to introduce itself at the beginning of class, stating its name, describing its role as teaching assistant, and priming students’ expectations that it was a work in progress, while waving a hand and smiling; at the end of class it said thank you and goodbye, again waving.
- C3
- Alerting: the robot looked toward the teacher and stated that the teacher had forgotten to explain a topic.
- C4
- Remote operation: students were invited during a break to teleoperate the robot using a handheld controller.
- C5
- Clarification: the robot automatically showed some example images in its display based on recognizing keywords spoken by the teacher: specifically, the names of some common charts, such as ’Venn diagram’, ’histogram’, ’pie chart’, and ’Gantt chart’.
- C6
- Motion: the robot took an attendance sheet from the teacher in its gripper and moved forward to hand it to the nearest student.
4.2.2. Day 2
- C1
- Reading: six quizzes were conducted, in regard to circuits, connectors, computers, math, and programming languages.
- C2
- Greeting: the robot greeted the class at the beginning and said goodbye at the end of the lecture.
- C3
- Alerting: the robot advised the teacher at one point that a description was not clear.
- C4
- Remote operation: the students listened to a former master’s student describe her experience by speaking remotely through the robot.
- C5
- Clarification: the robot recognized keywords which the students said and displayed related images on its display.
- C6
- Motion: The robot shook hands with students who wished to do so during a break.
4.2.3. Day 3
- C1
- Reading: math, pattern recognition, statistics
- C2
- Greetings: hello, bye (2).
- C3
- Alerting: the robot alerted the class that it was time to go for a short outing to a workshop with tools.
- C4
- Remote operation: video conference with remote person (the second author in Oslo, Norway).
- C5
- Clarification: the robot described some extra resources.
- C6
- Motion: in break, handshakes and face recognition.
4.2.4. Day 4
- C1
- Reading: sensors, actuators, computer vision, summary (6).
- C2
- Greeting: hello (storyline, roles), bye (2).
- C3
- Alerting: time to change topics or switch between listening to lectures and exploring.
- C4
- Remote operation: make the robot’s gaze follow people moving left to right or vice versa.
- C5
- Clarification: showing quiz questions and answers on the robot’s display.
- C6
- Motion: the robot handed out robot kits to a representative from each project group.
4.3. Observations
Day 1
Day 2
Day 3
Day 4
Afterwards
5. Discussion
5.1. Limitations
5.2. Future Work
6. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
ASD | Autism spectrum disorder |
Capability i; see Table 2 | |
DEIS | Design of Embedded and Intelligent Systems (course acronym at Halmstad University) |
EDM | Educational data mining |
ISDD | Department of Intelligent Systems and Digital Design (at Halmstad University, Sweden) |
ITS | Intelligent tutoring system |
HRI | Human-robot interaction |
LA | Learning analytics |
NAO | a humanoid robot created by the company Softbank Robotics |
ROS | Robot operating system |
SOLO | Structure of observed learning outcomes |
OpenCV | Open source computer vision library |
SVM | Support vector machine |
References
- Harasim, L. Learning Theory and Online Technologies, 1st ed.; Routledge: New York, NY, USA, 2011. [Google Scholar]
- Säljö, R. Digital tools and challenges to institutional traditions of learning: Technologies, social memory and the performative nature of learning. J. Comput. Assist. Learn. 2010, 26, 53–64. [Google Scholar] [CrossRef]
- Entwistle, N. Teaching for Understanding at University: Deep Approaches and Distinctive Ways of Thinking; Universities into the 21st Century; Palgrave Macmillan: Basingstoke, UK, 2009. [Google Scholar]
- Druin, A.; Hendler, J.A. Robots for Kids: Exploring New Technologies for Learning; Morgan Kaufmann: Burlington, MA, USA, 2000. [Google Scholar]
- Fong, T.W.; Nourbakhsh, I.; Dautenhahn, K. A Survey of Socially Interactive Robots: Concepts, Design, and Applications; Technical Report CMU-RI-TR-02-29; Robotics Institute, Carnegie Mellon University: Pittsburgh, PA, USA, 2002. [Google Scholar]
- Steinfeld, A.; Fong, T.; Kaber, D.; Lewis, M.; Scholtz, J.; Schultz, A.; Goodrich, M. Common Metrics for Human-robot Interaction. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-robot Interaction, Salt Lake City, UT, USA, 2–3 March 2006; pp. 33–40. [Google Scholar] [CrossRef]
- Han, J.; Jo, M.; Jones, V.; Jo, J.H. Comparative Study on the Educational Use of Home Robots for Children. J. Media Res. 2008, 4. [Google Scholar] [CrossRef]
- Leyzberg, D.; Spaulding, S.; Toneva, M.; Scassellati, B. The Physical Presence of a Robot Tutor Increases Cognitive Learning Gains. In Proceedings of the 34th Annual Meeting of the Cognitive Science Society, CogSci 2012, Sapporo, Japan, 1–4 August 2012; pp. 1882–1887. [Google Scholar]
- Belpaeme, T.; Kennedy, J.; Baxter, P.; Vogt, P.; Krahmer, E.; Kopp, S.; Bergmann, K.; Leseman, P.; Küntay, A.C.; Göksun, T.; et al. L2TOR—Second Language Tutoring using Social Robots. In Proceedings of the ICSR 2015 Workshop on Educational Robotics (WONDER) 2015, Paris, France, 26–30 October 2015. [Google Scholar]
- Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3. [Google Scholar] [CrossRef]
- Biggs, J.B.; Tang, C. Teaching for Quality Learning at University; Open University Press/Mc Graw-Hill Education: New York, NY, USA, 2007. [Google Scholar]
- Avanzato, R. Collaborative mobile robot design in an introductory programming course for engineers. In Proceedings of the 1998 ASEE Annual Conference, Seattle, WA, USA, 28 June–1 July 1998. [Google Scholar]
- Klassner, F.; Anderson, S.D. MindStorms: Not Just for K-12 Anymore. IEEE Robot. Autom. Mag. 2003, 10, 12–18. [Google Scholar] [CrossRef]
- Fernandes, E.; Fermé, E.; Oliveira, R. Using Robots to Learn Functions in Math Class. In Proceedings of the Seventeenth International Commission on Mathematical Instruction (ICMI) Study Conference “Technology Revisited”, Hanoi, Vietnam, 3–8 December 2006; pp. 152–159. [Google Scholar]
- Church, W.; Ford, T.; Perova, N.; Rogers, C. Physics with robotics: Using Lego Mindstorms in high school education. In Proceedings of the Advancement of Artificial Intelligence Spring Symposium, Palo Alto, CA, USA, 22–24 March 2010; pp. 47–49. [Google Scholar]
- Castledine, A.; Chalmers, C. LEGO Robotics: An Authentic Problem Solving Tool? Des. Technol. Educ. 2011, 16, 19–27. [Google Scholar]
- Fels, D.; Waalen, J.; Zhai, S.; Weiss, P. Telepresence Under Exceptional Circumstances: Enriching the Connection to School for Sick Children. In Proceedings of the IFIP INTERACT01: Human-Computer Interaction, Tokyo, Japan, 9–13 July 2001; pp. 617–624. [Google Scholar]
- Yamazaki, R.; Nishio, S.; Ogawa, K.; Ishiguro, H.; Matsumura, K.; Koda, K.; Fujinami, T. How Does Telenoid Affect the Communication Between Children in Classroom Setting? In CHI ’12 Extended Abstracts on Human Factors in Computing Systems; ACM: New York, NY, USA, 2012; pp. 351–366. [Google Scholar]
- Børsting, J.; Culén, A.L. A Robot Avatar: Easier Access to Education and Reduction in Isolation? In Proceedings of the International Conference on E-Health 2016, IADIS, Funchal, Portugal, 1–3 July 2016; pp. 34–44. [Google Scholar]
- Kanda, T.; Hirano, T.; Eaton, D.; Ishiguro, H. Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial. Hum.-Comput. Interact. 2004, 19, 61–84. [Google Scholar] [Green Version]
- Kanda, T.; Ishiguro, H. Communication Robots for Elementary Schools. In Proceedings of the AISB’05 Symposium Robot Companions: Hard Problems and Open Challenges in Robot-Human Interaction, Hatfield, UK, 12–15 April 2005; pp. 54–63. [Google Scholar]
- Shiomi, M.; Kanda, T.; Howley, I.; Hayashi, K.; Hagita, N. Can a Social Robot Stimulate Science Curiosity in Classrooms? Int. J. Soc. Robot. 2015, 7, 641–652. [Google Scholar] [CrossRef]
- Köse, H.; Uluer, P.; Akalın, N.; Yorgancı, R.; Özkul, A.; Ince, G. The effect of embodiment in sign language tutoring with assistive humanoid robots. Int. J. Soc. Robot. 2015, 7, 537–548. [Google Scholar] [CrossRef]
- Zaga, C.; Lohse, M.; Truong, K.; Evers, V. The Effect of a Robot’s Social Character on Children Task Engagement: Peer Versus Tutor. In Proceedings of the 7th International Conference on Social Robotics, ICSR 2015, Paris, France, 26–30 October 2015; pp. 704–713. [Google Scholar] [CrossRef]
- Goetz, J.; Kiesler, S.; Powers, A. Matching robot appearance and behavior to tasks to improve human-robot cooperation. In Proceedings of the 12th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2003, Millbrae, CA, USA, 31 October–2 November 2003; pp. 55–60. [Google Scholar]
- Alves-Oliveira, P.; Sequeira, P.; Paiva, A. The role that an educational robot plays. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2016), New York, NY, USA, 26–31 August 2016; pp. 817–822. [Google Scholar]
- Kennedy, J.; Baxter, P.; Belpaeme, T. The robot who tried too hard: Social behaviour of a robot tutor can negatively affect child learning. In Proceedings of the 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2015), Portland, Oregon, USA, 2–5 March 2015; pp. 67–74. [Google Scholar]
- Baxter, P.; Ashurst, E.; Read, R.; Kennedy, J.; Belpaeme, T. Robot education peers in a situated primary school study: Personalisation promotes child learning. PLoS ONE 2017, 12, e0178126. [Google Scholar] [CrossRef]
- Chang, C.W.; Lee, J.H.; Chao, P.Y.; Wang, C.Y.; Chen, G.D. Exploring the Possibility of Using Humanoid Robots as Instructional Tools for Teaching a Second Language in Primary School. Educ. Technol. Soc. 2010, 13, 13–24. [Google Scholar]
- Alemi, M.; Meghdari, A.; Ghazisaedy, M. Employing Humanoid Robots for Teaching English Language in Iranian Junior High-Schools. Int. J. Humanoid Robot. 2014, 11, 1450022. [Google Scholar] [CrossRef]
- Fuglerud, K.S.; Solheim, I. The use of social robots for supporting language training of children. In Proceedings of the Universal Design and Higher Education in Transformation Congress (UDHEIT2018), Dublin Castle, Ireland, 30 October–2 November 2018; pp. 1–8. [Google Scholar] [CrossRef]
- Kanero, J.; Geçkin, V.; Oranç, C.; Mamus, E.; Köntay, A.C.; Göksun, T. Social Robots for Early Language Learning: Current Evidence and Future Directions. Child Dev. Perspect. 2018. [Google Scholar] [CrossRef]
- Kanda, T.; Nabe, S.; Hiraki, K.; Ishiguro, H.; Hagita, N. Human friendship estimation model for communication robots. Auton. Robots 2008, 24, 135–145. [Google Scholar] [CrossRef]
- Saerbeck, M.; Schut, T.; Bartneck, C.; Janse, M.D. Expressive Robots in Education: Varying the Degree of Social Supportive Behavior of a Robotic Tutor. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI2010), Atlanta, Georgia, 10–15 April 2010; pp. 1613–1622. [Google Scholar]
- Alemi, M.; Meghdari, A.; Ghazisaedy, M. The effect of employing humanoid robots for teaching English on students’ anxiety and attitude. In Proceedings of the 2014 Second RSI/ISM International Conference on Robotics and Mechatronics (ICRoM), Nottingham, UK, 6–7 July 2014; pp. 754–759. [Google Scholar] [CrossRef]
- Cooper, M.; Keating, D.; Harwin, W.; Dautenhahn, K. Robots in the classroom-tools for accessible education. In Assistive Technology on the Threshold of the New Millennium, Assistive Technology Research Series; IOS Press: Amsterdam, The Netherlands, 1999; Volume 4, pp. 448–452. [Google Scholar]
- Goel, A.K.; Joyner, D.A. Design of an Online Course on Knowledge-Based AI. In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, Phoenix, Arizona, USA, 12–17 February 2016; pp. 4089–4094. [Google Scholar]
- Maderer, J. Artificial Intelligence Course Creates AI Teaching Assistant: Students Didn’T Know Their TA Was a Computer. Georgia Tech News Center. 9 May 2016. Available online: https://www.news.gatech.edu/2016/05/09/artificial-intelligence-course-creates-ai-teaching-assistant (accessed on 12 March 2019).
- Shin, N.; Kim, S. Learning about, from, and with Robots: Students’ Perspectives. In Proceedings of the 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2007), Jeju Island, Korea, 26–29 August 2007; pp. 1040–1045. [Google Scholar] [CrossRef]
- Cramer, H.; Mentis, H.M.; Fernaeus, Y. Serious work on playful experiences: A preliminary set of challenges. In Proceedings of the CSCW 2010 Fun, Seriously Workshop, Savannah, GA, USA, 6–10 February 2010; Available online: https://pdfs.semanticscholar.org/ae58/923f3925f8f1a558b73a6fe307e5a5562522.pdf (accessed on 12 March 2019).
- Knowles, M.S. The Modern Practice of Adult Education: Andragogy versus Pedagogy; Cambridge Adult Education; Prentice Hall Regents: Englewood Cliffs, NJ, USA, 1970. [Google Scholar]
- Rayner, G.M.; Burke da Silva, K. Building pedagogical bridges between secondary and tertiary biology: A multi-institutional, national endeavour. In Proceedings of the STEM 2014—International Science, Technology, Engineering and Mathematics in Education Conference, Vancouver, BC, Canada, 12–15 July 2014; pp. 1–6. [Google Scholar]
- Brščić, D.; Kidokoro, H.; Suehiro, Y.; Kanda, T. Escaping from Children’s Abuse of Social Robots. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI 2015), Portland, OR, USA, 2–5 March 2015; pp. 59–66. [Google Scholar] [CrossRef]
- Salvini, P.; Ciaravella, G.; Yu, W.; Ferri, G.; Manzi, A.; Mazzolai, B.; Laschi, C.; Oh, S.R.; Dario, P. How safe are service robots in urban environments? Bullying a robot. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 12–15 September 2010; pp. 1–7. [Google Scholar] [CrossRef]
- Spink, A.; Cole, C.; Waller, M. Multitasking behavior. Annu. Rev. Inf. Sci. Technol. 2008, 42, 93–118. [Google Scholar] [CrossRef]
- Engelberg, D.; Seffah, A. A framework for rapid mid-fidelity prototyping of web sites. In Proceedings of the IFIP World Computer Congress, Poznan, The Netherlands, 25–30 August 2002; pp. 203–215. [Google Scholar]
- Biggs, J.B.; Collis, K.F. Evaluating the Quality of Learning: The SOLO Taxonomy (Structure of the Observed Learning Outcome); Academic Press: Cambridge, MA, USA, 1982. [Google Scholar]
- SoftBank Robotics. Find out More about NAO. Available online: https://www.ald.softbankrobotics.com/en/robots/nao/find-out-more-about-nao (accessed on 12 March 2019).
- Open Source Robotics Foundation. What Is a TurtleBot? 2018. Available online: https://www.turtlebot.com (accessed on 12 March 2019).
- Rae, I.; Takayama, L.; Mutlu, B. The influence of height in robot-mediated communication. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 4–6 March 2013; pp. 1–8. [Google Scholar] [CrossRef]
- Kim, T.; Hong, H.; Magerko, B. Designing for Persuasion: Toward Ambient Eco-Visualization for Awareness. In Persuasive Technology: 5th International Conference, PERSUASIVE 2010, Copenhagen, Denmark, 7–10 June 2010; Ploug, T., Hasle, P., Oinas-Kukkonen, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 106–116. [Google Scholar]
- OpenCV Team. OpenCV Library. 2018. Available online: https://opencv.org (accessed on 12 March 2019).
- CMU Sphinx Developers. CMU Sphinx/Pocketsphinx. 2018. Available online: https://github.com/cmusphinx/pocketsphinx (accessed on 12 March 2019).
- Open Source Robotics Foundation. ROS—Robot Operating System. 2018. Available online: https://www.ros.org (accessed on 12 March 2019).
- The Glossary of Education Reform. Student Engagement. Interactive Web Pages. 2016. Available online: http://edglossary.org/student-engagement/ (accessed on 12 March 2019).
- Leister, W.; Tjøstheim, I.; Joryd, G.; de Brisis, M.; Lauritzsen, S.; Reisæter, S. An Evaluation-Driven Design Process for Exhibitions. Multimodal Technol. Interact. 2017, 1, 25. [Google Scholar] [CrossRef]
- Leister, W.; Tjøstheim, I.; Joryd, G.; Schulz, T.; Larssen, A.; de Brisis, M. Assessing Visitor Engagement in Science Centres and Museums. J. Adv. Life Sci. 2016, 8, 49–63. [Google Scholar]
- Ocampo-Agudelo, J.; Maya, J.; Roldán, A. A Tool for the Design of Experience-Centred Exhibits in Science Centres. Poster at Science Centre World Summit—SCWS2017, Tokyo, Japan, 15–17 November 2017. [Google Scholar] [CrossRef]
- Google Inc. About Google Forms. 2018. Available online: https://www.google.com/forms/about/ (accessed on 12 March 2019).
- Gray, H.M.; Gray, K.; Wegner, D.M. Dimensions of mind perception. Science 2007, 315, 619. [Google Scholar] [CrossRef]
- Arnett, J.J. Are college students adults? Their conceptions of the transition to adulthood. J. Adult Dev. 1994, 1, 213–224. [Google Scholar] [CrossRef]
- Mundorf, N.; Brownell, W. Media preferences of older and younger adults. Gerontologist 1990, 30, 685–691. [Google Scholar] [CrossRef]
- Bolton, K.; Kuteeva, M. English as an academic language at a Swedish university: Parallel language use and the ‘threat’of English. J. Multiling. Multicult. Dev. 2012, 33, 429–447. [Google Scholar] [CrossRef]
- Bartneck, C.; Nomura, T.; Kanda, T.; Suzuki, T.; Kato, K. A cross-cultural study on attitudes towards robots. In Proceedings of the HCI International, Las Vegas, Nevada, USA, 22–27 July 2005. [Google Scholar]
- Kuo, I.H.; Rabindran, J.M.; Broadbent, E.; Lee, Y.I.; Kerse, N.; Stafford, R.; MacDonald, B.A. Age and gender factors in user acceptance of healthcare robots. In Proceedings of the RO-MAN 2009—The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 214–219. [Google Scholar]
- Furnham, A. Response bias, social desirability and dissimulation. Personal. Individ. Differ. 1986, 7, 385–400. [Google Scholar] [CrossRef]
- Sabanovic, S.; Michalowski, M.P.; Simmons, R. Robots in the wild: Observing human-robot social interaction outside the lab. In Proceedings of the 9th IEEE International Workshop on Advanced Motion Control, Istanbul, Turkey, 27–29 March 2006; pp. 596–601. [Google Scholar]
- Salter, T.; Werry, I.; Michaud, F. Going into the wild in child–robot interaction studies: Issues in social robotic development. Intell. Serv. Robot. 2008, 1, 93–108. [Google Scholar] [CrossRef]
- Strauss, A.; Corbin, J. Grounded theory methodology. Handb. Qual. Res. 1994, 17, 273–285. [Google Scholar]
- Rosenthal, R. The file drawer problem and tolerance for null results. Psychol. Bull. 1979, 86, 638. [Google Scholar] [CrossRef]
- Slater, S.; Joksimović, S.; Kovanovic, V.; Baker, R.S.; Gasevic, D. Tools for educational data mining: A review. J. Educ. Behav. Stat. 2017, 42, 85–106. [Google Scholar] [CrossRef]
- Piech, C.; Huang, J.; Nguyen, A.; Phulsuksombati, M.; Sahami, M.; Guibas, L. Learning program embeddings to propagate feedback on student code. arXiv, 2015; arXiv:1505.05969. [Google Scholar]
- Dutt, A.; Ismail, M.A.; Herawan, T. A systematic review on educational data mining. IEEE Access 2017, 5, 15991–16005. [Google Scholar] [CrossRef]
- Piech, C.; Bassen, J.; Huang, J.; Ganguli, S.; Sahami, M.; Guibas, L.J.; Sohl-Dickstein, J. Deep knowledge tracing. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 7–12 December 2015; pp. 505–513. [Google Scholar]
- Rothfuss, J.; Ferreira, F.; Aksoy, E.E.; Zhou, Y.; Asfour, T. Deep episodic memory: Encoding, recalling, and predicting episodic experiences for robot action execution. IEEE Robot. Autom. Lett. 2018, 3, 4007–4014. [Google Scholar] [CrossRef]
- Heenan, B.; Greenberg, S.; Aghel-Manesh, S.; Sharlin, E. Designing Social Greetings in Human Robot Interaction. In Proceedings of the 2014 Conference on Designing Interactive Systems, Vancouver, BC, Canada, 21–25 June 2014; pp. 855–864. [Google Scholar] [CrossRef]
- Glas, D.F.; Wada, K.; Shiomi, M.; Kanda, T.; Ishiguro, H.; Hagita, N. Personal Greetings: Personalizing Robot Utterances Based on Novelty of Observed Behavior. Int. J. Soc. Robot. 2017, 9, 181–198. [Google Scholar] [CrossRef]
- Pollack, M.E.; Brown, L.; Colbry, D.; Orosz, C.; Peintner, B.; Ramakrishnan, S.; Engberg, S.; Matthews, J.T.; Dunbar-Jacob, J.; McCarthy, C.E. Pearl: A Mobile Robotic Assistant for the Elderly. In Proceedings of the AAAI Workshop on Automation as Caregiver, Edmonton, AB, Canada, 28–29 July 2002. [Google Scholar]
- Schindler, M.; Lilienthal, A. Eye-Tracking For Studying Mathematical Difficulties: Also in Inclusive Settings. In Proceedings of the Annual Meeting of the International Group for the Psychology of Mathematics Education (PME-42), Umeå, Sweden, 3–8 July 2018; Volume 4, pp. 115–122. [Google Scholar]
- Gupta, M.; Bendersky, M. Information retrieval with verbose queries. Found. Trends Inf. Retr. 2015, 9, 209–354. [Google Scholar] [CrossRef]
- Cakmak, M.; Srinivasa, S.S.; Lee, M.K.; Kiesler, S.; Forlizzi, J. Using Spatial and Temporal Contrast for Fluent Robot-Human Hand-Overs. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2011), Lausanne, Switzerland, 8–11 March 2011; pp. 489–496. [Google Scholar]
- Koay, K.; Sisbot, E.; Syrdal, D.; Walters, M.; Dautenhahn, K.; Alami, R. Exploratory Study of a Robot Approaching a Person in the Context of Handing Over an Object. In Proceedings of the AAAI-Spring Symposium 2007: SS07, Multidisciplinary Collaboration for Socially Assistive Robotics, Palo Alto, CA, USA, 26–28 March 2007; pp. 18–24. [Google Scholar]
- Shi, C.; Shiomi, M.; Smith, C.; Kanda, T.; Ishiguro, H. A model of distributional handing interaction for a mobile robot. In Proceedings of the Robotics: Science and Systems, Berlin, Germany, 24–28 June 2013. [Google Scholar]
- Schulz, T.; Herstad, J.; Tørresen, J. Classifying Human and Robot Movement at Home and Implementing Robot Movement Using the Slow in, Slow out Animation Principle. Int. J. Adv. Intell. Syst. 2018, 11, 234–244. [Google Scholar]
- Ogawa, K.; Taura, K.; Ishiguro, H. Possibilities of Androids as poetry-reciting agent. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 565–570. [Google Scholar] [CrossRef]
Reference | Benefits | Robot | Capabilities | Class | Outcomes |
---|---|---|---|---|---|
Role: Tutor outside class: | |||||
[20,22,33] | Prevent bullying, provide friendship | Robovie | Recognize children, quizzes, gaze, entertaining | Elementary school language and science classes in Japan, 1–2 months | Better retention, some increased curiosity |
[34] | - | iCat | Read, gaze, feedback w/facial expressions, nod, shake head, idling | 16 10–11 year old elementary school children in Holland, 1 h | Social behaviors facilitated learning |
Role: Avatar: | |||||
[19] | Tele-teaching, avoid loneliness | AV1 | Remote operation, avatar | 9 12–16 year old adolescents at school in Norway | Users provided positive feedback |
[18] | Change social interaction, expand human capabilities | Telenoid | Convey operator’s voice, arms move | 28 9–10 year old elementary school children in Japan, 2 days | Limitations of robot had positive effects on collaboration |
Role: Teaching Assistant: | |||||
[29] | Repeatability, digitization, fantastic appearance, different voices | RoboSapiens | Read (feedback), remote control | Elementary school for five weeks, in Taiwan | Students were motivated, and suggestions for improvement were made |
[35] | Repeatability, AI, sensors | NAO | Read words, pantomime, entertainment (sing, dance). | 12 year old students in Iran | Students learned faster and more words, compared to control |
[31] | Second language learning tool | NAO | Listen, repeat, feedback | pre-school children in Norway; children with autism (ASD) | Increased participation and involvement |
Capability | Description | |
---|---|---|
Reading | to orally present material such as quizzes. | |
Greeting | to greet the students. | |
Alerting | to alert the teacher. | |
Remote operation | to facilitate communication with persons at remote places. | |
Clarification | to present extra material on request. | |
Motion | to perform physical tasks by means of locomotion and object manipulation. |
Capability | Preferred Range | Suitable Range |
---|---|---|
, Reading | ||
, Quiz | ||
, Greeting | ||
, Remote operation | ||
, Clarification | ||
, Alerting | ||
, Motion | - | - |
classroom setting | ||
one robot in front | ||
use of robot in general |
C: | 2 | Competition with robot. The students will discuss and respond to quiz questions in pairs in front of the class. Some students might implicitly perceive themselves to be competing with others to some extent, but in general the students will work together as a class to answer the robot’s questions. |
N: | 2 | Limited narrative structure. The robot follows a simple storyline: introducing itself, why it is participating and what it should do, conducting its task, and saying goodbye. |
I: | 2 | Limited interactivity. The students will respond to the robot’s quizzes, but the responses will not change how the interaction proceeds. |
P: | 0 | Look only. The students will get the chance to also pilot the robot via a controller if they wish during the break, and they will maybe also receive handouts from the robot, but in general the students will mostly look only. |
U: | 1 | Linear chronology. The robot will give quizzes in a predefined sequence during the lecture. The users can affect how many quizzes are given by the time they take to answer (lectures can last only two hours, so if time runs out, quizzes can be given at a later date). |
S: | 3 | One student, others cheer and engage. The robot will conduct social behaviors aimed at the group, greeting and quizzing. |
A: | 1 | Immediate feedback. Answers to quizzes will be given in general very soon after students respond. We do not plan to give scores currently, to avoid having some students worry about losing face, although scores could be a fun way to motivate some students. |
E: | 0 | Defined view. The students will investigate topics through a standard lecture view, and also from an applied view in participating in quizzes, but both perspectives are predefined and the robot will only be involved with the applied/quiz perspective. |
Question | Formulation |
---|---|
Should there be more or less competition between groups and participants in the learning experience? | |
Should the storyline and roles in the learning experience be more evident or less evident in the learning experience? | |
Should there be more or less feedback on the choices you did in the learning experience? | |
Should there be more or less physical activity in the learning experience? | |
Should the learning experience be more or less influenced by what you did during the experience? | |
Should more or less be done in a group (as opposed to individually) during the learning experience ? | |
Should there be more or less feedback on how well you are doing during the learning experience? | |
Should there be more or less possibilities to go in depth with extra content to explore on your own? |
Question | Formulation |
---|---|
I liked the learning experience. | |
The learning experience was engaging. | |
I learned much during the learning experience. | |
I recommend the learning experience to other students. | |
I would like to have this type of learning experience for future course content. |
Question | Formulation |
---|---|
The ability to read material (e.g., giving quizzes) will be helpful for a robot teaching assistant. | |
The ability to greet people (e.g., saying hello and goodbye at the start and end of a class) will be helpful for a robot teaching assistant. | |
The ability to alert the teacher (e.g., if the teacher has forgotten to mention something or an explanation is unclear) will be helpful for a robot teaching assistant. | |
The ability to be remotely controlled (e.g., for people who cannot attend class due to illness or travel) will be helpful for a robot teaching assistant. | |
The ability to provide additional information (e.g., visualizing data, or adding information about topics which the teacher or students are discussing) will be helpful for a robot teaching assistant. | |
The ability to interact physically with people (e.g., fetching and handing out materials, handshakes) will be helpful for a robot teaching assistant. |
Day 1 | Day 2 | Day 3 | Day 4 | ||
---|---|---|---|---|---|
Week 1 | Week 2 | Week 3 | |||
Design | lecture with six behaviours | sound increased | voting via waving, links | explore sessions | |
Implementation | |||||
Reading | basic | basic | on request | for small groups | |
Greeting | outline/roles | basic | basic | outline/roles | |
Remote operation | locomotion | recording | video conference | gaze | |
Clarification | teacher | students | extra material | teacher and students | |
Alerting | omission | clarification | event | switch topic | |
Motion | sheet | handshake | handshake | robot kits | |
Feedback | good, more volume | more exploration, user control, physical, social interaction | more narrative, exploration, control, awareness | more narrative, social interaction, awareness, exploration |
Like | Engage | Learn | Recmd. | Again | Read | Greet | alert | Remote | Content | Interact | |
---|---|---|---|---|---|---|---|---|---|---|---|
week 1 | |||||||||||
mean | 3.8 | 3.4 | 3.4 | 3.7 | 4.1 | 3.7 | 3.7 | 4.1 | 4.0 | 4.0 | 4.0 |
median | 4.0 | 4.0 | 3.0 | 4.0 | 4.0 | 3.0 | 3.0 | 5.0 | 4.0 | 4.0 | 4.0 |
90 % | 5.0 | 4.2 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 |
10 % | 2.0 | 2.0 | 2.0 | 2.0 | 3.0 | 2.8 | 2.8 | 2.8 | 2.8 | 2.8 | 2.8 |
variance | 1.4 | 1.0 | 1.3 | 1.5 | 0.6 | 1.3 | 1.3 | 1.4 | 1.0 | 1.0 | 1.3 |
positive | 67% | 56% | 44% | 56% | 78% | 44% | 44% | 67% | 78% | 78% | 67% |
neutral | 11% | 22% | 33% | 22% | 22% | 44% | 44% | 22% | 11% | 11% | 22% |
negative | 22% | 22% | 22% | 22% | 0% | 11% | 11% | 11% | 11% | 11% | 11% |
week 2 | |||||||||||
mean | 4.6 | 4.4 | 4.2 | 4.6 | 4.6 | 4.2 | 4.4 | 4.4 | 4.2 | 4.6 | 4.2 |
median | 5.0 | 4.0 | 4.0 | 5.0 | 5.0 | 4.0 | 4.0 | 5.0 | 5.0 | 5.0 | 4.0 |
90% | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 4.6 |
10% | 4.0 | 4.0 | 3.4 | 4.0 | 4.0 | 3.4 | 4.0 | 3.2 | 2.8 | 4.0 | 4.0 |
variance | 0.3 | 0.3 | 0.7 | 0.3 | 0.3 | 0.7 | 0.3 | 1.8 | 1.7 | 0.3 | 0.2 |
positive | 100% | 100% | 80% | 100% | 100% | 80% | 100% | 80% | 80% | 100% | 100% |
neutral | 0% | 0% | 20% | 0% | 0% | 20% | 0% | 0% | 0% | 0% | 0% |
negative | 0% | 0% | 0% | 0% | 0% | 0% | 0% | 20% | 20% | 0% | 0% |
week 3 | |||||||||||
mean | 4.4 | 4.2 | 4.0 | 4.3 | 4.3 | 4.0 | 3.9 | 4.4 | 4.3 | 4.3 | 4.1 |
median | 4.5 | 4.0 | 4.0 | 4.0 | 4.0 | 4.0 | 4.0 | 5.0 | 4.0 | 4.5 | 4.0 |
90% | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 |
10% | 4.0 | 4.0 | 3.0 | 3.1 | 3.1 | 3.0 | 3.0 | 3.1 | 3.1 | 3.1 | 3.0 |
variance | 0.5 | 0.4 | 0.6 | 0.6 | 0.6 | 0.6 | 1.4 | 0.6 | 0.6 | 0.6 | 0.6 |
positive | 92% | 92% | 75% | 83% | 83% | 75% | 75% | 83% | 83% | 83% | 75% |
neutral | 8% | 8% | 25% | 17% | 17% | 25% | 17% | 17% | 17% | 17% | 25% |
negative | 0% | 0% | 0% | 0% | 0% | 0% | 8% | 0% | 0% | 0% | 0% |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cooney, M.; Leister, W. Using the Engagement Profile to Design an Engaging Robotic Teaching Assistant for Students. Robotics 2019, 8, 21. https://doi.org/10.3390/robotics8010021
Cooney M, Leister W. Using the Engagement Profile to Design an Engaging Robotic Teaching Assistant for Students. Robotics. 2019; 8(1):21. https://doi.org/10.3390/robotics8010021
Chicago/Turabian StyleCooney, Martin, and Wolfgang Leister. 2019. "Using the Engagement Profile to Design an Engaging Robotic Teaching Assistant for Students" Robotics 8, no. 1: 21. https://doi.org/10.3390/robotics8010021
APA StyleCooney, M., & Leister, W. (2019). Using the Engagement Profile to Design an Engaging Robotic Teaching Assistant for Students. Robotics, 8(1), 21. https://doi.org/10.3390/robotics8010021