Researchers at Bielefeld University’s Cluster of Excellence Cognitive Interaction Technology (CITEC) are investigating what a robot needs to be able to do to teach a second language to preschool children. Since 2016, they have been researching whether and how social robots are suitable for teaching language. This research is part of the international project L2TOR, which is funded as part of the European Commission’s Horizont 2020 research program. A preliminary finding, according to the researchers: robots can motivate children to learn, and help them retain more vocabulary. On 13 July 2017, the Bielefeld project team demonstrated how the robot supports preschool kids in their learning.
“In our project, we are working on robot-child pedagogy for learning a second language with researchers from the fields of computer science, educational sciences, and linguistics,” says Prof. Dr. Stefan Kopp, who heads the “Cognitive Systems and Social Interaction” research group at the Cluster of Excellence. He is also responsible for the German sub-project of L2TOR (pronounced “El Tutor”). The abbreviation stands for “Second Language Tutoring Using Social Robots.” Key questions in the project include: “How can – and should – robots react in teaching-learning situations? For which tasks of learning is a robot best suited to provide support? And what does the robot have master to do this?” says Professor Kopp. One project partner is SoftBank Robotics (formerly Aldebaran Robotics) of Paris. The company manufactures the humanoid robot based on the Nao model, which is combined with a Tablet-PC for language instruction.
The consortium consists of five universities and two companies, and is researching how children at 4-6 years of age react to the robot as a language trainer. Kopp sees several factors for why using robots as supportive language companions at this age makes sense: “Supporting children in their early language development at preschools has become increasingly important. Part of this is the fact that many preschools want to teach English. What’s more: many children are growing up in bilingual households, particularly children from immigrant and refugee families.”
The German team is working together with several preschools in Bielefeld and the surrounding area. On-going tests, for example, are about practicing nouns and prepositions in connection with English terms. The Bielefeld robot called Robin gives instructions for display on a Tablet-PC: “setze den ‘dog’ hinter den Baum” (place the ‘dog’ behind the tree). The child chooses from among several animals. If the child moves the dog correctly behind the tree, the robot praises her, and the next exercise follows. “At first, we assumed that the interaction between children and robots would be difficult,” says Dr. Kirsten Bergmann, who is working on the study. “The opposite is true. The kids are focused, and get along well with the tasks.”
On this project, the Bielefeld researchers are working on interaction management in particular. “The robot should understand, what is going on with the child standing in front of it, and adjust its behavior accordingly,” says Stefan Kopp. To measure attention and motivation, the system records, for instance, where a child is looking, how quickly he or she reacts, and how many mistakes they make. “Our previous studies show that in most cases, children are very focused on the questions and pick up on the tasks given by the robots.”
The project is also able provide evidence that a robot’s hand and arm gestures can have a positive impact on learning. In cooperation with project partners from the University of Tilburg in the Netherlands, the robot learning system was recently tested with 80 children. In the experiment, the robot described animals to children in their native language, and they had to select which animal the robot was talking about on a tablet. In one group, the robot described the animals using only words. In the other group, the robot used gestures: for “chicken,” it flapped its arms, and for “monkey” it scratched its head. The gestures appear to have a beneficial impact: “Children from the group in which gestures were used remembered more English words a week later than the children from the control group,” says Bergmann.Normally, the robot speaks with a high-pitched voice, which brings to mind animated characters like the Smurfs. In an experiment, the researchers tested whether vocal pitch influenced learning. The robot would either speak with a higher voice or, by contrast, with a deeper voice. “When it comes to learning success, however, a robot’s vocal pitch does not play a role,” reports Bergmann. In addition to this, it appears that monotone intonation, similar to the pronunciation heard in a GPS system, does not have a negative effect. “Kids don’t pronounce vocabulary robotically, but rather with an accent from their native language.”
Beginning in January 2018, the Bielefeld team, along with project partners in the Netherlands and Turkey, will test how well the L2TOR system works in a large, simultaneous study with approximately 400 children over several weeks. “We assume that it can be helpful for robots in preschools to give structured language instruction in which the robot is always adapting to the progress of individual children,” says Stefan Kopp. “With this capability, the robot should enrich the daily routine at the preschool. However, in no way is the robot meant to become a new caregiver. In the beginning, a preschool teacher will always be present during language training.”
Source: Bielefeld University