According to a report published in the journal Biological Psychology, Infants as young as 6 to 8 months old already understand the significance of eye contact, even from robots designed with human-like features.
The Rising Use Of Robots Prompts Questions
While much research has focused on how babies engage with humans, the rising use of robots in childcare and education prompts questions about how infants perceive these machines. The study emphasizes the need to understand infants’ interactions with technology.
The authors of the report note that, much like their responsiveness to human interaction, infants also react to the gaze of robots. This indicates that infants recognize the social significance of gaze and can interpret meaningful behaviors from these machines.
There is a need for ongoing research into the role robots may play in early socialization, as well as further studies on the long-term effects of infants interacting with this technology.
“Humanoid robots are becoming increasingly common in social environments, and people are suddenly expected to engage in social interactions with these artificial agents. We are interested in how the human brain understands the ‘sociality’ of artificial humanoid robots,” said study author Samuli Linnunsalo, a doctoral researcher at Tampere University and member of the Human Information Processing Laboratory.
“We believe that, to fully explore people’s instinctive interpretations of humanoid robots’ sociality, it is necessary to use physiological measures to investigate their responses to robots’ nonverbal social cues, such as eye contact. After finding initial evidence that adult humans’ psychophysiological responses to eye contact with a humanoid robot were similar to their responses to eye contact with a human, we sought to investigate whether young infants react similarly to a humanoid robot’s and a human’s eye gaze. This was particularly interesting to us because infants do not have knowledge of the humanoid robots’ purpose as social interaction partners, nor do they understand that people are expected to treat humanoid robots as social agents.”
Study Examines Infants’ Responses to Human and Robot Interaction
According to a release, the study included 114 infants aged 6 to 8 months. Researchers brought the infants to a laboratory where they were exposed to three types of stimuli: a human, a humanoid robot named Nao, and a non-human object, specifically a vase. To create a more realistic experience for the infants, the researchers used live stimuli instead of videos or images.
Each human and robot model was presented to the infants either looking directly at them (direct gaze) or looking away (averted gaze). To keep the infants engaged, researchers created a controlled environment with an interactive introduction for both the human and the robot. The robot introduced itself, mimicking natural social gestures like nodding and hand movements, while the vase acted as a non-interactive control object.
Follow Wat-Not on Facebook, Twitter, and Instagram
GIPHY App Key not set. Please check settings