Research suggests that robots can improve psychological well-being in the workplace, but they need to look good.
According to research, the perception of the effectiveness of machines depends largely on what the robot looks like.
Researchers at the University of Cambridge conducted a study at a technology consultancy firm using two different robotic wellness trainers.
Twenty-six employees took part in weekly well-being sessions led by robots for four weeks.
Although the robots had identical voices, facial expressions, and session scenarios, their physical appearance influenced how humans interacted with them.
Those who did the feel-good exercises with the toy-like robot said they felt more connected to their “trainer” than people who worked with the humanoid-like robot.
The researchers say the perception of robots is influenced by popular culture, where the only limit to what robots can do is your imagination.
But when confronted with a robot in the real world, it often falls short of expectations.
According to the researchers, because the toy-like robot looks simpler, people may have had lower expectations and found it easier to talk to and connect with.
Those who worked with the humanoid robot found that their expectations did not match reality because the robot was unable to conduct interactive conversations.
Researchers worked with local technology firm Cambridge Consultants to design and implement a workplace wellness program using robots.
Over a four-week period, employees were guided through four different well-being exercises by one of two robots: the QTRObot (QT) or the Misty II robot (Misty).
QT is a child-like humanoid robot about 90 cm tall, while Misty is a 36 cm tall toy robot.
Both have on-screen faces that can be programmed with different facial expressions.
Dr Micol Spitale, the paper’s first author, said: “Maybe since Misty’s robot is more like a toy, it has lived up to their expectations.
“But because QT is more humanoid, they expected it to behave like a human, and perhaps that’s why participants who worked with QT were a bit disappointed.”
After talking to various well-being coaches, the researchers programmed the robots to have a coach personality of great openness and conscientiousness.
Professor Hatice Gunes from Cambridge’s Department of Computer Science and Technology, who led the research, said: “The most common response we received from participants was that their expectations of the robot did not match reality.
“We programmed the robots with a script, but participants hoped there would be more interactivity.
“It is extremely difficult to create a robot capable of natural conversation. New developments in large language models could be really beneficial in this regard.”
Co-author Minja Axelsson said: “Our perception of how robots should look or behave may be holding back the uptake of robotics in areas where they could be useful.”
The findings were presented at the ACM/IEEE International Conference on Human-Robot Interaction in Stockholm.