In a lab in Genoa, Italy, volunteers teamed up with iCub, a child-sized humanoid robot, to slice a bar of soap using a steel wire. The task itself was simple. What surprised researchers was what happened afterward: the participants’ brains began treating iCub’s hand as if it were their own.
The study was led by experts at the Istituto Italiano di Tecnologia (IIT) and Brown University. The results suggests that collaboration with a robot does more than complete a task. It changes how our brains map the space around us.
Our brains keep a constantly updating map of the body that helps us move without thinking about each step.
Swinging a tennis racket or reaching for a glass of water feels natural because the brain treats the racket or the glass as part of the body for that moment. This map is flexible. It expands and reshapes when new tools or objects prove useful.
The team wanted to know if a humanoid robot could fit into that same map. Could our brain adopt a robot hand the way it adopts a familiar tool? That was the puzzle.
The experiment built on what scientists call the near hand effect. When a hand is near an object, the brain pays more attention to it. It prepares to act.
Earlier studies showed that this effect can even extend to another person’s hand, but only after working together. The researchers asked if the same could happen with a robot.
Giulia Scorza Azzarà, a PhD student at IIT and first author of the study, designed the test. Volunteers and iCub cut soap together using a steel wire. Afterward, they did the Posner cueing task, a reaction-time test where images flash on a screen.
The results were striking. People responded faster when images appeared near iCub’s hand. Their brains had started to treat the robot’s hand almost like their own.
This effect didn’t happen automatically. A control study showed that when the robot’s hand was simply placed near the screen, without any shared task, there was no change in attention.
Unlike a rubber fake hand, which can trick the brain through sight alone, a robot’s hand needed active collaboration to matter.
Attention is prioritized to the space near another’s hand (human or robot) only after the latter becomes relevant to accomplish a shared goal.
The way iCub moved changed how people responded. When the robot’s movements were wide, fluid, and well synchronized, the effect grew stronger.
Distance also mattered. If the robot’s hand came close to a person’s personal space, the brain was more likely to accept it into its map. Engagement and closeness turned out to be powerful ingredients.
The study also measured how people felt about their robotic partner. Questionnaires revealed that the more competent, likable, or animated iCub seemed, the stronger the effect became.
When participants thought of the robot as capable of feeling or acting intelligently, their brains gave more weight to its presence. The more human-like and competent the robot is perceived to be, the more impact its hand has after collaboration on human attentional prioritization.
What forms here is something called a joint body schema. During teamwork, our brains do not just map our own limbs. They also start to include a partner’s movements.
Athletes, dancers, and musicians rely on this unconscious merging all the time. The new study shows that robots can be part of it too.
The researchers note that iCub’s behavior in the study was pre-programmed. It could not adapt in real time the way humans do.
Future robots will need to adjust their actions more flexibly, responding to the pace and style of their human partners. Adaptability could make the connection feel even more natural.
Future tests may also explore whether the same effect happens with non-humanoid objects, like a stick or another simple tool. That would show whether human-like design is essential, or whether collaboration itself is enough to reshape attention.
The applications are clear. Robots that people can accept as extensions of themselves could transform rehabilitation and assistive care.
Patients recovering from injury may work with robots that help restore movement by literally feeling like part of their bodies. Virtual reality systems could integrate robotic partners that guide attention and enhance immersion.
This project is part of the ERC-funded wHiSPER initiative, coordinated by IIT’s CONTACT unit. It reveals a powerful idea: working with robots can change how we see ourselves.
Slicing soap may seem trivial, but the shift it triggered shows just how ready our brains are to make space for new partners – even mechanical ones.
The study is published in the journal iScience.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–