Why robots are being trained in self-awareness

Robots passing cognitive tests such as recognising themselves in a mirror and being programmed with a human sense of time are showing how machines are being shaped to become a bigger part of our everyday lives. 

In 2016, for the first time ever, the number of robots in homes, the military, shops and hospitals surpassed that used in industry. Instead of being concentrated in factories, robots are a growing presence in people’s homes and lives – a trend that is likely going to increase as they become more sophisticated and ‘sentient’.

‘If we take out the robot from a factory and into a house, we want safety,’ said Dr Pablo Lanillos, an assistant professor at Radboud University in the Netherlands.

And for machines to safely interact with people, they need to be more like humans, experts like Dr Lanillos say.

He has designed an algorithm that enables robots to recognise themselves, in a similar way to humans.

A major distinction between humans and robots is that our senses are faulty, feeding misleading information into our brains. ‘We have really imprecise proprioception (awareness of our body’s position and movement). For example, our muscles have sensors that are not precise versus robots, which have very precise sensors,’ he said.

The human brain takes this imprecise information to guide our movements and understanding of the world.

Robots are not used to dealing with uncertainty in the same way.

‘In real situations, there are errors, differences between the world and the model of the world that the robot has,’ Dr Lanillos said. ‘The problem we have in robots is that when you change any condition, the robot starts to fail.’

At age two, humans can tell the difference between their bodies and other objects in the world. But this computation that a two-year-old human brain can do is very complicated for a machine and makes it difficult for them to navigate the world.

Recognise

The algorithm that Dr Lanillos and colleagues developed in a project called SELFCEPTION, enables three different robots to distinguish their ‘bodies’ from other objects.

Their test robots included one composed of arms covered with tactile skin, another with known sensory inaccuracies, and a commercial model. They wanted to see how the robots would respond, given their different ways of collecting ‘sensory’ information.

One test the algorithm-aided robots passed was the rubber hand illusion, originally used on humans. ‘We put a plastic hand in front of you, cover your real hand, and then start to stimulate your covered hand and the fake hand that you can see,’ Dr Lanillos said.

Within minutes, people begin to think that the fake hand is their hand.

The goal was to deceive a robot with the same illusion that confuses humans. This is a measure of how well multiple sensors are integrated and how the robot is able to adapt to situations. Dr Lanillos and his colleagues made a robot experience the fake hand as its hand, similar to the way a human brain would.

The second test was the mirror test, which was originally proposed by primatologists. In this exercise, a red dot is put on an animal or person’s forehead, then they look at themselves in a mirror. Humans, and some animal subjects like monkeys, try to rub the red dot off of their face rather than off the mirror.

‘In real situations, there are errors, differences between the world and the model of the world that the robot has.’

Dr Pablo Lanillos, Radboud University, the Netherlands

The test is a way to determine how self-aware an animal or person is. Human children are usually able to pass the test by their second birthday.

The team trained a robot to ‘recognise’ itself in the mirror by connecting the movement of limbs in the reflection with its own limbs. Now they are trying to get a robot to rub off the red dot.

The next step in this research is to integrate more sensors in the robot – and increase the information it computes – to improve its perception of the world. A human has about 130 million receptors in their retina alone, and 3,000 touch receptors in each fingertip, says Dr Lanillos. Dealing with large quantities of data is one of the crucial challenges in robotics. ‘Solving how to combine all this information in a meaningful way will improve body awareness and world understanding,’ he said.

Improving the way robots perceive time can also help them operate in a more human way, allowing them to integrate more easily into people’s lives. This is particularly important for assistance robots, which will interact with people and have to co-operate with them to achieve tasks. These include service robots which have been suggested as a way to help care for the elderly.

‘(Humans’) behaviour, our interaction with the world, depends on our perception of time,’ said Anil Seth, co-director of the Sackler Centre for Consciousness Science at the University of Sussex, UK. ‘Having a good sense of time is important for any complex behaviour.’

Sense of time

Prof. Seth collaborated on a project called TimeStorm which examined how humans perceive time, and how to use this knowledge to give machines a sense of time, too.

Inserting a clock into a robot would not give them temporal awareness, according to Prof. Seth. ‘Humans – or animals – don’t perceive time by having a clock in our heads,’ he said. There are biases and distortions to how humans perceive time, he says.

Warrick Roseboom, a cognitive scientist also at the University of Sussex who spearheaded the university’s TimeStorm efforts, created a series of experiments to quantify how people experienced the passage of time.

‘We asked humans to watch different videos of a few seconds up to about a minute and tell us how long they thought the video was,’ Roseboom said. The videos were first-person perspectives of everyday tasks, such as walking around campus or sitting in a cafe. Subjects experienced time differently from the actual duration, depending on how busy the scene was.

Using this information, the researchers built a system based on deep learning that could mimic the human subjects’ perception of the video durations. ‘It worked really well,’ said Prof. Seth. ‘And we were able to predict quite accurately how humans would perceive duration in our system.’

A major focus of the project was to investigate and demonstrate machines and humans working alongside each other with the same expectations of time.

The researchers were able to do this by demonstrating robots assisting in meal preparationsuch as serving food according to people’s preferences, something which requires an understanding of human time perception, planning and remembering what has already been done.

TimeStorm’s follow-up project, Entiment, created software that companies can use to programme robots with a sense of time for applications such as meal preparation and wiping down tables.

In the last 10 years, the field of robot awareness has made significant progress, Dr Lanillos says, and the next decade will see even more advances, with robots becoming increasingly self-aware.

‘I’m not saying that the robot will be as aware as a human is aware, in a reflexive way, but it will be able to adapt its body to the world.’

The research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.

Published on Horizon

Share This