Research reveals people's trust of coloured robots corresponds with their racial prejudices




People have similar automatic biases towards darker-coloured robots as they do toward people with darker skin colour, according to new research from the University of Canterbury (UC).

University of Canterbury

Source: Wikimedia Commons - Greg O'Beirne

The new research paper, Robots and racism, is being presented on Thursday in Chicago at an international conference on human-robot interaction.

Most robots currently being sold or developed are either stylised with white material or have a metallic appearance, according to the research paper.

The paper was an international collaboration between four universities, while the research team consisted of members with different nationalities and ethnicities.

"In this research, we examined if people automatically ascribe a race to robots such that we might say that some robots are 'White' while others are 'Asian' or 'Black'," the researchers said.

Researchers conducted a replication of the classic social psychological 'shooter bias' experiment which shows that people from many backgrounds are quicker to shoot at armed black people over armed white people -while also more quickly refraining from shooting unarmed white people over unarmed black people.

"This result should be troubling for people working in social robotics given the profound lack of diversity in the robots available," UC human-robot interaction expert associate professor Christoph Bartneck said.

"This lack of racial diversity amongst social robots may be anticipated to produce all of the problematic outcomes associated with a lack of racial diversity in other fields," he said.

We hope the research might inspire reflection on the social and historical forces that have brought what is now quite a racially diverse community of engineers to - seemingly without recognising it - design robots that are easily identified by those outside this community as being almost entirely 'White', he said.

loading error