Abstract
Autonomous robots increasingly perform functions that are potentially hazardous and could cause injury to people (e.g., autonomous driving). When this happens, questions will arise regarding responsibility, although autonomy complicates this issue – insofar as robots seem to control their own behaviour, where would blame be assigned? Across three experiments, we examined whether robots involved in harm are assigned agency and, consequently, blamed. In Studies 1 and 2, people assigned more agency to machines involved in accidents when they were described as ‘autonomous robots’ (vs. ‘machines’), and in turn, blamed them more, across a variety of contexts. In Study 2, robots and machines were assigned similar experience, and we found no evidence for a role of experience in blaming robots over machines. In Study 3, people assigned more agency and blame to a more (vs. less) sophisticated military robot involved in a civilian fatality. Humans who were responsible for robots' safe operation, however, were blamed similarly whether harms involved a robot (vs. machine; Study 1), or a more (vs. less; Study 3) sophisticated robot. These findings suggest that people spontaneously conceptualise robots' autonomy via humanlike agency, and consequently, consider them blameworthy agents.
Original language | English |
---|---|
Article number | 104582 |
Number of pages | 8 |
Journal | Journal of Experimental Social Psychology |
Volume | 111 |
Early online date | 19 Dec 2023 |
DOIs | |
Publication status | Published - 1 Mar 2024 |
Keywords
- Autonomy
- Blame
- Mind perception
- Robots
ASJC Scopus subject areas
- Social Psychology
- Sociology and Political Science