The investigation, documented in The Journal of Experimental Social Psychology, aims to guide legislative discourse as robotic technologies evolve. "With robots increasingly taking on complex roles with minimal human oversight, their involvement in activities posing a risk to human safety, such as autonomous driving or military applications, prompts a crucial debate on accountability for harm caused by such autonomous entities," Dr. Dawtry remarked.
This exploration into the psychology of blame allocation is timely, considering the expanding capabilities of robots and their integration into areas fraught with ethical and legal complexities, like autonomous weaponry and human rights considerations. The study delves into how the framing of robots' capabilities influences perceptions of their blameworthiness in harmful incidents.
In one segment of the study involving over 400 participants, a scenario was presented where an armed humanoid robot, engaged in a counter-terrorism operation, accidentally causes the death of a teenager. The findings revealed a propensity to assign more blame to the robot when it was described using terms that emphasized its advanced features, despite identical outcomes.
Furthermore, the research highlighted that the mere designation of a device as an 'autonomous robot', as opposed to a 'machine', significantly impacts the assignment of agency and blame. "Our findings illuminate the subtle yet profound impact of linguistic framing on the attribution of autonomy and blame to robots. This could have far-reaching implications as robots gain actual sophistication or are portrayed as more advanced than they are," Dr. Dawtry concluded.
Research Report:Hazardous machinery: The assignment of agency and blame to robots versus non-autonomous machines
Related Links
University of Essex
All about the robots on Earth and beyond!
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |