The Role of Decision Authority and Stated Social Intent as Predictors of Trust in Autonomous Robots

Top Cogn Sci. 2022 Jan 27. doi: 10.1111/tops.12601. Online ahead of print.

Abstract

Prior research has demonstrated that trust in robots and performance of robots are two important factors that influence human-autonomy teaming. However, other factors may influence users' perceptions and use of autonomous systems, such as perceived intent of robots and decision authority of the robots. The current study experimentally examined participants' trust in an autonomous security robot (ASR), perceived trustworthiness of the ASR, and desire to use an ASR that varied in levels of decision authority and benevolence. Participants (N = 340) were recruited from Amazon Mechanical Turk. Results revealed the participants had increased trust in the ASR when the robot was described as having benevolent intent compared to self-protective intent. There were several interactions between decision authority and intent when predicting the trust process, showing that intent may matter the most when the robot has discretion on executing that intent. Participants stated a desire to use the ASR in a military context compared to a public context. Implications for this research demonstrate that as robots become more prevalent in jobs paired with humans, factors such as transparency provided for the robot's intent and its decision authority will influence users' trust and trustworthiness.

Keywords: Autonomous robots; Decision authority; Human-machine teams; Trust; Trust in human-robot interaction.