Abstract
This paper describes IEEE P7001, a new draft standard on transparency of autonomous systems 1. In the paper, we outline the development and structure of the draft standard. We present the rationale for transparency as a measurable, testable property. We outline five stakeholder groups: users, the general public and bystanders, safety certification agencies, incident/accident investigators and lawyers/expert witnesses, and explain the thinking behind the normative definitions of “levels” of transparency for each stakeholder group in P7001. The paper illustrates the application of P7001 through worked examples of both specification and assessment of fictional autonomous systems.
Original language | English |
---|---|
Article number | 665729 |
Journal | Frontiers in Robotics and AI |
Volume | 8 |
DOIs | |
Publication status | Published - 26 Jul 2021 |
Bibliographical note
Funding Information:The work described in this paper was in part supported by EPSRC grants EP/L024845/1 (Verifiable Autonomy) (LD and AW), EP/ V026801/1 and EP/V026682/1 (UKRI Trustworthy Autonomous Systems: Verifiability Node (LD) and Node on Trust (HH)), EP/ S005099/1 (RoboTIPS: Developing Responsible Robots for the Digital Economy) (AW) and ESRC grant ES/S001832/1 (Driverless Futures) (AW). AT is funded by the Knut and Alice Wallenberg Foundation, grant agreement 2020.0221, and by the European Union’s Horizon 2020 research and innovation program under grant agreement No 825619.
Publisher Copyright:
© Copyright © 2021 Winfield, Booth, Dennis, Egawa, Hastie, Jacobs, Muttram, Olszewska, Rajabiyazdi, Theodorou, Underwood, Wortham and Watson.
Keywords
- Robotics
- Artificial Intelligence
- Transparency
- standard