Trustworthy AI

a cooperative approach

Authors

DOI:

https://doi.org/10.26116/techreg.2023.006

Keywords:

Artificial Intelligence, Trustworthy AI, Artificial Intelligence Act, Automated Decision Making, Administrative Law

Abstract

The EU has proposed harmonized rules on artificial intelligence (AI Act) and a directive on adapting non-contractual civil liability rules to AI (AI liability directive) due to increased demand for trustworthy AI. However, the concept of trustworthy AI is unspecific, covering various desired characteristics such as safety, transparency, and accountability. Trustworthiness requires a specific contextual setting that involves human interaction with AI technology, and simply involving humans in decision processes does not guarantee trustworthy outcomes. In this paper, the authors argue for an informed notion of what is meant for a system to be trustworthy and examine the concept of trust, highlighting its reliance on a specific relationship between humans that cannot be strictly transmuted into a relationship between humans and machines. They outline a trust-based model for a cooperative approach to AI and provide an example of what that might look like.

Downloads

Download data is not yet available.

Author Biographies

Jacob Livingston Slosser, University of Copenhagen

Assisstant Professor

iCourts - Danish National Research Foundation’s Centre of Excellence for International Courts
Faculty of Law

Birgit Aasa, University of Copenhagen

Postdoctoral Researcher

iCourts - Danish National Research Foundation’s Centre of Excellence for International Courts

Faculty of Law

Henrik Palmer Olsen, University of Copenhagen

Professor of Jurisprudence, University of Copenhagen, Faculty of Law, iCourts - the Danish National Research Foundation’s Centre of Excellence for International Courts; Research fellow 2023-2024 at IEA-Paris.

TechReg 2023.006 cover page

Downloads

Published

27-10-2023

How to Cite

Slosser, J. L., Aasa, B., & Olsen, H. P. (2023). Trustworthy AI: a cooperative approach. Technology and Regulation, 2023, 58–68. https://doi.org/10.26116/techreg.2023.006

Issue

Section

Articles
Received 2023-03-07
Accepted 2023-09-05
Published 2023-10-27