AI, Law and Human Responsibility

Authors

  • Gregor Noll

Abstract

What do algorithmic technologies do to the law, how do they alter lawyers' work on legal issues, and how do they affect the allocation of legal responsibility? If it turns out that algorithmic technologies make it harder to identify a responsible subject, can we do something about it? These are the questions that I am trying to answer in this article. After mapping how AI affects the law and the legal profession, I inquire into the factors distinguishing legal normativity from such normativity as is expressed in algorithmic technologies. I conclude that law and the cybernetic basis of AI conflict with each other in a way beyond remedy. AI fundamentally undermines lawyers’ ability to attribute responsibility, as humans and algorithmic technology amalgamate in practice. I propose that the lawmaker imposes strict responsibility on certain forms of AI to avoid a loss of accountability during the period where traditional law and cybernetic normativity overlap.

Downloads

Published

2021-12-01

How to Cite

Noll, G. (2021). AI, Law and Human Responsibility. Stockholm Intellectual Property Law Review , (2), 48–55. Retrieved from https://publicera.kb.se/siplr/article/view/14095

Issue

Section

Original Articles