https://techreg.org/issue/feed Technology and Regulation 2021-12-03T11:24:47+01:00 Ronald Leenes r.e.leenes@tilburguniversity.edu Open Journal Systems <p><strong>Technology and Regulation</strong> (TechReg) is a new interdisciplinary journal of law, technology and society. TechReg provides an <strong>open-access</strong> platform for disseminating original research on the <strong>legal and regulatory challenges</strong> posed by <strong>existing and emerging technologies</strong>.</p> <p>The Editor-in-Chief is Professor Ronald Leenes of the Tilburg Law School. Our <a href="https://techreg.org/index.php/techreg/about/editorialTeam"><strong>Editorial Board Committee</strong></a> comprises a distinguished panel of international experts in law, regulation, technology and society across different disciplines and domains.</p> <p>TechReg aspires to become the leading outlet for scholarly research on technology and regulation topics, and has been conceived to be as accessible as possible for both authors and readers.</p> https://techreg.org/article/view/10982 Keeping up with cryptocurrencies 2021-09-03T10:25:54+02:00 Lauren Fahy l.a.fahy@uu.nl Scott Douglas s.c.douglas@uu.nl Judith van Erp j.g.vanerp@uu.nl <section class="item abstract"> <p>Invented in 2008 with Bitcoin, cryptocurrencies represent a radical technological innovation in finance and banking; one which threatened to disrupt the existing regulatory regimes governing those sectors. This article examines, from a reputation management perspective, how regulatory agencies framed their response. Through a content analysis, we compare communications from financial conduct regulators in the UK, US, and Australia. Despite the risks, challenges, and uncertainties involved in cryptocurrency supervision, we find regulators treat the technology as an opportunity to bolster their reputation in the immediate wake of the Global Financial Crisis.&nbsp;Regulators frame their response to cryptocurrencies in ways which reinforce the agency’s ingenuity and societal importance. We discuss differences in framing between agencies, illustrating how historical, political, and legal differences between regulators can shape their responses to radical innovations.</p> </section> 2021-03-31T00:00:00+02:00 Copyright (c) 2021 Lauren Fahy, Scott Douglas, Judith van Erp https://techreg.org/article/view/10983 Not Hardcoding but Softcoding Data Protection 2021-09-03T10:31:59+02:00 Aurelia Tamò-Larrieux aurelia.tamo@gmail.com Simon Mayer simon.mayer@unisg.ch Zaïra Zihlmann zaira.zihlmann@unilu.ch <div class="main_entry"> <section class="item abstract"> <p>The delegation of decisions to machines has revived the debate on whether and how technology should and can embed fundamental legal values within its design. While these debates have predominantly been occurring within the philosophical and legal communities, the computer science community has been eager to provide tools to overcome some challenges that arise from ‘hardwiring’ law into code. What emerged is the formation of different approaches to code that adapts to legal parameters. Within this article, we discuss the translational, system-related, and moral issues raised by implementing legal principles in software. While our findings focus on data protection law, they apply to the interlinking of code and law across legal domains. These issues point towards the need to rethink our current approach to design-oriented regulation and to prefer ‘soft’ implementations, where decision parameters are decoupled from program code and can be inspected and modified by users, over ‘hard’ approaches, where decisions are taken by opaque pieces of program code.</p> </section> </div> 2021-05-06T00:00:00+02:00 Copyright (c) 2021 Aurelia Tamò-Larrieux, Simon Mayer, Zaïra Zihlmann https://techreg.org/article/view/10985 On the legal responsibility of artificially intelligent agents 2021-09-03T10:43:13+02:00 Antonia Waltermann antonia.waltermann@maastrichtuniversity.nl <p>This paper tackles three misconceptions regarding discussions of the legal responsibility of artificially intelligent entities: these are that they</p> <p>(a) <em>cannot </em>be held legally responsible for their actions, because they do not have the prerequisite characteristics to be ‘real agents’ and therefore cannot ‘really’ act.</p> <p>(b)<em> should not</em> be held legally responsible for their actions, because they do not have the prerequisite characteristics to be ‘real agents’ and therefore cannot ‘really’ act.</p> <p>(c)<em> should not</em> be held legally responsible for their actions, because to do so would allow other (human or corporate) agents to ‘hide’ behind the AI and escape responsibility that way, while they are the ones who should be held responsible.</p> <p>(a) is a misconception not only because (positive) law is a social construct, but also because there is no such thing as ‘real’ agency. The latter is also the reason why (b) is misconceived. The arguments against misconceptions a and b imply that legal responsibility can be constructed in different ways, including those that hold <em>both</em> artificially intelligent and other (human or corporate) agents responsible (misconception c). Accordingly, this paper concludes that there is more flexibility in the construction of responsibility of artificially intelligent entities than is at times assumed. This offers more freedom to law- and policymakers, but also requires openness, creativity, and a clear normative vision of the aims they want to achieve.</p> 2021-07-12T00:00:00+02:00 Copyright (c) 2021 Antonia Waltermann https://techreg.org/article/view/10986 Reviving Purpose Limitation and Data Minimisation in Data-Driven Systems 2021-09-03T10:48:04+02:00 Michele Finck michele.finck@uni-tuebingen.de Asia J. Biega asia.biega@acm.org <div class="page" title="Page 1"> <div class="section"> <div class="layoutArea"> <div class="column"> <p>This paper determines whether the two core data protection principles of data minimi- sation and purpose limitation can be meaningfully implemented in data-driven systems. While contemporary data processing practices appear to stand at odds with these prin- ciples, we demonstrate that systems could technically use much less data than they currently do. This observation is a starting point for our detailed techno-legal analysis uncovering obstacles that stand in the way of meaningful implementation and compliance as well as exemplifying unexpected trade-offs which emerge where data protection law is applied in practice. Our analysis seeks to inform debates about the impact of data protec- tion on the development of artificial intelligence in the European Union, offering practical action points for data controllers, regulators, and researchers.</p> </div> </div> </div> </div> 2021-12-07T00:00:00+01:00 Copyright (c) 2021 Michele Finck, Asia J. Biega https://techreg.org/article/view/11001 The right of access to personal data: A genealogy 2021-09-03T12:10:12+02:00 René Mahieu rene.mahieu@vub.be <p>In this paper, I analyze several traditions of data protection to uncover the theoretical justification they provide for the right of access to personal data. Contrary to what is argued in most recent literature, I do not find support for the claim that the right follows from the German tradition of “informational self-determination” or Westin’s idea of “privacy as control”. Instead, there are two other less known theories of data protection which do offer a direct justification for the right of access. First, American scholars Westin and Baker developed the “due process” view according to which access helps to expose error and bias in decision-making, thereby contributing to correct decisions and allowing the people who are affected to be involved in the decision making. Second, in what I call the “power reversal” view of access, Italian legal scholar Rodotà argues that, in particular when seen from a collective point of view, the right enables social control over the processing of personal data and serves as a counterbalance to the centers of power by placing them under the control of democratic accountability.</p> 2021-08-20T00:00:00+02:00 Copyright (c) 2021 https://techreg.org/article/view/11320 Special Issue: Should Data Drive Private Law? 2021-12-03T11:24:47+01:00 Vanessa Mak v.mak@law.leidenuniv.nl Catalina Goanta catalina.goanta@maastrichtuniversity.nl Monika Leszczynska monika.leszczynska@maastrichtuniversity.nl <p><span lang="EN-US">People differ with respect to their preferences, personalities, cognitive abilities, or attitudes. Yet the way in which private law has evolved in the past centuries sacrifices heterogeneity for the sake of the legal certainty that flows out of generalizations and typifications. Law distinguishes between different groups of individuals such as consumers and professionals, or even between average and vulnerable consumers. These groups are however based on conspicuous features that would justify differential treatment. For instance, determining a profile of the average consumer requires a context, such as a given industry or age group, and reflects specific considerations such as the consumer’s skills in retrieving information about a transaction.&nbsp;</span></p> <p><span lang="EN-US">Decades of research in psychology and behavioral economics generated a tremendous amount of knowledge about people’s behavior, creating typologies with regard to their personality traits, intertemporal or social preferences as well as cognitive skills. Later use of Big Data analysis showed that these types can be predictive of people’s behavior, as well as informational needs or other specific characteristics. Recently, legal scholars proposed that insights generated by this research on granular legal rules could be embedded in private law by, for instance, introducing different default rules or privacy disclosures depending on people’s personality traits or preferences.&nbsp;</span></p> <p><span lang="EN-US">This special issue tackles the question of whether and how data shapes private law. The development of new technologies enables the collection and processing of both personal and non-personal data at an unprecedented scale. The implications of this phenomenon for private law are twofold. On the one hand, the use of data in interactions between individuals may require adjustments or reconceptualization of private law rules and principles. On the other hand, data might be also used by legislators to help create new private law rules as well as to develop consumer empowerment tools to balance their position when transacting with businesses.&nbsp;</span></p> <p><span lang="EN-US">Taking these different perspectives, the papers included in this special issue explore the implications of data for private law. The first article by Antonio Davola addresses the question of how the law deals with the use of data by businesses in their interactions with consumers. Davola analyzes existing private law rules on defective consent and argues that these rules could offer potential protection to consumers when they are targeted by businesses’ personalized commercial practices. He juxtaposes this solution with those provided by consumer law and data protection regulation, as well as competition law.&nbsp;</span></p> <p><span lang="EN-US">Exploring the second perspective – how data can be used in the development of private law – Fabiana di Porto, </span><span lang="EN-US">Tatjana Grote, </span><span lang="EN-US">Gabriele Volpi,&nbsp; Riccardo Invernizzi demonstrate how data can be relied on in the legislative process. Di Porto et al. propose an automated text analysis method to extract information from contributions submitted by stakeholders in the process of public consultation. Specifically, the authors compare the use and understanding of core terms by various stakeholder groups consulted when developing proposals of Digital Markets Act and Digital Services Act.</span></p> <p><span lang="EN-US">Further papers in the special issue will be announced soon.</span></p> 2021-12-03T00:00:00+01:00 Copyright (c) 2021 Vanessa Mak, Catalina Goanta, Monika Leszczynska