Technology and Regulation https://techreg.org/ <p><strong>Technology and Regulation</strong> (TechReg) is a new interdisciplinary journal of law, technology and society. TechReg provides an <strong>open-access</strong> platform for disseminating original research on the <strong>legal and regulatory challenges</strong> posed by <strong>existing and emerging technologies</strong>.</p> <p>The Editor-in-Chief is Professor Ronald Leenes of the Tilburg Law School. Our <a href="https://techreg.org/index.php/techreg/about/editorialTeam"><strong>Editorial Board Committee</strong></a> comprises a distinguished panel of international experts in law, regulation, technology and society across different disciplines and domains.</p> <p>TechReg aspires to become the leading outlet for scholarly research on technology and regulation topics, and has been conceived to be as accessible as possible for both authors and readers.</p> Open Press TiU en-US Technology and Regulation 2666-139X <p>Submissions are published under a Creative Commons BY-NC-ND license.’</p> A Brief History of Data Protection by Design https://techreg.org/article/view/13807 <p>Article 25(1) of the General Data Protection Regulation (“GDPR”) is the first provision that comes to mind when discussing data protection by design. Yet, the origins of that concept can be traced back to an idea that was already solidly established in the software engineering community before its adoption. Besides, the GDPR is not the first binding piece of legislation that incorporates such an obligation. This paper unravels the history of data protection by design by delving into its technical roots and outlining the national and EU initiatives that have preceded the GDPR. Such a retrospective provides the necessary background to understand the implications and scope of its current manifestation in the text of the Regulation.</p> Pierre Dewitte Copyright (c) 2023 Pierre Dewitte https://creativecommons.org/licenses/by-nc-nd/4.0 2023-10-25 2023-10-25 2023 80 94 10.26116/techreg.2023.008 The Law and Political Economy of Online Visibility https://techreg.org/article/view/15790 <p>The paper critically assesses the regulation of social media recommendations in the EU’s 2022 Digital Services Act (DSA), drawing on Sarah Banet-Weiser’s economies of visibility theory. Banet-Weiser calls attention not only to injustices in the distribution of visibility between users, but also to the political implications of organising online media as an economy, in which individuals compete for visibility in a market structured by corporate platforms. DSA provisions on recommendations focus on enhancing user choice, protecting creators’ market access, and encouraging technocratic responses to particular negative externalities, such as promotion of disinformation. Ultimately, then, the DSA aims to enhance the functioning of existing economies of visibility, rather than more fundamentally reforming a social media market in which visibility is allocated based on commercial value.</p> Rachel Griffin Copyright (c) 2023 Rachel Griffin https://creativecommons.org/licenses/by-nc-nd/4.0 2023-10-25 2023-10-25 2023 69 79 10.26116/techreg.2023.007 Trustworthy AI https://techreg.org/article/view/13806 <p><em>The EU has proposed harmonized rules on artificial intelligence (AI Act) and a directive on adapting non-contractual civil liability rules to AI (AI liability directive) due to increased demand for trustworthy AI. However, the concept of trustworthy AI is unspecific, covering various desired characteristics such as safety, transparency, and accountability. Trustworthiness requires a specific contextual setting that involves human interaction with AI technology, and simply involving humans in decision processes does not guarantee trustworthy outcomes. In this paper, the authors argue for an informed notion of what is meant for a system to be trustworthy and examine the concept of trust, highlighting its reliance on a specific relationship between humans that cannot be strictly transmuted into a relationship between humans and machines. They outline a trust-based model for a cooperative approach to AI and provide an example of what that might look like.</em></p> Jacob Livingston Slosser Birgit Aasa Henrik Palmer Olsen Copyright (c) 2023 Jacob Livingston Slosser, Birgit Aasa, Henrik Palmer Olsen https://creativecommons.org/licenses/by-nc-nd/4.0 2023-10-27 2023-10-27 2023 58 68 10.26116/techreg.2023.006 All Rise for the Honourable Robot Judge? https://techreg.org/article/view/17979 <div class="page" title="Page 1"> <div class="layoutArea"> <div class="column"> <p>There is a rich literature on the challenges that AI poses to the legal order. But to what extent might such systems also offer part of the solution? China, which has among the least developed rules to regulate conduct by AI systems, is at the forefront of using that same technology in the courtroom. This is a double-edged sword, however, as its use implies a view of law that is instrumental, with parties to proceed- ings treated as means rather than ends. That, in turn, raises fundamental questions about the nature of law and authority: at base, whether law is reducible to code that can optimize the human condition, or if it must remain a site of contestation, of politics, and inextricably linked to institutions that are themselves account- able to a public. For many of the questions raised, the rational answer will be sufficient; but for others, what the answer is may be less important than how and why it was reached, and whom an affected population can hold to account for its consequences.</p> <p>This contribution is follwed by comments by Lyria Bennett Moses and Ugo Pagallo</p> </div> </div> </div> Simon Chesterman Lyria Bennett Moses Ugo Pagallo Copyright (c) 2023 Simon Chesterman, Lyria Bennett Moses, Ugo Pagallo https://creativecommons.org/licenses/by-nc-nd/4.0 2023-10-03 2023-10-03 2023 45 57 10.26116/techreg.2023.005 Cookies and EU Law: History, Future Regulation and Critique https://techreg.org/article/view/17184 <div><span lang="EN-GB">Cookies and similar technologies can be used to track the online behaviour of internet users and can pose risks to their privacy and other fundamental rights. The use of cookies and similar technologies is therefore regulated by EU law. The article describes the history of EU law regulating cookies, analyses its current form and application to different technologies, and describes the proposals for the ePrivacy Regulation. Based on the analysis, it provides a critique of both the current law and the proposals and suggests ways forward in the regulation of cookies and similar technologies.</span></div> Jan Tomisek Copyright (c) 2023 Jan Tomisek https://creativecommons.org/licenses/by-nc-nd/4.0 2023-10-01 2023-10-01 2023 35 44 10.26116/techreg.2023.004 Harmed While Anonymous https://techreg.org/article/view/13829 <p>Data law and policy assume that harms to individuals can result only from personal data processing. Conversely, generation and use of non-personal data supposedly create new value while presenting no risk to individual interests or fundamental rights. Consequently, the law treats these two categories differently, constraining generation, use, and sharing of the former while incentivizing the latter. This article challenges this assumption. It proposes to divide data-related harms into two high-level categories: unwanted disclosure and detrimental use. It demonstrates how personal/non-personal data distinction prevents unwanted disclosure but fails to capture, and unintendedly enables, detrimental use of data. As a remedy, the article proposes a new concept – data about humans – and illustrates how it could advance data law and policy.</p> Przemysław Pałka Copyright (c) 2023 Przemysław Pałka https://creativecommons.org/licenses/by-nc-nd/4.0 2023-09-27 2023-09-27 2023 22 34 10.26116/techreg.2023.003 How Decisions by Apple and Google obstruct App Privacy https://techreg.org/article/view/13254 <div class="page" title="Page 1"> <div class="layoutArea"> <div class="column"> <p>Ample past research highlighted that privacy problems are widespread in mobile apps and can have disproportionate impacts on individuals. However, doing such research, especially through automated methods, remains hard and has become an arms race with those who engage in invasive data practices. This paper analyses how decisions by Apple and Google, the makers of the two primary app ecosystems (iOS and Android), currently hold back (automated) app privacy research and thereby create systemic risks that have previously not been systematically documented. Such an analysis is timely and pertinent since the newly enacted EU Digital Services Act (DSA) obliges Very Large Online Platforms to enable ‘vetted researchers’ to study systemic risks (Article 40) and to put in place reasonable, proportionate and effective mitigation measures against systemic risks (Article 35).</p> </div> </div> </div> Konrad Kollnig Nigel Shadbolt Copyright (c) 2023 Konrad Kollnig, Nigel Shadbolt https://creativecommons.org/licenses/by-nc-nd/4.0 2023-10-02 2023-10-02 2023 10 21 10.26116/techreg.2023.002 A Right of Social Dialogue on Automated Decision-Making: From Workers’ Right to Autonomous Right https://techreg.org/article/view/13258 <p>An emerging tool in the movement for platform workers’ rights is the right not to be subject to automated decision-making. In its most advanced formulation to date in art 22 of the EU General Data Protection Regulation 2016, this right includes ‘the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision’. Among other things, art 22 forms part of the groundwork of the December 2021 European Commission Proposal for a Directive on Improving Working Conditions in Platform Work, with its mantra of promotion of ‘social dialogue on algorithmic management’. In this article, we argue that art 22 and now the Directive offer an important tool for responding to the mechanistic working conditions of platform work. More broadly, we suggest that a right of social dialogue regarding automated decision-making, which art 22 represents, has the potential to serve as a signal achievement in the history of data rights developing to allow democratic involvement in decisions that affect people’s lives under modern industrial conditions.</p> Damian Clifford Jake Goldenfein Aitor Jimenez Megan Richardson Copyright (c) 2023 Damian Clifford, Jake Goldenfein, Aitor Jimenez, Megan Richardson https://creativecommons.org/licenses/by-nc-nd/4.0 2023-06-23 2023-06-23 2023 1 9 10.26116/techreg.2023.001