Once upon a time, in legal circles, there was just one conversation—Law 1.0. This was—and still is—a conversation about how settled legal principles should be applied to a particular set of facts. Where the facts include novel technological elements—for example, autonomous vehicles, drones, automated transactions, 3D printing, musical composition by artificial intelligence, in vitro fertilization, and so on—we might ask how the general principles of the law apply. Characteristically, Law 1.0 is a conversation for advocates, attorneys, and judges, with the courtroom as the venue for making determinative rulings on the particular questions raised.
However, with the progressive industrialization of the common law world, and with its countries becoming ever more technologically sophisticated, Law 1.0 was joined and partially overtaken by a quite different conversation—Law 2.0. This is a conversation about regulatory policy and purpose—not legal principle—and about how best to serve some particular policy or purpose. To the extent that new technologies raise concerns about human health and safety, environmental hazard, or financial risk, or to the extent that they impact privacy and human dignity and the like, they will be viewed as problematic and in need of regulation. Accordingly, the typical approach of Law 2.0 is to consider whether existing rules are fit for regulatory purposes and, if not, how they need to be revised to render them fit for purpose. Law 2.0 differs from Law 1.0 in content as well as form. The parties to the conversations—lawmakers, policy-makers, and regulators—and the political venues in which they take place differ quite a bit.
Today, a third conversation, Law 3.0, is taking shape. Again, this is a conversation that joins Law 1.0 and Law 2.0 but also partially eclipses them. In Law 3.0, the questions, as in Law 2.0, are regulatory and instrumental (relative to specific policies and purposes), but for the first time, rather than presenting a problem, technologies are now viewed as part of the solution. For example, in response to concerns about unauthorized drones near airfields, the Law 3.0 response is to consider not only revising the rules so that the exclusion zones are extended, but also the employment of technologies such as geo-blocking or geo-fencing that make it more difficult for drones to get near airfields.
In a Law 3.0 conversation, “technological measures” should be understood broadly as including the architecture of spaces, the automation of processes, the coding of products, and, possibly, the coding of persons. Currently, it is not clear who should be part of this conversation, nor is it clear where it should take place. However, it is a conversation that needs to bring together lawyers, politicians, technologists, and the public; and it needs to be accessible. Law 3.0 is such a transformative conversation that it needs to be more transparent than ever and open to everyone.
Topics of Conversation From the many possibilities, I will highlight just two key topics for the Law 3.0 conversation—one concerning regulatory legitimacy and the other the preparedness of our global institutions for their responsibilities.
First, Law 3.0 is extremely instrumental in its focus. The emphasis is on what works. In Law 1.0, there is little attention to what works but there is a residual sense that the general principles of the common law are reasonable and legitimate. What urgently needs to be brought to the Law 3.0 conversation is a modern discourse of reasonableness and legitimacy. I suggest that there are three questions that we should ask to test out the legitimacy and reasonableness of any proposed use or application of technology—including, as in Law 3.0, its use as a regulatory tool.
The first and paramount question is whether the application or use is compatible with respect to the pre-conditions of human social existence. This rests on the simple insight that plurality needs a platform. That platform is found in the essential conditions for human social existence. Without the conditions for humans to exist—just to exist—there are no humans to debate and contest their purposes and priorities. Without conditions that are conducive to agency, humans will not be able to define themselves as the individuals that they are and the communities that they aspire to be. In the context of Law 3.0, regulators need to be particularly sensitive to the risk that technological measures do so much regulatory work that they crowd out the space that humans need to exercise autonomy, to display moral virtue, to take responsibility, and to show trust in and respect for others. Moreover, commitments to liberty mean little if, in practice, humans find themselves in technologically managed environments where they have no option other than to work with the affordances given by the technical measures.
The second question is whether the application or use is compatible with the fundamental values of the particular community—with the values, so to speak, that give the community its distinctive identity and that make it the particular community that it is. Unlike the first question, the second question allows for a plurality of answers provided always that those answers are not incompatible with the global commons.
Finally, the third question is whether regulators have reached a reasonable accommodation of whatever plurality of views—for example, views about the importance of innovation and the balance of benefits and risks—they identify in their community through consultative and deliberative processes. As with the second question, the accommodations reached might legitimately vary from one community to another, but plurality at this level must be compatible with both the global commons and the community’s constitutive values.
Putting this another way, the idea is that Law 3.0 conversations should be constrained by the principle that no technological instruments should be applied for regulatory purposes unless they meet the terms of a triple license, namely: a global commons license, a community license, and a social license.
International Cooperation Clearly, some of the matters implicated in the triple license require international coordination and stewardship, and this leads to the second topic—namely, whether our international institutions are themselves fit for purpose.
This part of the conversation should start with a reality check. First, while all members of the United Nations are formally equal, the actuality is that some are more equal than others. Crucially, the vetoing power of the permanent members of the Security Council invites the subordination of collective well-being to shorter-term national priorities. Secondly, the makers and subjects of international law have different amounts of power and influence, different intentions, different levels of commitment to collective responsibilities, and different degrees of civilization. Thirdly, there are both functioning states and failed states. Amongst the former, while many states are good citizens of the international order who respect the rule of international law, there are also superpowers who play by their own rules and rogue states who play by no rules. If the regulatory stewards were drawn from the good citizens, that might be fine insofar as an agency so populated would be focused on the right questions and motivated by concerns for the common interest of humans. However, all states need to be on board with the responsibilities of global stewardship. Finally, where the missions of international agencies include a number of objectives (such as trade, human rights, and environmental concerns) or where there is a dominant objective (such as the control of narcotics), value commitments to human rights will sometimes be overridden or even treated as irrelevant.
From any perspective, this is all highly problematic. It is one thing for the international community to unite around what it takes to be its shared prudential interests and, in so doing, to give less weight to its interest in certain aspirational values, but respect for the essential conditions of human social existence should never be neglected, collateralized, nullified, or otherwise compromised. Tinkering with institutional structures and their rules is not sufficient; what is necessary for Law 3.0 to flourish is a common commitment to discharging the responsibilities of global stewardship.
Welcome, then, to the conversation that is Law 3.0. It might or might not end well. Employing technological tools for regulatory purposes could be catastrophic. One thing is for sure: if Law 3.0 is not the end of us, it will not be too long before it is overtaken by an even more intensive technological conversation, Law 4.0.
. . .
Roger Brownsword holds professorial positions in Law at King’s College London (where he is Director of TELOS) and at Bournemouth University. He is also an honorary Professor in Law at the University of Sheffield and a visiting professor at City University Hong Kong. His many publications include Contract Law: Themes for the Twenty-First Century (OUP, 2006), Rights, Regulation and the Technological Revolution (OUP, 2008), and, most recently, Law 3.0: Rules, Regulation and Technology (Routledge, 2020). He is the founding general editor (with Han Somsen) of Law, Innovation and Technology, an editorial board member of several international journals including the Modern Law Review, and a specialist adviser to parliamentary committees. Dr. Brownsword has been a member of various working parties, most recently the Royal Society Working Party on Machine Learning.
The COVID-19 pandemic highlights the important intersections of climate change, food security, migration, and socio-economic inequalities. An understanding of the gendered dimensions of these interconnected global concerns is crucial…