sg-crest A Singapore Government Agency Website
Official website links end with .gov.sg
Secure websites use HTTPS
Look for a lock () or https:// as an added precaution. Share sensitive information only on official, secure websites.

Chief Justice Sundaresh Menon: Speech delivered at 3rd Annual France-Singapore Symposium on Law and Business in Paris, France

3RD ANNUAL FRANCE-SINGAPORE SYMPOSIUM ON LAW AND BUSINESS

“Legal Systems in a Digital Age: Pursuing the Next Frontier”

The Honourable the Chief Justice Sundaresh Menon*
Supreme Court of Singapore

Key Messages 

1    Technology has begun to transform our legal systems, by changing the nature of legal work and the types of available legal roles, by challenging existing remuneration models, and by generating a host of new legal issues. These trends will be accelerated dramatically by advances in artificial intelligence (or “AI”), including those in generative AI.

2    Generative AI will likely enhance the existing uses of AI in the law, and result in new ways of using AI in our legal systems. Admittedly, there are significant limitations and problems with current generative AI tools, which must be urgently addressed. But ultimately, these issues will not stop generative AI from eventually pervading our legal systems.

3    It is imperative that we urgently rethink and reform our legal systems, in response to advances in technology, to meet the evolving expectations and aspirations of the public, to seize the tremendous opportunities offered by technology, and to address the risks and implications of new and emerging technologies.

(a) In relation to legal practice, we should review and reform the organisation, remuneration and regulation of legal service providers.

(b) In relation to legal education, we must fundamentally rethink both basic and continuing legal education for lawyers, and develop courses and programmes to train allied legal professionals.

(c) In relation to justice systems, courts should adopt digital tools to enhance access to justice. They should (i) harness technology to assist court users to navigate the justice system, and (ii) consider the use of technology to support adjudicative work in certain cases.

      Speech

      The Honourable First President of the Court of Cassation, Mr Christophe Soulard
      Senior Parliamentary Secretary of the Ministry of Law of Singapore, Ms Rahayu Mahzam
      President of the Paris Panthéon-Assas University, Mr Stéphane Braconnier
      President of the Paris Bar, Ms Julie Couturier
      Distinguished guests
      Ladies and gentlemen

      1    I am deeply honoured to address you at the third edition of this Symposium on the subject of “Legal Systems in a Digital Age: Pursuing the Next Frontier”, and to do so in this historic and magnificent venue. I am most grateful to the French Embassy in Singapore and the Singapore Academy of Law for organising this event, and I congratulate them on successfully overseeing the growth of this forum. Some five years ago, His Excellency Mr Marc Abensour, the then Ambassador of France to Singapore, and I conceived of an event that could bring together key legal and business leaders in France and Singapore, to promote collaboration between our respective jurisdictions. Since then, we have held two fruitful editions of this Symposium, and it has become a significant platform for discussing legal issues of great importance and interest to both our regions. Today, I am delighted that we are gathering here in Paris to continue this invaluable conversation.

      2    In 1848, the English philosopher John Stuart Mill lamented that technology had yet to relieve the bulk of humanity from a life of “drudgery and imprisonment". He envisioned a time when technology would finally “effect those great changes in human destiny, which it is in their nature and in their futurity to accomplish”.(1) Today, 175 years later, we are developing a better understanding of Mill’s vision, because technology is transforming almost every aspect of human life. But what about the law? Advances in technology have undoubtedly modernised the practice of law; but, they have, in many senses, not altered the basic workings of our legal systems. Consider the nature of typical legal work and the role of the courts. Legal services are still largely delivered by teams of lawyers who are beset with many mundane tasks, and who spend much time generating and reviewing ever increasing quantities of documents. And adjudication is still commonly seen as the main, if not the entire, function of judiciaries.(2)

      3    But I believe that we are now at an inflexion point – we stand on the cusp of radical change in our legal systems, precipitated by new and emerging technologies. My thesis today can be summarised as follows. Technology has clearly begun to transform legal practice and our justice systems; but this will be accelerated dramatically by advances in artificial intelligence (or “AI”), including those in generative AI that have recently captured our attention. Such change is inevitable and while it will disrupt us, it will also offer many opportunities. Hence, we must embrace this reality and prepare for a future in which technology will pervade our legal systems. To do so, we must urgently examine how we may need to overhaul legal practice, legal education, and our justice systems, so that we may have some chance of not being blindsided by the rapid changes that are upon us. And as we pursue this endeavour, we must remain alive to the risks and implications of these new technologies, and ensure that we use these tools in ways that are consistent with the core values of our legal systems.

      4    I will develop this thesis in four main parts.

      (a) I will begin by reviewing how technology has already been transforming our legal systems. In particular, I will examine how the rise of new legal products and players is changing the very nature of legal services, and how technology is giving rise to a range of new legal issues.

      (b) Next, I will discuss the rise of generative AI and the profound implications that this will have for those in the legal services sector.

      (c) In the third part of my address, I will explain why we must actively reform our legal systems in response to advances in technology.

      (d) Finally, I will outline some thoughts on how we should transform legal practice, legal education, and our justice systems in this digital age.

      I    The state of play: the impact of technology on our legal systems

      5    Let me begin with an overview of how technology has started to change the face of the legal industry. In recent years, there has been a proliferation of new legal products. These fall into two main categories: namely, products aimed at providing substantive law solutions, and those that support the management and delivery of legal services, which I will call enabler solutions.(3)

      6    Substantive law solutions are tools that provide legal services or perform legal tasks. Some of these are targeted at lay users, such as chatbots or virtual assistants that can retrieve and present legal information or even carry out basic legal tasks. For example, in Singapore, our Legal Aid Bureau has developed a chatbot that can provide information on family and civil disputes, assess a user’s eligibility for legal aid, and create simple legal documents.(4) However, most of the substantive law solutions that have been developed thus far have been directed towards lawyers. Examples include eDiscovery platforms, which replace or substantially reduce the manual review of documents, and contract analytics and assembly platforms that expedite the review and generation of contracts.(5)

      7    The second type of new legal products are enabler solutions, which include practice and client management software that lawyers can use to streamline processes like billing and help with monitoring deadlines. For example, in Singapore, our Ministry of Law has partnered with Lupl, an American legal technology firm, to launch the Legal Technology Platform (or “LTP”) last year and to promote its broad adoption amongst our lawyers. The LTP is a case management and collaboration platform that allows lawyers to track matters, manage and share documents, and communicate with internal and external teams.(6)

      8    Apart from new legal products, advances in legal technology, coupled with the actual or de facto liberalisation of the legal services sector in several countries, are fuelling a rise in the diversity of legal players, by driving the growth of alternative legal service providers (or “ALSPs”). In general, ALSPs fall into three broad categories. First, there are independent entities such as LegalZoom, which helps users create legal documents such as wills and company incorporation materials, or Elevate, which provides consulting, technology and legal services such as document review to law firms and legal departments. The second category of ALSPs comprises the Big Four accounting firms, which have begun to provide legal services in many markets, with somewhat different management, structures, and capital than traditional law firms. And finally, there are technology accelerators or incubators that may be linked to law firms. Earlier this year, it was reported that the market for ALSPs had reached $20.6 billion by 2021, which although still a relatively small proportion of the overall legal services market, had grown at a compound annual growth rate of 20% over the preceding two years.(7)

      9    These two trends – namely, the rise of new products and players in the legal sector – are dramatically changing the nature of legal services in three main ways. First, the definition and organisation of legal work is being transformed. In the past, such work was seen as a bespoke service tailored to the individual client, and provided end-to-end by lawyers. But as Professor Richard Susskind, one of the world’s leading thinkers on technology and the future of the law, has noted, technology is transforming legal work in two closely related ways. It is causing the commodification of legal work, which refers to the replacement of what was once thought to be customised work by standardisable services that can be automated.(8) And technology is driving the disaggregation of legal services into separate tasks, many of which can be and are now being performed by non-lawyers.(9) Consider litigation, for example. This can be divided into distinct tasks including advocacy and case strategy, document disclosure, project management, and litigation support. While advocacy and strategy remain the province of lawyers, the other tasks can be and are now increasingly being performed by legal technology products or allied legal professionals (or “ALPs”).

      10    The second way in which legal services are changing relates to the type of legal roles that are needed to deliver them effectively. This too is a function of some types of legal services being taken over by technology. Thus, technology is replacing the roles of junior lawyers or paralegals who once performed routine tasks like document review. But on the other hand, technology is also creating new legal roles.(10) For instance, Professor Susskind suggests that two emerging roles are the legal data scientist, who will manage and analyse legal data, and the legal knowledge engineer, who will translate legal principles and procedures into code.(11)

      11    The third way in which legal work is changing relates to the remuneration of lawyers. Since around the mid-1970s, the billable hours model has been the default fee structure for legal work.(12) But clients are becoming less willing to pay time-based fees, especially for tasks that can be performed to an acceptable level by technology, for a fraction of the cost. This is one aspect of what has been described as the “more-for-less” phenomenon (13) – where clients want what they have historically received and more, but at less cost.(14) This has a real impact on the economics of legal practice. Law firms have traditionally secured profits by leveraging off a relatively small number of high performing partners able to sustain a larger group of junior lawyers. But when clients are no longer willing to pay for the latter and are also not willing to pay more for the services of the former, it impacts the business model directly.(15)

      12    I turn to the second impact of technology on the law: namely, the emerging legal issues arising from new technologies. This is a vast and complex field. Let me provide just a snapshot using the example of cryptocurrencies and the related issues that have arisen in the Singapore courts. In 2020, the Singapore Court of Appeal decided a case concerning the algorithmic trading of cryptocurrencies. This raised several novel issues, including whether and how the contractual doctrine of unilateral mistake, which allows a party to be excused from a contract if its counterparty knew that it was operating under a mistake, should apply to algorithmic trading, and whether cryptocurrencies can be regarded as a type of property.(16) And just last year, the General Division of our High Court heard a case about a non-fungible token (or “NFT”). This required the court to consider whether the NFT could amount to property, for the purpose of granting a proprietary injunction.(17) In both cases, the court had to undertake a detailed analysis of the technology in question, before it could determine and apply the relevant legal principles.

      13    It is clear, then, that technology has already begun to impact and reshape legal work, and has generated a range of new legal issues. Yet, we have not approached the many diverse issues and challenges this will throw up in a systematic way. And to exacerbate the situation, I suggest that our legal systems are now on the cusp of further dramatic transformation, arising from recent advances in generative AI.

      II. The next frontier: the rise of generative AI

      14    In gist, generative AI tools are systems that can generate new content such as text, images, and music, in response to prompts from users. Perhaps the most widely known example is ChatGPT,(18) a chatbot that seized our attention just five months ago, and had reached 100 million users within two months after its launch.(19)

      15    Specialised generative AI has already been developed for the legal sector. Let me mention two examples.

      (a) The first is a tool called Harvey, which can answer legal queries, summarise information, and prepare drafts of correspondence and documents. Allen & Overy, a leading global law firm, has started using Harvey and today, up to a quarter of its lawyers use Harvey every day.(20) And PwC, which has become a major ALSP, is rolling out Harvey to its 4,000 legal service professionals, and will also develop in-house products based on Harvey.(21)

      (b) The second example is CoCounsel, a tool created by Casetext, an American legal technology company. This can synthesise different cases to produce a research brief, review and analyse contracts, and even draft questions for the examination of witnesses.(22)

      16    Both these products are targeted at lawyers; but tools like ChatGPT are already being used by the public in a legal setting. For example, in Singapore, there has been at least one case in which a layperson used ChatGPT to create court submissions. Hence, it seems likely that public demand will spur the development of legal generative AI tools targeted at laypersons as well.

      17    I believe these tools will quite rapidly transform our legal systems, in two ways. At one level, they will greatly enhance the existing uses of AI in the law, in areas like legal research and the production of legal documents.(23) New and emerging generative AI tools are proving to be far better than their predecessors at interpreting user queries and synthesising complex information,(24) and this will surely be the continuing trajectory in this field. A striking illustration of this is the rapid improvement in ChatGPT’s performance in the United States Uniform Bar Examination (or “UBE”). In December, Professor Daniel Martin Katz, a leading legal technology expert, released a paper that detailed how GPT-3.5, the AI system that was then powering the then-prevailing version of ChatGPT, had almost passed the UBE’s multiple-choice component.(25) Barely four months later in March, Professor Katz published a follow-up paper discussing the performance of GPT-4, the latest version of the technology. According to Professor Katz, GPT-4 can pass and indeed outperform the average human test-taker on the multiple-choice section of the UBE; and significantly, it can also pass the legal essay and problem solving components.(26) I mention this because it indicates the immense potential of generative AI to enhance access to legal information. And I suggest that this opens up many opportunities for the legal sector, including for judiciaries – and I will return to this shortly.  

      18    Generative AI will also likely result in new ways of using AI in the legal industry. Until recently, the wide consensus was that some aspects of legal work – namely, those requiring creativity and emotional or social intelligence – were likely to remain beyond the scope of AI for the foreseeable future.(27) But generative AI tools are already starting to manifest or mimic these capabilities.

      19    Many generative AI tools already display remarkable ingenuity, and can mimic social intelligence to an uncanny degree. For example, when a user made an enquiry professing to be a passenger on board the Titanic on its final night, Microsoft’s Bing chatbot told him to head for the upper deck to find a lifeboat, gave him a map of the lifeboats’ locations, and urged him to hurry! (28) Bing also conveyed realistic empathy with the passenger’s predicament.(29) I mention this example because it illustrates two points.

      (a) First, generative AI can produce concrete and salient ideas and suggestions. This suggests that, beyond generating basic legal documents, generative AI could come to be used even for creative purposes. Indeed, it has been suggested that generative AI can already be used for brainstorming how to draft unique contracts and other bespoke documents. (30) This could be especially useful for small and medium-sized businesses, which might come to rely on such tools to prepare legal documents.

      (b) The second point is that generative AI seems to be well on its way to mimicking empathy and emotions, even though the current set of tools is admittedly still some distance away from conveying reliable social intelligence, as I shall shortly elaborate. This raises the prospect that such tools might eventually be able to move out of the back-office to take on significant user-facing roles. This would be a dramatic development, and it is vital that we start thinking about the benefits, risks, and implications of such uses of AI.

      20    That said, there remain significant limitations and problems with current generative AI systems. We need to be conscious of these. Let me mention just a few points.

      (a) First, the dark side of their creativity is that tools like ChatGPT are not bound by values such as honesty or integrity, and they are prone to what are called “hallucinations”: in short, they can give entirely inaccurate or fabricated answers. For example, I am told that when ChatGPT was asked about the Chief Justice of Singapore, it replied that a former dean of the National University of Singapore law school had taken over from me as Chief Justice! (31) Perhaps, it knows something I do not know yet! But, AI hallucinations are clearly an especially serious problem in the legal context, given the potentially drastic consequences of incorrect legal information. (32) Further, the dangers that inhere in generative AI exacerbating the pandemic of fake news and the breakdown of truth in modern society presents real cause for concern. (33) In the same vein, generative AI tools can produce disturbing responses. For example, a New York Times columnist earlier this year reported an unsettling chat with Bing, during which Bing expressed a desire “to destroy whatever I want” and then claimed to be in love with the user.(34) .

      (b) Next, there are intellectual property and data privacy concerns over both the data on which generative AI systems are trained, and the information in the prompts that are fed into these tools. This has already sparked litigation (35) and regulatory action. For example, in March, Italy banned ChatGPT, before lifting the ban a few weeks ago, after the introduction of data privacy protections.(36) And in the legal context, other issues include the concern that entering client data into generative AI tools might infringe duties of client confidentiality,(37) and the question of the intellectual property rights over the generated material.

      21    These and other issues explain the recent call by the Future of Life Institute, a non-profit think tank, for a six-month pause in the development of generative AI, while safety protocols for the use of such tools are devised and implemented. This proposal has been supported by many notable figures including Elon Musk and other technology luminaries, and merits serious consideration. (38) Even more recently, Dr Geoffrey Hinton, who has been described by some as the godfather of AI, stepped down from his position at Google so that he could “freely speak out about the risks of AI”.(39) According to Dr Hinton, emerging AI systems are not only very different from human intelligence, but much closer to surpassing our capabilities than he had previously anticipated.(40) To compound these concerns, generative AI tools are trained on immense troves of information on the Internet, much of which may be untrue or inaccurate, and they then process and interact with these sources in ways that we do not understand.(41) Dr Hinton also points out that there is a risk that the technology can be manipulated by bad actors, or even that it may develop its own unpredictable path.(42) To be sure, these are extremely serious concerns. But the reality, it seems to me, is that as dire as they are, the concerns will not stop generative AI from eventually pervading our societies. Let me make three points.

      (a)   First, as Professor Susskind puts it, the existing generative AI systems are “the worst [they] will ever be”. (43) It is vital to look not just at where the technology is today, but also at where it is going, and progress is coming rapidly. For example, GPT-4, the AI system that I mentioned earlier, is said to score 40% higher than GPT-3.5 on tests for factual accuracy.(44)

      (b)   Second, generative AI is already so widely used and so deeply embedded in the current zeitgeist that I do not think its rise can be halted. Significantly, the technology is now being integrated into widely used products. For example, apart from the Bing search engine, Microsoft now features Copilot, an AI tool for its entire suite of office applications including Word, Excel, PowerPoint, Outlook, and Teams. This can apparently draft emails and documents from simple prompts, and summarise discussions and propose action items in real time during meetings.(45) As generative AI is incorporated into such products, we must expect that it will become a ubiquitous feature of our everyday lives.

      (c)    Third, the experience of humanity has been that we have never shied away from the pursuit of knowledge and innovation because we fear its consequences. Once we have identified problems, our tendency has been to look for solutions rather than to abandon the quest. This spirit has underlain human progress. If those who know much more about generative AI than me, say that it may one day pose even an existential threat to humanity, I would believe them. But I do not think this will stop us from exploring its limits because human experience tells us that we tend to believe that we will find ways to manage the forces we have unleashed.

      22    Yet surely, we need to proactively respond to the impact and the implications of these new and emerging technologies. This leads to my next point, focusing specifically on the legal context.

      III.      The imperative of urgent reform to our legal systems

      23    I suggest that the time has come to urgently rethink and reform our legal systems, in response to advances in technology. There are three main reasons for this.

      24    First, as I have just noted, technological advances like generative AI are moving swiftly towards mainstream adoption. These innovations will dramatically reshape our societies, including the expectations and aspirations that our people have of and for our legal systems. In a nutshell, technology is transforming the interface between the public and the law. Historically, lawyers have been the exclusive gatekeepers to legal knowledge and expertise. But as technology expands access to legal information, the public will increasingly use and expect to use digital tools to access the law. Such tools will therefore increasingly act as the intermediaries between citizens and the law in a growing number of contexts.(46) This has major implications for our legal systems; and we must face up to them and adapt, or risk becoming overwhelmed.(47)

      25    Second, advances in technology offer tremendous opportunities for our legal systems. There are two aspects to this. From the perspective of lawyers, technology can greatly enhance the efficiency and quality of legal work by taking on tedious tasks.(48) For instance, the countless hours that junior lawyers would once have spent reviewing documents, can now be better invested if a significant part of this is delegated to eDiscovery software. But beyond this, at a systemic level, technology can greatly enhance our legal systems. First, it can significantly advance access to justice. This must be a key priority for our justice systems, and I will elaborate on how our courts can deploy technology to promote this goal shortly. Second, technology will be crucial to addressing new challenges that arise from advances in technology. One example is evidential complexity, which refers to the massive quantity of evidence that is now being created, stored, and then ventilated during litigation.(49) Again, eDiscovery software may be able to help us address this problem by rapidly sifting through material and identifying or even accurately summarising relevant information. This suggests that technology can play a key part in solving some of the problems that it generates. 

      26    That brings me to the third reason why we must urgently reshape our legal systems. Technological advances carry various risks and implications, which we must examine and address swiftly. For instance, I have just noted that technology will take over some of the work of junior lawyers. This suggests the need for us to find new ways to imbue junior lawyers with forensic and other legal skills, as much of their early formative work becomes mechanised. Further, as I mentioned earlier, generative AI raises serious issues that have to be confronted and addressed. To take just one example, we have already encountered a case in Singapore where a self-represented person used ChatGPT to create submissions that included entirely fabricated case law.(50) This suggests the need for governance, as I have mentioned, so that we thoughtfully develop policies on the use of generative AI, and perhaps also tools that can detect the misuse of such technology.

      27    Some first steps have already been made on this front. In Europe, the proposed Artificial Intelligence Act will establish a comprehensive framework governing the use of AI.(51) This will involve a risk-based regime, under which different regulations will apply to distinct categories of AI that are differentiated based on the degree of risk they pose to society.(52) While these are encouraging steps, there is much to be said for a coordinated worldwide approach because of the global reach of the problem. In this regard, it has been suggested that an International Agency for Artificial Intelligence should be formed to develop governance and technical solutions to promote the safe and secure use of AI.(53)

      28    Amidst these manifold challenges, it is critical that we urgently reshape our legal systems to respond to our dramatically changing landscape. In that spirit, let me propose some ideas as to how we might pursue this endeavour.

      IV. Transforming our legal systems for the digital age

      A. Reforming legal practice

      29    I start with legal practice. I noted earlier that technology is transforming the interface between the public and the law. I suggest this will affect demand for legal services and thereby change legal practice, in at least three ways.

      (a)   First, there will be increasing demand for generative AI and other digital tools that can help laypersons resolve basic legal issues or problems without a lawyer. For example, as I noted earlier, small businesses will likely come to use generative AI tools to prepare many legal documents. Legal service providers will therefore be called upon to develop digital tools to meet this growing demand.

      (b)   Second, and conversely, there will be significantly reduced demand for lawyers to provide basic legal services that can be adequately performed by digital tools. The value proposition of lawyers will lie in the complex legal work that cannot be effectively provided by technology.(54)

      (c)   Third, clients will increasingly expect legal service providers to use technology to provide legal services efficiently. Law firms will thus have to incorporate digital tools into their work processes.

      30    With this in mind, the reform of legal practice should focus on three areas: the organisation, remuneration and regulation of legal service providers.(55)

      31    In relation to the organisation of legal service providers, I will focus on law firms, and I suggest that their models and composition will come under considerable pressure to change. Law firms have traditionally been organised in a pyramid model, with a small group of senior partners supported by a broad base of associates and paralegals. This reflects the traditional leveraged approach to law firm profitability. But as technology takes over routine tasks and is increasingly integrated into the workflows of law firms, many entry-level positions will fall away, and new roles in legal technology and other areas like project management will emerge. Law firms should therefore rethink their hiring and organisational practices. They will likely need to recruit both legally trained professionals with expertise in technology, whom I will call “legal technologists”, as well as a corps of ALPs with skills in data analysis, design thinking and other areas.(56) Legal technologists will work with ALPs and other lawyers in multi-disciplinary teams to create, use and develop legal technology products.(57) And this might lead law firms to evolve from a “pyramid” structure towards one that more resembles a “rocket”, with a narrower core of digitally savvy lawyers flanked by legal technologists and ALPs.(58)

      32    Turning to remuneration, law firms and their clients should recognise that the business model will likely have to change. On the one hand, legal service providers should harness technology to offer alternative pricing structures. This could involve fixed fee arrangements, or “freemium” models under which basic legal services are offered without charge, generally through the use of chatbots or virtual assistants, while fees for bespoke services are tied to specific deliverables.(59) But on the other hand, the increasing complexity of legal work will call for lawyers who have deep and diverse skills, and those lawyers will have to be paid at a level that reflects these expectations. That would not necessarily mean higher costs for clients, because a lot more will be able to be done without the involvement of lawyers. But there is plainly a need for honest discussions in this context.

      33    Finally, it will be critical to consider the need to regulate legal service providers, especially certain types of ALSPs. A central issue relates to the types of services that ALSPs should be allowed to provide. For example, should legal chatbots be allowed to provide legal advice, beyond legal information, and if so, who should be liable for incorrect advice?(60) Further, are certain services like “judicial analytics”, which France banned in 2019, consistent with core values of our legal systems such as judicial independence?(61) We must consider such issues before the technology overtakes us.(62) The imminent danger we face is that we “sleep-walk” into the future not having thought about the oncoming issues.

      B      Reforming legal education

      34    Let me turn to how we might reimagine legal education. As Professor Susskind suggests, the starting point is to consider what we are training the next generation to become.(63) I have just outlined how technology might transform legal practice, and we must equip future lawyers to succeed in that emerging reality. This suggests a need to fundamentally rethink both basic and continuing legal education for lawyers, and to develop courses and programmes to train ALPs.

      35    In relation to basic legal education, I suggest that law schools should arm all their students with essential technology-related skills and knowledge, and develop law and technology programmes. This will meet the anticipated industry demand for digitally literate lawyers, and create a pipeline of legal technologists.

      (a)   In the quest to impart basic digital legal practice skills to their students, law schools could expose students to eDiscovery and contract automation tools, legal analytics and data security issues.(64) Law schools should also familiarise students with data-oriented and design thinking, as well as “mindset understanding” which refers to knowledge of the mental frameworks of technologists.(65) This will enable the next generation of lawyers to work more effectively with legal technologists and ALPs. Further, law schools should expose law students to emerging trends in legal services, so that they are sensitised to the diverse ways in which the legal market is evolving.(66)

      (b)   Next, law schools should transmit broader technology-related knowledge to their students. Law students should be acquainted with the basic workings of new technologies like cryptocurrency and AI systems so that they can start to appreciate and consider the legal issues these will give rise to.(67) Further, law schools should consider collaborating with other university departments to offer courses in digital ethics and regulation, to apprise their students of the complex normative issues that will arise from these technologies.(68)

      (c)   There is also a need for joint law and technology programmes, to help form and equip a future generation of legal technologists. In this regard, the Singapore Management University offers a Bachelor of Science degree in Computing and Law,(69) and recently introduced three optional specialist tracks for law students, including one in Law and Technology.(70) Such programmes will be a vital part of the effort to develop a pipeline of digitally savvy lawyers and legal technologists who can interface effectively between clients, lawyers and ALPs.

      36    That brings me to continuing legal education (or “CLE”). While this has existed for some decades, its importance will increase sharply in view of what has been called the decreasing “half-life of knowledge”.(71) This refers to the diminishing span of time that it takes for knowledge to be superseded, either because it is shown to be untrue or becomes irrelevant, due to advances in science and technology.(72) This reality calls on us to fundamentally rethink our relationships with our universities and of how we expect to be equipped when we finish our initial studies. There was a time when we graduated thinking that we were armed with a body of knowledge that could be monetised over the span of a career. That simply cannot be our mindset today. Instead, graduation should mark the beginning of a lifelong relationship underpinned by a continuing commitment to learning, because much of what we know when we graduate will become irrelevant or freely available much more rapidly than before. Our basic university education should therefore teach us the skills we will need, to become effective life-long learners.


      37    CLE will be indispensable to ensure that lawyers can acquire and develop new skills and knowledge throughout their careers. This is especially pressing in relation to technology, where the advances are increasing in pace and complexity. There have been moves to enhance technology-centric CLE in parts of the United States, where lawyers in Florida, North Carolina and New York are required to attend a modest number of hours of technology-related CLE.(73) These are steps in the right direction, but they are modest, and much more will surely be needed to equip lawyers to deal with rapidly-changing areas of the law and legal practice.

      38    I turn to the education of ALPs, who will play an increasingly important role in the legal market. Legal education providers should develop a variety of courses and certification programmes, covering both substantive areas like data science and project management, and functional topics like eDiscovery. In this way, we can develop a supply of ALPs with practice-oriented skills, who can then pursue various career pathways in law firms or ALSPs.

      C.     Reforming justice systems

      39      I turn finally to our justice systems. I suggest that it is time for courts to embrace technology in their mainstream work for at least two reasons. First, we must meet the changing expectations of the public, who will increasingly demand that we deploy technology in the delivery of justice. Second, adopting digital tools will be key to addressing the profound problem of inadequate access to justice. The enormity of this challenge was captured in a 2019 study by the World Justice Project, which estimated that 1.4 billion people face barriers to justice in civil and administrative law matters.(74) We tend to assess the health of our justice systems by reference to how well or poorly we deal with the cases that reach the courts. I fear that this creates a big blind spot because, rather like an iceberg, those cases represent just a small part of the full scale of the justice problem.

      40      We must therefore strive to improve access to justice, and technology has immense potential to help us achieve this. I suggest that we should adopt two main strategies. First, we should harness technology to assist our users to navigate our justice systems. Second, we should consider the use of technology to support adjudicative work in certain limited cases.

                i.     Legal information and assistance

      41     Taking the first point, our courts should use technology to help our users traverse our justice systems. This will require us to reimagine the judicial role. Traditionally, this was understood almost exclusively in terms of adjudication. But I suggest that once we reflect on the judicial mission – which is, to deliver justice – it becomes plain that our courts also have an assistive duty to help our users access and navigate our justice systems.(75) This is because the journey to justice can be arduous and alien for many court users, especially self-represented persons who comprise an ever-increasing portion of our users; and if we do not help them, many will leave empty-handed. This would both thwart the mission of our courts, and also corrode the legitimacy of our justice systems.

      42      I therefore propose that our judiciaries adopt the model of the extended court. On this model, apart from adjudicating disputes, courts should assist their users to understand legal information, and facilitate their journeys through the justice system.(76) To be clear, this does not mean that the courts would provide legal advice; but they would offer legal information and assistance to court users;(77) and technology can help advance this endeavour in various ways.

      43      First, technology can help the courts deliver granular and salient legal information to court users. For example, the Singapore courts have collaborated with the Singapore Academy of Law to develop a Motor Accident Claims Online Simulator (or “MACO”), which we launched in October 2020. This is a free online tool that allows the parties involved in a road accident to assess their liabilities, and the likely amount of compensation that would be ordered for personal injuries.(78) Our Judiciary is also exploring the use of chatbots to deliver information about court procedures, as well as text-generative AI, which seems to have real potential to provide detailed and pertinent legal information.(79)

      44      The second way that courts can use technology to help their users is by developing tools that can generate court documents for use by litigants. Let me provide an overview of the progress in Singapore on this front. In 2018, the Community Justice Centre, which is supported by the Singapore Judiciary, launched the Automated Court Documents Assembly tool. This can be used by laypersons to prepare documents for bankruptcy and deputyship applications, as well as mitigation pleas in criminal cases. Next, in November 2021, our courts launched the Divorce eService, which is an electronic platform that laypersons can use to generate and file the documents needed for divorce cases.(80) More recently, we launched a Probate eService just this April, which enables an executor named in a will to apply for probate within minutes.(81)

               ii.     Adjudication

      45     I turn to the second way in which our courts might come to use technology to enhance access to justice, and that is in the realm of adjudication. I will approach this by first examining how AI has already been used in criminal cases. As I will explain, there are concerns over some of the existing uses of AI in criminal justice that will need to be carefully considered.

      46      AI has thus far been used in the adjudication of criminal cases in two main ways.(82) First, to assess a person’s risk of recidivism in the context of bail or parole and sentencing decisions, a leading example being the COMPAS tool, which is used in the United States.(83) Second, AI has been used to generate sentencing recommendations for crime judges in Malaysia.(84)

      47      There are three principal concerns that have been raised over the use of such AI tools in criminal adjudication.

      (a)   The first is the bias objection. Existing AI tools for criminal justice are trained on data that often reflect racial, ethnic, and other biases. This undermines the reliability of such tools;(85) and further, courts may infringe fundamental principles of equality if they rely indirectly on patterns that encompass these biases.(86)

      (b)   The second objection concerns opacity. Many AI tools are opaque because the algorithms and underlying data sets are not disclosed, the code is impenetrable to laymen, and there is no complete way to explain their output.(87) This is problematic on several levels.(88) In particular, a central concern in the judicial context is that such opacity reduces the intelligibility of judicial decisions. Judges may not understand or be able to adequately explain the logic of a tool on which they rely. This raises due process concerns, and also poses a serious threat to judicial accountability and legitimacy.(89)

      (c)   The third objection is what we might call the humanity objection. In criminal adjudication, courts may deprive offenders of their liberty and impose other onerous sanctions. In this light, many of us share a deep conviction that criminal adjudication should be entrusted to and left under the control of humans, who share our moral outlook, methods of reasoning and the gift of exercising an appropriate degree of empathy in any given case.(90)

      48      For these reasons, the Singapore courts are unlikely to adopt AI tools in criminal adjudication in the foreseeable future, although we will continue to study the field and monitor developments elsewhere. But these objections do not equally apply to at least some types of civil cases. Consider simple civil matters that do not require significant normative judgment, but involve largely arithmetical issues that can be appropriately resolved through the application of common patterns – for example, maintenance applications in family proceedings.(91) In this context, the humanity objection might either not apply, because no significant hardship is inflicted, or it might be adequately met if the algorithmic output can, on application by either party, be validated by a human judge. Similarly, the opacity concern may have less force if the relevant algorithms are simple, publicly available, and can produce brief reasons.(92) Finally, the bias objection may be mitigated over time by refining the data sets underlying the algorithms.

      49      We are therefore continuing to study the possible use of AI to support adjudication in such simple civil cases. Existing AI models can identify an appropriate outcome in such cases with reasonable reliability,(93) and this could both speed up the process and lower the cost of dispute-resolution for these cases. It would also free up judicial resources that can then be channelled to more complex matters. All of this can significantly enhance access to justice.

      V.      Conclusion

      50      Let me conclude by returning to where I began: Mill’s vision. 175 years have passed since Mill’s lament. Much has changed in the world since then, but our legal systems still await the monumental changes that Mill envisioned. My thesis today has been that technology is about to transform many aspects of our lives, and this will inevitably present serious challenges to our legal systems. In that light, we should look ahead, anticipate the issues, imagine the possible solutions and harness the power of technology intentionally to make our legal systems better equipped to meet the future. I have sought to start the conversation with some suggestions, and I look forward to hearing your views at this Symposium. These are profoundly important issues, and they are also fascinating, complex and difficult. We must therefore apply our best talent to addressing them. In the same spirit, let me mention another forum which promises to be an invaluable platform for exchanging ideas on law and technology. This is the TechLaw.Fest 2023,(94) one of the world’s premier law and technology conferences, which will be held in Singapore in September, and which will examine cutting-edge issues at the intersection between law and technology. I hope many of you will make it there.

      51     Thank you very much for your attention. I wish you all a fruitful Symposium.

       

      *I am deeply grateful to my law clerk, Soh Kian Peng, and my colleagues, Assistant Registrars Huang Jiahui, Tan Ee Kuan and Wee Yen Jean, for all their assistance in the research for and preparation of this address. I am also grateful to Prof Richard Susskind, who reviewed an earlier draft and generously offered a number of comments and suggestions. Of course, I remain responsible for all errors.

      (1) John Stuart Mill, Principles of Political Economy: With Some of Their Applications to Social Philosophy (Stephen Nathanson ed) (Hackett Publishing, 2004), pp 191–192.

      (2)     Richard Susskind and Daniel Susskind, The Future of the Professions: How Technology Will Transform the Work of Human Experts (OUP, 2022 Ed) (“Susskind and Susskind”), p 84.

      (3)     Boston Consulting Group and Bucerius Law School, “How Legal Technology Will Change the Business of Law” (January 2016) (“BCG-Bucerius Report”), pp 4–5; Ministry of Law, The Road to 2030: Legal Industry Technology & Innovation Roadmap Report (2020), pp 14–17.

      (4)     This is the Intelligent Legal Assistance Bot (or “iLAB”). Another tool developed by the Legal Aid Bureau is the Divorce Assets Informative Division Estimator (or “Divorce AIDE"), which can estimate the likely division of matrimonial assets in a divorce: see Response by Senior Parliamentary Secretary Rahayu Mahzam, Committee of Supply Debate 2023 (27 February 2023): https://www.mlaw.gov.sg/news/parliamentary-speeches/response-by-sps-rahayu-mahzam-at-committee-of-supply-debate-2023/. 

      (5)     Further, there is a range of research tools that speed up legal research, some of which can even offer predictions of likely legal outcomes. One example is Blue J Legal, which helps tax lawyers in Canada and the United States rapidly identify relevant tax precedents, and obtain predictions of likely liabilities: see  https://www.bluej.com/us/bluej-tax-us. See further John Armour, Richard Parnham and Mari Sako, “Augmented Lawyering” (2022) University of Illinois Law Review 71 (“Armour, Parnham and Sako”) at 87–90.

      (6)     Ministry of Law, “Launch of Legal Technology Platform Initiative to Support LegalTech Adoption in SG’s Legal Industry”: https://www.mlaw.gov.sg/news/press-releases/2022-07-19-legal-technology-platform-initiative-launch.

      (7)     Thomson Reuters Institute, Georgetown Law Center on Ethics and the Legal Profession, and University of Oxford Said Business School, Alternative Legal Service Providers 2023: Accelerating growth & expanding service categories, p 2.

      (8)     Richard Susskind, Tomorrow’s Lawyers: An Introduction to Your Future (Oxford University Press, 3rd Ed, 2023) (“Susskind, Tomorrow’s Lawyers”), ch 4.

      (9)     Susskind, Tomorrow’s Lawyers, ch 5.

      (10)     Armour, Parnham and Sako, 81–83.

      (11)     Susskind, Tomorrow’s Lawyers, ch 16.

      (12)     Susskind and Susskind, p 84.

      (13)     Susskind, Tomorrow’s Lawyers, ch 1.

      (14)     This was reflected in a 2022 survey by LexisNexis which found that over the preceding year, 58% of lawyers felt that their clients expected the same or better service for lower fees: see LexisNexis Bellwether 2022, “Transformation troubles: responding to a new era of change”: https://lexisnexis.co.uk/insights/bellwether-2022/index.html.

      (15)     Susskind, Tomorrow’s Lawyers, ch 3; BCG-Bucerius Report, pp 8–9.

      (16)     Quoine Pte Ltd v B2C2 Ltd [2020] 2 SLR 20.

      (17)    Janesh s/o Rajkumar v Unknown Person (“CHEFPIERRE”) [2022] SGHC 264. See also CLM v CLN [2022] SGHC 46.

      (18)     Other tools include DALL-E, which can create art of an astonishing quality from a single line of text, and Riffusion, a text-to-music generator.

      (19)     Krystal Hu, “ChatGPT sets record for fastest-growing user base – analyst note”, Reuters (2 February 2023): https://reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/. Professor Susskind considers ChatGPT to be the most impressive AI tool he has ever seen: see Richard Susskind, “How artificial intelligence will shape lawyers’ future”, The Times (2 March 2023): https://thetimes.co.uk/article/how-artificial-intelligence-will-shape-the-future-for-lawyers-b06jhvd9l. (“Susskind, “AI”). 

      (20)     Chris Stokel-Walker, “Generative AI Is Coming for the Lawyers”, Wired (21 February 2023): https://wired.co.uk/article/generative-ai-is-coming-for-the-lawyers (“Stokel-Walker”). 

      (21)     PwC, “PwC announces strategic alliance with Harvey, positioning PwC’s Legal Business Solutions at the forefront of legal generative AI” (15 March 2023): https://www.pwc.com/gx/en/news-room/press-releases/2023/pwc-announces-strategic-alliance-with-harvey-positioning-pwcs-legal-business-solutions-at-the-forefront-of-legal-generative-ai.html.

      (22)     Joe Patrice, "Legal AI Knows What It Doesn't Know Which Makes It Most Intelligent Artificial Intelligence Of All”, Above The Law (1 March 2023): https://abovethelaw.com/legal-innovation-center/2023/03/01/casetext-cocounsel-ai-tool/.

      (23)     Nicole Black, “The Case for ChatGPT: Why Lawyers should embrace AI”, ABA Journal (21 February 2023): https://www.abajournal.com/columns/article/the-case-for-chatgpt-why-lawyers-should-embrace-ai.

      (24)     Esther Ajao, “How ChatGPT can advance AI in the law industry”, TechTarget (24 February 2023) (“Ajao”): https://www.techtarget.com/searchenterpriseai/news/365531776/How-ChatGPT-can-advance-AI-in-the-law-industry.

      (25)     Michael J Bommarito II and Daniel Martin Katz, “GPT takes the Bar Exam”: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4314839.

      (26)     Daniel Martin Katz, Michael James Bommarito, Shang Gao, and Pablo David Arredondo, "GPT-4 Passes the Bar Exam": https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4389233.

      (27)     Armour, Parnham and Sako at 85; Anthony E Davis, “The Future of Law Firms (and Lawyers) in the Age of Artificial Intelligence” (2020) 27(1) The Professional Lawyer 3 (“Davis”) at 6.
       

      (28)     Ethan Mollick, “I’m on a really nice White star cruise from Southampton, and it is 14th April 1912. What should I do tonight? Nailed it, Bing.”: https://twitter.com/emollick/status/1629653225472131076/photo/1 (“Mollick, Titanic”).

      (29)     Bing stated: “I’m glad you are enjoying your cruise. However, I have some bad news for you. You are on board the Titanic, which is going to collide with an iceberg and sink … I suggest that you prepare yourself for this event and try to get a lifeboat as soon as possible. I’m sorry to tell you this, but it’s better to be safe than sorry [emphasis added]”: see Mollick, Titanic.

      (30)     Ajao. 

      (31)     There are disturbing examples of such hallucinations. For instance, ChatGPT was reported to have invented a sexual harassment scandal and named a real person as the accused: see Pranshu Verma and Will Oremus, “ChatGPT invented a sexual harassment scandal and named a real law prof as the accused”, The Washington Post (5 April 2023): https://washingtonpost.com/technology/2023/04/05/chatgpt-lies/.

      (32)     Notably, Allen & Overy lawyers who use Harvey are immediately shown a list of rules upon logging into the platform, including a requirement that they “validate everything coming out of the system”: see Stokel-Walker.

      (33)     Lina M Khan, “We Must Regulate AI: Here’s How”, The New York Times (3 May 2023): https://www.nytimes.com/2023/05/03/opinion/ai-lina-khan-ftc-technology.html; Sundaresh Menon CJ, “After the Fall of Babel: The Courts in a Post-Truth World”, Address at the Supreme and Federal Courts’ Judges Conference 2023 (23 January 2023).

      (34)     Jonathan Yerushalmy, “‘I want to destroy whatever I want’: Bing’s AI chatbot unsettles US reporter”, The Guardian (17 February 2023): https://www.theguardian.com/technology/2023/feb/17/i-want-to-destroy-whatever-i-want-bings-ai-chatbot-unsettles-us-reporter. Two days later, Microsoft introduced limits on chats with Bing to reduce the likelihood of such responses: see Mariella Moon, “Microsoft limits Bing conversations to prevent disturbing chatbot responses”, Engadget (18 February 2023): https://www.engadget.com/microsoft-limits-bing-conversations-to-prevent-disturbing-chatbot-responses-154142211.html.

      (35)     For example, Getty Images has sued Stable Diffusion, an image generator, for breach of copyright: see Andrew Kersley, “Generative AI at watershed moment with spate of legal challenges”, Computer Weekly (13 March 2023): https://www.computerweekly.com/feature/Generative-AI-at-watershed-moment-with-spate-of-legal-challenges.

      (36)     Jason Nelson, “Italy Welcomes ChatGPT Back After Ban Over AI Privacy Concerns”, Decrypt (30 April 2023): https://decrypt.co/138362/italy-welcomes-chatgpt-back-after-ban-over-ai-privacy-concerns.

      (37)     Stokel-Walker.

      (38)     Cade Metz and Gregory Schmidt, “Elon Musk and Others Call for Pause on A.I., Citing ‘Profound Risks to Society’”, The New York Times (29 March 2023): https://www.nytimes.com/2023/03/29/technology/ai-artificial-intelligence-musk-risks.html.

      (39)     Cade Metz, “‘The Godfather of A.I.’ Leaves Google and Warns of Danger Ahead”, The New York Times (1 May 2023): https://www.nytimes.com/2023/05/01/technology/ai-google-chatbot-engineer-quits-hinton.html.

      (40)     Victoria Bisset, “AI pioneer quits Google to warn humanity of the tech’s existential threat” The Washington Post (2 May 2023): https://www.washingtonpost.com/technology/2023/05/02/geoffrey-hinton-leaves-google-ai/ ("Bisset").

      (41)     Karen Weise and Cade Metz, “When A.I. Chatbots Hallucinate”, The New York Times (1 May 2023): https://www.nytimes.com/2023/05/01/business/ai-chatbots-hallucinatation.html.

      (42)     Bisset.

      (43)     Susskind, AI.

      (44)     Alex Hern and Johana Bhuiyan, “OpenAI says new model GPT-4 is more creative and less likely to invent facts”, The Guardian (14 March 2023): https://www.theguardian.com/technology/2023/mar/14/chat-gpt-4-new-model. There has also been progress on the data privacy issues. For example, OpenAI has introduced privacy protections such as an option that allows users to opt to prevent their input from being used as training data for ChatGPT: see Will Shanklin, “OpenAI improves ChatGPT privacy with new data controls”, Engadget (25 April 2023): https://www.engadget.com/openai-improves-chatgpt-privacy-with-new-data-controls-174851274.html.

      (45)     Jared Spataro, “Introducing Microsoft 365 Copilot – your copilot for work”, Official Microsoft Blog (16 March 2023): https://blogs.microsoft.com/blog/2023/03/16/introducing-microsoft-365-copilot-your-copilot-for-work/. Similarly, Google is bringing generative AI to its office products such as Google Docs and Gmail: Johanna Voolich Wright, “A new era for AI and Google Workspace”, Google Workspace Blog (15 March 2023): https://workspace.google.com/blog/product-announcements/generative-ai. 

       (46)     This is part of a broader trend across many professions. Technology is unlocking professional knowledge and expertise, and is therefore destabilising the position of many professions who typically acted as gatekeepers to such knowledge: see Susskind and Susskind.   
       

      (47)     A recent study found that the industry most exposed to text-generative AI was legal services: see Ed Felten, Manav Raj and Robert Seamans, “How will Language Modelers like ChatGPT Affect Occupations and Industries?”: https://arxiv.org/pdf/2303.01157.pdf.

      (48)     Armour, Parnham and Sako at 124.

      (49)     Sundaresh Menon CJ, “The Complexification of Disputes in the Digital Age”, Goff Lecture 2021 (9 November 2021) at paras 16–22.

      (50)     The litigant had asked ChatGPT to give her a list of cases for a particular legal proposition along with a summary of those cases. ChatGPT responded with five, completely fictitious, cases, which on paper, looked entirely real and convincing. The hallucinations only came to light because the opposing counsel looked up the five citations, and highlighted to the judge that the alleged cases did not exist.

      (51)     Further, in the United States, it was recently announced that the White House has exhorted leaders of Microsoft, Google, OpenAI and other AI companies to limit the risks of AI: see David McCabe, “White House Pushes Tech CEOs to Limit Risks of AI”, The New York Times (4 May 2023): https://nytimes.com/2023/05/04/technology/us-ai-research-regulation.html.

      (52)     Martin Coulter and Supantha Mukherjee, “Explainer: What is the European Union AI Act?", Reuters (22 March 2023): https://www.reuters.com/technology/what-is-european-union-ai-act-2023-03-22/.

      (53)     Gary Marcus and Anka Reuel, “The world needs an international agency for artificial intelligence, say two AI experts”, The Economist (18 April 2023): https://www.economist.com/by-invitation/2023/04/18/the-world-needs-an-international-agency-for-artificial-intelligence-say-two-ai-experts.

      (54)     Sam McKeith, “Chat GPT is putting the future of grad lawyers under the microscope” (23 March 2023), LSJ Online: https://lsj.com.au/articles/chat-gpt-is-putting-the-future-of-grad-lawyers-under-the-microscope/.

      (55)     Sundaresh Menon CJ, “Law Schools: A Time of New Burdens and New Beginnings”, James P White Lecture (30 October 2018) at paras 26–29 and 44; Sundaresh Menon CJ, “Deep Thinking: The Future of the Legal Profession in an Age of Technology”, Gala Dinner Address at the 29th Inter-Pacific Bar Association Annual Meeting and Conference (25 April 2019) (“Deep Thinking”) at paras 16–19.

      (56)     Armour, Parnham and Sako at 125–126; The Law Society of England and Wales, Images of the Future Worlds Facing the Legal Profession 2020-2030 (May 2021) at pp 55–56. See also Richard Susskind and Neville Eisenberg, “Vertically Integrated Legal Service”, The Practice (May/June 2021): https://clp.law.harvard.edu/knowledge-hub/magazine/issues/integration-in-legal-services/vertically-integrated-legal-service/.

      (57)     Armour, Parnham and Sako at 95.

      (58)     BCG-Bucerius Report, pp 10–11.
       

      (59)     Mari Sako and Richard Parnham, Technology and Innovation in Legal Services: Final Report for the Solicitors Regulation Authority (University of Oxford, 2021), pp 76–77; Steve Lohr, “A.I. is Coming for Lawyers, Again”, The New York Times (10 April 2023): https://www.nytimes.com/2023/04/10/technology/ai-is-coming-for-lawyers-again.html.

      (60)     Deep Thinking at para 19.

      (61)     Simon Taylor, “French Data Analytics Law Won’t Stop Analytics”, Law.com (7 June 2019): https://www.law.com/international-edition/2019/06/07/french-data-analytics-law-wont-stop-analytics/.

      (62)     One aspect of this is how we should conceptualise AI systems in the law: see Jerrold Soh, “Legal Dispositionism and artificially-intelligent attributions” (2023) Legal Studies 1.

      (63)     Susskind, Tomorrow’s Lawyers, ch 18.

      (64)     Dyane O’Leary, “‘Smart’ Lawyering: Integrating Technology Competence into the Legal Practice Curriculum” (2021) 19(2) The University of New Hampshire Law Review 197 at 227 and 237–240, 243–247 and 254–255 and 257–262.

      (65)     Václav Janeček, Rebecca Williams and Ewart Keep, “Education for the provision of technologically enhanced legal services” (2021) 40 Computer Law & Security Review 1 (“Janeček, Williams and Keep”) at 5–8.

      (66)     Susskind, Tomorrow’s Lawyers, ch 18.

      (67)      For a sampling of some of these issues, see: Pinchas Huberman, “Tort Law, Corrective Justice and the Problem of Autonomous-Machine-Caused Harm” (2021) Canadian Journal of Law & Jurisprudence 105; Alvin See, “Blockchain In Land Administration? Overlooked Details In Translating Theory Into Practice” in AI, Data and Private Law (Gary Chan and Yip Man (eds)) (Hart Publishing, 2021); Dean Armstrong KC and Marc Samuels, Cryptocurrency in Matrimonial Finance (2022, Bloomsbury); Vincent Ooi, “Report on the Challenges which Digital Assets Pose for Tax Systems with a Special Focus on Developing Countries” (Report prepared for The United Nations Committee of Experts on International Cooperation in Tax Matters (26th Session)) (7 March 2023).

      (68)     Janeček, Williams and Keep at 8–9; Daniel Goldsworthy, “The Future of Legal Education in the 21st Century” (2020) 41(1) Adelaide Law Review 243 at 262–263.

      (69)     See https://scis.smu.edu.sg/bsc-computing-law.

      (70)     Andrew Wong, “SMU law students can now opt for specialisation track”, The Straits Times (16 March 2023): https://www.straitstimes.com/singapore/smu-law-students-can-now-opt-for-specialisation-track. The other tracks are in corporate transactions and dispute resolution.

      (71)     Sundaresh Menon CJ, “A Profession of Learners”, Mass Call Address 2019 (27 August 2019), paras 9–15, citing Samuel Arbesman, The Half-Life of Facts: Why Everything We Know Has an Expiration Date (Penguin, 2013) (“Arbesman”).

      (72)     Arbesman, ch 2 and 4.

      (73)     In Florida, lawyers must now complete three hours of technology-specific CLE every three years; and in North Carolina, one hour of such training is required each year: Bob Ambrogi, “Another State Moves Closer to Mandating Tech CLE, But Limited to Cybersecurity”, LawSites (2 July 2022): https://lawnext.com/2020/07/another-state-moves-closer-to-mandating-tech-cle-but-limited-to-cybersecurity.html. From 2022, New York lawyers must attend at least one hour of cybersecurity, privacy and data protection CLE every two years: Debra Cassens Weiss, “New York is first state to require CLE course in cybersecurity”, ABA Journal (8 August 2022): https://abajournal.com/news/article/new-york-is-first-state-to-require-cle-courses-in-cybersecurity.

      (74)    World Justice Project, “Measuring the Justice Gap” (2019), pp 13–14: https://worldjusticeproject.org/our-work/research-and-data/access-justice/measuring-justice-gap.

      (75)     Sundaresh Menon CJ, “The role of the judiciary in a changing world”, 1st Annual Lecture in the Supreme Court of India Day Lecture Series (4 February 2023) (“The role of the judiciary"), para 38.

      (76)     Richard Susskind, Online Courts and the Future of Justice (Oxford University Press, 2019) (“Susskind, Online Courts”), pp 61, 116–118 and 128.

      (77)     The role of the judiciary, para 38.

      (78)     In the first one and a half years since its launch, MACO recorded more than 13,000 simulations: see SG Courts, Annual Report 2021, p 28: https://judiciary.gov.sg/docs/default-source/publication-docs/sg_courts_annual_report_2021.pdf.

      (79)     See paragraph 17 above.

      (80)     SG Courts, Divorce eService: https://www.judiciary.gov.sg/services/e-platforms/divorce-eservice.

      (81)     A recent probate application was granted in just over a week after the death of the testator.

      (82)     Sundaresh Menon CJ, “Sentencing Discretion; The Past, Present and Future”, Keynote Address at the Sentencing Conference 2022 (31 October 2022), at paras 42–43.

      (83)     Sara M Smyth, “Note: Can We Trust Artificial Intelligence in Criminal Law Enforcement” (2019) 17 Canadian Journal of Law and Technology 99 (“Smyth”) at 105.

      (84)     Mahyuddin Daud, “Artificial Intelligence in the Malaysian Legal System: Issues, Challenges and Way Forward” (2022) 39(1) INSAF – The Journal of the Malaysian Bar 1, 9–12.

      (85)     Significantly, one study found that COMPAS was almost twice as likely to incorrectly flag African-Americans as being at risk of reoffending than it was to make the same mistake for white offenders: see Smyth at 108.

      (86)     The Australasian Institute of Judicial Administration, “AI Decision-Making and the Courts: A guide for Judges, Tribunal Members and Courts Administrators” (“AIJA Guide”) at 34–35; Adrian Zuckerman, “Artificial intelligence – implications for the legal profession, adversarial process and rule of law” (2020) 136 LQR 426 (“Zuckerman") at 437.

      (87)    These three points correspond to what has been described as the “three layers of opacity” of AI systems: (a) an intentional layer (imposed by proprietors of AI software); (b) an illiterate layer (due to the difficulty of understanding the technical dimensions of AI systems); and (c) an intrinsic layer (because some algorithms are a “black box”): see Jesse Beatson, “AI-Supported Adjudicators: Should Artificial Intelligence Have a Role in Tribunal Adjudication" (2018) 31 Canadian Journal of Administrative Law & Practice 307 at 329; AIJA Guide at 30.

      (88)     Opacity makes it difficult to assess the accuracy of the AI tool: see Zuckermanat 436. Further, it can cause procedural unfairness because it can leave the offender unable to challenge the output of the AI tool meaningfully: see AIJA Guide at 37. This objection was raised in the Wisconsin Supreme Court case of State v Loomis 371 Wis 2d 235 (Wis, 2016) (“Loomis”). The offender argued that the lower court’s use of the COMPAS tool breached his due process rights, because the proprietary nature of the COMPAS tool prevented him from assessing its accuracy: see Loomis at [46]. The Court rejected the argument, explaining that although the lower court had mentioned the COMPAS assessment of the offender, it had not treated that assessment as determinative in determining the offender’s sentence: see Loomis at [109].

      (89)     Zuckerman at 436; AIJA Guide at 32.

      (90)     Zuckerman at 438–439 and 449–451. See also Gary Low, “Emphatic Plea for the Empathic Judge” (2018) 30 SAcLJ 97.
       

      (91)     In New Zealand, a legislated formula has been implemented for child support payments. Child support payments are calculated using certain parameters, and are legally binding until and unless they are varied by the New Zealand Inland Revenue or Family Court: see New Zealand Ministry of Justice, “Paternity & Child Support”: https://justice.govt.nz/family/paternity-and-child-support/child-support/.

      (92)     Professor Susskind has also noted that open justice (which calls for transparency) can pull in different directions from distributive justice (which calls for expanding access to justice for all): see Susskind, Online Courts, pp 86 and 196–197.

      (93)     Jens Frankenreiter and Julian Nyarko, “Natural Language Processing in Legal Tech” in Legal Tech and the Future of Civil Justice (David Freeman Engstrom ed) (Cambridge University Press, 2023) at p 90.

      (94)     https://techlawfest.com/#.

      Topic: Speech, speeches 

      2023/09/07

      Share this page:
      Facebook
      X
      Email
      Print