AustLII Home | Databases | WorldLII | Search | Feedback

Precedent (Australian Lawyers Alliance)

You are here:  AustLII >> Databases >> Precedent (Australian Lawyers Alliance) >> 2020 >> [2020] PrecedentAULA 6

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Sourdin, Tania; Li, Bin; Hinds, Tom --- "Humans and justice machines: Emergent legal technologies and justice apps" [2020] PrecedentAULA 6; (2020) 156 Precedent 20


HUMANS AND JUSTICE MACHINES

EMERGENT LEGAL TECHNOLOGIES AND JUSTICE APPS

By Professor Tania Sourdin, Dr Bin Li and Tom Hinds

Changing and emerging legal technologies have had, and will continue to have, a significant impact on the evolution of the justice system by assisting to inform, support and advise participants (supportive technologies), by replacing activities and functions that were previously carried out by humans (replacement technologies), and by significantly changing processes (disruptive technologies).[1] Many of the technologies being developed have overlapping purposes and functions.

While there are already a range of applications or ‘apps’ that support better access to justice and enable participants to be better informed and engaged in the justice sector, there are clear risks that arise where human oversight of processes and decision-making is removed.[2] The use of automated frameworks for decision-making in the justice sector (beyond administrative decision-making) is of particular concern.

Some newer technologies, such as apps, facilitate the ‘quick’ and ‘cheap’ resolution of disputes, and increase access to legal assistance and resources by vulnerable social groups. Legal apps may be designed for use by lawyers or the general public.[3] Apps targeting lawyers often promote more efficient legal service delivery and can assist to streamline legal research.[4] Apps targeting the general public – sometimes termed ‘DTP’ (direct-to-public) apps[5] – can make legal services easier to access and allow users to engage with self-help processes,[6] help users to identify legal issues, guide users through legal choices, assist with the drafting and filing of legal documents, and provide referrals to legal service providers.[7] More sophisticated legal ‘chatbot’ or ‘robolawyer’ apps can offer recommendations or solutions based on conditional and causal decision logic trees, and in some cases, more advanced artificial intelligence (AI) techniques.[8] Other apps, such as Adieu can provide a platform for referral, enabling people to access legal and other services in circumstances where it is unlikely that these services would otherwise be accessed.[9] Such approaches raise significant issues relating to the unbundling of legal services and also in relation to lawyer and client innovation readiness, which are explored further below.

TECHNOLOGY AND JUSTICE

The meaning of ‘justice’ is notoriously elusive.[10] Justice in court-based decision-making and alternative dispute resolution (ADR) may encompass the notion that there is ‘a set of principles for assigning rights and duties which determine the benefits and burdens of social cooperation’.[11] Other notions of justice are linked to perceptions of fairness, either in the quality of the outcome or the process through which an outcome is delivered.[12] Essentially, though, there are different philosophical meanings given to ‘justice’.[13] Litigation supporters may argue that justice can only take place within the courts, where a judge is able to apply the rule of law. Those who participate in ADR may adopt a ‘broader’ view of justice: that justice can be seen to exist ‘in the relationships that exist between people and in their ethical values’.[14] How technologies support justice must be considered in the context of these differing meanings of ‘justice’, and, to some extent, the context within which justice is sought.

For example, ‘supportive technologies’ such as phone and web-based justice apps have a pre-litigation and educative placement in the system and may support justice in the context of education about key principles and values. In response to budgetary drivers as well as consumer appetite,[15] there has been significant growth in these types of apps over the past five years. Some are linked to court or tribunal systems, such as eBRAM in Hong Kong[16] and Guided Resolution (linked to the NSW Civil and Administrative Tribunal).[17] While some justice apps have been designed for lawyers, such as legal research assistant Ailira,[18] those targeting consumers use legal ‘chatbot’ or ‘robolawyer’ designs and offer legal advice based on causal decision logic trees, and in some cases, more sophisticated AI techniques.[19] For example, DoNotPay is an AI chatbot app[20] which as of June 2016 had helped over 250,000 people to challenge traffic and parking tickets across the UK and US, with a 40 per cent success rate.[21] Many of these technologies promote the ‘just’ resolution of disputes by enabling a ‘weaker’ party with a disadvantaged status to access appropriate legal information and advice, addressing to some extent the power imbalances that often exist in disputes.[22]

‘Replacement technologies’ such as consumer online dispute resolution (ODR) can also benefit disputing parties and support justice in terms of procedural or substantive justice. ODR can include facilitative processes such as online mediation, advisory processes such as online case appraisal, and determinative processes such as online arbitration or adjudication. In 2016, British Columbia became the first jurisdiction to fully integrate ODR into a formal tribunal system with the Civil Resolution Tribunal (CRT). The CRT provides tailored legal information, tools and resources to help parties resolve their disputes.[23] If there is no resolution of a dispute, a facilitator is assigned, and if after that there is still no agreement between the parties, the dispute proceeds to adjudication by a tribunal member. As of July 2019, the CRT reported 10,461 completed disputes, with only 1,717 of these disputes resorting to adjudication.[24] ODR can save travel time and avoid disbursements, as well as contribute to faster resolution of disputes compared with traditional litigation processes and traditional forms of ADR.[25] A further benefit of ODR is that it can provide access to justice in relatively sparsely populated but vast countries such as Australia, where face-to-face interactions are difficult and where ‘postcode’ justice[26] is an issue (which essentially refers to the geographical exclusion of people as a result of their proximity to justice services). Given that disputes can include international, national and local interactions, ODR has the potential to exert significant influence on the justice system.[27]

Accordingly, both ‘supportive’ and ‘replacement’ technologies can improve access to justice and, to some extent, empower parties to resolve disputes.

In contrast, the extent to which ‘disruptive’ technologies are able to incorporate justice values has been questioned. The use of AI technologies to determine liability issues in civil cases and penalties in criminal cases, including sentences of imprisonment, has been problematic. Issues relating to the transparency of decision-making, algorithmic bias and enforceability have arisen. Specifically, as outlined by Cruz, implicit biases in AI formulas can skew results in a way that negatively impacts minority individuals.[28] This is because the designers of AI programs typically come from very similar backgrounds: they are usually highly-educated, cisgender men – most of whom are Caucasian or Asian – and their preferences do not necessarily reflect the beliefs, experiences, and preferences of marginalised communities.[29]

An example of this bias has been seen in the US, where AI technologies are used to assist judicial officers with sentencing.[30] One such AI tool, the Correctional Offender Management Profiling for Alternative Solutions (COMPAS), was used in the sentencing of an offender, and examined, in State v Loomis.[31] The offender had pleaded guilty to two charges relating to a drive-by shooting. COMPAS recommended the maximum sentence, which the offender received. Later testing of the tool established that black defendants were twice as likely to be labelled high risk when compared with white defendants, and that white defendants who had been categorised as lower risk were more likely to go on to reoffend.[32] In the Loomis case, COMPAS was proprietarily protected and therefore examination of its software was prevented. Such algorithmic biases in the delivery of ‘justice’ have led to calls for greater scrutiny and transparency of the designers and developers of these technologies.[33]

The impact of AI on the justice system is significant as it has the capacity to be blended with existing administrative and judicial decision-making processes.[34] There have been calls to keep human decision-makers ‘in the loop’, and for careful scrutiny of automation that applies strict criteria rather than the exercise of discretion or a value-based judgment.[35] Professor Zeleznikow has warned against ODR being fully automated, and recommended that such systems should aim to support decision-making rather than usurp this function.[36] Zalnieriute, Moses and Williams have also noted the risk that automation can compromise individual due process rights by undermining the ability of a party to challenge a decision affecting them.[37] It is important, therefore, to ensure that automated processes do not prevent parties from accessing or assessing the information used to make the decision. This has also been emphasised by Sourdin, Li and Burke, who refer to the risk that the processes used to reach an outcome, particularly by more ‘disruptive’ technologies involving developed AI, may be less visible to the parties. The authors note the transparency and natural justice issues that can arise as a result of this.[38]

The importance of an ethical approach to legal tech

Ethical issues have been the subject of many recent discussions about the use of technology in legal contexts.[39] Like the definition of ‘justice’, legal minds may differ on the meaning of ‘ethical’. It is, however, generally accepted that it pertains to right and wrong conduct.[40] In particular, ethics can be regarded as personal or professional ‘values’ in different contexts. In respect of an ethical approach to technological innovations in the justice system, it can be argued that innovations should align with the objectives of justice processes – such as the ‘just’ resolution of legal disputes. Therefore, both the placement of technological innovations in the justice sector and the types of innovations will have an impact on the ethical considerations relating to achieving the objectives of fairness and justice.[41] For example, the Law Council of Australia has emphasised, in a recent submission, the need to pay particular attention to the values of lawfulness, fairness, rationality and intelligibility where AI is used in administrative decision-making.[42]

Legal technologies will inevitably generate ethical discussions, about issues such as:

• How is data management, including collection, access, privacy and confidentiality, affected where legal technologies are used?

• How can ‘justice machines’ and platforms be evaluated by reference to an ethical framework?

• What impact does the digital divide have on the delivery of ‘justice’?

• Are lawyers, judicial officers and policymakers equipped and ready to participate in the inevitable ethical reshaping of the justice system that will result from technological change? This is particularly relevant given that digital innovation, AI and other technological changes are currently largely corporate driven.[43] Arguably, corporate drivers in this sector may have little regard to societal good.[44]

• Is the range of ethical issues that arises in relation to legal technological innovation adequately reflected in the current legislative and regulatory frameworks?[45]

Some of these points are expanded on below, particularly in the context of the attainment of justice.

There are issues arising out of the automation and technological disruption of executive decision-making. An example of this is the automation of welfare decisions through the Australian Centrelink debt recovery scheme (robodebt scheme). In its recent landmark ruling on this robodebt case, the Federal Court of Australia confirmed the unlawfulness of an administrative decision made by automated machines.[46] Another example is in the immigration context. While the Australian government has automated decisions for certain types of visas,[47] Hungary, Greece and Latvia are testing an immigration system that is even more technology-based, with a virtual border guard programmed with ‘deception detection technology’ that scrutinises the gestures of interviewees to determine if they are lying.[48] However, this seemingly sophisticated technology to assist with executive decision-making has been challenged by some scientists as ‘pseudoscience’.[49] Although both examples are not in the legal context, they illustrate the questions surrounding the reliability and trustworthiness of automation technology in making decisions on behalf of human professionals. In legal settings, automated decisions may be vested with legal authority through legislation,[50] however issues around the design and effectiveness of the technology remain, as in the administrative context.

Further, because both participation and transparency of decision-making may be reduced, perceptions relating to the impact of AI on procedural justice as well as substantive justice may be adversely affected. This is most relevant where evaluative and judicial adjudication processes are involved; and also where the interplay between ethical considerations and the attainment of justice is most in focus.[51] For example, it has been argued that the role of lawyers as ‘justice’ providers would be undermined by the automation of decisions. Remus and Levy note that automated options are not appropriate substitutes for professional family lawyers, particularly in the case of vulnerable clients and children.[52] According to Bell, family law involves emotional work on the part of the lawyer because clients usually seek, and require, more than ‘pure’ or mechanistic legal advice.[53] That is, family law is often seen as necessitating skills that are not strictly technical or legal, but rather which fall into the category of ‘life skills’, which have been acquired through experience rather than formal training.[54] Kulp has also noted that parties are often interested not just in the dispute at a ‘technical’ level, but also at a ‘personal’ level.[55] Kulp argues that if a party having their ‘day in court’ is a substantial and desired outcome, then a process that involves little or no face-to-face contact, as in most ODR, might be considered inadequate.[56]

Despite these views, a study currently being undertaken by the authors of this article into the Adieu platform and the Lumi chatbot has demonstrated that for many, an effective justice platform, a well-equipped chatbot and a sound automated referral system can support people involved in family disputes. Although the evaluation of these services is still being undertaken, early feedback from the 500 plus people who have used this system suggests that these processes can support resolution through the provision of well-targeted referral (where required), tailored information support and sophisticated chatbot interactions.

Another ethical consideration in developing legal technologies is the digital divide and how this may affect the delivery of ‘justice’ – which should be one of the core purposes of technological innovation in law. In Australia, it has been assumed that people will be able to access ODR options via the internet,[57] justice apps and other web or mobile-based legal technologies. This assumption, however, may not be correct for many people. Those with a low-income, elderly people, individuals with disabilities, Indigenous people, and those from non-English speaking backgrounds can face particular difficulties accessing technology,[58] and the question arises as to whether these legal technologies further deprive vulnerable people of the opportunity to access information and support. In this regard, however, the authors note that some assumptions made about the digital divide may not be correct. For example, in relation to the current Adieu research, the proportion of users of the divorce app that had a relationship length of more than 20 years was more than 50 per cent. This preliminary finding suggests that age may no longer be as relevant a factor in the context of the digital divide.

The production and improvement of justice apps is a significant recent development. This is partly because apps may support those who are comfortable with smartphones and mobile devices but intimidated by, or unable to access, computers.[59] Mobile phones are increasingly the dominant means by which people access the internet. In terms of internet connectivity, the Australian Bureau of Statistics (ABS) has noted that in 2016–17, 87 per cent of persons were internet users, with mobile or smartphones used by 91 per cent of connected households.[60] App usage is also growing: in 2017, there were almost 200 billion mobile app downloads, an increase of 32 per cent from the previous year.[61] While app usage has increased as a result of the influx of new apps, more surprising is the way in which apps have changed the technological appetite of consumers.[62] Consumers are spending more time on apps than on any other media platform. A 2017 survey of American consumers found that 50 per cent of all digital media usage time was spent on apps, equating to 2.3 hours a day.[63]

Given these high levels of smartphone saturation and the uptake of handheld access to the internet among consumers, justice apps hold the promise of being able to improve access to justice for the ‘missing middle’[64] in civil law disputes. However, while smartphone-enabled internet access is rising, the potential of justice apps to improve access for disadvantaged people will depend in part on closing the digital divide. The last time the ABS asked households why they weren’t connected to the internet, the two significant responses were expense and a lack of knowledge about how to use the internet.[65]

CONCLUSION

Given the complicated nature of decision-making and competing considerations that must be contemplated, ethical technological change appears easiest at the lower levels of ‘supportive’ and ‘replacement’ technologies; though the issues of digital literacy and cost of technology require consideration if an aim of the legal sector is to improve access to justice for disadvantaged people. The more ‘disruptive’ technologies, such as AI developments, may raise quite different issues in terms of ‘justice’. As justice machines become more capable, questions surrounding how and why decisions are made become more complex. While policymakers around the world are recognising the need for ethical frameworks in the expansion of technology, foundational to such an expansion is the readiness of policymakers to adapt, implement and regulate 21st century machine-administered justice. This is particularly the case as technological change shapes justice in two key and interrelated ways: access to systems of justice and the dispensation of justice within those systems.

Professor Tania Sourdin is Professor of Law and Dean of the University of Newcastle Law School, Australia. She thanks Jacqueline Meredith, a Senior Researcher and the University of Newcastle, for assistance with this article. Parts of this article are drawn from T Sourdin, Alternative Dispute Resolution, Thomson Reuters, 6th ed, 2020 (forthcoming); T Sourdin, B Li and T Burke, ‘Just quick and cheap? Civil dispute resolution and technology’, Macquarie Law Journal, Vol. 19, 2019, 17; T Sourdin, ‘Judge v robot? Artificial intelligence and judicial decision-making’, University of New South Wales Law Journal, Vol. 41(4), 2018, 1114; and T Sourdin, ‘Justice and technological innovation’, Journal of Judicial Administration, Vol. 25, 2015, 96. EMAIL Tania.Sourdin@Newcastle.edu.au.

Dr Bin Li is lecturer of law at the University of Newcastle Law School, Australia. EMAIL bin.li@newcastle.edu.au.

Tom Hinds is a researcher and penultimate LLB (Hons) student at the University of Newcastle Law School, Australia.


[1] For a detailed discussion, see T Sourdin, ‘Justice in the age of technology: “The rise of machines is upon us”’, Precedent, issue 139, 2017, 4–9.

[2] See P Cashman and E Ginnivan, ‘Digital justice: Online resolution of minor civil disputes and the use of digital technology in complex litigation and class actions’, Macquarie Law Journal, Vol. 19, 2019, 39–79.

[3] J McGill, S Bouclin and A Salyzyn, ‘Mobile and web-based legal apps: Opportunities, risks and information gaps’, Canadian Journal of Law and Technology, Vol. 15, 2017, 229 at 237.

[4] Ibid, 238.

[5] T Scassa et al, ‘Developing privacy best practices for direct-to-public legal apps: Observations and lessons learned’, Canadian Journal of Law and Technology, Vol. 18(1), 2020 (forthcoming).

[6] McGill et al, above note 3, 239–40.

[7] See generally S Cruz, ‘Coding for cultural competency: Expanding access to justice with technology’, Tennessee Law Review, Vol. 86, 2019, 347 at 361.

[8] J Bennett et al, ‘Current state of automated legal advice tools’ (Discussion Paper No. 1, The University of Melbourne, April 2018) 26, <https://networkedsociety.unimelb.edu.au/__data/assets/pdf_file/0020/2761013/2018-NSI-CurrentStateofALAT.pdf>. See also ibid, 364.

[9] Professor Sourdin is currently reviewing the Adieu app and the Lumi chatbot – see <https://www.adieu.ai/lumi/>.

[10] M Kirby, ‘Attaining universal justice: Realities beyond dreams’, Dictum – Victoria Law School Journal, Vol. 1, 2011, 7, <http://classic.austlii.edu.au/au/journals/DICTUMVicLawSJl/2011/3.html> .

[11] J Rawls, A Theory of Justice, Harvard University Press, 2005 (reprint), 5.

[12] T Sourdin, B Li and T Burke, ‘Just, quick and cheap? Civil dispute resolution and technology’, Macquarie Law Journal, Vol. 19, 2019, 17 at 23.

[13] T Sourdin, ‘The role of the courts in the new justice system’, Yearbook on Arbitration and Mediation, Vol. 7, 95 at 98–9, <https://elibrary.law.psu.edu/arbitrationlawreview/vol7/iss1/11/?utm_source=elibrary.law.psu.edu%2Farbitrationlawreview%2Fvol7%2Fiss1%2F11&utm_medium=PDF&utm_campaign=PDFCoverPages>.

[14] Ibid.

[15] Deloitte, Behaviour Unlimited Mobile Consumer Survey 2018: The Australian Cut (Report, 2018) 6, <https://www2.deloitte.com/au/mobile-consumer-survey>; Comscore, The 2017 US Mobile App Report (Report, 2017) 7, <https://www.comscore.com/Insights/Presentations-and-Whitepapers/2017/The-2017-US-Mobile-App-Report>; Comscore, The 2015 US Mobile App Report (Report, 2015) 6, <https://www.comscore.com/layout/set/popup/Request/Presentations/2015/The-2015-US-Mobile-App-Report>.

[16] eBRAM, eBRAM (online) <http://ebram.org/> .

[17] Guided Resolution, Guided Resolution (online) <https://www.guidedresolution.com/company/news>.

[18] Ailira, Ailira (online) <https://www.ailira.com/>.

[19] Bennett et al, above note 8, 26.

[20] Donotpay, DoNotPay (2019) <https://donotpay.com/>.

[21] Bennett et al, above note 8, 38.

[22] Sourdin et al, above note 12, 27.

[23] Cashman et al, above note 2, 44.

[24] Civil Resolution Tribunal, CRT Statistics Snapshot – July 2019 (2019) <https://civilresolutionbc.ca/crt-statistics-snapshot-july-2019/>.

[25] Sourdin et al, above note 12, 26.

[26] See generally R Coverdale, ‘Postcode justice: Rural and regional disadvantage in the administration of the law’, Deakin Law Review, Vol. 16, 2011, 155–87, <https://ojs.deakin.edu.au/index.php/dlr/article/view/98/98>.

[27] Sourdin et al, above note 12, 36.

[28] Cruz, above note 7, 369–71.

[29] Ibid.

[30] Electronic Privacy Information Center, Algorithms in the criminal justice system, <https://epic.org/algorithmic-transparency/crim-justice/>.

[31] State v Loomis, 371 Wis 2d 235 (2016).

[32] J Angwin et al, ‘Machine bias’, ProPublica (online), 23 May 2016, <https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing>.

[33] L Moses and A Collyer, ‘When and how should we invite artificial intelligence tools to assist with the administration of law? A note from America’, Australian Law Journal, Vol. 93, 2019, 176 at 180–1.

[34] M Perry, ‘iDecide: Administrative decision-making in the digital world’, Australian Law Journal, Vol. 91, 2017, 29 at 33.

[35] Ibid, 33–4.

[36] J Zeleznikow, ‘Can artificial intelligence and online dispute resolution enhance efficiency and effectiveness in courts’, International Journal for Court Administration, Vol. 8(2), 2017, 30 at 39–41.

[37] M Zalnieriute, L Bennett Moses and G Williams, ‘The rule of law and automation of government decision‐making’, Modern Law Review, Vol. 82(3), 2019, 425 at 449.

[38] Sourdin et al, above note 12, 23–4.

[39] For example, D Simshaw, ‘Ethical issues in robo-lawyering: The need for guidance on developing and using artificial intelligence in the practice of law’, Hastings Law Journal, 2018, <https://ssrn.com/abstract=3308168>; see also M Bailes , President-elect, Law Council of Australia, ‘The law and legal technology – Our changing work practices’ (Speech, Australian Young Lawyers’ Conference, Sydney, 20 October 2017) <https://www.lawcouncil.asn.au/docs/41755a8c-dcbd-e711-93fb-005056be13b5/The%20Law%20and%20Legal%20Technology%20%E2%80%93%20Our%20Changing%20Work%20Practices.pdf>.

[40] See J McKenzie, Legal Services Commissioner NSW, ‘Legal ethics – What are they today?’ (Discussion paper, Holding Redlich, 16 February 2017) <http://www.olsc.nsw.gov.au/Documents/Legal%20Ethics%20What%20are%20they%20today%20Discusson%20Paper%20160217.pdf> .

[41] Sourdin et al, above note 12, 25.

[42] Law Council of Australia, Submission to Department of Industry, Innovation and Science, Artificial Intelligence: Australia’s Ethics Framework, 28 June 2019, 33.

[43] See KPMG, KPMG 2019 Enterprise AI Survey (online) <https://advisory.kpmg.us/content/dam/advisory/en/pdfs/2019/8-ai-trends-transforming-the-enterprise.pdf>; Carnegie Endowment for International Peace, AI Global Surveillance (AIGS) Index (online) <https://carnegieendowment.org/2019/09/17/global-expansion-of-ai-surveillance-pub-79847>.

[44] C Cath et al, ‘Artificial intelligence and the “good society”: The US, EU, and UK approach’, Science and Engineering Ethics, Vol. 24(2), 2017, 505.

[45] D Bindman, ‘Lawtech sector given public funding boost’, Legal Futures (online), 4 November 2019, <https://www.legalfutures.co.uk/latest-news/lawtech-sector-given-public-funding-boost>.

[46] See <https://www.comcourts.gov.au/file/Federal/P/VID611/2019/3859485/event/30114114/document/1513665>.

[47] P Papadopoulos, 'Digital transformation and visa decisions: An insight into the promise and pitfalls' (Speech, 2018 AIAL National Administrative Law Conference, 28 September 2018) 4.

[48] M Perry ‘iDecide: Digital pathways to decision’ (Speech, CPD Immigration Law Conference, Canberra, 21–23 March 2019) <https://www.fedcourt.gov.au/digital-law-library/judges-speeches/justice-perry/perry-j-20190321#_ftnref11>, citing AlgorithmWatch, Automating society: Taking stock of automated decision making in the EU (Report, January 2019) 37–8.

[49] See D Boffey, ‘EU border “lie detector” system criticised as pseudoscience’, The Guardian (online), 2 November 2018, <https://www.theguardian.com/world/2018/nov/02/eu-border-lie-detection-system-criticised-as-pseudoscience>.

[50] See T Sourdin, ‘Judge v robot? Artificial intelligence and judicial decision-making’, University of New South Wales Law Journal, Vol. 41(4), 2018, 1126–7, citing the Therapeutic Goods Act 1989 (Cth), s7C(2).

[51] Sourdin et al, above note 12, 25.

[52] D Remus and F Levy, ‘Can robots be lawyers: Computers, lawyers, and the practice of law’, Georgetown Journal of Legal Ethics, Vol. 30(3), 2017, 501.

[53] F Bell, ‘Family law, access to justice, and automation’, Macquarie Law Journal, Vol. 19, 2019, 103 at 131–2.

[54] Ibid, 109.

[55] HS Kulp, ‘Future justice? Online dispute resolution and access to justice’, Just Court ADR (online blog), 8 August 2011, <http://blog.aboutrsi.org/2011/policy/future-justice-online-dispute-resolution-and-access-to-justice/> .

[56] Ibid.

[57] Bell, above note 53.

[58] L Toohey et al, ‘Meeting the access to civil justice challenge: Digital inclusion, algorithmic justice, and human-centred design’, Macquarie Law Journal, Vol. 19, 2019, 133.

[59] See J Dysart, ‘20 apps to help provide easier access to legal help’, American Bar Association Journal (online), 1 April 2015, <http://www.abajournal.com/magazine/article/20_apps_providing_easier_access_to_legal_help> .

[60] Australian Bureau of Statistics, Household Use of Information Technology, Australia, 2016–17 (Catalogue No 8146.0, 28 March 2018).

[61] K Tao and P Edmunds, ‘Mobile apps and global markets’, Theoretical Economics Letters, Vol. 8, 2018, 1510 at 1511.

[62] Comscore, The 2017 US Mobile App Report (Report, 2017) 7; cf Comscore, The 2015 US Mobile App Report (Report, 2015) 6.

[63] Comscore, The 2017 US Mobile App Report (Report, 2017) 7.

[64] Productivity Commission, Access to justice arrangements (Inquiry Report No. 72, Canberra, 2014) 20, <https://www.pc.gov.au/inquiries/completed/access-justice/report/access-justice-volume1.pdf>.

[65] See J Thomas, CK Wilson and S Park, ‘Australia’s digital divide is not going away’, The Conversation (online), 29 March 2018, <https://theconversation.com/australias-digital-divide-is-not-going-away-91834>.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/PrecedentAULA/2020/6.html