AustLII Home | Databases | WorldLII | Search | Feedback

Australian Senate Standing Committee for the Scrutiny of Bills - Scrutiny Digests

You are here:  AustLII >> Databases >> Australian Senate Standing Committee for the Scrutiny of Bills - Scrutiny Digests >> 2024 >> [2024] AUSStaCSBSD 203

Database Search | Name Search | Recent Documents | Noteup | LawCite | Download | Help

Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 - Initial Scrutiny [2024] AUSStaCSBSD 203 (9 October 2024)


Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024[132]

Purpose
The bill seeks to amend the Broadcasting Services Act 1992 to combat misinformation and disinformation via new requirements on digital communications platform providers. To ensure compliance with these new requirements, the bill also seeks to expand the Australian Communications and Media Authority’s regulatory and legislative powers to make rules, set standards, approve codes, impose reporting conditions and more. The bill also introduces consequential and transitional amendments across the Australian Communications and Media Authority Act 2005 and the Online Safety Act 2021 to insert definitions and references to the provisions created by the bill.
Portfolio
Communications
Introduced
House of Representatives on 12 September 2024
Bill status
Before the House of Representatives

Significant matters in delegated legislation[133]

1.108 This bill seeks to amend the Broadcasting Services Act 1992 to introduce a new Schedule 9 that would impose requirements on certain digital communications platform providers[134] (providers) relating to misinformation and disinformation. These providers would be required to make specified information publicly available and comply with any requirements set out in digital platform rules. These rules would be made by the Australian Communications and Media Authority (ACMA) and would include rules relating to:

• risk management;[135]

• media literacy plans;[136]

• complaints and dispute handling processes.[137]

1.109 A provider who contravenes the digital platform rules would be subject to a civil penalty of up to 5,000 penalty units for a body corporate (currently $1.565 million) or 1,000 penalty units for a non-body corporate (currently $313,000).[138]

1.110 As such, the regulation of such matters largely falls to the rules, with very little set out in relation to this in the bill itself. Where a bill includes significant matters in delegated legislation, the committee expects the explanatory memorandum to the bill to address why it is appropriate to include the relevant matters in delegated legislation and whether there is sufficient guidance on the face of the primary legislation to appropriately limit the matters that are being left to delegated legislation. A legislative instrument made by the executive is not subject to the full range of parliamentary scrutiny inherent in bringing forward proposed legislation in the form of a bill.

1.111 In this regard, the detailed and thorough explanatory memorandum has set out why, in some instances, it may be appropriate to leave certain matters to delegated legislation. For example, in relation to proposed section 17, which requires the provider to publish certain information that meets the requirements of the rules, the explanatory memorandum explains the necessity of having the flexibility to consider how the system is operating in practice and to respond to the evolving risk landscape.[139] The committee appreciates that the area of digital technology is rapidly changing and in such cases delegated legislation may be more appropriate to respond to this challenge.

1.112 However, it is not clear why all of the detail regarding risk management assessments, media literacy plans and complaint handling processes are to be left to the rules. The bill provides only that the rules ‘may require’ certain broad matters, with few limits on what the rules will provide. The explanatory memorandum does not explain why, for example, matters such as a complaints and dispute handling process for misinformation complaints should be entirely set out in delegated legislation. The committee considers the ability of persons to complain about the operation of this scheme is integral to the operation of the scheme, and as such considers further detail regarding this should be included on the face of the legislation. As it stands, no such rules are required to be made, meaning, in theory, there could be no legislative requirement for providers to establish a complaints and dispute handling process, or such processes may only be required in limited circumstances. It is not clear why a requirement could not be provided for in the bill itself.

1.113 Moreover, the bill provides that applications may be made to the Administrative Review Tribunal for review of decisions of the ACMA made under the rules, so long as those rules provide that the decision is a reviewable decision.[140] The explanatory memorandum explains:

As some, but not all, decisions the ACMA may empower itself to make under those provisions may be appropriate for merits review, subsection 204(4A) would enable the ACMA, in developing rules, to provide for merits review where it is appropriate. For example, it is likely the ACMA would provide for merits review where its decision would affect the interests of a person, but that it may not be necessary to do so where decisions would be of a procedural or preliminary nature, would have no appropriate remedy or would have such limited impact that the costs of review cannot be justified.[141]

1.114 The committee considers that, generally, administrative decisions that will, or are likely to, affect the interests of a person should be subject to independent merits review unless a sound justification is provided. The committee understands that it is not possible at this stage to determine which decisions are appropriate for merits review as the content of the ACMA’s decision-making power is not yet clear, as this will be provided for in the rules. The committee considers this exemplifies the problems with leaving significant matters to be dealt with in delegated legislation. Even so, the committee considers it would be possible for the bill to provide that all decisions made by the ACMA under the rules should be subject to merits review, with the option for the ACMA to specifically exclude in the rules certain decisions. This would then require the ACMA to consider each decision and justify each opt out, including allowing for parliamentary oversight of any decision to exclude merits review.

1.115 The committee therefore seeks the minister’s advice as to:

why it is considered necessary and appropriate to leave to the rules all detail regarding risk management, media literacy plans and complaints;

why there is no requirement to make digital platform rules regarding complaints and dispute handling processes for misinformation complaints;

whether further detail could be included on the face of the primary legislation, noting the importance of parliamentary scrutiny; and

whether the bill could provide that all ACMA’s decisions made under the rules are subject to merits review, unless ACMA specifically excludes merits review in individual cases.

Privacy
Significant matters in delegated legislation[142]

1.116 The bill also provides the ACMA with the power to make rules to place record keeping and reporting requirements on providers in relation to misinformation and disinformation.[143] Proposed section 30 states that the rules may require providers to make and retain records relating to misinformation or disinformation and measures implemented by providers to respond to this (the ACMA may also require providers to give records to the ACMA if necessary for it to perform its monitoring and compliance functions).[144] The bill provides that before making such rules the ACMA must consider the privacy of end-users, and that rules must not require providers to make or retain records of the content of ‘private messages’ or of VoIP communications (non-recorded real-time voice communication using the internet). What constitutes a private message is a message between two end-users or to numerous end-users that does not exceed the number specified in the rules, or if no number is specified, 1,000.[145] Failure to comply with the rules would be subject to a civil penalty (up to 5,000 for a body corporate or 1,000 for a non-body corporate).[146]

1.117 Enabling rules to be made that specify the collection, use or disclosure of personal information may impact on the right to privacy. As such, the committee expects the explanatory materials accompanying the bill to contain a clear explanation justifying why this is appropriate and what safeguards are in place to protect personal information. In addition, these are significant matters being left to the rules and as set out above, the committee expects the explanatory memorandum to the bill to address why it is appropriate to include the relevant matters in delegated legislation. In this instance, the explanatory materials accompanying the bill have provided a detailed analysis of the privacy implications of this measure.

1.118 The statement of compatibility states that it is possible that in making such a rule the ACMA could effectively require providers to make and retain records that include personal information. It states that the objective behind this, and the power to gather information:

is to enable the ACMA to collect data regarding the spread of misinformation and disinformation, so as to enable it to assess the steps being taken by digital communications platform providers to manage the risk of misinformation and disinformation on their platforms [and] are also aimed at enabling the ACMA to publish information about the prevalence and nature of misinformation and disinformation on digital communications platforms, and about the steps being taken by digital communications platform providers to prevent and respond to misinformation and disinformation. This in turn is aimed at empowering end-users to identify misinformation and disinformation on digital communications platforms.[147]

1.119 The explanatory memorandum expands on the safeguards that are available, noting that the ACMA must consider the privacy of end-users before making a rule in relation to records:

This requirement would be particularly important if the ACMA were to make a digital platform rule for the purpose of this clause requiring digital communications platform providers to make and retain records containing personal information, as that term is defined in section 6 of the Privacy Act. This might arise, for example, if the ACMA were to make a digital platform rule requiring a digital communications platform provider to make and retain records of examples of misinformation and disinformation posted by individual end-users that have been removed from the digital communications platform.

When considering the privacy of end-users before making a digital platform rule in relation to records, the ACMA would be expected to consider the extent to which particular records are necessary and reasonable for the purpose of regulating misinformation and disinformation. For example, it is expected that the ACMA would consider the extent to which it may be feasible to use de-identified records to achieve the objectives stated in the legislation, and should ensure that if digital communications platform providers are required to retain records of personal information, these are only required to be retained for the period of time reasonably necessary to achieve those objectives. Any risks to the privacy of end-users would also be minimised by the fact that the rules would not be permitted to require digital communications platform providers to make or retain records of the content of private messages or VoIP communications (subclause 30(3) ...). In addition, the ACMA must comply with the requirements of the Privacy Act when dealing with personal information, including Australian Privacy Principle 11 (about security of personal information).[148]

1.120 The requirement that the ACMA must consider the privacy of end-users before making a rule in relation to records is an important safeguard. However, the committee notes that this would require the ACMA only to ‘consider’ privacy rather than require the ACMA to make rules that are consistent with the right to privacy. Moreover, it is not clear why the protections the explanatory memorandum states the ACMA would be expected to consider are not set out on the face of the bill itself. It is unclear to the committee why, for example, the length of time such records should be retained for cannot be included in the bill.

1.121 A further important safeguard is that the rules cannot require providers to make or retain records of the content of private messages. What qualifies as a private message, however, is to be determined by the rules (or if the rules are silent on this, the number of users is 1,000). The rules may make the definition extremely wide (for example, messages sent to 2,000 people may be considered private) or they may make it extremely narrow (for example, messages sent between only ten people or less may be considered private, meaning messages sent to 11 or more people would not be considered private). The explanatory memorandum sets out why the rules should be able to set out the number of recipients:

Allowing the maximum number of end-users to whom a private message may be sent to be specified in the digital platform rules, as opposed to in Schedule 9, allows the determination of this number to be informed by information made available to the ACMA pursuant to the operation of other provisions in Schedule 9. It would be expected, for example, that the determination of the maximum number of recipients that may receive a private message – with the result that such messages would not be subject to record keeping and reporting obligations (clause 30), the ACMA’s information gathering powers (clauses 33 and 34) or misinformation codes or standards – may be informed by information on misinformation complaints, made available by digital communications platform providers pursuant to any digital platform rules made under paragraph 25(2)(c) and any additional information regarding misinformation and disinformation on digital communications platforms obtained by the ACMA pursuant to clauses 33 and 34.[149]

1.122 While the committee acknowledges this explanation, it is unclear why the bill cannot set a minimum number of end-users to ensure the rules are not empowered to set an overly narrow number of end-users and therefore undermine this important privacy protection.

1.123 The committee seeks the minister’s advice as to:

why it is considered necessary and appropriate to leave to the rules all details regarding record keeping relating to misinformation or disinformation;

why privacy protections specified in the explanatory memorandum are not included in the bill itself, such as in relation to de-identification and that records should only be retained for as long as is reasonably necessary; and

why the bill does not contain a minimum number of end-users as to what constitutes a ‘private message’ (noting that if the rules set a low number, important privacy protections would not apply to such messages).

Freedom of expression
Significant matters in delegated legislation [150]

1.124 The bill specifies that providers in the digital platform industry may develop misinformation codes. If the ACMA is satisfied that a body or association represents a particular section of the digital platform industry, the ACMA may request that they develop a misinformation code.[151] The ACMA may make a misinformation standard if such a request is not complied with; or the ACMA considers a particular section of industry is not represented by a body or association; a code is not providing adequate protection; or there are exceptional and urgent circumstances.[152]

1.125 The bill does not set out what must be in such codes or standards. Instead it provides examples of matters that may be included depending on which section of the digital platform industry is involved. Examples of what might be in codes or standards include:

• preventing or responding to misinformation or disinformation (including that which constitutes an act of foreign interference);

• preventing advertising that constitutes misinformation or disinformation;

• supporting fact checking;

• giving information to end-users about the source of political or issues-based advertising, improving media literacy of end-users, and allowing end-users to detect and report misinformation or disinformation; and

• policies and procedures for receiving and handling reports and complaints from end-users.[153]

1.126 The ACMA may approve a code developed by industry if the ACMA is satisfied that there has been appropriate consultation, the code requires participants to implement measures to prevent or response to misinformation or disinformation, and enables assessment of compliance with the measures.[154] In addition, the ACMA may only approve a code, or make a standard, if the ACMA is satisfied that:

• it is reasonably appropriate and adapted to achieving the purpose of providing adequate protection for the Australian community from serious harm caused or contributed to by misinformation or disinformation; and

• goes no further than reasonably necessary to give that protection.[155]

1.127 Once the ACMA has approved a code, or made a standard, the code or standard would be a disallowable legislative instrument.[156]

1.128 If providers did not comply with a code or standard they would be subject to significant civil penalties. For non-compliance with a code, a body corporate provider could face a civil penalty of up to 10,000 units (or $3.13 million) or up to two per cent of their yearly annual turnover, whichever is greater. This increases to 25,000 units (or $7.825 million) or 5 per cent of annual turnover for non-compliance with a standard.[157]

1.129 As such, providers could face substantial penalties if they were not to comply with a code or standard requiring them to prevent or respond to misinformation or disinformation on their platforms. While the bill does not itself empower the ACMA to directly regulate content on the internet, providers are incentivised (by the threat of substantial penalties) to remove content on their platforms that might constitute misinformation or disinformation. In doing so, this impacts on freedom of expression, and depending on how this is applied in practice, may unduly trespass on individual rights and liberties.

1.130 The statement of compatibility recognises this, noting:

These measures could feasibly incentivise digital communications platform providers to take an overly cautious approach to the regulation of content that could be regarded as misinformation and disinformation – or in other words, they could have a ‘chilling effect.’[158]

1.131 It goes on to say that the measures are aimed at addressing the risk that misinformation and disinformation could cause or contribute to serious harm and argues that the measures are focused on systems and processes rather than regulation of actual content and there are safeguards in that there are exemptions for certain content and privacy protections.[159]

1.132 The potential impact on freedom of expression relates to how providers will interpret their obligations under relevant codes or standards. As to whether freedom of expression is adequately protected will depend on a mixture of how robust the free speech protections are in the codes or standards, and how they are applied in practice.

1.133 The breadth of the definition of what constitutes misinformation or disinformation is particularly relevant to this. Misinformation is defined in the bill to mean dissemination of content to end-users in Australia if:

• the content contains information that is reasonably verifiable as false, misleading or deceptive;

• the provision of content is reasonably likely to cause or contribute to serious harm, which is defined exhaustively and relates to harm to electoral process; public health; physical injury; imminent damage to critical infrastructure or disruption of emergency services; or imminent harm to the economy, and that has significant and far-reaching consequences for the community or severe consequences for an individual; and

• is not dissemination of content that is parody or satire; professional news content; or for any academic, artistic, scientific or religious purpose.[160]

1.134 Disinformation is defined in the same way as misinformation with the addition that there must be grounds to suspect the person disseminating the content intends that it deceive another person, or the dissemination involves ‘inauthentic behaviour’.[161]

1.135 The committee considers it important that an exhaustive definition of ‘serious harm’ is provided for in the bill, with a high threshold, particularly by reference to the need for the harm to have significant and far-reaching consequences for the community or severe consequences for an individual. The bill also sets out that in determining whether content is reasonably likely to cause or contribute to serious harm regard must be had to a range of factors including the circumstances in which it is disseminated; the subject matter of the information; its potential reach and speed of dissemination; and the author and purpose of the dissemination. The bill provides that the minister may, by legislative instrument, determine a matter which may be considered as part of this test. The explanatory memorandum provides a useful justification for including this matter in delegated legislation, noting that ‘it is possible that in light of evolutions in technology, the minister may determine that there is another factor that is so significant that it should be explicitly prescribed as a matter to be considered’.[162]

1.136 The breadth of the exceptions is also relevant when considering the limit on freedom of expression. In this regard, the bill excludes the dissemination of professional news content from what constitutes misinformation or disinformation. This is defined in the bill as news content produced by a person who publishes it in a range of formats and is subject to various Australian editorial standards or to rules or standards that are analogous to this if it relates to the provision of quality journalism and the person has editorial independence.[163] The explanatory memorandum states that for rules or standards to be considered analogous:

internal editorial standards should include, at a minimum:

• a mechanism for accepting, adjudicating and notifying complainants of the outcome of complaints about news content, and

• standards relating to the accuracy and impartiality of news content.[164]

1.137 While the definition of misinformation and disinformation require that the content must be provided to one or more end-users in Australia, the person posting the content does not need to be in Australia. As such, it is likely that a significant amount of news content will be produced by persons located overseas, and it is unclear how providers, particularly individual fact-checkers, would be able to ascertain if the person who produced the content was subject to appropriate editorial standards. If this is interpreted overly narrowly there is the potential for news content produced by journalists from countries without established journalistic rules or standards to be blocked, despite the content reporting important news. The explanatory memorandum is silent on how journalistic content from overseas counties will be treated in practice.

1.138 Ultimately, whether these measures will unduly trespass on the right to freedom of expression will depend on the processes by which each industry participant determines what individual content will constitute misinformation or disinformation. The burden of determining if particular content is reasonably verifiable as false, misleading or deceptive will likely fall on individual fact checkers. While the explanatory memorandum does a good job of giving examples of some matters that could be considered in determining if content is reasonably verifiable (including expert opinions, multiple reliable and independent sources, similar complaints),[165] a significant burden would appear to rest on the fact checker to be able to assess this. Similarly, fact checkers would need to assess, in relation to disinformation, whether there are grounds to suspect that a person intends that content deceive another person. Again, the explanatory memorandum provides useful examples of what this might include, such as where similar complaints have been made, or content is a doctored image or false content using logos of trusted sources.[166] But again, much will depend on how individual fact checkers apply this in practice. This is why reviewing how these measures are working in practice to protect freedom of expression is vitally important. The bill provides for a review of this scheme every three years, which includes assessing the impact of the scheme on freedom of expression.[167] In the interim it will be important for industry participants to be transparent about their processes. Yet the detail of this will be set out in the codes and standards, rather than in primary legislation. The committee considers that without knowing the detail of what will be included in these codes or standards, including no requirement as to what must be included, it is difficult to adequately assess whether this measure may unduly trespass on rights and liberties.

1.139 Finally, the committee notes that the approach of this proposed scheme is to incentivise providers to remove content assessed to be misinformation or disinformation. Substantial penalties apply if inappropriate content is not adequately managed. Yet, no penalty is applicable if providers go too far in limiting freedom of expression. If providers were to take the view that all content in relation to a particular contentious topic were to be blocked, there is no legislation that would prevent them from doing so. The bill seeks, in some degree, to address this by requiring the ACMA, when approving codes or making standards, to be satisfied that it is reasonably appropriate and adapted to achieve the purpose of protecting the community from serious harm and goes no further than is reasonably necessary.[168] The explanatory memorandum explains the basis for these provisions:

this requirement is aimed at ensuring that the power conferred on the ACMA is wholly valid, by making clear on the face of the legislation that the power it confers cannot be exercised in a way that would transgress the constitutional limits imposed by the implied freedom of political communication, which the High Court of Australia has recognised as impliedly protected by the Australian Constitution.[169] Freedom of political communication in this context means people’s ability to communicate ‘information and opinions about matters relevant to the exercise and discharge of governmental powers and functions on their behalf’.[170] ... It means that before determining a standard, the ACMA must carefully consider the way in which each of the measures contained in the standard burden the implied freedom of political communication, and whether in all the circumstances, the burden imposed by the standard overall is reasonable and not excessive.[171]

1.140 The committee considers that requiring the ACMA to consider whether measures in a code or standard would burden the implied freedom of political communication is an important protection. However, the committee notes that the right to freedom of expression is broader than just that of political communication. It applies to other types of speech that may not involve discussion of political matters. Given the only oversight of what private providers do in response to this scheme is through enforcement of these codes and standards, requiring those codes and standards to appropriately balance the right to freedom of expression is essential. As such, it is not clear to the committee why the bill does not explicitly require the ACMA to consider whether the right to freedom of expression is appropriately balanced before approving a code or making a standard.

1.141 The committee considers the explanatory materials accompanying this bill to be of particularly high quality, especially in providing examples of how key aspects of the proposed scheme are likely to apply in practice. The committee notes, however, that the scheme has the potential to apply a chilling effect on freedom of expression, as it incentivises providers to remove content that might constitute misinformation or disinformation, while there is no incentive for providers to respect the right to freedom of expression.

1.142 Noting the above comments, the committee seeks the minister’s advice as to:

whether the definition of ‘professional news content’ is overly narrow in requiring that the person producing the content be bound by specific editorial standards, and how this is likely to operate in practice in relation to journalists producing content in countries that may not have analogous standards;

why it is considered necessary and appropriate to leave to codes and standards all processes by which participants in a digital platform industry are to prevent or respond to misinformation or disinformation, including why there is no requirement as to what such a code or standard must contain; and

whether the bill could be amended to require the ACMA to be satisfied that a misinformation code or standard appropriately balances the importance of protecting the community from serious harm with the right to freedom of expression.


[132] This entry can be cited as: Senate Standing Committee for the Scrutiny of Bills, Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, Scrutiny Digest 13 of 2024; [2024] AUSStaCSBSD 203.

[133] Schedule 1, item 2, proposed Schedule 9, Division 2, Subdivisions B-D. The committee draws senators’ attention to these Subdivisions pursuant to Senate standing order 24(1)(a)(iv) and (v).

[134] Schedule 1, item 2, proposed sections 5 and 7 set out the providers who would be bound by these requirements, as being those who provide a digital communications platform, which is a digital service that is a connective media service; a content aggregation service; an internet search engine service; a media sharing service; or a kind of digital service determined by legislative instrument, but does not include an internet carriage service, SMS service or MMS service.

[135] Schedule 1, item 2, proposed Subdivision B, Division 2, Part 2, Schedule 9.

[136] Schedule 1, item 2, proposed Subdivision C, Division 2, Part 2, Schedule 9.

[137] Schedule 1, item 2, proposed Subdivision D, Division 2, Part 2, Schedule 9.

[138] Schedule 1, item 2, proposed sections 20, 23 and 26 together with Schedule 2, item 20, proposed subsection 205F(5E).

[139] Explanatory memorandum, p. 67.

[140] Schedule 2, item 15.

[141] Explanatory memorandum, pp. 137–138.

[142] Schedule 1, item 2, proposed section 2, definition of ‘private message’ and section 30. The committee draws senators’ attention to these provisions pursuant to Senate standing order 24(1)(a)(i) and (iv).

[143] Schedule 1, item 2, proposed sections 30–32.

[144] Schedule 1, item 2, proposed section 34.

[145] Schedule 1, item 2, proposed section 2, definition of ‘private message’.

[146] Schedule 1, item 2, proposed section 31 together with Schedule 2, item 20, proposed subsection 205F(5E).

[147] Statement of compatibility, p. 16.

[148] Explanatory memorandum, p. 86.

[149] Explanatory memorandum, p. 27.

[150] Schedule 1, item 2, proposed Division 4. The committee draws senators’ attention to this provision pursuant to Senate standing order 24(1)(a)(i) and (iv).

[151] Schedule 1, item 2, proposed section 48.

[152] Schedule 1, item 2, proposed sections 55–59.

[153] Schedule 1, item 2, proposed sections 55–59.

[154] Schedule 1, item 2, proposed sections 47.

[155] Schedule 1, item 2, proposed sections 47 and 54.

[156] Schedule 1, item 2, proposed subsections 47(6), 55(2), 56(2), 57(3), 58(3), and 59(2).

[157] Schedule 1, item 2, proposed sections 52 and 62 together with Schedule 2, item 20, proposed subsections 205F(5G) and (5H). Note that a non-body corporate would face up to 2,000 penalty units for non-compliance with a code and up to 5,000 penalty units for non-compliance with a standard.

[158] Statement of compatibility, p. 18.

[159] Statement of compatibility, p. 19.

[160] Schedule 1, item 2, proposed subsection 13(1) and sections 14 and 16.

[161] Schedule 1, item 2, proposed subsection 13(2) and section 15.

[162] Explanatory memorandum, p. 46.

[163] Schedule 1, item 2, proposed subsection 16(2).

[164] Explanatory memorandum, p. 64.

[165] Explanatory memorandum, p. 44.

[166] Explanatory memorandum, p. 45.

[167] Schedule 1, item 2, proposed section 70.

[168] Schedule 1, item 2, proposed subparagraph 47(1)(d)(ii) and (iv), 50(1)(d)(ii) and (iv), section 54 and subsection 60(2).

[169] Nationwide News Pty Ltd v Wills [1992] HCA 46; (1992) 177 CLR 1; Australian Capital Television Pty Ltd v The Commonwealth (1992) 177 CLR 106.

[170] Nationwide News Pty Ltd v Wills [1992] HCA 46; (1992) 177 CLR 1, 72 (Deane and Toohey JJ).

[171] Explanatory memorandum, pp. 114–115.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/other/AUSStaCSBSD/2024/203.html