![]() |
Home
| Databases
| WorldLII
| Search
| Feedback
Australian Senate Standing Committee for the Scrutiny of Bills - Scrutiny Digests |
Purpose
|
This bill seeks to amend the Online Safety Act 2021 to impose a
civil penalty obligation on certain providers (once specified by the minister)
to prevent Australian children aged under
16 from having accounts with that
provider
|
Portfolio
|
Communications
|
Introduced
|
House of Representatives on 21 November 2024
|
Bill status
|
Before the House of Representatives
|
1.64 This bill provides that providers of certain kinds of social media platforms must take reasonable steps to prevent children under 16[70] from having accounts. It would do so by applying a civil penalty to these providers if they do not take reasonable steps to prevent children under 16 from having accounts with their social media platforms.[71] The civil penalty would be up to 30,000 penalty units (currently $9.9 million, or for body corporates $49.5 million).[72]
1.65 At the outset, the committee notes with concern, from a scrutiny perspective, the speed with which this bill is anticipated to pass the Parliament. The bill was introduced in the House of Representatives on 21 November 2024. While senators had an opportunity to hear from submitters in a hearing of the Senate Standing Committee on Environment and Communications on 25 November, the committee notes that the truncated time between introduction of the bill and the hearing may have diminished the ability of senators to scrutinise the bill to the fullest extent.
1.66 The committee notes that the timeline for the passage of the bill also diminishes the ability of this committee to undertake its usual scrutiny process, including to engage in meaningful dialogue with the executive to address any concerns.
1.67 The committee notes that the standing orders of both houses of the Parliament with respect to legislation are designed to provide members of the Parliament with sufficient time to consider and reconsider the proposals contained in bills. The committee is of the view that truncated parliamentary processes by their nature limit parliamentary scrutiny and debate. This is of particular concern in relation to bills that may trespass on personal rights and liberties.
1.68 The committee is also concerned that much of the detail of this legislative scheme has not yet been developed. In part this is due to significant reliance on delegated legislation, but it is also the case that the online platforms are delegated the task of developing the mechanisms and procedures to verify a user’s age. The social media platforms on whom the obligation would be imposed is stated to be an electronic service (accessible in Australia) that has as its sole or significant purpose enabling online social interaction between two or more end-users; allows end-users to post material and link to, or interact with, other end-users; and satisfies ‘such other conditions (if any) as are set out in the legislative rules’. The rules may specify the inclusion or exclusion of any electronic services for the purposes of this definition.[73]
1.69 As such, much of the detail as to which social media platforms will be covered by this scheme may be set out in delegated legislation. The committee's view is that matters that may be significant to the operation of a legislative scheme should be included in primary legislation unless sound justification for the use of delegated legislation is provided. The committee notes that providing for a broad range of matters to be provided for in delegated legislation provides the minister with a broad power to determine the scope and operation of significant aspects of the bill.
1.70 In this instance, the explanatory memorandum states that the bill ‘casts a wide net’ in terms of which platforms are captured, but provides that ‘flexibility to reduce the scope or further target the definition will be available through legislative rules’.[74] It goes on to state that the use of delegated legislation allows the government ‘to be responsive to changes and evolutions in the social media ecosystem’.[75] It states that in the first instance, the government proposes to make rules to exclude messaging apps, online gaming services and services with the primary purpose of supporting the health and education of end-users.[76] The committee acknowledges that the social media landscape is likely to quickly evolve and the use of delegated legislation in these circumstances may be necessary. However, the committee notes that the question of which social media providers are covered by this scheme is a significant matter that is worthy of proper parliamentary consideration. In this regard, if the government’s intention is that messaging apps, online gaming services and services supporting health and education are not to be captured by this bill, it is not clear to the committee why this cannot be included in the bill currently before the Parliament.
1.71 Further, the committee notes that the mechanisms and processes for age assurance or verification will be determined by the relevant social media platforms. The bill provides that one of the functions of the eSafety Commissioner is to formulate written guidelines for the taking of reasonable steps to prevent children under 16 from having relevant accounts, but that these guidelines are not legislative instruments.[77] The explanatory memorandum notes that these guidelines will not be binding but ‘are intended to give regulated entities practical guidance’ on how they can meet the minimum age obligation, ‘including by setting out what age assurance methodologies could satisfy the obligation’.[78] The committee notes that as these guidelines are not legislative instruments, Parliament would have no oversight of the content of the guidelines. The committee considers parliamentary scrutiny over the operation of this scheme would be improved if these guidelines were to be made binding by way of a disallowable legislative instrument.
1.72 Finally, the committee’s concerns regarding parliamentary oversight are triggered by proposed section 63E. This provision provides that the civil penalty to be imposed on certain social media platforms for failure to take reasonable steps to prevent children under 16 from having accounts will only take effect if the minister specifies, by notifiable instrument, a day for it to take effect. While there is a minimum period before the minister could bring the civil penalty into effect (namely, not before 12 months after the Act commences), it is open-ended as to when the penalty provision may come into effect. The explanatory memorandum explains the reason for this as follows:
This flexibility reflects the novel nature of the Bill, and the inherent uncertainties with taking forward world-leading legislation. It also provides for flexibility for the Commissioner to establish the necessary guidance and enforcement framework with an appropriate lead-time for effective operation of the law. It is the Government’s intention to give effect to the minimum age obligation as soon as practicable, balancing the need to act quickly to minimise risks of harm to young Australians online, with realistic timeframes for regulatory compliance.[79]
1.73 While the committee acknowledges the likely need for greater consultation and time to consider how this legislation is to be implemented, it is concerned that by providing on open-ended ability for the minister to determine when the social media minimum age requirements are to commence, Parliament has no oversight of whether these requirements ultimately will ever commence. While the committee acknowledges the government’s intention to give effect to these obligations as soon as practicable, as a matter of law, there is no requirement that these obligations need ever commence should the executive government choose not to proceed. The committee’s view is that legislation should not give the executive unfettered control over whether or when provisions in an Act passed by the Parliament should come into force.
1.74 The committee notes with concern the anticipated speed at which this bill may pass the Parliament. While the procedure to be followed in the passage of legislation is ultimately a matter for each house of the Parliament, the committee reiterates its consistent scrutiny view that legislation, particularly where it may trespass on personal rights and liberties, should be subject to thorough parliamentary scrutiny.
1.75 The committee is also concerned that the bill leaves key elements of the scheme to delegated legislation, namely which social media providers will, or will not, be subject to the scheme. The committee notes that the mechanisms and processes for the verification of a user’s age is left entirely to providers, and any guidance from the eSafety Commissioner will be non-binding and not subject to parliamentary oversight.
1.76 The committee is particularly concerned that the bill gives the executive unfettered control over whether and when the substantive provision of the bill will come into force and considers proposed section 63E should be amended to provide a specified day by which section 63D must come into force.
1.77 The committee otherwise draws these scrutiny concerns to the attention of senators and leaves this matter to the Senate as a whole.
1.78 The committee also draws this matter to the attention of the Senate Standing Committee for the Scrutiny of Delegated Legislation.
1.79 As set out above, the bill leaves to the providers of certain social media platforms the method by which they are to comply with their obligation to prevent children under 16 from having accounts. As the explanatory memorandum states, while the bill does not prescribe the reasonable steps a platform must take, ‘it is expected that at a minimum, the obligation will require platforms to implement some form of age assurance’. The explanatory memorandum them provides:
Section 63D would not preclude a platform from contracting with a third party to undertake age assurance on its behalf. Similarly, it would be open to a platform to enter into an agreement with app distribution services or device manufacturers, to allow for user information to be shared for age assurance purposes (subject to the consent of users and compliance with Australian privacy laws).
In addition to age assurance, compliance with the minimum age obligation is also likely to require platforms to implement systems and procedures to monitor and respond to age-restricted users circumventing age assurance.[81]
1.80 To avoid breach of the civil penalty provision, providers will need to implement measures to ascertain a user’s age. Presumably, this will require providers to subject all users, not just those under 16, to age verification or assurance processes– whether undertaken directly or via a third party. Requiring all people in Australia who have, or wish to have, a social media account to provide evidence of their age appears to limit the right to privacy. It may also have a chilling effect on freedom of expression to the extent users may be deterred from accessing social media accounts if they have concerns as to how their private information may be treated.
1.81 In this regard, the bill provides some safeguards to protect the privacy of information collected for the purposes of verifying a person’s age. Proposed section 63F provides that if an entity holds personal information collected for these purposes, the use or disclosure of that information is taken to be in breach of the Privacy Act 1988 (Privacy Act) (and therefore subject to civil penalty provisions) unless the use or disclosure:
• is for the purpose of determining whether the user is a child under 16; or[82]
• is with the voluntary, informed, current, specific and unambiguous consent of the individual; or[83]
• is in circumstances where certain aspects of the Australian Privacy Principles apply:[84]
• when required or authorised by law or a court or tribunal order;
• when necessary to lessen or prevent a serious threat to life, health or safety;
• when the entity has reason to suspect that unlawful activity or misconduct of a serious nature relating to their functions or activities have been engaged in and it is necessary to take appropriate action in relation to the matter;
• when the entity reasonably believes it necessary to locate a missing person;
• when reasonably necessary for the establishment, exercise or defence of a legal or equitable claim; of a confidential dispute resolution process; for diplomatic or consular functions; or for any war or warlike operations, peacekeeping, humanitarian assistance etc;
• for organisations where a permitted health situation exists (including for the provision of health services and research); and
• when the entity reasonably believes it is reasonably necessary for one or more enforcement related activities by an enforcement body (such as the police, the immigration department or corruption commissions).
1.82 Further, if an entity holds personal information collected for the purposes of assessing a person’s age, the entity must destroy the information after using or disclosing it for the purpose for which it was collected. The failure to do this is taken to be an interference with the Privacy Act.
1.83 The explanatory memorandum states that the bill introduces robust privacy protections ‘including prohibiting platforms from using information collected for age assurance purposes for any other purpose, unless explicitly agreed to by the individual’.[85] However, the committee notes that this is not how the bill is drafted, given the exception to allow the use and disclosure when certain aspects of the Australian Privacy Principles apply (as set out above). The explanatory memorandum provides no explanation of why these exceptions have been included. The committee is concerned that the broad exception to these privacy protections undermine the effectiveness of the proposed safeguards.
1.84 The committee notes that the privacy of all Australians wishing to access social media will be affected by leaving age verification methods and processes to relevant social media providers. If users are deterred from accessing social media accounts because of concerns regarding the collection of their private information, this may also have an impact on the right to freedom of expression.
1.85 The committee considers the right to privacy would be better protected if the broad exceptions as to when personal information collected for age assurance purposes were removed,[86] so that the bill reflects the explanatory memorandum’s statement that such information can only be used for the purposes for which it was collected, unless explicitly agreed to by the individual.
1.86 The committee otherwise draws these scrutiny concerns to the attention of senators and leaves this matter to the Senate as a whole.
1.87
[68] This entry can be cited as: Senate Standing Committee for the Scrutiny of Bills, Online Safety Amendment (Social Media Minimum Age) Bill 2024, Scrutiny Digest 15 of 2024; [2024] AUSStaCSBSD 235.
[69] Schedule 1. The committee draws senators’ attention to this Schedule pursuant to Senate standing order 24(1)(a)(iv) and (v).
[70] Being children ordinarily resident in Australia, see the definition of ‘age-restricted user’ in item 2 and existing definition of ‘Australian child’ in the Online Safety Act 2021, section 5.
[71] Schedule 1, item 7, proposed section 63D.
[72] See Regulatory Powers (Standard Provisions) Act 2014, subsection 82(5).
[73] Schedule 1, item 7, proposed section 63C.
[74] Explanatory memorandum, p. 3.
[75] Explanatory memorandum, p. 3.
[76] Explanatory memorandum, p. 4.
[77] Schedule 1, items 6 and 6.
[78] Explanatory memorandum, p. 18.
[79] Explanatory memorandum, p. 5.
[80] Schedule 1. The committee draws senators’ attention to this Schedule pursuant to Senate standing order 24(1)(a)(i).
[81] Explanatory memorandum, pp. 21–22.
[82] Schedule 1, item 7, proposed subparagraph 63F(1)(b)(i).
[83] Schedule 1, item 7, proposed subparagraph 63F(1)(b)(iii) and subsection 63F(2).
[84] Schedule 1, item 7, proposed subparagraph 63F(1)(b)(ii) when read together with paragraphs 6.2(b), (c), (d) and (e) of the Australian Privacy Principles and the Privacy Act 1988, sections 16A and 16B.
[85] Explanatory memorandum, p. 7, emphasis added. See also statement of compatibility, p. 13. Compare this to the explanatory memorandum, p. 24 which briefly sets out the Australian Privacy Principle exceptions (without explanation).
[86] Namely, Schedule 1, item 7, proposed subparagraph 63F(1)(b)(ii).
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/other/AUSStaCSBSD/2024/235.html