MELBOURNE, Australia (AP) — In a groundbreaking move aimed at enhancing digital safety, Australia has announced plans to mandate social media platforms to implement robust measures to mitigate online harms that users face, including bullying, predatory behavior, and algorithms promoting detrimental content. This significant announcement was made by the government on Thursday.
“The Digital Duty of Care will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms,” Communications Minister Michelle Rowland emphasized in her official statement, underscoring the government’s commitment to protect its citizens in the digital landscape.
The proposed modifications to the Online Safety Act will be unveiled in Parliament next week, coinciding with the introduction of pioneering legislation designed to prohibit children younger than 16 years old from accessing major social media platforms, including X, Instagram, Facebook, and TikTok. This development has stemmed from growing concerns about children’s safety online.
Critics of this new legislation have voiced their concerns, arguing that excluding children from social media platforms could inadvertently diminish the motivation for these companies to create safer online spaces that are less prone to harm.
Research has increasingly implicated social media in the rising rates of self-harm and suicide among young people, as well as the alarming prevalence of eating disorders linked to bullying and exposure to unrealistic body images.
Rowland acknowledged that holding tech companies legally accountable for the safety of their users is a strategy that has previously been adopted by both Britain and the European Union, showcasing a growing global consensus on the issue.
Under the proposed framework, digital businesses will be compelled to take reasonable and proactive steps to mitigate foreseeable harms associated with their platforms and services. This duty of care framework will be supported by a thorough risk assessment and risk mitigation approach, bolstered by principles focused on safety-by-design, as highlighted by Minister Rowland.
Legislating a duty of care signifies that services cannot merely “set and forget.” These platforms will be obligated to continually identify and address potential risks as technology evolves, underscoring the dynamic nature of online safety, she stated.
The categories of harm to be addressed in the legislation include risks to young people’s mental well-being as well as the promotion of harmful practices and illegal activities, reflecting a comprehensive approach to online safety.
The government has yet to disclose the timeline for when the duty of care legislation will be presented to Parliament or the specific penalties that will be enforced for any violations of these new regulations.
The Digital Industry Group Inc. (DIGI), a prominent advocate for the digital sector in Australia, has welcomed the government’s initiative to “future-proof” the Online Safety Act. DIGI managing director Sunita Bose expressed that, “DIGI’s members together represent some of the safest sections of the Internet, and their work to keep people safe on their services never stops,” emphasizing the ongoing commitment to user safety.
“While we await further details about this announcement, DIGI’s members will continue to deliver safety-by-design on their services and work constructively with the government to keep Australians safe online,” Bose added, highlighting the collaborative approach the industry intends to take.
Swinburne University’s digital media expert Belinda Barnet has referred to the proposed duty of care as a “great idea,” recognizing its pioneering potential in holding digital platforms accountable.
“It’s quite pioneering to expect that platforms that host Australian users would have a duty of care responsibility in terms of the content they show and the experiences they offer,” Barnet stated, emphasizing the shift in accountability.
“It’s making the platforms take responsibility, and that just simply doesn’t happen at the moment. There’s an assumption that they’re a neutral third party. They’re not responsible for the impact of that content,” she added, portraying the need for a paradigm shift in how social media platforms are held accountable for the content they disseminate.
What specific measures will be implemented to ensure social media companies comply with the Digital Duty of Care legislation?
**Interview with Communications Minister Michelle Rowland on Australia’s Digital Duty of Care Legislation**
**Interviewer:** Thank you for joining us today, Minister Rowland. The Australian government has recently announced the Digital Duty of Care aimed at enhancing online safety. Could you expand on what this initiative entails?
**Michelle Rowland:** Thank you for having me. The Digital Duty of Care is a significant step towards protecting our citizens in the digital landscape. It mandates social media platforms to proactively implement measures that mitigate online harms, such as bullying, predatory behavior, and harmful content driven by algorithms. Essentially, we are placing the responsibility on these platforms to ensure a safer online environment.
**Interviewer:** It sounds like a comprehensive approach. Alongside this, you’ve introduced legislation that will restrict access to major social media platforms for children under 16. Why is this age limit being implemented?
**Michelle Rowland:** We are seeing alarming trends in mental health issues among young people, often linked to social media exposure. This age limit is part of our strategy to protect children from potential harms. We want to ensure that they are not exposed to environments that can lead to bullying, self-harm, or unhealthy body image perceptions.
**Interviewer:** Critics have raised concerns that this exclusion could lessen the motivation for platforms to create safer environments. How do you respond to those concerns?
**Michelle Rowland:** That’s a valid point and one we’ve considered. However, our objective is to initiate a shift in the responsibility held by tech companies. By legislating a duty of care, we are emphasizing the need for these platforms to continually assess and enhance their safety measures. We believe that by establishing clear expectations, we can encourage companies to prioritize user safety, even if children are restricted from accessing their services.
**Interviewer:** The legislation aligns with efforts seen in the UK and USA. Why do you think there is a growing global consensus on these issues now?
**Michelle Rowland:** The mental well-being of our youth is an international concern that transcends borders. As we gather more evidence linking social media with mental health issues, it becomes increasingly clear that coordinated action is required. We are learning from the experiences of other nations and taking concrete steps to address these challenges comprehensively.
**Interviewer:** Under the new framework, what type of actions will social media companies be required to perform?
**Michelle Rowland:** Companies will need to conduct thorough risk assessments and take proactive steps to mitigate identified risks. This includes implementing safety-by-design principles, which means incorporating safety features from the outset of platform design rather than adjusting afterward. The legislation will enforce that they cannot simply ignore these responsibilities.
**Interviewer:** Thank you, Minister Rowland, for providing insight into this vital issue. It sounds like a crucial initiative for making online spaces safer for all Australians, especially children.
**Michelle Rowland:** Thank you for having me. It’s a pivotal time for digital safety, and we are committed to seeing these changes for the betterment of our society.