Revamping Safety: The Impact of Instagram’s New Policies on Teen Accounts

Revamping Safety: The Impact of Instagram’s New Policies on Teen Accounts

Big changes are expected to come to the Instagram platform soon regarding the protection of minors.

Meta announced this Tuesday, September 19, the creation of Teenage Accounts“, supposed to better protect underage users from the dangers linked to Instagram, an application accused by many associations and authorities of harming the mental health of young people. “This is a significant update, designed to give parents peace of mind.”summarizes for AFP Antigone Davis, vice-president of the Californian group responsible for security issues.

In practice, users aged 13 to 17 will now have private accounts by default, with safeguards on who can contact them and what content they can see. Teenagers under 16 who want a public profile and fewer restrictions, because they want to become influencers, for example, will have to get their parents’ permission. This applies whether they are already registered or new to the platform.

“This is a fundamental change to make sure we really do things right.”the manager emphasizes. Adults will be able to supervise their children’s activities on the social network and act accordingly, including blocking the application. The parent company of Facebook, Instagram, WhatsApp and Messenger is also tightening its rules on age.

“We know that teens may lie about their age, particularly to try to circumvent these protections.”notes Antigone Davis. Now, if a teenager tries to change his date of birth“we will ask him to prove his age.”

Age

Pressure has been mounting for a year against the world’s second largest digital advertising company and its competitors. Last October, some forty US states filed a complaint against Meta’s platforms, accusing them of harming the “mental and physical health of youth”due to the risks of addiction, cyberbullying or eating disorders.

From Washington to Canberra, elected officials are working on bills to better protect children online. Australia is expected to soon set the minimum age for using social networks between 14 and 16. Meta currently refuses to check the age of all its users, in the name of respect for confidentiality.

“If we detect that someone has definitely lied about their age, we intervene,” says Antigone Davis, “but we don’t want to force 3 billion people to provide an ID.” According to the manager, it would be simpler and more effective if age control took place at the level of the mobile operating system of smartphones, i.e. Android or iOS.

“They have significant information about the age of users,” she argues, and could therefore “share them with all the apps that teenagers use.”

Victims

“It’s hard to know to what extent Instagram’s announcement will satisfy the authorities”responded Casey Newton, author of the specialist newsletter Platformer. The concern has reached such proportions that the US Surgeon General recently called for social networks to be required to display information about the dangers faced by minors, like the prevention messages on cigarette packages.

“Instagram is addictive. The app leads children into vicious circles, where they are shown not what they want to see, but what they can’t look away from,” says Matthew Bergman. In 2021, the lawyer founded an organization to defend the “victims of social networks” in court. She represents in particular 200 parents whose child committed suicide “after being encouraged to do so by videos recommended by Instagram or TikTok.”

Matthew Bergman also cites the many cases where young girls have developed serious eating disorders. Meta already prevents the promotion of extreme diets on its platforms, among other measures taken in recent years. “These are small steps in the right direction, but there is so much more to do.”judges the lawyer.

According to him, it would be enough for groups to make their platforms less addictive. “and therefore a little less profitable” without losing their qualities for users, to communicate or explore interests. Auditioned by Congress at the end of January, Meta boss Mark Zuckerberg had presented a rare apology to the parents of victims, saying “sorry for everything you’ve been through.”

– What are the ‍new features of Instagram’s “Teenage Accounts” aimed at protecting minors?⁢

Big Changes Coming to Instagram⁣ to Protect Minors: What You Need to Know

Meta, the parent company of Instagram, Facebook, WhatsApp, and Messenger, has announced a‌ significant update to its platform to better protect underage ⁢users from the dangers‌ linked to ​Instagram. The move comes amid growing pressure from authorities and ⁣associations​ who accuse the social media giant⁢ of harming the mental health of⁣ young people.

Introducing “Teenage Accounts”

The new feature, dubbed “Teenage Accounts,” is designed ​to give parents peace of​ mind ‌by providing an added ‌layer‍ of protection for minors​ on the platform. Users ⁣aged 13 ‍to 17 will now‍ have private accounts by default, with safeguards​ on ​who ‍can ​contact them and what ‍content they can see. Teenagers under 16 who ‌want a public profile and fewer restrictions, such ‍as aspiring influencers, will need to⁤ obtain their parents’ permission.

Tighter ⁤Age Verification

Meta is also tightening its rules ‍on ‌age verification. If a teenager tries⁢ to change their date of birth, they ⁤will be asked⁣ to prove their age. This move is aimed at​ preventing ‍minors from lying about ​their age to⁢ circumvent the new protections.

The Pressure on Social Media Giants

The ‍move comes after a year of mounting pressure on social⁤ media⁣ companies to‍ better protect children online. In October last year, 40 US⁢ states filed a complaint against Meta’s platforms, ⁣accusing them of harming⁣ the mental and physical⁣ health of ⁤youth⁢ due to the risks of addiction,⁢ cyberbullying, and eating disorders. Lawmakers⁤ around the‌ world​ are working on ⁣bills to better protect children online, ​with Australia set to⁢ introduce a minimum age of 14 ​to 16 for using social⁤ networks.

A Call for Action

Experts argue that more needs to be done to protect minors online. Casey⁣ Newton, ⁤author of the‌ specialist newsletter ​Platformer, noted that it’s hard to know if⁢ Instagram’s announcement will satisfy authorities. The US Surgeon General has even called for social‍ networks to ⁤be required to​ display information​ about the dangers faced by​ minors, similar to ⁤prevention ‌messages‍ on cigarette packages.

The Dark‌ Side of ⁣Social​ Media

Matthew Bergman, a lawyer who founded an organization to defend the “victims​ of social networks” in⁤ court, argues that social media platforms like ‍Instagram⁢ are addictive ⁣and can lead children into‌ vicious‌ circles of harmful content. ‍He represents‌ 200 parents whose​ children committed suicide⁣ after being encouraged to do so by videos recommended by Instagram or⁣ TikTok.⁤ Bergman also cites cases ⁢where⁣ young girls have developed serious eating ‍disorders due⁢ to ‍the promotion of extreme diets on social media platforms.

A Step ⁣in the Right Direction

While Meta’s move is seen⁣ as ⁢a step in the right direction, experts ⁢argue that more needs to be ⁢done ⁤to protect minors online. ‌Bergman believes ‌that making⁣ social media platforms less addictive and⁢ therefore less profitable could be the key⁣ to ⁢resolving the issue.

The Future of ​Social Media for Minors

As the ⁤world grapples⁢ with the challenges ⁤of ​protecting minors online, one thing is clear: social media giants like Meta have ⁢a responsibility to ensure their ‌platforms are safe for all users. ⁣The introduction of “Teenage Accounts” and ‌tighter age⁤ verification ​is a start, ​but it’s only the beginning ‍of a long ‌journey towards creating ​a safer​ online environment for minors.

Keywords: Instagram, Meta,‌ minors,⁤ online safety,‍ social media, addiction, mental health, cyberbullying, eating disorders,‍ age verification, social media giants.

Here are some “People Also Ask” (PAA) questions related to the topic of “Big Changes Coming to Instagram to Protect Minors: Meta Announces ‘Teenage Accounts'”:

Big Changes Coming to Instagram to Protect Minors: Meta Announces “Teenage Accounts”

In a bid to better protect underage users from the dangers associated with social media, Meta, the parent company of Instagram, Facebook, WhatsApp, and Messenger, has announced the creation of “Teenage Accounts”. This major update is designed to give parents peace of mind and provide a safer online environment for minors.

Private Accounts by Default for 13-17 Year Olds

As part of the new features, users aged 13 to 17 will now have private accounts by default, with safeguards in place to control who can contact them and what content they can access. This means that only people they follow will be able to see their posts, and they will have more control over who can send them messages.

Parental Permission Required for Public Profiles

Teenagers under 16 who want to have a public profile and fewer restrictions will need to obtain their parents’ permission. This applies to both new and existing users. This move aims to prevent underage users from exposing themselves to potential online dangers without their parents’ knowledge or consent.

Tighter Age Verification Rules

Meta is also tightening its rules on age verification. If a teenager tries to change their date of birth, they will be asked to prove their age. This is a significant change, as many minors have been known to misrepresent their age to access social media platforms.

Supervision and Blocking

Adults will be able to supervise their children’s activities on the social network and take action if necessary, including blocking the application. This will give parents more control over their child’s online activities and enable them to intervene if they notice any suspicious or harmful behavior.

Pressure from Governments and Associations

The pressure to better protect minors on social media has been mounting for some time. Last October, some 40 US states filed a complaint against Meta’s platforms, accusing them of harming the mental and physical health of young people. Governments around the world, from Washington to Canberra, are working on bills to better protect children online.

Calls for Stricter Age Controls

Some are calling for stricter age controls, with Australia expected to set the minimum age for using social networks between 14 and 16. Meta currently refuses to check the age of all its users, citing respect for confidentiality. However, the company is open to exploring alternative solutions, such as age control at the level of the mobile operating system.

Concerns and Criticisms

While Meta’s announcement has been welcomed by some, others are skeptical about its effectiveness. Casey Newton, author of the specialist newsletter Platformer, notes that it’s hard to know whether Instagram’s announcement will satisfy the authorities. Matthew Bergman, a lawyer who represents parents of children who have committed suicide after being encouraged to do so

Leave a Replay