Advertisment

Decoding how different countries are regulating social media for teenagers

To deal with a problem, you need to address it from the root cause. While Australia has banned social media for users under 16, and several countries are considering it, the key lies in how well they are regulating social media in the first place. We look at how countries including India have been regulating social media for teenagers.

author-image
Shamita Islur
New Update
regulating social media for teenagers

It seems like Australia’s decision to ban social media for teenagers under 16 has sparked global discussions, with several countries examining similar measures. This decision is considered to be a bold and controversial move and is now serving as a catalyst for broader debates about the mental health implications of social media and the need for stricter regulations. 

For example, Swedish police have been observing gangs recruiting teenagers to commit murders and bombings on social media. Police data is said to have revealed 93 kids under 15 years old connected to planning murders in the first seven months of 2024. The country’s law enforcement noted that’s three times more than last year. To prevent this, Sweden is reportedly considering a ban. 

Similarly, the United Kingdom could also consider banning social media use for teenagers under 16s. According to reports, the country has commissioned research to assess the impact of online platforms on youths. By next summer, the U.K.'s Online Safety Act aims to bring in protections for children to ensure their online experiences are appropriate for their age.

When Australia enacted a nationwide social media ban for teenagers under 16, it all began with a conversation at home. The catalyst? A book titled The Anxious Generation by U.S. social psychologist Jonathan Haidt. After reading the book, Annabel Malinauskas, wife of South Australia Premier Peter Malinauskas, urged him to take action. Studies cited by the Australian government highlight that teenagers spending over three hours daily on platforms like Instagram or TikTok are twice as likely to report symptoms of mental health issues. Prime Minister Anthony Albanese stated, “Social media is doing harm to our kids, and I’m calling time on it”.

While the ban will be fully implemented in the coming year, data from an Ipsos survey shows that it’s not just Australians who support banning social media for children and young teens. According to the report, two-thirds of respondents across the 30 countries surveyed said the same with 73% of Indians surveyed supporting it.

 

 

While protecting teenagers is a priority, leaders have highlighted that such restrictions could sever teens from positive digital spaces. Additionally, such bans could infringe on children’s rights to digital access and expression. International frameworks like the United Nations Convention on the Rights of the Child emphasise children’s right to participate in the digital world. Adopted in March 2021, the General comment No. 25 established children as rights holders in the digital environment. 

While there’s an increasing focus on getting children off of social media, it could potentially hinder their ability to attain information and seek connections with one another, especially the ones that are vulnerable. While countries are drafting plans to restrict teens’ social media access, the question arises, how have they regulated social media usage for teens? 

Regulations by countries 

Countries have adopted varied approaches to regulating social media for teenagers. 

India

The country is currently enhancing its regulatory framework for social media use by teenagers, particularly under the Digital Personal Data Protection Act, 2023. The law introduces several measures aimed at protecting minors online:

Parental consent for minors: Users under the age of 18 are considered children and will require verifiable parental consent to use social media platforms. Platforms may need to authenticate the identities of both minors and their parents through Aadhaar-based checks or DigiLocker integration. 

Restrictions on targeting children: The law prohibits tracking, monitoring, or targeting children with personalised advertisements, aiming to reduce harm from such practices.

Age verification systems: The government is working on implementing age verification mechanisms across platforms. This may include app store-level validations to ensure that users meet age requirements.

United States 

Federal Regulations

Kids Online Safety and Privacy Act (KOSPA): This legislation combines elements of the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act. It requires platforms to implement stricter privacy protections for minors, such as limiting data collection and improving safety tools for parents. It has passed the U.S. Senate but awaits further action in the House of Representatives.

Children’s Online Privacy Protection Act (COPPA): Originally enacted in 1998 and updated in 2013, COPPA restricts data collection from children under 13 without parental consent. Efforts are ongoing to update it to address modern online platforms.

State-Level Initiatives

California Age-Appropriate Design Code Act: Effective from July 2024, this law mandates platforms to set their systems to default with the highest privacy settings for users under 18. It prohibits features like targeted advertising for minors and encourages platforms to prioritise child safety.

Utah’s social media regulation laws: Utah has passed legislation requiring parental consent for minors to create social media accounts. It also imposes a curfew on minor accounts and mandates access to activity data for parents.

Arkansas’ Social Media Safety Act: This law requires social media platforms to verify users' ages and obtain parental consent for minors, aiming to prevent unsupervised access.

Additionally, many states, including Texas and Louisiana, are introducing measures to limit harmful content and implement stronger safeguards for minors online. 

European Union

Digital Services Act (DSA): The DSA requires large online platforms with over 45 million users in the EU to ensure the safety of minors. This includes avoiding personalised ads for children, banning manipulative designs (like infinite scrolling), and ensuring that their algorithms do not recommend harmful content to young users. This law also mandates regular impact assessments on the effects of social media on minors. 

There has been growing support for stricter age limits on social media usage. The Danish Prime Minister, along with other EU leaders, has proposed raising the minimum age for social media use to 15. Moreover, parental consent is required for the processing of personal data for children under the age of 16.

China

Screen time limits: The Chinese government has enforced strict limits on the amount of time minors under 18 years old can spend on social media and gaming platforms. In 2021, it introduced new rules limiting children to just one hour of online gaming on weekends and holidays, significantly reducing the previous limit of three hours during holidays.

Youth mode on platforms: Platforms like ByteDance's Douyin (the Chinese version of TikTok) have implemented youth mode restrictions for users under 14. This mode allows a maximum of 40 minutes of screen time per day, enforcing usage limits between 6 a.m. and 10 p.m. It also includes educational content designed to enrich minors' experiences.

Norway

The Norwegian government plans to raise the minimum age for social media access from 13 to 15. Currently, children under 13 are not allowed to use social media platforms. This move is aimed to better protect children from harmful online content. Prime Minister Jonas Gahr Støre emphasised the importance of safeguarding young individuals in the digital space. 

The proposed regulations would require social media companies to implement age verification mechanisms, potentially using digital ID systems like BankID, which are already used for other age-restricted services in Norway.

France

In France, legislative measures mandate social media platforms to authenticate the age of users and seek parental consent for those under 15. This legislation addresses concerns about cyberstalking, harmful content, and social media addiction, especially since many children as young as 8 are accessing platforms like Instagram and Snapchat.

The country is pushing for a broader European framework to set a minimum age of 15 for social media use. This law allows parents to request the suspension of their children’s accounts and includes provisions for limiting screen time and social media companies found in violation could face fines of up to 1% of their global revenues.

Germany

The country has key laws governing youth protection including the German Interstate Treaty on the Protection of Minors in the Media (JMStV), the German Protection of Young Persons Act (JuSchG), and the German Criminal Code (StGB). These laws aim to balance the protection of minors with fundamental freedoms like freedom of expression.

Key provisions:

Protection from harmful content: Content that can impede the development of minors, like violent or sexually inappropriate material, must be restricted or age-verified. Platforms need to block harmful content from minors, such as media that may foster violent or socio-ethical disorientation.

Youth Media Protection Act: This law introduces measures that require platforms to implement safer settings, such as child-friendly terms of service and private profiles, ensuring that users' data and activities are not publicly accessible.

Increased participation of minors: There have been reforms made to the Youth Protection Act to emphasise involving children in the decision-making process. Minors will be represented on advisory boards, ensuring that their perspectives are considered in regulatory decisions.

Challenges ahead

Despite these efforts, there is a growing fear that teenagers will turn to less-regulated platforms or create fake accounts to bypass restrictions. While social media platforms like Meta are working on technologies that can detect user’s ages, it is unclear how it would work. Australian PM Albanese himself was sceptical about the ban being fully effective given that alcohol restrictions have failed to prevent underage drinking.  Experts like Dr. Jean Twenge, author of iGen, argue that the core issue lies in the addictive design of these platforms, which require structural changes. 

Teenagers seek social media for various reasons, but many of them include feeling a sense of freedom to express themselves, learn new things, and stay connected with their friends. When the issue itself lies in the mechanisms of social media platforms, it would be unfair to remove access from teenagers without ensuring that these platforms make space for safer online communities. 

Decode is a weekly series where we will be decoding what’s happening in the world of social media and technology.

Australia social media ban 2024 indian social media regulations teen safety on social media Australia child safety online law US social media regulations