The Online Safety Toolkit: Report: Mobile App Age Limits are Unsafe and Unenforced

Tiktok age limits

The Online Safety Toolkit:

Report: Mobile App Age Limits are Unsafe and Unenforced

Tiktok age limits
Over the past several months, we’ve worked to help decipher online safety concerns and tools that are difficult to navigate. We’ve often emphasized that proper use of parental settings can build a shield for unwanted content and experiences. One of the cornerstones of this approach is setting kids’ devices and accounts to their proper age, so that they won’t be able to download apps that aren’t appropriate. Unfortunately, it seems that these foundation-level safety features are less reliable than we had hoped.

The Canadian Center for Child Protection (C3P) has just released a thorough report investigating mobile app age limits in the Apple Appstore and Google Play Store. The findings are not positive for children’s safety. Here are a few key points, and we encourage readers to check out the entire report.

Key Findings:

  • For the same app, there are significant differences in age limits between apps’ own terms of service, Appstore, and Google Play.Example: Kik, a popular app that has notoriously been unsafe for kids and allows no parental settings, is rated 13+ in their terms of service, 17+ in the Appstore, and “Teen” in Google Play (which equates to 13+).
  • Even when set up as a 7 year old’s account, Google’s default content filtering is “Teen” (13+), meaning they would be able to download age-inappropriate apps, even in violation of apps’ terms of service.
  • In both the Appstore and Google Play, analysts were able to get around parental settings simply by trying to bypass them multiple times over several hours.
  • With one extra click, the Appstore allows accounts of 13-year-olds to download apps rated 17+, including adult dating, sex, and live camera streaming apps. In a couple instances, analysts were able to access this from accounts set up for even an 11-year-old.
  • Through web searches, analysts easily found workarounds for Google Play’s age limits, enabling the account of a 13-year-old to download any app.
  • For both age 13 and age 11 accounts, both app marketplaces promoted apps above the account age, including mature content.


[A portion of the Report Findings. Full report here.]

This report is alarming for online safety advocates. Underpinning all of this are brief, vague, unenforced systems for determining an apps’ rating in the first place. Ratings rely heavily on brief questionnaires that are self-reported by app developers. By placing rating responsibility on developers with such a prominent conflict of interests, app marketplaces neglect their commitment to be creating safer environments for youth.

As to what can be done, the report includes a few suggestions:

  • Transparency on how ratings are determined and why companies have chosen non-standard ages like 17+ (instead of 18+)
  • App age limits should be enforced! Currently both Apple and Google have loopholes or workarounds that allow youth to view and/or download apps that are beyond their age settings.
  • Standardize age ratings across platforms. There is no justification for the same app being accessible 4 years earlier simply because one child has an Android and one has an Apple device. At present, this is the case.

Hidden in plain sight, these loopholes are major weaknesses in efforts for online safety, and they require additional learning and skill to try to fill until tech companies take the necessary steps. For guidance on some of the steps to take check out our 5-page guide, Keeping Up with Apps and Parental Settings.

“Sending Nudes:” LGBTQ+ Youth Trends and Attitudes

Teen on Phone image

“Sending Nudes:” LGBTQ+ Youth Trends and Attitudes

Teen on Phone image

Behaviors around sharing explicit images are varied by gender, age range, culture, ethnicity, household income, etc. But one of the clearest distinctions that emerges in research is the experience between LGBTQ+ youth and non-LGBTQ+ youth. Last month we looked at some general points from the recent Thorn Report. This month we’ve consolidated a few points on LGBTQ+ youth attitudes and behaviors, with a few notes about the figures:

Self-reported rates of having shared one’s own image:

LGBTQ+ Youth (age 9-17)
2019 to 2020: 21% to 32% (+11)

Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 8% to 13% (+5)

With the exception of girls age 13-17, every demographic of youth sharing their own explicit images increased from 2019 to 2020. The largest increases show in kids aged 9-10, boys across ages, and LGBTQ+ youth.

Rates of youth who agree sharing nudes is normal for their peers:

LGBTQ+ Youth (age 9-17)
2019 to 2020: 39% to 34% (-5)

Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 25% to 27% (+2)

Note: Encouragingly, the rate of LGBTQ+ youth considering “sending nudes” to be normal normal decreased between 2019 and 2020, but remains above that of non-LGBTQ+ youth.

Rates who have engaged in non-consensual resharing of another’s image (“yes” responses only, majority selected “prefer not to say”):

LGBTQ+ Youth (age 9-17)
2019 to 2020: 12% to 5% (-7)

Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 8% to 8% (-)

Note: Here we see another positive trend among LGBTQ+ youth, indicating a decrease in non-consensual resharing.

For some LGBTQ+ youth, the internet and social media provide a place for identity development, exploration, and validation that may not be available in their non-virtual lives. In some cases, these limited options can lend to unsafe environments and behaviors, which may help explain some of the different results.

As researchers continue to collect this information, it will begin to show clearer trends and make us better able to understand what youth are experiencing and how we can better guide their decisions toward online safety.

Sextortion Revisited

Sextortion Revisited

In the first edition of the Online Safety Toolkit Newsletter we learned about sextortion on the rise. Unfortunately, that was a very real concern, and in recent months FBI offices across the country have been issuing warnings that teen boys are increasingly being targeted.

Sextortion is a crime of online exploitation in which children or adults are coerced or blackmailed by a criminal seeking to acquire explicit content, pursue sex, or obtain money. While the issue is not new, we are currently seeing the standardization of the financial scam aspect of sextortion. Increasingly these scams are directed at children, and particularly young boys.

In this version of the crime, perpetrators capitalize on the developing brain, manipulating boys into risky decisions and then blackmailing them by threatening to leak the photos if they do not send payment. Sadly, in March, a 17 yr old boy commit suicide just hours after receiving sextortion demands, and he is far from the only one.

It’s vital to talk to kids about the risks of online behavior, about concerning trends, and about how they can always come to you, even if they feel they’ve made a bad decision. The harms of sextortion are amplified by the secrecy victims feel they must keep. Below is a good infographic from the Internet Crimes Against Children Task Force, giving an overview of the issue and how you might approach discussing it with kids.

Preventing Sextortion infographic

View and Download PDF

Plugged In: Project Arachnid – A Tool for Removing Child Sexual Abuse Material

Plugged In: Project Arachnid – A Tool for Removing Child Sexual Abuse Material

Created by the Canadian Center for Child Protection (C3P), Project Arachnid partners with child abuse hotlines and organizations around the world to find and remove child sexual abuse images. They also collect data on the proliferation of this material, and whether companies are proactive in responding to removal requests. Below is a 1.5 minute video explaining how Project Arachnid works, and then a link to another 2 minute video on how tech companies fall short on child safety.

Click here to watch 2 minute report from the project (only viewable on Youtube link).

The Project Arachnid Report discussed in the video concluded with 8 suggestions to make tech companies more compliant and proactive:

  1. Enact and impose a duty of care, along with financial penalties for non-compliance or failure to fulfill a required duty of care.
  2. Impose certain legal/contractual obligations in the terms of service for electronic service providers and their downstream customers.
  3. Require automated, proactive content detection for platforms with user-generated content.
  4. Set standards for content that may not be criminal, but remains severely harmful-abusive to minors.
  5. Mandate human content moderation standards.
  6. Set requirements for proof of subject or participant consent and uploader verification.
  7. Establish platform design standards that reduce risk and promote safety.
  8. Establish standards for user-reporting mechanisms and content removal obligations.