The Online Safety Toolkit:
Report: Mobile App Age Limits are Unsafe and Unenforced
Over the past several months, we’ve worked to help decipher online safety concerns and tools that are difficult to navigate. We’ve often emphasized that proper use of parental settings can build a shield for unwanted content and experiences. One of the cornerstones of this approach is setting kids’ devices and accounts to their proper age, so that they won’t be able to download apps that aren’t appropriate. Unfortunately, it seems that these foundation-level safety features are less reliable than we had hoped.
The Canadian Center for Child Protection (C3P) has just released a thorough report investigating mobile app age limits in the Apple Appstore and Google Play Store. The findings are not positive for children’s safety. Here are a few key points, and we encourage readers to check out the entire report.
- For the same app, there are significant differences in age limits between apps’ own terms of service, Appstore, and Google Play.Example: Kik, a popular app that has notoriously been unsafe for kids and allows no parental settings, is rated 13+ in their terms of service, 17+ in the Appstore, and “Teen” in Google Play (which equates to 13+).
- Even when set up as a 7 year old’s account, Google’s default content filtering is “Teen” (13+), meaning they would be able to download age-inappropriate apps, even in violation of apps’ terms of service.
- In both the Appstore and Google Play, analysts were able to get around parental settings simply by trying to bypass them multiple times over several hours.
- With one extra click, the Appstore allows accounts of 13-year-olds to download apps rated 17+, including adult dating, sex, and live camera streaming apps. In a couple instances, analysts were able to access this from accounts set up for even an 11-year-old.
- Through web searches, analysts easily found workarounds for Google Play’s age limits, enabling the account of a 13-year-old to download any app.
- For both age 13 and age 11 accounts, both app marketplaces promoted apps above the account age, including mature content.
[A portion of the Report Findings. Full report here.]
This report is alarming for online safety advocates. Underpinning all of this are brief, vague, unenforced systems for determining an apps’ rating in the first place. Ratings rely heavily on brief questionnaires that are self-reported by app developers. By placing rating responsibility on developers with such a prominent conflict of interests, app marketplaces neglect their commitment to be creating safer environments for youth.
As to what can be done, the report includes a few suggestions:
- Transparency on how ratings are determined and why companies have chosen non-standard ages like 17+ (instead of 18+)
- App age limits should be enforced! Currently both Apple and Google have loopholes or workarounds that allow youth to view and/or download apps that are beyond their age settings.
- Standardize age ratings across platforms. There is no justification for the same app being accessible 4 years earlier simply because one child has an Android and one has an Apple device. At present, this is the case.
Hidden in plain sight, these loopholes are major weaknesses in efforts for online safety, and they require additional learning and skill to try to fill until tech companies take the necessary steps. For guidance on some of the steps to take check out our 5-page guide, Keeping Up with Apps and Parental Settings.
“Sending Nudes:” LGBTQ+ Youth Trends and Attitudes
Behaviors around sharing explicit images are varied by gender, age range, culture, ethnicity, household income, etc. But one of the clearest distinctions that emerges in research is the experience between LGBTQ+ youth and non-LGBTQ+ youth. Last month we looked at some general points from the recent Thorn Report. This month we’ve consolidated a few points on LGBTQ+ youth attitudes and behaviors, with a few notes about the figures:
Self-reported rates of having shared one’s own image:
LGBTQ+ Youth (age 9-17)
2019 to 2020: 21% to 32% (+11)
Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 8% to 13% (+5)
With the exception of girls age 13-17, every demographic of youth sharing their own explicit images increased from 2019 to 2020. The largest increases show in kids aged 9-10, boys across ages, and LGBTQ+ youth.
Rates of youth who agree sharing nudes is normal for their peers:
LGBTQ+ Youth (age 9-17)
2019 to 2020: 39% to 34% (-5)
Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 25% to 27% (+2)
Note: Encouragingly, the rate of LGBTQ+ youth considering “sending nudes” to be normal normal decreased between 2019 and 2020, but remains above that of non-LGBTQ+ youth.
Rates who have engaged in non-consensual resharing of another’s image (“yes” responses only, majority selected “prefer not to say”):
LGBTQ+ Youth (age 9-17)
2019 to 2020: 12% to 5% (-7)
Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 8% to 8% (-)
Note: Here we see another positive trend among LGBTQ+ youth, indicating a decrease in non-consensual resharing.
For some LGBTQ+ youth, the internet and social media provide a place for identity development, exploration, and validation that may not be available in their non-virtual lives. In some cases, these limited options can lend to unsafe environments and behaviors, which may help explain some of the different results.
As researchers continue to collect this information, it will begin to show clearer trends and make us better able to understand what youth are experiencing and how we can better guide their decisions toward online safety.
In the first edition of the Online Safety Toolkit Newsletter we learned about sextortion on the rise. Unfortunately, that was a very real concern, and in recent months FBI offices across the country have been issuing warnings that teen boys are increasingly being targeted.
Sextortion is a crime of online exploitation in which children or adults are coerced or blackmailed by a criminal seeking to acquire explicit content, pursue sex, or obtain money. While the issue is not new, we are currently seeing the standardization of the financial scam aspect of sextortion. Increasingly these scams are directed at children, and particularly young boys.
In this version of the crime, perpetrators capitalize on the developing brain, manipulating boys into risky decisions and then blackmailing them by threatening to leak the photos if they do not send payment. Sadly, in March, a 17 yr old boy commit suicide just hours after receiving sextortion demands, and he is far from the only one.
It’s vital to talk to kids about the risks of online behavior, about concerning trends, and about how they can always come to you, even if they feel they’ve made a bad decision. The harms of sextortion are amplified by the secrecy victims feel they must keep. Below is a good infographic from the Internet Crimes Against Children Task Force, giving an overview of the issue and how you might approach discussing it with kids.
Plugged In: Project Arachnid – A Tool for Removing Child Sexual Abuse Material
Created by the Canadian Center for Child Protection (C3P), Project Arachnid partners with child abuse hotlines and organizations around the world to find and remove child sexual abuse images. They also collect data on the proliferation of this material, and whether companies are proactive in responding to removal requests. Below is a 1.5 minute video explaining how Project Arachnid works, and then a link to another 2 minute video on how tech companies fall short on child safety.
Click here to watch 2 minute report from the project (only viewable on Youtube link).
The Project Arachnid Report discussed in the video concluded with 8 suggestions to make tech companies more compliant and proactive:
- Enact and impose a duty of care, along with financial penalties for non-compliance or failure to fulfill a required duty of care.
- Impose certain legal/contractual obligations in the terms of service for electronic service providers and their downstream customers.
- Require automated, proactive content detection for platforms with user-generated content.
- Set standards for content that may not be criminal, but remains severely harmful-abusive to minors.
- Mandate human content moderation standards.
- Set requirements for proof of subject or participant consent and uploader verification.
- Establish platform design standards that reduce risk and promote safety.
- Establish standards for user-reporting mechanisms and content removal obligations.
The Online Safety Toolkit: Guidesheet: 7 Steps for Deciding what Apps to Allow
Use this series of questions to help decide, in collaboration with your kids, when to allow or prevent apps and games. Let them know it is not a checklist resulting in only “yes” or “no”, but that there may be settings and parameters that need to be in place before proceeding.
New to the Newsletter? Sign up here.
Plugged In: What is the Metaverse?
Virtual reality, augmented reality, the metaverse… Parents have varying degrees of familiarity with these ideas, but nobody truly knows how they will play out or the overall effect they might have in the lives of growing children.
In this 13 minute video, human rights advocate and virtual reality expert Brittan Heller talks about the future of the metaverse, virtual and augmented reality and how society can make these technologies safer for youth — and everyone.
“Sending Nudes:” What are Kids’ Behaviors and Perceptions?
Online trends shift faster than they can be tracked and studied, especially since COVID-19 began. Some of the most helpful research in recent years has been conducted by Thorn, a nonprofit working to eliminate online child sexual abuse. They have now released 2 reports on youth behaviors and attitudes related to self-generated child sexual abuse material (SG-CSAM), or what is informally known as “sending nudes.” Below we summarize some of the valuable findings from their reports, which documented attitudes and behaviors in 2019 and 2020.
We encourage readers to check out the full report, where data is presented in two age ranges (9-12, 13-17) for more precision.
“…(1) sexting is becoming viewed as a “normal” activity among peers; (2) coercion plays a critical role and exponentially increases risk to the victim; and (3) attitudes of blame and shame can compound the harms of online threats and unintentionally isolate young people.
Specific Data Points:
% of minors (9-17) who agree it is normal for kids their age to “share nudes”.
Note: while the combined total remained steady, younger youth showed a significant increase while teens showed a slight decrease.
% of minors (9-17) who have shared their own explicit image
Note: Both age ranges (9-12 and 13-17) showed increased sharing, with reports from 9-12 yr. Olds more than doubling (from 6-14%)
Largest increases in production and sharing of images:
% of 9-10 yr. Old who have shared their own image
2020: 32% (almost 2.5x more likely than non-LGBTQ+ youth)
Among youth who have sent a nude photo or image (2020 data):
50% sent images to someone they had never met offline
41% sent images to someone over the age of 18
Rates of sharing nude images and videos by household income (2019 data only):
$50k – $75k: 10%
$75k – $100k: 15%
$100k – $150k: 17%
Note: There is a possible link between household income and experience with sharing nudes, with a trend of increased sharing with higher household income.
Reasons why youth who considered sending a nude image or video decided not to:
Note: When speaking with kids and developing curriculum, speaking to youth concerns is essential. For example, the illegality of image sharing is shown to not weigh heavily on their minds, and therefore may not be the best primary emphasis.
Perceptions of blame when a nude has been re-shared with broader audience:
Note: This is encouraging data that suggests that education around coercion and non-consensual sharing is having an impact. The less shame attached to these situations, the better able kids are to get help.
Taken together, these and other data points paint a complex and shifting picture of youth attitudes and behavior. While the details fluctuate, kids’ sense of “a new normal” seems to hold steady. As we continue to work to keep them safe and raise informed digital citizens, keeping up with their experiences and views is essential to effective communication.
Going Beyond “Public and Permanent”
When encouraging kids to practice safe behaviors, whether on or offline, the most effective approach depends on a variety of factors, such as kids’ priorities and motivations, current norms, as well as the current practices recommended by experts.
With time, we as a society learned that “stranger danger” poses less of a risk to children than does danger and maltreatment from people they know and trust, and we continue to adjust our messaging accordingly.
When it comes to sharing explicit images online, dominant practice has been to emphasize to kids that anything they share online can become “public and permanent” and therefore they should behave as if it will. This is not bad advice, but it first arose in a pre-social media, pre-mobile device environment. With the advent of smartphones and non-stop connectivity, the ease and frequency of sharing private content has increased dramatically, as has the likelihood that a private image will then be non-consensually shared with others or posted online.
Given how often kids are sharing images and having those images non-consensually spread further, some experts are questioning if “public and permanent” is still the best framing. When a child – who through typical adolescent lapses in judgement may have engaged in sharing an image of themselves – is told that those images are public and permanent, it may simply be add to the shame, despair, and fear they could already feel toward the situation. In reality, there are often actions to be taken to prevent or minimize the spread of explicit images, but kids may be less likely to pursue that help if they have been presented with a black-and-white version of online risk and safety, or fear punishment for seeking help.
Ultimately, we need laws, language, and education that better acknowledge the complex range of experiences that fall within “sexting” and online risky behavior. When we recognize and speak to these different circumstances, we are better able to emphasize the danger of “public and permanent” without that becoming a disempowering idea to those kids who have already experienced coercion or the betrayal of having images shared non-consensually.
The Online Safety Toolkit: Resource Packet – Keeping Up with Apps and Parental Tools
Parents and caregivers regularly tell us that they feel overwhelmed by the task of understanding and safeguarding kids’ online activity. Our new, 5-page resource packet, linked below, is a great starting point for any adult wanting to take steps toward more effective online safety practices.
New to the Newsletter? Sign up here.
Plugged In: Current Legislation Related to Online Safety
On March 1st, 2022, children’s online safety was included in the State of the Union address. There are currently four primary bills being considered in the US after nearly 25 years since the Children’s Online Privacy Protection Act (COPPA) of 1998.
Kids Online Safety Act (KOSA)
KOSA would require commercial online platforms to “prevent and mitigate the risks of physical, emotional, developmental, or material harms” posed to minors using the platform.
Children and Teens’ Online Privacy Protection Act
The Children and Teens’ Online Privacy Protection Act (CTOPPA) expands upon the original COPPA legislation and would extend the protections of COPPA from children up to age 13 to children up to age 16. This has been put forth by Senator Ed Markey, who authored the original COPPA legislation.
Protecting the Information of our Vulnerable Children and Youth Act
Also called the Kids PRIVCY Act, this legislation is similar to CTOPPA, but extends COPPA protections to children up to age 17.
The California Age-Appropriate Design Code Act
California is considering the California Age-Appropriate Design Code Act, which would bring increased requirements for companies to verify age and consider childrens’ safety and privacy. This aligns with a movement in the United Kingdom toward requiring all pornography sites to include age verification that goes beyond self-confirmation, the current norm
Read more and read legislation text, here.