The Online Safety Toolkit: Understanding Hidden and Vault Apps

The Online Safety Toolkit:

Understanding Hidden and Vault Apps

There are two basic ways of hiding activity on mobile devices: using built-in features to hide apps, or using 3rd party “vault apps” that assist in hiding specific content. If you are concerned that your kid(s) might be using their device unsafely, knowing how to look for these is especially valuable.

As always, one of the best things you can do is an initial setup of a parental tools and age settings, as that will reduce access to unsafe digital environments in the first place. For steps to set up device, app, game, and streaming settings, check out our Guide.

Hidden Apps:
Hidden apps are normal mobile applications downloaded to devices and then removed from the standard menus and home screens so they don’t easily come up. Some of the apps kids might hide are:

  • Dating apps – If age settings are in place, this is less likely, but as we recently shared, app store age limits do not work well.
  • Explicit apps – related to pornography or adult chatting.
  • Streaming apps – This can range from gaming related (Twitch) to overtly sexual video streaming apps.
  • Games – Any game a kid wants to play and knows they aren’t supposed to is a prime candidate for hiding.
  • To experiment – Some kids may not be using these apps unsafely, but just want to use the hiding options for fun and to feel sneaky.
  • To hide vault apps – Vault apps exist to hide explicit media, and kids may then also hide the direct access to the vault app.

Vault Apps:

Vault apps are applications downloaded to mobile devices and designed for the sole purpose of hiding other content. Kids who are creating or exchanging explicit content will often hide those images within a vault app so that they don’t appear in the standard media library on the device.

Things to know about vault apps:

  • They often have a misleading icon. Calculator icons are most popular, but there are dozens of these apps. Apps that use the word or imagery of a lock should be suspect.
  • Some vault apps are available for download to any age user.
  • Some have decoy passwords/folders. This means kids can make it look like they’ve unlocked the app, but they are still keeping their real content hidden.

For a step-by-step guide to finding hidden apps on Android devices, click here.

For a step-by-step guide on finding hidden apps in iOS devices, click here.

How Much Privacy? How Much Monitoring?

How Much Privacy? How Much Monitoring?

Many parents and caregivers worry about risks online, have heard the true stories of what can go wrong, and still don’t quite know how to get a footing in the dizzying world of apps, clouds, online games, devices, routers, browsers, and so on.

When the details are daunting, we look for simpler solutions, such as not allowing devices, taking them overnight, shutting off web routers, inspecting phones regularly, etc. These are all effective methods for reducing risk, but they may also be too broad of a solution for some families, or for specific (older) children within the same family.

A Family Discussion:

Taking time as a family to identify what level of safety practices and online monitoring you want to put in place helps to make the process more manageable and effective. If you include kids in this discussion, you get the benefits of learning more about what they value from their time online and giving them a window into your logic in setting up limitations. This is especially valuable for kids who are used to having digital free reign and are now having safety settings placed on them for the first time.

A Range of Settings:

Think through the spectrum of control and monitoring options to decide where you want to land. Some examples of how this might vary even within 1 family.

8 year-old child:

  • Has a dedicated tablet with age-appropriate parental tools in place.
  • Only allowed to use at home and in shared spaces.
  • No passcode or passcode known by parents.
  • Optional 3rd party apps for detailed monitoring (i.e. inappropriate language across all apps).

12 year-old child:

  • Has a phone with age-appropriate parental tools in place.
  • Allowed to use device outside and in private, but not overnight.
  • Allowed to have a passcode, but must open and share phone when asked.
  • Optional 3rd party apps for detailed monitoring.

17 year-old child:

  • Has a phone with age-appropriate parental tools in place.
  • Allowed to keep device at all times (optional settings for limitations on usage, location tracking).
  • Allowed to lock the device with private passcode.
  • Broad level monitoring (i.e. viewing what apps are installed, overall usage, etc.).

The best solutions to reducing online risks are those that are tailored to the specific situation, family, and child. Understanding the spectrum of tools and options can be more effective (and less of a battle!) than trying to implement blanket solutions. Turning online safety and wellbeing into ongoing discussions with your family is one of the best things you can do. Our free PDF resources can be helpful in planning and implementing your family’s approach:

Social Media in My Life – Interactive Chart
Deciding What Apps to Allow – 7-Step Guide
Keeping Up With Apps and Parental Tools

Plugged In: NoFiltr

NoFiltr Video: Can You Trust Someone You’ve Met Online?

Discussing sex, abuse, and safety with kids can be hard and intimidating. NoFiltr videos offer candid, light-hearted, and important discussions about online safety. Check out this Father & Son Edition:

The Online Safety Toolkit: Report: Mobile App Age Limits are Unsafe and Unenforced

Tiktok age limits

The Online Safety Toolkit:

Report: Mobile App Age Limits are Unsafe and Unenforced

Tiktok age limits
Over the past several months, we’ve worked to help decipher online safety concerns and tools that are difficult to navigate. We’ve often emphasized that proper use of parental settings can build a shield for unwanted content and experiences. One of the cornerstones of this approach is setting kids’ devices and accounts to their proper age, so that they won’t be able to download apps that aren’t appropriate. Unfortunately, it seems that these foundation-level safety features are less reliable than we had hoped.

The Canadian Center for Child Protection (C3P) has just released a thorough report investigating mobile app age limits in the Apple Appstore and Google Play Store. The findings are not positive for children’s safety. Here are a few key points, and we encourage readers to check out the entire report.

Key Findings:

  • For the same app, there are significant differences in age limits between apps’ own terms of service, Appstore, and Google Play.Example: Kik, a popular app that has notoriously been unsafe for kids and allows no parental settings, is rated 13+ in their terms of service, 17+ in the Appstore, and “Teen” in Google Play (which equates to 13+).
  • Even when set up as a 7 year old’s account, Google’s default content filtering is “Teen” (13+), meaning they would be able to download age-inappropriate apps, even in violation of apps’ terms of service.
  • In both the Appstore and Google Play, analysts were able to get around parental settings simply by trying to bypass them multiple times over several hours.
  • With one extra click, the Appstore allows accounts of 13-year-olds to download apps rated 17+, including adult dating, sex, and live camera streaming apps. In a couple instances, analysts were able to access this from accounts set up for even an 11-year-old.
  • Through web searches, analysts easily found workarounds for Google Play’s age limits, enabling the account of a 13-year-old to download any app.
  • For both age 13 and age 11 accounts, both app marketplaces promoted apps above the account age, including mature content.


[A portion of the Report Findings. Full report here.]

This report is alarming for online safety advocates. Underpinning all of this are brief, vague, unenforced systems for determining an apps’ rating in the first place. Ratings rely heavily on brief questionnaires that are self-reported by app developers. By placing rating responsibility on developers with such a prominent conflict of interests, app marketplaces neglect their commitment to be creating safer environments for youth.

As to what can be done, the report includes a few suggestions:

  • Transparency on how ratings are determined and why companies have chosen non-standard ages like 17+ (instead of 18+)
  • App age limits should be enforced! Currently both Apple and Google have loopholes or workarounds that allow youth to view and/or download apps that are beyond their age settings.
  • Standardize age ratings across platforms. There is no justification for the same app being accessible 4 years earlier simply because one child has an Android and one has an Apple device. At present, this is the case.

Hidden in plain sight, these loopholes are major weaknesses in efforts for online safety, and they require additional learning and skill to try to fill until tech companies take the necessary steps. For guidance on some of the steps to take check out our 5-page guide, Keeping Up with Apps and Parental Settings.

“Sending Nudes:” LGBTQ+ Youth Trends and Attitudes

Teen on Phone image

“Sending Nudes:” LGBTQ+ Youth Trends and Attitudes

Teen on Phone image

Behaviors around sharing explicit images are varied by gender, age range, culture, ethnicity, household income, etc. But one of the clearest distinctions that emerges in research is the experience between LGBTQ+ youth and non-LGBTQ+ youth. Last month we looked at some general points from the recent Thorn Report. This month we’ve consolidated a few points on LGBTQ+ youth attitudes and behaviors, with a few notes about the figures:

Self-reported rates of having shared one’s own image:

LGBTQ+ Youth (age 9-17)
2019 to 2020: 21% to 32% (+11)

Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 8% to 13% (+5)

With the exception of girls age 13-17, every demographic of youth sharing their own explicit images increased from 2019 to 2020. The largest increases show in kids aged 9-10, boys across ages, and LGBTQ+ youth.

Rates of youth who agree sharing nudes is normal for their peers:

LGBTQ+ Youth (age 9-17)
2019 to 2020: 39% to 34% (-5)

Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 25% to 27% (+2)

Note: Encouragingly, the rate of LGBTQ+ youth considering “sending nudes” to be normal normal decreased between 2019 and 2020, but remains above that of non-LGBTQ+ youth.

Rates who have engaged in non-consensual resharing of another’s image (“yes” responses only, majority selected “prefer not to say”):

LGBTQ+ Youth (age 9-17)
2019 to 2020: 12% to 5% (-7)

Non LGBTQ+ Youth (age 9-17)
2019 to 2020: 8% to 8% (-)

Note: Here we see another positive trend among LGBTQ+ youth, indicating a decrease in non-consensual resharing.

For some LGBTQ+ youth, the internet and social media provide a place for identity development, exploration, and validation that may not be available in their non-virtual lives. In some cases, these limited options can lend to unsafe environments and behaviors, which may help explain some of the different results.

As researchers continue to collect this information, it will begin to show clearer trends and make us better able to understand what youth are experiencing and how we can better guide their decisions toward online safety.

Sextortion Revisited

Sextortion Revisited

In the first edition of the Online Safety Toolkit Newsletter we learned about sextortion on the rise. Unfortunately, that was a very real concern, and in recent months FBI offices across the country have been issuing warnings that teen boys are increasingly being targeted.

Sextortion is a crime of online exploitation in which children or adults are coerced or blackmailed by a criminal seeking to acquire explicit content, pursue sex, or obtain money. While the issue is not new, we are currently seeing the standardization of the financial scam aspect of sextortion. Increasingly these scams are directed at children, and particularly young boys.

In this version of the crime, perpetrators capitalize on the developing brain, manipulating boys into risky decisions and then blackmailing them by threatening to leak the photos if they do not send payment. Sadly, in March, a 17 yr old boy commit suicide just hours after receiving sextortion demands, and he is far from the only one.

It’s vital to talk to kids about the risks of online behavior, about concerning trends, and about how they can always come to you, even if they feel they’ve made a bad decision. The harms of sextortion are amplified by the secrecy victims feel they must keep. Below is a good infographic from the Internet Crimes Against Children Task Force, giving an overview of the issue and how you might approach discussing it with kids.

Preventing Sextortion infographic

View and Download PDF

Plugged In: Project Arachnid – A Tool for Removing Child Sexual Abuse Material

Plugged In: Project Arachnid – A Tool for Removing Child Sexual Abuse Material

Created by the Canadian Center for Child Protection (C3P), Project Arachnid partners with child abuse hotlines and organizations around the world to find and remove child sexual abuse images. They also collect data on the proliferation of this material, and whether companies are proactive in responding to removal requests. Below is a 1.5 minute video explaining how Project Arachnid works, and then a link to another 2 minute video on how tech companies fall short on child safety.

Click here to watch 2 minute report from the project (only viewable on Youtube link).

The Project Arachnid Report discussed in the video concluded with 8 suggestions to make tech companies more compliant and proactive:

  1. Enact and impose a duty of care, along with financial penalties for non-compliance or failure to fulfill a required duty of care.
  2. Impose certain legal/contractual obligations in the terms of service for electronic service providers and their downstream customers.
  3. Require automated, proactive content detection for platforms with user-generated content.
  4. Set standards for content that may not be criminal, but remains severely harmful-abusive to minors.
  5. Mandate human content moderation standards.
  6. Set requirements for proof of subject or participant consent and uploader verification.
  7. Establish platform design standards that reduce risk and promote safety.
  8. Establish standards for user-reporting mechanisms and content removal obligations.

The Online Safety Toolkit: Guidesheet: 7 Steps for Deciding what Apps to Allow

7 Steps to Help You Decide What Apps to Allow

The Online Safety Toolkit: Guidesheet: 7 Steps for Deciding what Apps to Allow

Use this series of questions to help decide, in collaboration with your kids, when to allow or prevent apps and games. Let them know it is not a checklist resulting in only “yes” or “no”, but that there may be settings and parameters that need to be in place before proceeding.

7 Steps to Help You Decide What Apps to Allow

View and download PDF.

New to the Newsletter? Sign up here.

Plugged In: What is the Metaverse?

Ask Me Anything Image

Plugged In: What is the Metaverse?

Virtual reality, augmented reality, the metaverse… Parents have varying degrees of familiarity with these ideas, but nobody truly knows how they will play out or the overall effect they might have in the lives of growing children.

In this 13 minute video, human rights advocate and virtual reality expert Brittan Heller talks about the future of the metaverse, virtual and augmented reality and how society can make these technologies safer for youth — and everyone.

Ask Me Anything: Virtual Reality Expert Brittan Heller from ConnectSafely on Vimeo.

“Sending Nudes:” What are Kids’ Behaviors and Perceptions?

Reasons for not sharing a nude photo table

“Sending Nudes:” What are Kids’ Behaviors and Perceptions?

Online trends shift faster than they can be tracked and studied, especially since COVID-19 began. Some of the most helpful research in recent years has been conducted by Thorn, a nonprofit working to eliminate online child sexual abuse. They have now released 2 reports on youth behaviors and attitudes related to self-generated child sexual abuse material (SG-CSAM), or what is informally known as “sending nudes.” Below we summarize some of the valuable findings from their reports, which documented attitudes and behaviors in 2019 and 2020.

We encourage readers to check out the full report, where data is presented in two age ranges (9-12, 13-17) for more precision.

Summary Findings:

“…(1) sexting is becoming viewed as a “normal” activity among peers; (2) coercion plays a critical role and exponentially increases risk to the victim; and (3) attitudes of blame and shame can compound the harms of online threats and unintentionally isolate young people.

Specific Data Points:

% of minors (9-17) who agree it is normal for kids their age to “share nudes”.
2019: 27%
2020: 28%

Note: while the combined total remained steady, younger youth showed a significant increase while teens showed a slight decrease.

% of minors (9-17) who have shared their own explicit image
2019: 11%
2020: 17%

Note: Both age ranges (9-12 and 13-17) showed increased sharing, with reports from 9-12 yr. Olds more than doubling (from 6-14%)

Largest increases in production and sharing of images:
% of 9-10 yr. Old who have shared their own image
2019: 3%
2020: 15%

LGBTQ+ Youth:
2019: 21%
2020: 32% (almost 2.5x more likely than non-LGBTQ+ youth)

Among youth who have sent a nude photo or image (2020 data):
50% sent images to someone they had never met offline
41% sent images to someone over the age of 18

Rates of sharing nude images and videos by household income (2019 data only):
<$50k: 9%
$50k – $75k: 10%
$75k – $100k: 15%
$100k – $150k: 17%
>$150k: 25%

Note: There is a possible link between household income and experience with sharing nudes, with a trend of increased sharing with higher household income.

Reasons why youth who considered sending a nude image or video decided not to:

Reasons for not sharing a nude photo table

Note: When speaking with kids and developing curriculum, speaking to youth concerns is essential. For example, the illegality of image sharing is shown to not weigh heavily on their minds, and therefore may not be the best primary emphasis.

Perceptions of blame when a nude has been re-shared with broader audience:

Perceptions of blame image

Note: This is encouraging data that suggests that education around coercion and non-consensual sharing is having an impact. The less shame attached to these situations, the better able kids are to get help.

Taken together, these and other data points paint a complex and shifting picture of youth attitudes and behavior. While the details fluctuate, kids’ sense of “a new normal” seems to hold steady. As we continue to work to keep them safe and raise informed digital citizens, keeping up with their experiences and views is essential to effective communication.