The Online Safety Toolkit: New Supervision Tools in Instagram and Snapchat

The Online Safety Toolkit:

New Supervision Tools in Instagram and Snapchat

Before we can dive into the specific features rolled out by Instagram and Snapchat, it’s important to understand the basic distinction between parental supervision features and parental setting features.

App Supervision vs Settings

The new tools in Snapchat and Instagram only offer supervision. That means they allow some degree of monitoring activity, but they do not offer tools to limit or determine how kids interact on their platforms.

The most important, and unfortunate, thing to know about the new settings in both Instagram and Snapchat are that they only work under specific conditions:

  • For each app, a parent must make their own account, forcing them into the platform ecosystem.
  • For settings to work, parents have to link to their child’s account, and children have to agree to them. Children can also turn them off without parental permission.

Because these settings cannot be locked, they are most effective in conjunction with app restrictions settings within devices. Then a parent might allow Snapchat or Instagram on the condition that the parental supervision remain on. If a child turns it off, the consequence could be no longer being allowed to use that app. If device settings are properly set up, a parent can remotely restrict their child from keeping or downloading an entire app. (Download our Guide to settings in devices, apps, games, and streaming services.)

Overview of Features:

Snapchat’s new Family Center:

Snapchat Family Center introduces the following features:

  • See who a child has sent messages, images, or videos to in the past 7 days. The content of communication remains private.
  • See a complete list of kids’ existing friends. In the coming weeks, Snapchat says it will add a feature to easily view new friends that are added.
  • Easily and confidentially report any concerning accounts to a 24/7 Trust and Safety team to investigate.
  • Access key explainers about how to use these tools, resources for important conversation starters, and additional tips.
  • Kids who have opted in to Family Center are able to see what their parents see, with a mirrored view of features.

Click here for a more detailed discussion and instruction video for Snapchat Family Center features.

Instagram’s new Family Center:

Instagram Family Center introduces the following features (note, if Instagram is used across multiple devices, these settings will apply to them all):

  • Set a time limit for use each day.
  • Set scheduled breaks that limit teen’s use during select days and hours.
  • See how much time is spent on the app across all devices.
  • See how much time is spent on the app each day on average, across all devices.
  • See how much time was spent on the app each specific day for the last week.
  • See which accounts a child is following.
  • See which accounts are following a child.
  • Kids will be able to see a preview of what their parents see while supervising, and can notify parents after reporting something on Instagram.

Click here for a more detailed discussion of Instagram Family Center features.

PimEyes: The Facial Recognition Search Engine

The Online Safety Toolkit:

PimEyes: The Facial Recognition Search Engine

Recent investigation and reporting from the Intercept have shown major child safety concerns related to PimEyes, the facial recognition search engine.

Previously, PimEyes received backlash for including search result images connected to social media accounts and other sites not usually indexed by search engines. While they have removed those features from results, plenty of the remaining results link to potentially identifying information. In their investigation, the Intercept found PimEyes results that included pictures of children on charity sites, or as part of videos in a local news story; details that would help someone trying to locate a child – be they a predator or abusive, non-custodial parent.

More alarmingly, the Intercept reports one 16 year-old who used PimEyes to find “revenge porn” images of her that had been posted online by an ex. These images are child pornography, and yet appear in PimEyes search results (blurred out and marked as potentially explicit, unless the user pays for premium features).

While PimEyes reports to have some systems in place for monitoring and addressing harmful use of the platform, it is clear these systems are minimal, and may not function as intended.

By contrast, the child abuse prevention nonprofit Thorn has also used facial recognition technology to identify images connected to child sex trafficking and take action to address it.

What does all this mean for parents?

Parents may want to use PimEyes to search for images of their own family and see where they may appear online, and if they may be connected to identifying information. Unfortunately, in order to view the URLs where images live, you have to pay for a premium account.

But most importantly, parents need to be thoughtful about what they and their children post online. These need to become ongoing age-appropriate discussions with kids so that they understand the risks, the reasons for parental limitations, and who to go to if something goes wrong.

Read the full Intercept report: Facial Recognition Search Engine Pulls Up Potentially Explicit Photos of Kids