The Online Safety Toolkit: The TikTok Edition – Parental Settings

The Online Safety Toolkit:

The TikTok Edition – Parental Settings

Tik Tok’s meteoric rise to the most popular mobile app ever has left many concerned and feeling unprepared to guide their kids to safe and healthy usage. As is the norm, the app has also been slow to prioritize child safety. Thankfully, there are some new features and resources that begin closing safety loopholes.

The New Safety Center:

Tik Tok’s Safety Resources and Guides live within the Safety Center, where you can find valuable guides for issues ranging from misinformation, to bullying, to our focus of preventing child sexual abuse.

On the Preventing child sexual abuse on Tik Tok page there are links and steps to reporting the presence of child sexual abuse material (CSAM), as well as links for those who have seen or sought out such materials.

The Guardian’s Guide is a good starting point for parents and caregivers to understand the risks and how “Family Pairing” works.

The Specifics of Safety Settings

But most important are the Safety and Privacy Controls, which are of 3 types:

Account Settings include privacy and public visibility and discoverability. Accounts for people under 16 default to private, while over 16 defaults to public. Account settings also include:

  • Discoverability: Decide whether an account is public or private and whether it will be suggested to other users*
  • Personalized ads
  • Download your data
  • Family Pairing

Community Controls are the core of the safety settings, allowing to:

  • Set who can view a video
  • Set whether anyone, friends only, or no one can send direct messages*
  • Set who can comment on content (under age 16 can have friends or no one; over 16 can have friends, no one, or everyone able to comment)*
  • Apply comment filters through automatic detection or setting specific keywords
  • Block an account
  • Set whether people can see what videos a user has liked*
  • Set whether your videos can be downloaded
  • Set whether others can use “Stitch” and “Duet” features, which are only available to users over 16.

Content Controls determine what can be viewed and accessed, such as:

  • Restricted Mode to prevent content that may be inappropriate*
  • Screen Time Management limiting usage and requiring a code to override*
  • Search function – determine whether or not use can search for videos*

*These settings can be determined by a parent using “Family Pairing”

NOTE: Family Pairing only applies to the Tik Tok App, meaning if a teen uses the desktop or mobile website, only settings applied to those browsers or devices will apply. Like many apps, this is a significant loophole.

For a convenient video overview to set up Family sharing, watch below. Or visit this page for written step-by-step instructions.

 

Tik Tok Live Streaming Age Changes

Tik Tok Live Streaming Age Changes

Tik Tok interface and design changes have far-reaching impact. In addition to new safety tools, Tik Tok will soon take steps toward safer live-streaming videos.

In a move to existing risks to children, Tik Tok is making two big changes to their live streaming (LIVE) feature, which has seen growing popularity.

  • Current rule: Livestreaming is restricted to users with 1,000 followers or more and who are 16 and older.
  • New rule beginning Nov. 23rd: Livestreaming is restricted to only users 18 and older.
  • Adults-Only content: Tik Tok will allow creators to tag content that is better for viewers 18 and up. According to the announcement, adult content will still not allow nudity, pornography, or any content that violates the existing rules. The adults-only distinction is intended for comedy or content that may not be appropriate for minors, but is not sexually explicit or violent (i.e. adult humor).

Many safety features hinge on accounts being set to the accurate age. Making sure your kids devices and apps have their accurate age is crucial to utilizing what safety features do exist, even while it is not fully sufficient.

The Online Safety Toolkit: “Behind Their Screens” and The Evolution of Online Safety

The Online Safety Toolkit:

“Behind Their Screens” and The Evolution of Online Safety

This month, we’re zooming out to share one of the most important and recent developments in online safety.

In the new book Behind Their Screens: What Teens are Facing and Adults are Missing, authors and researchers Emily Weinstein and Carrie James share the findings of over 10 years of studying youth online, including bringing youth into the research process, itself. (Official book website) The result is a rich and nuanced picture of how we might more effectively accomplish safety goals (many of which are more aligned with kids than we imagine).

Any parent or professional who works with kids will find value in this book, but we also want to share some takeaways as they pertain to child sexual abuse prevention:

There is no one type of internet usage:

Apps that are used for risky behaviors are also used for positive ones. This is easily seen in an example like Youtube, where kids can learn and grow, but also potentially be exploited. But even apps like Discord, a messaging service that is too often used to exchange Child Sexual Abuse Material (CSAM), have diverse uses. Teens have used private Discord chats to support each other academically and mentally, especially in the pandemic.

Kids are not unaware or unconcerned:

When we approach kids as adversaries in the mission to keep them safe, we fail to recognize all the ways kids already do think and worry about their safety and online wellbeing. From screen time, to predators, to leaked photos, and much more, some kids think about and navigate these risks daily. Knowing that can make them collaborators in their safety.

There are different motives and outcomes to sexting:

While all sexting is risky, it is not all equally so, and it does not always lead to feelings of harm, even among adults reflecting in hindsight. The spectrum of motives that cause adults to engage in this risky behavior also motivate some kids. When sexting has been experienced as positive, worst-case messages and stories may detract from adults’ credibility or influence.

The details are essential:

Emotionally and legally, the specifics of an online exchange of explicit images matters. Whether images were consensual, coerced, or shared without permission; whether a child was sender, recipient, or forwarder; the risks, outcomes, and best actions vary depending on each individual situation.

We are under-equipping kids for their circumstances:

Because sweeping messages and warnings have not fully spoken to kids’ more nuanced experiences, they are left to solve specific problems alone. The result is, for example, TikTok videos explaining steps to take to secretly embed the name of the person receiving a nude image, in case the sender needs to prove someone forwarded the image in the future. While we wish many of these situations did not exist, it is still in everyone’s best interest if adults and kids come together to tackle them.

NoFiltr: Resources for Kids, Informed by Kids

NoFiltr:

Resources for Kids, Informed by Kids

No Filtr is a project started by Thorn, a nonprofit on the cutting edge of online abuse prevention messaging and technology. Thorn has comprehensive discussion guides and resources for parents, and through No Filtr, they now have resources for kids.

Try sending your teen a link to No Filtr’s TikTok page, Youtube, or website.

Resources include quizzes about how to spot different types of manipulative behavior, how to respond when someone’s explicit images are shared, and so on.

No matter what, sometimes things go wrong or crises arise. Kids do not always want to bring these problems to parents, and it’s essential they know of multiple forms of support. No Filtr’s text support line is great to keep in mind: Text NOFILTR to 741741 for immediate assistance.

At No More Stolen Childhoods, we’ve been learning how essential it is to bring young people further into the conversations about their safety. No Filtr is doing just that via feedback forms and establishing a Youth Innovation Council for young people around the world who recognize a need to help other kids stay safer online.

Also connected to Thorn is another important project for parents and kids to be aware of; stopsextortion.com. There one can find resources, information, and support around the growing phenomenon of sextortion scams.

Concerning Apps – Amazon’s Encrypted Messaging Service: Wickr Me

Concerning Apps –

Amazon’s Encrypted Messaging Service: Wickr Me

A Hub of Abuse Material:
More and more often, apps are utilizing end-to-end encryption that makes illegal activity like the exchange of Child Sexual Abuse Material hard to monitor, report, and prosecute. One of the larger apps known to be hub of CSAM is Wickr Me (usually just referred to as Wickr), which began in 2015 and was bought by Amazon in 2021.

There is no good reason for a kid – and rarely for an adult – to communicate using Wickr. Because it is a closed communication system, people generally exchange Wickr usernames in more public online settings before moving to communicating privately in Wickr. This move from public setting (games, Youtube comments, a subreddit thread) to private settings (Kik, Telegram, Wickr) is a key step in the sequence of manipulation and child sexual abuse that plays out online.

Settings and Blocking Access:
There are no safety settings within Wickr, and no way for a parent to monitor any activity. Wickr is an app that, if a kid is using it, warrants a serious discussion about why, what they are doing and finding valuable, and if their usage *is* appropriate, other apps and ways they might fill those needs (like WhatsApp).

Because Wickr is listed in the Google Playstore as “E for Everyone,” and in the Apple App Store as “12+,” age settings on devices are insufficient to block access. Parents wanting to prevent Wickr need to use parental settings to blacklist the app individually. Check out our Guidesheet for more info and links to step-by-step guides.

Privacy and Safety:
While apps like WhatsApp are also encrypted, many require identifying information when starting an account, whereas Wickr does not and creates an anonymous environment more conducive to criminal behavior. And while there is valid tension between maintaining digital privacy and ensuring child protection, WhatsApp greatly increased their reports of child abuse material when they began to monitor for non-encrypted signs of child abuse, such as usernames and profile photos. So far, Wickr seems disinclined to take proactive steps in child safety, despite the web having thousands and thousands of coded referenced to child sexual abuse material paired with Wickr usernames, appearing in places like Tumblr, Reddit, and Twitter.

The Online Safety Toolkit: iPhone’s New Safety Check Feature

The Online Safety Toolkit:

iPhone’s New Safety Check Feature

Broadly speaking, Safety Check has two main uses: Emergency Reset and Managing Sharing and Access. We’ll explain both:

When to Consider an Emergency Reset:

Emergency Reset refers only to privacy and security, it is not a reset of your phone or general settings.

If someone (child or adult) is in danger or at risk of harm from someone who they have shared data like location services, iPhotos, calendars, or contacts, with, an Emergency Reset cuts those virtual ties so that the other person can no longer access those shared items. It is important to note that someone will be aware if their access has been removed. Emergency Reset will also allow you to review and edit your Apple ID if there is concern that someone could still access the account directly.

For teens, this feature may be relevant if there has been a falling out with a friend or significant other and there are concerns about what that person might do with shared photos or knowledge of someone’s location.

To Do an Emergency Reset:

  • Go to Settings > Privacy & Security > Safety Check.
  • Tap Emergency Reset, then follow the onscreen instructions.
    Progress is saved as you go.

To read more about Emergency Reset: https://support.apple.com/guide/personal-safety/stop-sharing-with-people-and-apps-ips16ea6f2fe/1.0/web/1.0#ips671ae37be

For a full list of apps and information that can be unshared using Safety Check: https://support.apple.com/guide/personal-safety/how-safety-check-works-ips2aad835e1/1.0/web/1.0

When to Consider a Safety Check:

When you want to review shared data settings in more detail, use Manage Sharing & Access to review and reset information you’re sharing with people, review and reset the information that apps have access to, and update your device and Apple ID security.

  • Go to Settings > Privacy & Security > Safety Check.
  • Tap Manage Sharing & Access.
    Progress is saved as you go.
  • Do one of the following to stop sharing information with other people:
    • Tap People, select people in the list, review the information shared with people, then decide which information you want to stop sharing with selected people.
    • Tap Information, select apps in the list, review the information shared with people, then decide which information you want to stop sharing with selected people.
  • Do one of the following to stop sharing information with other apps:
    • Tap Apps, select apps in the list, review the information shared with them, then decide which information you want to stop sharing with the selected apps.
    • Tap Information, select the information being shared in the list, review the information shared with apps, then decide which information you want to stop sharing with the selected apps.
  • Tap Continue, then do any of the following:
    • Review and remove devices signed into your account.
    • Review and update trusted phone numbers.
    • Change your Apple ID password.
    • Update your emergency contacts.
    • Update your device passcode, or your Face ID or Touch ID information.
  • Click Done.

When you’ve finished, make sure you stopped certain sharing and reset specific settings. See Verify you’ve stopped sharing.

Full page guide: https://support.apple.com/guide/personal-safety/stop-sharing-with-people-and-apps-ips16ea6f2fe/1.0/web/1.0#ips54d90c122

Abuse via Instagram: The Jenkins Family’s Experience

Abuse via Instagram:

The Jenkins Family’s Experience

In this important 6.5 minute video, the Jenkins’ family share their experience of manipulation and child sexual abuse carried out via Instagram on their daughter’s tablet.

<iframe width=”560″ height=”315″ src=”https://www.youtube.com/embed/bzhhzD51p6M” title=”YouTube video player” frameborder=”0″ allow=”accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture” allowfullscreen></iframe>

The Online Safety Toolkit: New Supervision Tools in Instagram and Snapchat

The Online Safety Toolkit:

New Supervision Tools in Instagram and Snapchat

Before we can dive into the specific features rolled out by Instagram and Snapchat, it’s important to understand the basic distinction between parental supervision features and parental setting features.

App Supervision vs Settings

The new tools in Snapchat and Instagram only offer supervision. That means they allow some degree of monitoring activity, but they do not offer tools to limit or determine how kids interact on their platforms.

The most important, and unfortunate, thing to know about the new settings in both Instagram and Snapchat are that they only work under specific conditions:

  • For each app, a parent must make their own account, forcing them into the platform ecosystem.
  • For settings to work, parents have to link to their child’s account, and children have to agree to them. Children can also turn them off without parental permission.

Because these settings cannot be locked, they are most effective in conjunction with app restrictions settings within devices. Then a parent might allow Snapchat or Instagram on the condition that the parental supervision remain on. If a child turns it off, the consequence could be no longer being allowed to use that app. If device settings are properly set up, a parent can remotely restrict their child from keeping or downloading an entire app. (Download our Guide to settings in devices, apps, games, and streaming services.)

Overview of Features:

Snapchat’s new Family Center:

Snapchat Family Center introduces the following features:

  • See who a child has sent messages, images, or videos to in the past 7 days. The content of communication remains private.
  • See a complete list of kids’ existing friends. In the coming weeks, Snapchat says it will add a feature to easily view new friends that are added.
  • Easily and confidentially report any concerning accounts to a 24/7 Trust and Safety team to investigate.
  • Access key explainers about how to use these tools, resources for important conversation starters, and additional tips.
  • Kids who have opted in to Family Center are able to see what their parents see, with a mirrored view of features.

Click here for a more detailed discussion and instruction video for Snapchat Family Center features.

Instagram’s new Family Center:

Instagram Family Center introduces the following features (note, if Instagram is used across multiple devices, these settings will apply to them all):

  • Set a time limit for use each day.
  • Set scheduled breaks that limit teen’s use during select days and hours.
  • See how much time is spent on the app across all devices.
  • See how much time is spent on the app each day on average, across all devices.
  • See how much time was spent on the app each specific day for the last week.
  • See which accounts a child is following.
  • See which accounts are following a child.
  • Kids will be able to see a preview of what their parents see while supervising, and can notify parents after reporting something on Instagram.

Click here for a more detailed discussion of Instagram Family Center features.