Skip to content

Do you post videos of your children dancing on social media? You might want to read this disturbing investigation before you hit upload next time

Are fun videos of dancing tweens giving us a blindspot to the dangers?

TikTok and Instagram Reels are the social media platforms known for light-hearted dancing and memes that have brought an element of seemingly harmless fun into the dark days of Covid for many of us. But as Anne-Marie Tomchak reports, are we nonchalantly uploading content of our children having fun and ignoring the darker side of child safety issues that these social platforms present?

When 35-year-old Krystal* in Somerset opened up an Instagram account in 2015 to document her then 6-year-old daughter Edie’s journey of learning dance and gymnastics she had no idea of the horrors that would transpire. The account quickly grew to 17,000 followers and Krystal’s daughter (now 11) excelled in her sport to become an elite gymnast, so things were progressing nicely.

“I was quite naive at the time,” admits Krystal. “I didn’t know that when you set it up it’s immediately open. It’s not private. You have to go into the settings. And when notifications came in I couldn’t see all the individual likes and comments because of the size of the account.” About a year ago a friend suggested that the account should be private because of child safety concerns so Krystal went about changing the settings and removing any followers that she thought seemed ‘dodgy’. But it was already too late. In June 2020 she found out that an image of her daughter had been published on a Russian porn website. “It was a photo of Edie in a pink leotard taken from the days before the Instagram account was private,” says Krystal. “The comments on the photo were shocking and I recognised images of other kids from the dance world on there too.”

Image: Pexels

Krystal took screenshots and alerted the police immediately about the images that had been stolen. That night she sat for 24 hours and removed every single one of the 17,000 followers on her daughter’s page. “I was hysterical and in tears. I literally just sat there tapping, tapping, tapping. It needed to be done,” she says. “If people really want to follow they can send another request.” Sure enough the requests started coming back in. Krystal’s new policy was to vet every single account before accepting them. She also decided to regularly go back in to check her follower list, closely inspecting the profile photo, posts and looking at who else they’re following. “Some accounts are initially set up to look legit and a part of the gymnast community, but they are really lurkers who just watch and don’t post,” adds Krystal. “You need to keep checking them. I’m constantly reporting and blocking.”

What happened to Krystal and her daughter is something that parents have become increasingly aware of and experts and activists are vociferously highlighting. And the platforms are beginning to take notice. Two months after a Glamour UK investigation revealed that child predators were harvesting images of kids on Instagram, the platform has introduced a new child safety feature. It allows users to report content that involves a child. Reported accounts are flagged as a priority to a child safety specialist. Previously this reporting option existed solely in the nudity category. A spokesperson for Facebook (which owns Instagram) told us: “Now, we're offering a new option for people to report entire accounts that might endanger children”. The spokesperson added: “We prohibit content and accounts that put young people in danger. We use artificial intelligence technology and reports from our community to keep child exploitation off Instagram.”

India*, an executive assistant in London in her 30's, is one of the people who has been actively lobbying for these changes. She runs the @pd_protect Instagram page which reports and spotlights alarming content. In the space of four weeks she reported 7,000 accounts to Instagram and she says they have really listened: “I initially had a handwritten list of six hundred accounts which I reported. Instagram investigated and removed every single one of those. I’m trying to work with them rather than against them,” she says. Having had some success with Instagram, India is now turning her attention to TikTok. She’s keen to emphasise that she’s not trying to shame parents but says they need to be more aware. “There are parents of child gymnasts and dancers who are doing the splits and dressed in leotards. Many of them are focused on followers and sponsorship deals. If they limit the number of followers their sponsorship deals are at risk” explains India. “The nature of TikTok is to encourage girls to dance. You can have a private profile but it won’t get as many followers or likes that way. The idea of a following or money being more important than child safety is a subject that makes people uncomfortable.”

Image: Pexels

As the mum of a child gymnast, the topic of followers is something that Krystal is only too aware of. “I’ve made a big emphasis to reassure my daughter not to be fooled by big followings. If you have five followers that’s ok,” she says. Krystal is critical of dance companies that are looking to partner specifically with child dancers with large followings. “They are not looking at the quality of the followers,” she says before adding that “loads of mums we know are in denial and turning a blind eye because of all the free stuff (like leotards, dance shoes and other products and merchandise) that comes with having a big account.” Like India, Krystal is now also turning her attention to TikTok particularly given the popularity of dancing on the platform and the younger demographic using it. “It took 6 months before I agreed to let my daughter use TikTok. Some people post some really funny videos on there and some are really creative. But I took it off her phone recently as I didn’t want her watching a particularly upsetting video.”

The video that Krystal is referring to was a livestream that was originally posted on Facebook showing a man dying by suicide. It went viral and was reposted across other social media platforms including TikTok. It ended up being promoted in TikTok’s ‘For You’ section which meant it was seen by thousands of unsuspecting viewers -- including kids who were left traumatised afterwards. The ‘For You’ section is similar to the Discovery tab on Instagram where you can see content from creators that you don’t necessarily follow but the algorithm has tried to learn from you and serves back popular content in the belief that you’ll like it.

That video served as a lesson in how harmful content can end up being seen by children even when they’re not actively seeking it out. But for anyone who’s been keeping across child safety online it is nothing new. YouTube has been grappling with the manipulation of algorithms for years. Violent content has been served alongside Peppa Pig and Paw Patrol videos, for example. Viewers encounter it unintentionally while using the search function or simply through autoplay where one video plays directly after the other. I’ve witnessed this first hand where my nieces (then aged three and four) sat on the living room sofa glued to an iPhone to watch My Little Pony on YouTube. The next video that came up initially looked and sounded the same. But a few minutes in, the voice overs began to change to a kinky, baby tone. At this point I grabbed the phone off my nieces before they saw what happened next -- the ponies violently started stabbing one another. Although they didn’t see the blood and gore that followed, these innocent little girls were still shaken by the experience which would have been worse if an adult hadn’t been in the room to monitor their viewing.

Why would anyone target children via kid’s content in this way? The most obvious reason is simple. Money. The ad revenue generated by video content can be sizable and given that nursery rhymes and other kids content are among the most watched things on YouTube, it doesn’t take a genius to figure it out. But that doesn’t explain how disturbing video content has still managed to creep on to YouTube Kids which is supposed to filter out content that isn’t suitable for children.

In January 2020, YouTube made changes to how kids content is published on the platform with big financial implications. Creators are now required to label their videos as ‘made for kids’ which means that a range of features will no longer be available including targeted ads and comments. These changes were lamented by some who knew their income would take a hit. But they were devised after the company was fined almost $200 million for breaking child privacy laws in the US. Here in the UK, a legal case against Google (which owns YouTube) has just begun for allegedly breaching the privacy of under 13’s by collecting data without parental consent.

“YouTube has effectively admitted that some of their systems have not adequately protected children,” says online safety expert John Carr OBE. Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He’s currently carrying out research on TikTok, which he describes as a magnet for paedophiles. “YouTube is massive and dominant,” he says “But TikTok is coming.”

“I don’t think TikTok is a safe space for kids. The truth is it is completely inappropriate for children,” says Carr before describing some of the age inappropriate content he has encountered on the app while registered as a 13-year-old for research purposes. “Parents think TikTok is about crazy dances. In the real world we have all sorts of things that kids are not meant to see. But on TikTok there are videos showing three teenage girls talking about fucking each others boyfriends or a mother describing her daughter as a little c*** who she should’ve strangled at birth.” Carr believes that TikTok needs to vastly improve its age verification process. “There are some excellent programmes available,” he says “but the companies are not afraid of the regulators.”

Tik Tok has a range of features to protect young users such as family pairing (which allows parents to link to a child’s account), a restricted mode (which hides potentially inappropriate content and limits messaging on the app) and a function to manage screen time. A TikTok spokesperson told Glamour: "Our Terms of Service make clear that users must be at least 13 years or older to use TikTok, and it has a 12+ App Store rating so parents can simply block it from their child’s phone using device-based controls. If we learn that someone under 13 is using the platform, we disable their account permanently.” Earlier this year TikTok also limited direct messages to accounts that are registered to 16-year-olds or over. But the reality is that its user base is much younger than the age limit of 13. Almost half of British children are using it according to a study.

Internet expert John Carr’s point about age verification on TikTok still stands. It falls short. There is no proof of age required. All you need is an email address or mobile number. I registered as a 13-year-old girl on the app and within minutes it was serving me weight loss content. Glamour asked TikTok if its age verification was robust enough. A TikTok spokesperson said: "We know the industry as a whole needs to do more work on age verification, and we are committed to working with peers, regulators and key stakeholders to find an industry-wide solution to ensure that only those who meet minimum age requirements use platforms like ours. Keeping our users safe is our top priority.”

The prominence of dancing on TikTok is something that parents like Krystal is wary of, or more specifically the type of dancing that’s part of the TikTok culture. “I watched the movie Cuties on Netflix recently and as much as it disgusted me, it also reminded me of what you’d see on TikTok. The dancing is very similar with all of these young girls grinding.” Cuties has attracted criticism and even led some people to cancel their Netflix subscriptions over its sexualised depiction of young girls and stereotypes of black bodies and Muslim women.

A poster for the film showing the scantily clad tween protagonists in provocative poses was widely criticised and led Netflix to issue an apology. The film is a coming of age story about an 11-year-old from a conservative Senegalese family who joins a sassy dance troupe to rebel against her background. In an op-ed for the Washington Post the film’s director and writer Maïmouna Doucouré (who is herself French-Senegalses) said that Cuties was a story about modern girlhood and the confusion that young girls experience during puberty in the digital age. She was inspired to write the script for the film after speaking to a group of 11-year-old dancers at a community event in Paris who told her that “the sexier a woman is on Instagram or TikTok, the more likes she gets.” “They tried to imitate that sexuality in the belief that it would make them more popular,” wrote Doucouré. “They construct their self-esteem based on social media likes and the number of followers they have.”

In the world of ‘cuties’ on Instagram and TikTok the list of things to protect kids from can seem overwhelming. Finding a way to safeguard children from grooming, cyberbullying, commercial exploitation and violent content is not a low lift. But there are lots of resources available online: from what parents need to know about TikTok to blogs from activists like India with general information about how to keep children safe online. There is also child protection software that can be downloaded and installed on children’s devices such as SafeToNet. It’s an app that uses language processing and artificial intelligence to detect warning signals around language or behaviour online. Parents receive an alert if anything in their child’s phone activity raises a red flag but they cannot read messages so the child’s privacy is not infringed upon. Founder Sharon Pursey says it can help guide a child and disrupt potentially harmful conversations. The alert provides an extra layer of support so that parents can initiate a dialogue with their child. The company has also released a range of steps that parents can follow in order to keep their kids safe online such as making sure video phones are not used in the bedroom or bathroom. “Children are so savvy online,” continues Pursey. “We’ve had cases where kids as young as six are sexting as they learn from their older siblings who are watching porn.”

Amid all the uncertainty about what exactly is going on when a child is using a phone during lockdown, one thing is clear: dialogue between parents and children is imperative. Cutting kids off from spaces like TikTok is not necessarily the answer. Supervision and building knowledge and trust over time (depending on the age of the child) is a key part of the process. Technology and the content consumed online is influencing children in so many ways. It’s informing how they see themselves and the world around them. But it is not without risks and there is no way of sugar coating what those risks are. Perhaps John Carr said it best: “Families and children are spending more time online during lockdown, parents are busy working from home and this has presented predators with a golden opportunity. Everything is happening more. The pandemic has put the child protection issue on steroids.”

*Some interviewee details have been redacted.

This originally appeared on GLAMOUR UK | Ann Marie Tomchak

Share this article: