In 10 minutes on TikTok I saw self-harm, girls offering sex, boys wielding knives and potentially deadly challenges – The Sun

IT’S the app that's taking school playgrounds across Britain by storm, with more than a billion downloads worldwide.

And at first glance TikTok looks harmless enough, thrilling fans with a flurry of cute animal videos and clips of their favourite celebs.


But the seemingly innocent video-sharing platform is hiding a much more sinister side.

Lurking just a few swipes away, a steady stream of drugs, predatory messages and animal cruelty is shockingly easy to find.

That's why Sun Online is today launching its TikTok Time Bomb series to make sure parents are aware of the risks their kids are being exposed to, and what they can do to better protect them.

We also want TikTok to better moderate its content so that it not being left to kids to protect themselves online.

This week, our campaign will shine a light on the dangerous side of the app and how its lax security and moderation has allowed it to become a magnet for paedophiles, profanity, crime, violence and extremism.


We've seen kids as young as eight being groomed on TikTok, while other creeps take advantage of young girls posting increasingly sexualised content.

Disturbingly, more than a quarter of parents admit they are clueless such content even exists.

Many are allowing kids some as young as THREE to swipe through video after video, while a quick straw poll on Facebook reveals that the lower age limit of 12 is being completely ignored.

The app has already been blamed for several deaths and there is concern within the tech industry about how easy it is for TikTok users to lie about their age.

Ray Walsh, a digital privacy expert at ProPrivacy tells Sun Online, "TikTok uses AI to serve users content depending on their previous likes and this can lead the algorithm to serve suggestive, abusive, violent, or sexually explicit content."

Indeed, within 10 minutes of signing up to find out what it's all about, I was watching content I wouldn’t show to my 18-year-old son, let alone a primary school child.

Abuse in seconds

The minimum age is meant to be 13 but when I signed up to the app there was no age verification process, so it was easy to submit a false date of birth.

As soon as I clicked into the app, a video starts on autoplay – in this case a video of a girl dancing – and then the user can like, share and swipe up for more content.

As I swiped on my first visit, the fourth video I came across was a gender fluid teen discussing their sexuality – before it ended with an older woman shouting “shut up you c***”.

The same vile rant cropped up in numerous videos – including one aimed at 17-year-old eco-campaigner Greta Thunberg.

But that was just the tip of the iceberg.


TikTok time bomb

TikTok has spread like digital wildfire, snapping up over 1.5 billion users since its global launch three years ago — including millions in the UK. 

On the surface, the world's fastest growing social media platform shows short clips of  lip-syncing to songs or showing off dance moves but there’s a far more sinister side. 

It’s become a magnet for paedophiles as well as a hotbed for violent and extremist content, with TikTok predators exploiting the platform's young user base and lax security to prey on the vulnerable.

We've seen kids as young as eight being groomed on TikTok, while other creeps take advantage of young girls posting sexualised content of themselves on the platform.

And that's especially worrying on a site which is attracting millions more children every year, with 53 per cent of kids now owning a smartphone by the age of seven.

That's why we launched our TikTok Time Bomb series — to make sure parents are aware of the risks their kids are being exposed to, and what they can do to better protect them. 

Everyone agrees social media can be a force for good, but it has to be done the right way and with proper controls in place.

We want TikTok to better moderate its content so that it's not being left to kids to protect themselves online. 

Dancing 12-year-olds sing about sex

Another worrying theme is the over-sexualisation of many girls featured – not to mention the anonymous paedos lurking on the site.

In one video, two girls who looked around 12 sang a lewd song about men “trying to get their d*** in my pants".

Another saw a teenager sending the message “Do you want head?” to everyone in her address book a different clip had boys demonstrating which “fingers they use” in a sex act.

There were numerous references to child abuse too, including one with a lad walking into a bedroom with the caption: “Me when I get bored of my girl and remember her little brother."

A sinister voiceover then laughed, “You thought I forgot about you little boy,” – as he reached for his flies.

Meanwhile, another made a tasteless gag about “Madeleine McCann’s kidnapper”.


Self-harm and boys wielding knives

A further dig down the video timeline revealed a clip by school pupils joking about saving “the new kid from jumping” to his death.

And shockingly, just a swipe away showed a girl aged around seven or eight revealing self-harm cuts on her arms.

Teenagers brandishing knives were commonplace too, with one captioned “knife training” showing a lad throwing a 10 inch carving knife about a kitchen.

In another, a balaclava-clad lad talked about Iran winning World War 3 and stabbed at the screen with a long blade.


Minutes later, I swiped through to a video with the hashtag 'Spiceworld', which showed two men driving past a zombified homeless woman – apparently on drugs – before cruelly shouting to make her jump.

Another, with 142,000 likes, showed a man agitated and shaking, high on drugs, and a third showed another spaced-out addict in Nottingham.

There were also numerous "comic" references to how the world looks on the drug ketamine, while videos about "my first joint" and a girl asking pals "shall we get a bag" – a reference to cocaine – also cropped up.

Risking lives for TikTok fame

The app is a hotbed for viral challenges – ranging from the fun to the outright dangerous.

Before long I stumble across footage of two girls dancing in the middle of a busy dual carriageway in a bid to go viral.

Horrifyingly, the caption read: “We got hit by a car doing this so make us famous.”

Numerous other “challenges” can be found with ease, including one which sees girls inviting schoolmates to “Kiss or slap” them.

“The real appeal of TikTok for youngsters lies in its 'gamification'," says Walsh. "The app encourages users to not just comment or like content but to respond with their own content.

"This can encourage young users to behave in ways which they otherwise wouldn’t in order to record it for the app.

"A current trend encourages young children to swear in front of their parents, and record their reactions.

"Other with potentially more serious consequences could spring up at any moment."

Take control of TikTok – change these settings now

Parents should do the following immediately…

Go private:

  • Head into Settings > Privacy and Safety and look for the Discoverability heading at the top.
  • Under that you'll see a setting called Private Account. Toggle this on.
  • TikTok recommends your page to lots of other users to improve video circulation.
  • Switch the setting off and the account will no longer be recommended to other users.

Shut out weirdos:

  • In Privacy and Safety > Safety, you can prevent other users from interacting with you.
  • Most of the settings are on Everyone by default, but can be changed to Friends or Off.
  • You can prevent interactions on comments, Duets, Reacts, users seeing which videos you've liked, and also messages.

Restricted Mode ON:

  • Restricted Mode tries to limit age-inappropriate content from appearing for children.
  • It's not perfect, and works through using computer-scanning systems – so some dodgy content will inevitably be missed.
  • It's also possible to set a passcode to prevent your child from changing this setting later on.
  • You'll find this in Settings > Digital Wellbeing > Screen Time Management.

Smacking dogs

One such trend called “#Clap your hands” proves particularly alarming.

This sees pet owners slapping their clearly distressed dogs to the tune of 'If You’re Happy and You Know It’.

While I saw one dog snarl as it was slapped on the bottom, a tiny puppy was also held in a man's arms while being repeatedly smacked in the face.

So many of the videos I've come across make for distressing viewing and I'm horrified to think that innocent kids could watch it so easily.

"The best way to monitor your child's activity on TikTok, is simply to join the app yourself," says Walsh.

"Becoming one of their chosen friends meaning you can see what content they upload and who else they are interacting with."

A spokesperson from TikTok, which last week announced a new feature to allow parents to control what
their kids view, says: "Promoting a positive and safe app environment for our users is a top priority for TikTok.

"We use both technologies and human moderation teams to identify, review and remove dangerous or abusive content. We have investigated every individual case that has been raised and removed all content that violates our Community Guidelines.

"We have a number of protective measures in place to reduce the opportunity for misuse and we're constantly evolving our measures to further strengthen safety on TikTok.

"While our protections won't catch every instance of inappropriate content, we continue to rapidly expand our content moderation teams and improve our technologies and policies so that TikTok can remain a place for positive creative expression."

Source: Read Full Article