LONDON:
Snapchat is kicking out more kids in the UK from its site each month compared to the tens of thousands blocked by rival TikTok, according to internal information shared by the company with Britain’s media regulator Ofcom. and seen by Reuters.
Social media platforms such as Meta’s Instagram, ByteDance’s TikTok, and Snap Inc.’s Snapchat require users to be at least 13 years old. These restrictions are intended to protect the privacy and safety of minors.
Before the British plan Internet Security Billaimed at protecting social media users from harmful content such as child pornography, Ofcom asked TikTok and Snapchat how many under-13 suspects they kicked out their position during the year.
According to information seen by Reuters, TikTok told Ofcom that between April 2021 and April 2022, it blocked an average of 180,000 suspected under-age users in the UK every month. about 2 million in that 12 months.
In the same period, Snapchat revealed that it lost about 60 dollars a month, or only 700 in total.
A spokesperson for Snap told Reuters that the numbers were wrong in measuring the amount of work done by the company to keep under 13s from their position. The spokesperson declined to provide additional information or detail the specific security measures taken by the company.
“We take these obligations very seriously and every month in the UK we block and delete tens of thousands of attempts from minors to create a Snapchat account,” said a Snap spokesperson.
Ofcom’s latest research shows that the two programs are equally effective with younger users. Children are also more likely to set up their own Snapchat accounts, than their parents use, compared to TikTok.
“There is no sense that Snapchat is blocking part of the number of children of TikTok,” said a source inside Snapchat, speaking on the condition of anonymity.
Snapchat blocks users from signing up with a birthday that puts them under 13. Reuters could not confirm what data is in place to remove young users if they receive the status and the speaker did not explain.
Ofcom told Reuters that reviewing the steps video sharing companies are taking to protect children online remains a key area of focus, and that the regulator, which works independently of the government, will report its findings later this year.
Currently, social media companies are responsible for setting age limits on their platforms. However, under the long-awaited Internet Security Bill, they will be required by law to comply with these restrictions, and show how they will do it, for example through age verification technology.
Companies that do not comply with their terms of service are fined 10% of their annual revenue.
In 2022, Ofcom research found that 60% of children between the ages of eight and 11. at least one social media account, usually done by giving a wrong date of birth. The Snapchat regulator also found that the most popular program for minors on social media.
FOR YOUNG CHILDREN
Social media poses serious risks to young children, child safety advocates say.
According to recent statistics published by the NSPCC (National Society for the Prevention of Cruelty to Young Child), Snapchat found 43% of the cases where social media was used to distribute indecent images of children.
Richard Collard, assistant head of child online safety at the NSPCC, said it was “very alarming” that the number of young Snapchat users appeared to be removed.
Snapchat “needs to take stronger action to ensure that young children are not using the platform, and that older children are protected from harm,” he said.
The United Kingdom, like the European Union and other countries, is looking for ways to protect media users, especially children, from harmful content without harming free speech.
Enforcing age restrictions is expected to be a key part of his Internet Safety Bill, along with ensuring companies remove content that their terms of service are illegal or prohibited.
The representative of TikTok said that his figures spoke to the strength of the company’s efforts to remove the doubts of young users.
“TikTok is a 13+ platform and we have measures in place to enforce our minimum age requirements, both at the time of signing up and by continuing to remove under-age suspicions from the our situation,” they said.
.