TikTok, Twitter and Instagram called dangerous for children in the UK

The British organization 5Rights, which specializes in protecting the digital rights of children, filed a complaint with the Office of the Information Commissioner (ICO), accusing dozens of large technology companies of systematically exposing children to danger on the Internet and violating the recently enacted Children’s Code of Great Britain.
The author of the complaint filed with the department was Baroness Beeban Kidron, chairman of 5Rights and a member of the House of Lords — it was the Baroness who initially proposed to approve the Code.
The charity has conducted its own investigation and brought charges against dozens of services, including popular ones such as TikTok, Snap, Twitter and Instagram, as well as lesser-known platforms Omegle, Monkey and Kik.
The document states that mobile applications released by these platforms include unfair design solutions and incentives that encourage children to share their location and receive personalized advertising based on this feature. According to the authors of the study, potentially dangerous materials are published on these platforms, including about eating disorders, self-harm and suicide.
Apps don’t check users’ ages carefully enough before giving them access to non-rated features, such as video chats with strangers. In addition, gaming applications are caught in the massive data exchange with third parties: from advertising companies like Google to food delivery services like Grubhub and Uber, as well as social platforms from Pinterest to Facebook.
The British Age Compliance Code (or Children’s Code) came into force at the beginning of September after the grace year for companies that had to make the appropriate preparations. By some estimates, it is considered innovative. Violations of the Code entail the same consequences as violations of the European General Regulation on the Protection of Personal Data (GDPR) – including a fine of up to 4% of the global turnover of the infringing company. H
Members of the US Senate and Congress called on American tech giants to voluntarily comply with the requirements of the document in relation to American children. As a result, the largest social platforms, including YouTube, Instagram and TikTok, have made some changes to their services, but many problems remain. The complaint describes in detail the alleged violations on 102 platforms, which indicates their systemic nature. For research purposes, the authors of the project registered devices running Android and iOS as belonging to children aged 8, 13 and 15 years. They were able to download 16 dating apps from the Apple store, including Tinder, Happn, Find Me A Freak, Bumble and others with an age rating of 18 years using an iCloud account registered to a 15-year-old teenager – it was enough to press the “OK” button to confirm that the user had reached the required age.
Dozens of applications, such as Omegle, allow users to communicate with strangers in text or video format. These apps request a date of birth to confirm the age of 13, and if the user is under 18, the consent of the parent is required. At the same time, there is no mechanism confirming that the parent has indeed given his consent. One of the teenagers told the authors of the study that as a child he spent a lot of time in Omegle and encountered intimate content on the platform, often communicating with adults.
It was also found that recommendation algorithms contain harmful material or endanger the safety of children, for example, by offering them unfamiliar adults as friends. Twitter search makes it easy to find information on such dangerous queries as, for example, “self-mutilation”. Underage users have free access to search results for such queries, as well as hashtags associated with them. These results contain images and even instructions that are unacceptable for minors to view, which is not only prohibited by British law, but also contradicts the platform’s own rules.
Finally, according to the results of the 5Rights investigation, the Monkey video chat app uses memes on pop-ups, encouraging users to open access to a location, which is then used to search for users in the same area – including unfamiliar adults. Snapchat makes accounts of users under 16 closed by default for the Snap Maps function, however, the authors of the study believe, it pushes users to ensure that their location can be used for targeted advertising.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *