Children have been left in the “absurd” situation of having to take on tech giants themselves over damaging algorithms, a leading peer has warned as ministers decided not to activate “class action” rights.
Baroness Kidron, a crossbench peer and leading children’s rights campaigner, said the move would leave children “unprotected” as individual families would struggle to take on the might of Silicon Valley behemoths.
Her comments come after the Government decided not to activate article 80(2) of the Data Protection Act 2018, which would allow charities to sue tech companies on the behalf of children over misuse of their personal information.
Social media and tech giants have previously been accused of misusing the data they collect on children to then expose them to dangerous content, such as self-harm images or inappropriate ads.
However, ministers said they did not see the need for the new rights as they felt the current and incoming data laws, which are enforced by the Information Commissioner’s Office (ICO), provide enough protection.
The ICO can levy fines running into the billions on companies that misuse users’ personal information, however none of the money goes to the victims themselves.
Proposals to activate 80(2) were also “strongly opposed” by tech companies and businesses who warned they could “lead to an acceleration in claims that is detrimental to firms, customers, the ICO, and the courts”.
Following the decision, Baroness Kidron, who is also chair of the children’s rights charity 5Rights, said the decision undermined the Government’s pledge to make the UK “world-leading” for online protections.
She said: “It is absurd to suggest that we should leave it to children to understand the opaque and complex system of data processing.
“This is about making sure that children’s social media profiles are private by default, preventing tech firms sharing a child’s real-time location and making them available to predators, stopping them sharing children’s data for commercial purposes – and importantly – preventing the insidious processing of a child’s data to recommend detrimental material such as self-harm or violent sexual content.”
The decision comes as tech companies are facing legal action around the world for the way their apps collect and use people’s data to target them with ads and content.
Last week, TikTok settled a class action lawsuit for $92 million (£66 million) in the US over allegations it was unlawfully using the facial recognition data of its users.
A Government spokesperson said: “Children and vulnerable people are at the heart of our commitment to make the UK the safest place to be online.
“Our current data protection laws already offer strong protections for people including children and other vulnerable groups, and we continue to assist them in exercising their rights.
“The ICO has a range of powers to investigate data breaches under the current laws and is already capable of forcing data controllers to take decisive action where needed.”