science

Google lets under 18s request that their pictures be removed from search results


Google will allow under 18s or their parents to request that their pictures be removed from search results: Images will still show up if there is ‘compelling public interest or newsworthiness’

  • Google launched a feature that lets minors under 18 request images of themselves be taken down from Search
  • Parents and guardians of the minor can also make the same request
  • Google activated a new help page for these requests, which the user provides image links to content they wish to have removed 










Google rolled out a new safety feature on Wednesday that lets minors under 18 request images of themselves be removed from search results. 

The tech giant launched a help page for such requests that not only lets minors request the removal of information, but also prevent information of themselves from appearing in Search and on a specific website – parents and guardians are also allowed to submit requests.

However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement.

Users must also be 18 years old or younger for Google to approve the request, meaning if they are over the age they cannot apply to have images removed when they were a teenager.

Google rolled out a new safety feature on Wednesday that lets minors under 18 request images of themselves be removed from search results

Google rolled out a new safety feature on Wednesday that lets minors under 18 request images of themselves be removed from search results

Google first announced plans for the feature in August, adding it would be activated in the coming weeks.

It comes as major online platforms have long been under scrutiny from lawmakers and regulators over their sites’ impact on the safety, privacy and wellbeing of younger users.

Facebook and Instagram are currently under fire after a whistleblower shared how the platforms harm children.

Apple also announced in August that it would join the movement to protect minors by scanning users’ photos for child sexual abuse material.

However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement

However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement

However, the act was not welcomed with open arms – critics said it was an invasion of privacy that forced Apple to delay the launch. 

Mindy Brooks, Google’s general manager for kids and families, wrote in an August blog post: ‘Some countries are implementing regulations in this area, and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens globally.’

In order to request removal of an image, the help form requires users provide image URLs for the specific content.

Users will then receive an automated email confirmation after submitting their request, which means Google will review the removal request by ‘gathering more information.’

Following that, a notification of action taken will be sent. 

‘If the request doesn’t meet the requirements for removal, we’ll also include a brief explanation. If your request is denied and later you have additional materials to support your case, you can re-submit your request,’ according to Google.

Google’s new feature also helps users report child sexual abuse imagery to the National Center for Missing and Exploited Children or an organization in a provided list that is based on the user’s geography.

And the company will review requests in the tragic situation of a child who has died before reaching the age of 18.



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.