Google introduced a new safety feature Wednesday that allows minors under 18 to request images of themselves to be removed from search results.
The tech giant has created a help site for such requests. This allows minors to request information removal and prevents their information from appearing in Search and on specific websites. Parents and guardians are also permitted to submit requests.
However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement.
Google must approve requests from users who are 18 years or older. If they are over 18, they cannot request to have images removed that were taken when they were teenagers.
Google introduced a new safety feature Wednesday that allows minors under 18 to request images of themselves removed from search results.
Google first announced plans to activate the feature in August.
This comes as regulators and lawmakers have been closely monitoring major online platforms for their impact on the safety, privacy, and well-being of younger users.
After a whistleblower revealed how Instagram and Facebook harm children, Instagram is currently under fire.
Apple also announced in August that it would join the movement to protect minors by scanning users’ photos for child sexual abuse material.
However, there are circumstances in which a person’s request may not be granted – ‘the exception of case of compelling public interest or newsworthiness,’ Google shared in a statement
However, the act was not welcomed with open arms – critics said it was an invasion of privacy that forced Apple to delay the launch.
Mindy Brooks is Google’s general manager of kids and families. In an August blog post, Mindy wrote: ‘Some countries have implemented regulations in this area. As we comply, we’re exploring ways to create consistent product experiences and user control for kids, teens, and all people around the world.
To request the removal of an image, users must provide URLs to the image.
Users will then receive an automated email confirmation after submitting their request, which means Google will review the removal request by ‘gathering more information.’
A notification of the action taken will then be sent.
‘If the request doesn’t meet the requirements for removal, we’ll also include a brief explanation. If your request is denied and later you have additional materials to support your case, you can re-submit your request,’ according to Google.
Google’s new feature also helps users report child sexual abuse imagery to the National Center for Missing and Exploited Children or an organization in a provided list that is based on the user’s geography.
The company will also review requests in the case of a child who died before the age of 18 and is in a tragic situation.