Days after Apple introduced child safety measures on its platform, tech giant Google has also announced a slew of changes for YouTube, Search and its other apps to make them safer for kids. The latest change will enable parents to have their images removed from Gooogle search results.
“In the coming weeks, we’ll introduce a new policy that enables anyone under the age of 18, or their parent or guardian, to request the removal of their images from Google Image results. Of course, removing an image from Search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online” the tech-giant said in a statement.
Many of the updates are dedicated to YouTube and YouTube Kids. The key change is aimed at young creators aged 13 to 17, changing the default upload setting to the most private option available. The video can only be seen by select users unless the creator changes it to public.
As part fo the new measure, Google will also remove “overly commercial content” from the children’s version of YouTube and change what kind of adverts can be targeted at under-18s. Like Facebook, Google is also limiting what kind of advertising activities can be performed using data from kids accounts.
Google will also stop allowing children under the age of 18 from turning on location history on their devices, as long as they have Supervised Accounts.
In the coming months, the search engine giant turn SafeSearch on for existing users under 18 and make this the default setting for teens setting up new accounts. SafeSearch helps filter out explicit results when enabled and is already on by default for all signed-in users under 13 who have accounts managed by Family Link.
Also Read: Pragmatism prevails with the withdrawal of retrospective tax
The tech giant will also launch a new safety section on Google Play that will let parents know which apps follow our Families policies. Apps will be required to disclose how they use the data they collect in greater detail, making it easier for parents to decide if the app is right for their child before they download it. This is similar to Apple’s App Transparency protocols and requires app developers to disclose what kind of user data they access and use.