Tech News

Google set to let minors remove their images from search – Tech News

Tech giant Google announced some policy changes on Tuesday, August 10th, 2021, geared at safeguarding people under the age of 18 from abuse on its search and Youtube platform. Minors or their parents will be able to request that their photos be removed from Google’s Image Search feature, which is a significant development given that the platform has always taken a lackadaisical approach to control its search engine.

Videos published to YouTube by underage persons will be turned to private by default, meaning they can only be watched by themselves and anyone they choose, though they can modify the settings as needed. For minors, the platform will also turn off autoplay by default and switch on digital well-being tools, such as, reminders for bedtime and warnings that remind them to take a break after.

The reforms come at a time when tech corporations are facing more scrutiny over safeguarding children. When Apple stated last week that it would scan iPhones for child exploitation pictures when those images are uploaded to the corporation’s iCloud storage service, it generated outrage. Some privacy advocates are concerned about the possibility of spying and abuse as a result of the shift.

Google does not allow children under 13 to create normal accounts, but it does provide some products, such as YouTube Kids, for children to use under the supervision of their parents. The company also said that ‘SafeSearch’, which filters out explicit search results, will be turned on automatically for users under the age of eighteen who are registered to their Google accounts and minors will also be unable to turn on Google’s Location History setting for all child user accounts, this feature was previously only available to minors with supervised accounts.

Google also promises to beef up controls to keep age-sensitive ads off of teenagers’ screens and will ban ads that target them based on their age, gender, or hobbies. The action comes as the UK prepares to implement new restrictions in September. This isn’t the first time Google has made changes to its search engine standards to combat abuse. The company had earlier said in June, that it will tweak its search algorithms to target websites that publish unsubstantiated and defamatory posts about people.

The Age-Appropriate Design Code, developed by the Information Commissioner’s Office (ICO), requires tech companies to build digital services that are safe for children from the start.

“These steps are only part of what is expected and necessary,” said Baroness Kidron, chair of the 5Rights Foundation, “but they establish beyond doubt that it is possible to build the digital world that young people deserve and that when government takes action, the tech sector can and will change.”

Critics had before blamed Google for defying the Children’s Online Privacy Protection Act, or COPPA, federal legislation that governs the acquisition of personal data from children under the age of 13 on websites. For violating YouTube’s COPPA, the US Federal Trade Commission punished the business with a $170 million fine and new guidelines in 2019. Therefore, the video site made significant changes to how kids interface with films, including lessening the measure of information it gathers from those views.

Enjoyed this post? Never miss out on future posts by following us»

Leave a Reply

Discover more from Africa's top tech news platform

Subscribe now to keep reading and get access to the full archive.

Continue reading