FeaturesTech News

Maximizing Parent Control on Social Media Platforms

Big social media companies were dragged over how they failed to protect the teens and the younger ones from adult content and strangers a year back. This caught the attention of lawmakers, and different social media platforms: Facebook, Snapchat, Tiktok and others met a series of congressional hearings over how their social media platforms portray adult and harmful content, that damages the mental health of teenagers, where there is lack of effective parental control features for streamlining contents seen by teens.

In the hearings of Facebook, there was an introduction of Facebook papers from the whistleblower containing the Impact of Instagram on teens, and the parent company Meta vowed to change swiftly. There were significant changes on major social media platforms as some platforms introduced changes in the algorithm in which by default the teenagers are not exposed to sensitive content, and more tools have been introduced around parental control. 

Despite these changes, the lawmakers and the stakeholders were not satisfied with the changes made in their opinion it was said to be limited. The solution tendered needs to be improved to effectively protect the teens and the young ones across social media platforms.

Senator Richard Blumenthal reported that “Despite the congressional hearings based on the revelation of Facebook Papers, the social media platforms made only a little, unrapid step to make the decision.” “The technology is evolving and extensive, we need rules to ensure teenagers’ safety on their social media platforms.” He added 

Several experts also agreed with the fact that social media platforms are not effectively curbing the rate at which teenagers are prone to harmful content online among many other challenges. Michela Menting digital security director at ABI research stated that “social media companies are presenting very little element to curb their ills on the platform.” 

A clinical psychologist named Alexandra Hamlet reported that she was invited to a meeting geared toward how the Instagram platform can be improved for teenagers and young ones. “I haven’t seen of our ideas being executed,” “social media companies need to continually work on improving parental controls, protection of teenagers and younger ones against harmful contents and strangers. Despite this criticism about the social media company, the company choose to ignore it with no comment.

The way parental controls work on different social media will be discussed below:

Facebook

The Facebook platform has a Safety Centre aimed at providing supervision tools such as articles and advice that have been published on the platform. A Meta spokesperson Liza Crenshaw stated that “the vision for the family centre is to give the parents and guardians the privilege to help the younger ones control and manage their experiences across Meta technologies, from one place.”

The safety centre also has a guide to the emerging technology Meta VR parental supervision tools from ConnectSafety a non-profit organization aimed at helping the younger ones online, and assisting parents with discussing virtual reality with their kids. Guardians and parents can see who their kids have blocked and also authorize teens to download or purchase an application.

Instagram

The first step this platform made after so much criticism was to release a version of Instagram that is for kids under the age of 13 and aimed at making it safer for younger ones. It contains an educational centre for parents and guardians which enables them to read articles and other resources to ensure safety. They can also see how much time the kids have spent on the platform with the feature of setting limits, parents or guardians can see who their kids are following and the account that follows them, and they will also be notified when the user makes account settings.

The Instagram application has a feature which will engage users while they take a break from the application like allowing the user to write something down, take a deep breath, or listen to a song after a certain amount of time.

TikTok

Recently, TikTok made changes to how content will be filtered to identify possibly problematic videos. It was also said that the new tool aims to help people decide how long they want to stay on the platform. The tool allows users to set screen time breaks, and provide analysis of how they use the application. The TikTok platform has a family pairing hub, it allows parents to link their accounts to their teens’ accounts and set parental controls such as restricting them from live videos, and direct messages, disabling push notifications after a certain time, and also restricting how the teen is exposed to content on the platform.

Snapchat 

Some months back, Snapchat launched a parent hub geared at giving more insights about how their teenagers use the application. They can know who the teenager has been chatting with most recently without seeing the conversation content.

To curb teenagers’ exposure on the platform Snapchat made a few changes such as banning teenagers from having a public account and also requiring teenagers to be mutual friends before they can chat with them. Snapchat also allows the feature of allowing parents or guardians to be notified when a teenager adds a new friend. The location tool on Snapchat also helps parents to locate and identify where their teenagers are even after closing the application.

Discover more from Africa's top tech news platform

Subscribe now to keep reading and get access to the full archive.

Continue reading