Owing to regulatory concerns, TikTok has long been working to prevent the younger audience from viewing inappropriate content on its platform.

In this pursuit, the company has now come up with a new borderline suggestive model โ€“ that can automatically detect the borderline content and hide them from the appโ€™s recommendations to the young audience. This could be sexually explicit content thatโ€™s not violating the platform policies but is still inappropriate for younger people to see.

TikTokโ€™s Borderline Suggestive Model

Whether for its security practices or the content prevailing on it โ€“ TikTok has been facing enormous heat in the last couple of years to properly manage them all. One among them is the ability to protect young users on its platform โ€“ who may be subject to explicit content that can badly influence them.

Well, to manage it, TikTok has announced a bunch of tools in recent months. From age restrictions to content labels to identify explicit content and prevent young people from watching it, TikTok has tried it all. Yet, thereโ€™s some form of mature content that always sneaks through.

Thus, the company has now come up with a new system called the โ€œborderline suggestive modelโ€ that can automatically identify โ€œsexually explicit, suggestive, or borderline content.โ€ These could be videos that donโ€™t explicitly break the platform rules but may not be appropriate for younger users to stream.

This has long been present in Instagram โ€“ where the platform automatically detects and hides borderline content to young usersโ€™ recommendations. Though itโ€™s a good move from both companies, it has long been a problem for the automated systems to properly detect mature content.

So we should wait to see how effective TikTokโ€™s model is. Though the company didnโ€™t list this information, it claims to have โ€œprevented teen accounts from viewing over 1 million overtly sexually suggestive videos.โ€

LEAVE A REPLY

Please enter your comment!
Please enter your name here