Owing to regulatory concerns, TikTok has long been working to prevent the younger audience from viewing inappropriate content on its platform.
In this pursuit, the company has now come up with a new borderline suggestive model โ that can automatically detect the borderline content and hide them from the appโs recommendations to the young audience. This could be sexually explicit content thatโs not violating the platform policies but is still inappropriate for younger people to see.
TikTokโs Borderline Suggestive Model
Whether for its security practices or the content prevailing on it โ TikTok has been facing enormous heat in the last couple of years to properly manage them all. One among them is the ability to protect young users on its platform โ who may be subject to explicit content that can badly influence them.
Well, to manage it, TikTok has announced a bunch of tools in recent months. From age restrictions to content labels to identify explicit content and prevent young people from watching it, TikTok has tried it all. Yet, thereโs some form of mature content that always sneaks through.
Thus, the company has now come up with a new system called the โborderline suggestive modelโ that can automatically identify โsexually explicit, suggestive, or borderline content.โ These could be videos that donโt explicitly break the platform rules but may not be appropriate for younger users to stream.
This has long been present in Instagram โ where the platform automatically detects and hides borderline content to young usersโ recommendations. Though itโs a good move from both companies, it has long been a problem for the automated systems to properly detect mature content.
So we should wait to see how effective TikTokโs model is. Though the company didnโt list this information, it claims to have โprevented teen accounts from viewing over 1 million overtly sexually suggestive videos.โ