Owing to regulatory concerns, TikTok has long been working to prevent the younger audience from viewing inappropriate content on its platform.
In this pursuit, the company has now come up with a new borderline suggestive model – that can automatically detect the borderline content and hide them from the app’s recommendations to the young audience. This could be sexually explicit content that’s not violating the platform policies but is still inappropriate for younger people to see.
TikTok’s Borderline Suggestive Model
Whether for its security practices or the content prevailing on it – TikTok has been facing enormous heat in the last couple of years to properly manage them all. One among them is the ability to protect young users on its platform – who may be subject to explicit content that can badly influence them.
Well, to manage it, TikTok has announced a bunch of tools in recent months. From age restrictions to content labels to identify explicit content and prevent young people from watching it, TikTok has tried it all. Yet, there’s some form of mature content that always sneaks through.
Thus, the company has now come up with a new system called the “borderline suggestive model” that can automatically identify “sexually explicit, suggestive, or borderline content.” These could be videos that don’t explicitly break the platform rules but may not be appropriate for younger users to stream.
This has long been present in Instagram – where the platform automatically detects and hides borderline content to young users’ recommendations. Though it’s a good move from both companies, it has long been a problem for the automated systems to properly detect mature content.
So we should wait to see how effective TikTok’s model is. Though the company didn’t list this information, it claims to have “prevented teen accounts from viewing over 1 million overtly sexually suggestive videos.”