Ofcom is making new rules and measures to regulate content on video-sharing platforms including Tik Tok, Snapchat, and Twitch.
According to the BBC, Ofcom is requiring the apps to ensure content related to terrorism, child sexual abuse and racism does not appear.
The move seemed necessary when Ofcom found out that a third of users have seen hateful content on such sites. The agency has therefore put in repercussions for any app that breaches the new guidelines.
Any VSP that fails to adhere to the rules will either be fined or suspended entirely in the most serious cases.
What exactly are the rules?
According to the BBC, the VSPs will have to:
- provide and effectively enforce clear rules for uploading content
- make the reporting and complaints process easier
- restrict access to adult sites with robust age-verification
Ofcom promised a report next year into whether those in scope – and there are 18 in total – were taking the appropriate steps.
The apps involved the new guidelines are the ones falling within UK jurisdiction, and adhere to the “VSP” definition put by already existing legal criteria.
According to the IWF report, there has been a 77% increase in the amount of “self-generated” abuse content in 2020.
Ofcom’s job will not involve assessing individual videos. And it acknowledges the impossibility of regulating all content on the internet.