Ofcom issues new guidelines for video sharing platforms to deal with ‘harmful’ content: media, telecommunications, IT, entertainment
United States: Ofcom issues new guidelines for video sharing platforms to tackle “harmful” content
To print this article, simply register or connect to Mondaq.com.
Ofcom issued new orientation for video sharing platforms to better protect users from harmful content.
This is not part of the Online Safety Bill (formerly the Online Damage Bill), which is still taking shape, but Ofcom will also be the online damages regulator, which could provide some guidance. useful information on Ofcom’s approach.
Video Sharing Platforms (VSP) are a type of online video service where users can upload and share videos with other platform users. They allow people to engage with a wide range of social content and features.
VSPs established in UK (including some big names such as Snapchat, TikTok, Twitch, and Vimeo) are required by law to take steps to protect those under the age of 18 from potentially harmful video content; and all users of videos likely to incite violence or hatred, as well as certain types of criminal content.
Ofcom says its best practice guide is designed to help businesses understand their new obligations and decide how best to protect their users from this type of harmful material. They also published a short explanatory guide for industry on the new framework for video sharing platforms.
Ofcom’s role is to enforce the rules set out in legislation and to hold VSPs to account. Unlike its regulatory role with respect to the content broadcast, when regulating VSPs, Ofcom’s role is not to evaluate individual videos. Obviously, the massive volume of online content means that it’s impossible to prevent all damage.
Instead, the laws focus on what actions VSPs need to take, if any, to protect their users – and they have a fair amount of flexibility in how they do that. To help them meet their user protection obligations, Ofcom guidelines expect VSPs:
- Provide clear rules for uploading content.Downloading content related to terrorism, child pornography or racism is a criminal offense. Platforms must have clear and visible terms and conditions that prohibit this – and enforce them effectively.
- Have easy reporting and complaint processes.VPSs should implement tools that make it easy for users to report harmful videos. They must indicate how quickly they will react and be open to any action taken. Providers should provide a formal way for users to bring their concerns to the platform and challenge their decisions. Ofcom considers this vital to protect the rights and interests of users who upload and share content.
- Restrict access to adult sites. VSPs that host pornographic material must have strong age verification in place, in order to prevent those under the age of 18 from accessing such material.
Through all of this, Ofcom must also balance the rights of users to freedom of expression.
Ofcom has also defined five priorities for the coming year, which are:
- working with virtual service providers to reduce the risk of child pornography;
- fight hate and terror online;
- ensure an age-appropriate experience on platforms popular with under-18s;
- laying the groundwork for age verification on adult sites; and
- ensure that VSP’s processes for reporting harmful content are effective.
The content of this article is intended to provide a general guide on the subject. Specialist advice should be sought regarding your particular situation.
POPULAR POSTS ON: US Media, Telecom, Computers, Entertainment