Ofcom issues new guidelines for video sharing platforms to deal with ‘harmful’ content: media, telecommunications, IT, entertainment

0

United States: Ofcom issues new guidelines for video sharing platforms to tackle “harmful” content

To print this article, simply register or connect to Mondaq.com.

Ofcom issued new orientation for video sharing platforms to better protect users from harmful content.

This is not part of the Online Safety Bill (formerly the Online Damage Bill), which is still taking shape, but Ofcom will also be the online damages regulator, which could provide some guidance. useful information on Ofcom’s approach.

Video Sharing Platforms (VSP) are a type of online video service where users can upload and share videos with other platform users. They allow people to engage with a wide range of social content and features.

VSPs established in UK (including some big names such as Snapchat, TikTok, Twitch, and Vimeo) are required by law to take steps to protect those under the age of 18 from potentially harmful video content; and all users of videos likely to incite violence or hatred, as well as certain types of criminal content.

Ofcom says its best practice guide is designed to help businesses understand their new obligations and decide how best to protect their users from this type of harmful material. They also published a short explanatory guide for industry on the new framework for video sharing platforms.

Regulatory approach

Ofcom’s role is to enforce the rules set out in legislation and to hold VSPs to account. Unlike its regulatory role with respect to the content broadcast, when regulating VSPs, Ofcom’s role is not to evaluate individual videos. Obviously, the massive volume of online content means that it’s impossible to prevent all damage.

Instead, the laws focus on what actions VSPs need to take, if any, to protect their users – and they have a fair amount of flexibility in how they do that. To help them meet their user protection obligations, Ofcom guidelines expect VSPs:

  • Provide clear rules for uploading content.Downloading content related to terrorism, child pornography or racism is a criminal offense. Platforms must have clear and visible terms and conditions that prohibit this – and enforce them effectively.
  • Have easy reporting and complaint processes.VPSs should implement tools that make it easy for users to report harmful videos. They must indicate how quickly they will react and be open to any action taken. Providers should provide a formal way for users to bring their concerns to the platform and challenge their decisions. Ofcom considers this vital to protect the rights and interests of users who upload and share content.
  • Restrict access to adult sites. VSPs that host pornographic material must have strong age verification in place, in order to prevent those under the age of 18 from accessing such material.

Through all of this, Ofcom must also balance the rights of users to freedom of expression.

Horizontal sweep

Ofcom has also defined five priorities for the coming year, which are:

  1. working with virtual service providers to reduce the risk of child pornography;
  2. fight hate and terror online;
  3. ensure an age-appropriate experience on platforms popular with under-18s;
  4. laying the groundwork for age verification on adult sites; and
  5. ensure that VSP’s processes for reporting harmful content are effective.

The content of this article is intended to provide a general guide on the subject. Specialist advice should be sought regarding your particular situation.

POPULAR POSTS ON: US Media, Telecom, Computers, Entertainment

The Open App Markets Act targets Apple and Google Stores

Gamma law

While Epic Games’ lawsuit against Apple has drawn attention to practices such as Apple’s 15-30% commission on app sales, in-app purchases, and Apple’s gag rule on advertising, Apple is not the only developer of technology companies …


Source link

Leave A Reply

Your email address will not be published.