Published in News

Europeans put 19 big tech outfits on notice

by on26 April 2023


New online content regulations

The European Commission will require 19 large online platforms and search engines to comply with new online content regulations starting on August 25, European officials said.

The EC specified which companies must comply with the rules for the first time, announcing today that it "adopted the first designation decisions under the Digital Services Act."

Five of the 19 platforms are run by Google -- YouTube, Google Search, the Google Play app and digital media store, Google Maps, and Google Shopping.

Facebook and Instagram are on the list, as are Amazon's online store, Apple's App Store, Microsoft's Bing search engine, TikTok, Twitter, and Wikipedia. These platforms were designated because they each reported having over 45 million active users in the EU as of February 17. The other listed platforms are Alibaba AliExpress, Booking.com, LinkedIn, Pinterest, Snapchat, and German online retailer Zalando.

Companies have four months to comply with the full set of new obligations and could face fines of up to six per cent of a provider's annual revenue. One new rule is a ban on advertisements that target users based on sensitive data such as ethnic origin, political opinions, or sexual orientation. There are new content moderation requirements, transparency rules, and protections for minors.

Under the new rules "targeted advertising based on profiling towards children is no longer permitted," the EC said.

Companies will have to provide their first annual risk assessment on August 25, and their risk mitigation plans will be subject to independent audits and oversight by the European Commission.

"Platforms will have to identify, analyse and mitigate a wide array of systemic risks ranging from how illegal content and disinformation can be amplified on their services, to the impact on the freedom of expression and media freedom. Specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated.The new requirements for the 19 platforms include:

  • Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
  • Users will be able to report illegal content easily and platforms have to process such reports diligently; - Platforms need to label all ads and inform users on who is promoting them;
  • Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.

Platforms will be required to "analyse their specific risks, and put in place mitigation measures -- for instance, to address the spread of disinformation and inauthentic use of their service," the EC said. They will also "have to redesign their systems to ensure a high level of privacy, security, and safety to minors."

 

Last modified on 26 April 2023
Rate this item
(3 votes)