Weather     Live Markets

The European Commission has released new guidelines aimed at online platforms like X, TikTok, and Facebook, urging them to address risks to elections and combat voter disinformation. These guidelines are part of the Digital Services Act and are targeted at platforms with over 45 million active users in the EU. While the guidelines are not legally binding, platforms that do not comply could face fines of up to 6% of their global turnover. The move is part of an effort to push Big Tech companies to take more responsibility for upholding democratic values and addressing the spread of misleading content online, particularly in relation to elections.

One key focus of the guidelines is the need to address election-related risks, harmful AI content, and misleading political advertising. Platforms are required to implement measures to tackle these issues, as well as cooperate with authorities, experts, and civil society organizations to address emerging threats. Concerns around the use of generative AI and deepfake content have raised alarms about the potential for misinformation to impact the integrity of elections. The guidelines also address the use of recommender systems, which prioritize divisive or harmful content and require platforms to give users more control over their feeds.

In preparation for the upcoming elections to the European Parliament in June, some platforms such as Google, Meta, and TikTok have already taken steps to combat misinformation. The European Commission plans to test the new guidelines with relevant platforms in April to ensure they are effective. With 370 million eligible voters across 27 member states, the linguistic complexity of the EU presents a challenge, as platforms may struggle to have content moderators proficient in all 24 official languages. This complexity makes European elections particularly vulnerable to misinformation and interference.

The guidelines come amid a global surge in elections, with over 2 billion voters expected to participate this year. While ensuring compliance with the Digital Services Act may be costly for platforms, the potential impact on election integrity and democratic processes is significant. The senior EU official behind the guidelines acknowledges that the cost of implementing similar rules outside the EU is relatively low, suggesting that platforms could adopt similar safeguards worldwide. This coordinated effort by the European Commission signals a shift towards holding online platforms more accountable for their role in safeguarding democratic processes.

Overall, the new guidelines from the European Commission aim to address the growing concerns around election integrity and voter disinformation on online platforms. By targeting Very Large Online Platforms and Search Engines, the Commission hopes to push Big Tech companies to take more responsibility for tackling harmful content and ensuring the integrity of democratic processes. The upcoming stress tests will help assess the effectiveness of these guidelines ahead of the European Parliament elections, highlighting the need for platforms to address linguistic and regulatory challenges to combat misinformation and protect voters. As the global landscape of elections evolves, the guidelines could set a precedent for platform responsibility in safeguarding democratic values worldwide.

Share.
Exit mobile version