top of page
Search

SAFEGUARDING DEMOCRATIC IDEALS IN THE DIGITAL AGE

More than 80% of Europeans agree that the existence of news or information that misrepresents reality or is even false is a problem for democracy, according to the October 2024 Eurobarometer. In a time when information spreads faster than ever, democratic ideals are increasingly vulnerable to persistent attacks from various actors seeking to distort public opinion. This issue has created divergent views on how it should be addressed, particularly at the intersection between regulation and freedom, posing a serious dilemma for decision-makers.


But to understand this threat, we must first identify the core values represented by democracy. These include citizen participation, individual freedoms, separation of powers, rule of law, accountability, equality, and transparency, among others. These elements are crucial to sustaining a democratic system, maintaining its legitimacy, and ensuring people's trust. Before the digital age, the discussions concerning these values were held through newspapers, radio, television, and on-site forums, limiting the spread and visibility of anti-democratic ideals. Once social media and new applications started growing, the exchange of ideas increased significantly as these platforms allowed mass participation and real-time interaction.


Despite the opportunities offered by new technologies, serious concerns have emerged regarding their impact on democratic life. Opaque algorithms, used by the main digital platforms, influence what information users are exposed to, often without their awareness. Furthermore, disinformation campaigns, information overload, and the loss of shared references have weakened citizens' ability to engage in properly informed public debate. These dynamics are reinforcing polarization, undermining trust in institutions, and distorting perceptions of democratic processes.


Recent tactics include the use of generative AI and deepfakes, bot farms deployed to conduct coordinated attacks, and microtargeting of specific segments of society as a political weapon to manipulate elections, polarize public opinion, and create chaos. The annulment of the Romanian presidential elections in late 2024 was a highly contested case as the Romanian intelligence services identified patterns of external interference through platforms like TikTok and Telegram, involving over 25,000 fake accounts. The debate over who was truly behind the operations, between Russian agents and national parties, highlights one of the greatest challenges of our times: when misinformation is spread by opaque networks and coordinated messages, truth becomes difficult to corroborate and even harder to defend. This ambiguity is not neutral. It directly benefits the actors responsible for these actions, granting them impunity and deepening institutional distrust, ultimately resulting in a generalized opposition within democratic systems.


While some social media platforms offer fact-checking tools, their effectiveness has been intentionally reduced. In the case of X, Community Notes allow users to add context to misleading posts, warn when the content is created by AI, and raise awareness about fake news. These notes once played an essential role in limiting the spread of misinformation. However, Elon Musk publicly claimed they were being used by governments as a manipulation tool. As a result, the platform's algorithms were intentionally modified to reduce their visibility and delay their appearance. By removing this first line of defense, the spread of fake news is no longer being stopped, and thus, the competition between viral content and verified information becomes unequal, resulting in a weakened public debate, a key aspect of democracy.



Moreover, significant efforts have been made by institutions to deliver a collective European response to this problem, as for instance, the Digital Services Act (DSA) approved by the European Parliament in 2022. It is the first legislative initiative that imposes legal obligations on digital platforms to ensure accountability and transparency. Nonetheless, an open letter to EU policy-makers, signed by more than 50 European expert organizations, argues that the Act does not directly tackle the core of the problem “due to the tendency in Brussels to frame the question of legislating on disinformation as a simple trade-off between free expression and total control of our online environment.” This reflects how the current regulatory approach often oversimplifies the complexity of the issue, frequently resulting in either under-regulation or self-censorship, rather than achieving a balanced framework that protects democratic debate and combats manipulation.


In order to counter digital misinformation, solutions must have a cross-cutting approach that includes technological, political, and social measures. In the technological domain, mandatory algorithmic audits by independent external agents should be required for digital platforms. Additionally, a tracking mechanism must be implemented to identify the origins of paid advertisements, clearly indicating who pays, to whom, and for what purpose. On the political front, governments should establish international cooperation mechanisms against Foreign Information Manipulation and Interference (FIMI), strengthening alliances through intelligence-sharing and agreeing on collective responses to large-scale transnational disinformation campaigns. Beyond cooperation, governments must also adopt binding rules that include accountability and sanctions for non-compliance while avoiding censorship or other forms of speech suppression. In the social sphere, policy should not aim to control citizens but to empower them. Investing in educational programs that develop critical thinking skills will equip future generations to interpret reality, assess sources, and detect manipulative patterns. It is not about telling them what to think but rather teaching them how to think independently. This must be complemented by institutional support for independent media and fact-checking journalists. Many of these organizations do not have enough resources to develop their work, yet they are essential to verifying the vast amount of circulating information and amplifying the diversity of trustworthy voices.


With social platforms that disincentivize verification, foreign campaigns that produce misleading content, and incomplete legislation, democratic ideals are suffering a silent but very real erosion. Fighting disinformation is not merely a technical issue; it is a civil, institutional, and cultural necessity. It requires platforms to assume responsibility, authorities to legislate with determination, and citizens to exercise critical thinking as an integral part of democratic practice. It is not about restricting the digital public sphere but about strengthening it. Not about imposing truths but about creating the conditions in which truth has a fair opportunity. Because if this historic moment shows us anything, it is that democracy needs more than just to be proclaimed: it needs to be defended. And that defense, today, begins in the digital world




 
 
 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.

CONTACT US

Use our email below to contact us regarding any questions or interest in supporting this movement

contact@paneuropeanfoundation.eu

Registered Address:

Lange Vijverberg 4

2513 AC ’s-Gravenhage (The Hague)

Netherlands

 

Corporate Seat: The Hague
 

RSIN:

868575732

BE THE FIRST TO KNOW

Sign up to our newsletter to stay informed

Make sure to submit to our Substack too!

  • Whatsapp
  • X
  • Telegram
  • LinkedIn
  • TikTok
PEF logo

© 2025 Pan-European Foundation.

bottom of page