Skip to main content Scroll Top

News

EDMO Secretary-General Paula Gori Addresses First EU Parliamentary Democracy Forum in Brussels

On 19 November 2025, EDMO Secretary-General Paula Gori joined the European Parliament’s inaugural EU Parliamentary Democracy Forum as the scene-setter for the third session of the day, titled “Building Resilience”. The recording of the live-stream can be watched here (starting at 17:18:01).

Speech delivered by Paula Gori to the EU Parliamentary Democracy Forum

(check against delivery)

Technology is the application of science to practical aims of our life.

Take washing machines, for example. They were created to solve the issue of cleaning our clothes, which before their invention took human time and effort.

We could get it done also before (and in many realities it is still done manually), but the washing machine allows for the clothes to be washed without time investment on our side.

Our saved time can be devoted to something else. Have a walk, read the news, work, stay with our beloved ones.

The washing machine is somehow a neutral technology. It washes our clothes.

It does not tell us if it likes them or not, it does not impose us to stop buying yellow socks, nor does it decide to clean just the clothes it prefers.

Technological development started with things but it then included services and infrastructures. Communications devices, the internet, social media platforms and search engines, AI, Generative AI. You name it.

And all of a sudden, we moved from gaining time with a washing machine, to moving public discourse on private online platforms.

We moved from buying our selection of newspapers and switching on radio and TV deciding which show we wanted to watch, to an online environment that provides us with the potential for accessing an almost unlimited amount of information.

This new environment contains an almost unlimited amount of information and at the same time leaves information voids.  This is already a good recipe for information manipulation.

Why? Because we are looking for information and, either because there is too much of it or because it is missing, we are not finding it. And this happens in a space that notoriously contains lots of information and which leads to cognitive overload.

Let’s not forget it, any decision we make, including voting, is based on the information we have at our disposal.

Algorithms are needed to manage this vast and dense environment. The information we access depends on them.

However, they are not neutral. They reward engagement. And engagement is driven by emotions such as anger, fear, frustration. The more such emotions, the more engagement, the more advertising, the more profit.

Therefore, we are often offered content we have not asked for and are not interested in, to the detriment of the content we would like to digest. We lost the agency.

We are also flooded with disinformation and illegal content, to the point that one may wonder if these technologies are really uniting us or on the contrary dividing us.

We end up in echo chambers and private actors compete to get our attention, in what is called the attention economy.

And the paradox is that while they are fighting for our time they in parallel take too much of it, by creating addiction.

As a consequence, our habits changed. We spend less time in reflecting, less time in focusing and have moved to series of very short bits of attention.

Our relationship with questions is also changing. We do not look anymore for evidence to build the answers, we get them in a blink of an eye from a GenAi chatbots.

Such tools come however with hallucinations, they can be manipulated and are trained to provide with an answer, not matter what.

Meantime our critical skills worsened, our attention span decreased and we are too lazy to spend additional time questioning the response (nor actually the quality of our prompt).

On the other hand, malign actors play with this system, they manipulate us and profit from this whole structure.  Including external actors that threaten our security, our values, our democracy.

And let me quickly go back to emotions. Emotions come before thinking. This is what saves us as animals. If I’m in danger, fear is what makes me run away, thinking comes after. If we apply this to our online behaviors, it is easy to see how this whole system impacts our critical skills.

Today, in this hemicycle, we are talking about democracy and democracy is based on the fact that all citizens are on the same level playing field.

And if public discourse has moved online, then this space needs to be safe. It should not be uneven. Again, we all deserve the same level playing field.

Be it a social network, a video application or an AI Chatbot, it should guarantee information integrity, it should be safe, it should be tested against manipulation, it should be truly democratic.

Let’s move on to solutions. Technology advances at high speed, way faster than regulation.

This is why I believe that the risk-based approach that we have adopted in the EU, for example with the DSA and the AI Act, is the right one.

It can adapt to new technologies, because the focus is on the risks. In the future we may need some updates, but the core is there.

It would also be important that tech developers work hand in hand with experts in the field of ethics, law, anthropology, sociology, environment, health.

There is no limit to technological development, but we must ensure that it takes the direction of safety of humankind and democracy.

Technological development should reflect our values by design.

Regulation is prevention, and it needs to be enforced. Yes, some features may be improved, others may deserve additional reflections, but we have a solid scheme, which puts the EU at the forefront.

Preparedness and resilience are equally key. They ensure that we are all equipped with the skills needed in case the threat becomes too high, there where regulation cannot cope or when malign actors find new loops to attack us.

Resilience and preparedness must be structured as a lifelong learning process for citizens and as a lifelong investment for the EU.

The equation is easy: if technological development moves fast, so do potential threats.

As such, resilience and preparedness programmes need to be regularly updated to keep up with that pace.  They must address all potential risks. They must ensure that to each changing scenario, we are equipped with specific tools.

Only a well funded whole-of-society and evidence-based approach can lead this.

To conclude, we all deserve a safe online space that guarantees information integrity, that prevents manipulation, that respects our fundamental values and that allows us to implement new forms of participatory democracy, accountability and transparency.

We have the rules and we are building preparedness. We do not lack creativity in the EU and with courage and investments we can create an online environment which is attractive to citizens and that is in parallel safe and fair.

We need technology to be based on our values, not to undermine them.

We need technology to foster democracy, not to put it in danger.