Scroll Top

News

Members of the EDMO Task Force on Disinformation on the War in Ukraine submit feedback to the EC call for evidence on the provisions in the DSA related to data access

Members of the EDMO Task Force on Disinformation on the War in Ukraine submit feedback to the EC call for evidence on the provisions in the DSA related to data access

The European Digital Media Observatory (EDMO) established a Taskforce on Disinformation and the War in Ukraine in March 2022. Chaired by Dr. Claire Wardle, the taskforce includes 18 members representing academia, journalism, media and civil society, from 10 countries across the EU, acting in their personal capacity.  The current war in Ukraine shows that disinformation is playing an active role both in the conflict and in the right of citizens to access information. The EDMO task force has a strong academic/ research focus, yet the objective is to provide actionable insights that are relevant for policy- and decision-makers, for public and private stakeholders and for the public at large.

Based on the work of the taskforce and the 10 recommendations by the taskforce on disinformation and the war in Ukraine, some members have come together to identify the challenges of data access which need to be addressed in the current call for evidence. These include the following:

  1. A future intermediary body to facilitate data access and research that has members representing key stakeholders including people working within large online platforms, academia, and civil society. It will ensure data privacy and GDPR compliance, and facilitate the creation and implementation of data sharing standards. The EDMO Working Group for the Creation of an Independent Intermediary Body to Support Research on Digital Platforms will play a pivotal role in this process.
  2. Any data disclosure infrastructure should be designed with a number of types of users in mind, including academic researchers, fact-checkers, journalists, policy-makers, public health professionals, and civil society. The technical and methodological expertise of those who will be relied upon to audit platform data differs significantly, as do their needs and skills.
  3. Data access should be free and comprehensive to enable computational research and monitoring at scale, API-based data access should be free and should allow access to sufficiently large volumes of data, to enable longitudinal, large-scale monitoring of compliance. This is especially critical around global emergencies such as the COVID-19 pandemic and the Ukraine war, where there are tens of millions posts a day on big platforms such as Facebook, YouTube, Twitter, TikTok, etc.
  4. Data sharing for compliance by platforms should include sizeable and representative samples of moderated, demonetized and removed content (online abuse or disinformation), so researchers can verify (based on these samples) whether platforms are enforcing correctly their policies and that legitimate content and accounts are not being silenced due to biases or errors in algorithmic moderation and/or human moderation error.
  5. Cross-platform structured data sharing standards and ideally, data access APIs, need to be created and adopted by platforms. Multi-stakeholder action is needed to define and implement these, in order to enable cross-platform, quantitative comparative studies on key compliance issues such as cross-platform spread of disinformation, cross-platform political ad campaigning during elections, cross-platform online abuse, etc.
  6. Data disclosure should meet holistic and flexible guidelines, considering the varied stakeholders and systemic risks of a service. This should include an industry-independent process of mapping stakeholders, identifying systematic risks, and then developing measurements to understand the risks.
  7. There should be a modular approach to researcher access to data. This would fulfill three high-level objectives: a) Creating a code of conduct for platforms and researchers; b) Implementing the standard to vet research requests and assess platform access in practice; c) creating a remedies mechanism to mediate disputes that arise in specific cases.
  8. There should be an external advisory board, responsible for writing and upholding an ethical framework. This framework would consider ethical questions related to conflicts of interest, unexpected outcomes related to data sharing, long term data storage and ongoing funding for the wider ecosystem working on these data sharing provisions. To minimise administrative overheads, once researchers are cleared for data access, they shouldn’t have to re-apply on a per-project basis
  9. There needs to be an appeals system built into any data sharing mechanism. To ensure adequate provision of a timely appeals mechanism and recourse to an independent EU-level/national ombuds entity to help arbitrate cases where platforms and researchers can not reach an agreement.
  10. Data sharing needs to include nuances occuring in Eastern European languages. Data sharing needs to be complemented by a data processing infrastructure, shared know how, and open-source data cleaning, harmonisation, transformation, and analysis tools – these will enable researchers from less resourced EU countries (e.g. Central and especially Eastern Europe) to carry out effective monitoring, as platforms are currently fairing the worst there in terms of effective enforcement of their policies against online abuse and disinformation.

Members of the EDMO Taskforce on Disinformation on the War in Ukraine signing this contribution:

  • Alina Bargaoanu | EDMO Advisory Council
  • Anja Bechmann | NORDIS and EDMO
  • Kalina Bontcheva | University of Sheffield, EDMO Advisory Council
  • Carlos Hernández-Echevarría | Maldita.es
  • Roman Imielski | Gazeta Wyborcza
  • Gianni Riotta | Luiss Guido Carli University, IDMO, EDMO Advisory Council
  • Jochen Spangenberg | EDMO Advisory Council
  • Rebekah Tromble | George Washington University, EDMO Advisory Council
  • Claire Wardle | Information Futures Lab, Brown University, US
  • Richard Woods | The Global Disinformation Index