- Series: Studio Europa Maastricht Policy Brief Collections
- Institution: Studio Europa Maastricht | Maastricht University
DOI: 10.26481/mup.rep.sem.2501
The content of this work is licensed under a Creative Commons BY 4.0 International License.
© 2025 Studio Europa Maastricht | Maastricht University
Description
Digitalisation has reshaped society, offering convenience and connectivity and raising concerns about misinformation, privacy violations, online safety, and AI governance. With the Digital Services Act (DSA) now in full effect in the EU, how can regulation protect democracy, fundamental rights, and well-being in the digital age?
The Policy Brief Collection on Digitalisation, featuring insights from UM experts from various disciplines, offers insights on the impact of the DSA on online platforms, AI technologies, and citizens. This collection brings together critical reflections on the opportunities and challenges of digitalisation, providing key policy recommendations for a fair and safe digital future.
Publication details and metadata
Title
SEM Policy Brief Collection: Digitalisation
Subtitle
EU Digital Services Act
Series
Studio Europa Maastricht Policy Brief Collection
Institution
Studio Europa Maastricht | Maastricht University
Editor
Dr Philippe Verduyn (ORCID) – Maastricht University (ROR)
Contributors
Prof. Sally Wyatt, Prof. Tsjalling Swierstra, Dr Katleen Gabriels, Emma Prebreza, Prof. Jan-Willem van Prooijen, Dr Jessica Alleva, Dr Visara Urovi, Dr Thomas Frissen, Dr Konstantia Zarkogianni, Prof. Dominik Mahr, Dr Jonas Heller, Dr Tim Hilken
DOI (digital version)
https://doi.org/10.26481/mup.rep.sem.2501
Copyright and licensing
© 2025 Studio Europa Maastricht | Maastricht University – CC BY
The content of this work is licensed under a Creative Commons BY 4.0 International License.
Access to this publication
Publication Type and Language
Report – English – Version 1
Publication date
18 March 2025
Subject
digitalisation, policy brief
Keywords
EU regulation, Digital Services Act, tech regulation, digitalisation, disinformation, social media, online platforms, AI, policy
Citation for this work
Verduyn, Philippe (Ed.). (2025). SEM Policy Brief Collection: Digitalisation. Maastricht University Press. https://doi.org/10.26481/mup.rep.sem.2501
Statistics
Overview of total views and downloads. Statistics are updated monthly
- Page views:
- Online report views:
- Report downloads:
Last update: 06.03.2024
Table of contents
1. Introduction to the policy brief collection on digitalisation
- Editor: Dr Philippe Verduyn (ORCID)
- Page: 3
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Verduyn, P. (2025). Introduction to the policy brief collection on digitalisation. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 3-5). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.1
Read online (Flipbook) | Download (PDF)
2. Helping (non-)users of the digital services in the Digital Services Act
Digital services are becoming increasingly central to EU member states’ economies and administrative functions. The Digital Services Act (DSA) came into effect in early 2024 and aims to hold extensive online platforms accountable for the content they post and share with millions of residents and citizens within the EU. The key recommendations do not yet officially cover smaller providers. This means that providers operating primarily in more minor language communities are exempt from the provisions of the DSA, though they are recommended to follow the guidelines.
This has consequences not only for people as consumers of services but also for people as citizens. Many terms are used in the DSA to describe people, including recipient, consumer, person, child, citizen and user. What is striking about this list is the emphasis on people as individuals and their relationships to private business. Except for citizens and persons, people using digital services as patients, passengers, and audiences are absent. These collective interests and public values must also be considered.
The interests of non-users of digital services should also be taken into account. People might not use online platforms for various reasons, including physical, cognitive and socioeconomic limits. Non-use might happen because of the fear of harm, especially for women who are subject to misogynistic abuse. Policy-makers need to take seriously their needs and protections.
The DSA is a long, complex document. The EU should provide summaries of its policy documents that are readable by large population segments. This is also important for ensuring democratic accountability and engagement.
- Author: Prof. Sally Wyatt (ORCID)
- Page: 6
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Wyatt, S. (2025). Helping (non-)users of the digital services in the Digital Services Act. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 6-10). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.2
Read online (Flipbook) | Download (PDF)
3. The Digital Services Act: not enough to protect democracy against populism
An essential goal of the Digital Services Act is to mitigate democratic risks posed by large online platforms. It aims to curb manipulation and misinformation and to protect fundamental civic rights, such as freedom of expression, media freedom, pluralism and protection against discrimination. However, the DSA is insufficient to protect public deliberation – the heart of democracy – against populism. Populism is an anti-democratic political programme which is partly facilitated by digital media. The DSA is rightly hesitant to limit the freedom of speech for politicians, but this severely restricts what it can do. It is essential to acknowledge the limitations of the DSA in this respect to develop complementary policies to fight populism.
Digital media facilitate populism in at least three respects. First, by accelerating the production and consumption of news, New Social Media (NSM) make careful fact-checking impossible and feed forms of impatience that are hard to reconcile with democratic procedures. Second, NSM plays into our psychological bias, favouring emotionally charged news over more bland and complex information. Third, NSM does away with gatekeepers and editorial filters, guaranteeing civic norms of democratic deliberation. These three factors pave the road for populism.
The problem is that these democratic risks are intrinsically linked to features of NSM that are positively related to the democratic process. Acceleration allows the public to respond quickly, in real-time if need be, to problems and threats. Democratic activity/citizenship can also be enhanced by appealing to emotions; democracy should be passionate. The absence of gatekeepers allows otherwise marginalised voices direct access to the public agora. The so-called bubbles can enhance democracy by offering easy entrance points for people occupying marginal positions, who can then build their confidence and power in a safe environment.
- Author: Prof. Tsjalling Swierstra (ORCID)
- Page: 11
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Swierstra, T. (2025). The Digital Services Act: not enough to protect democracy against populism. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 11-17). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.3
Read online (Flipbook) | Download (PDF)
4. How teenagers’ lifeworlds are shaped with snaps, streaks and social surveillance
With five million active users in the Netherlands, Snapchat is integral to the digital lifeworld of many people, including minors. This paper critically examines Snapchat’s design choices and their implications for underage users. Snap Inc. employs algorithm-driven content curation, live location tracking and gamified interactions to maximise user engagement. While effective at increasing activity, these strategies often prioritise engagement metrics over the well-being of young users who are particularly sensitive to social feedback, validation and rejection.
This paper’s focus on a platform heavily used by minors aligns with the objectives of the Digital Services Act (DSA), which aims to better protect minors in the EU. In addition, Snapchat falls under the Very Large Online Platforms (VLOPs) category of the DSA. Our paper argues that the responsibility of social media companies should extend beyond profit maximisation to include the well-being of their users, especially minors. In 2024, the European Commission acted against ByteDance (TikTok), Meta (Instagram and Facebook) and X Corp. (X, formerly Twitter) under the DSA for using so-called dark patterns: design techniques that mislead users and prompt certain behaviours. Our analysis indicates that the DSA may not offer sufficient protection at this moment, and it is also due to Snapchat’s lack of age verification measures.
While not all young users are equally affected, raising awareness about these practices is crucial. This paper advocates for comprehensive education programmes to help young people navigate social media responsibly and critically. It recommends enhancing protections for policymakers and urges parents and schools to guide young users proactively. Creating environments where young people can openly discuss their online experiences and learn to manage them effectively is essential for their benefit and well-being.
- Authors: Dr Katleen Gabriels (ORCID), Emma Prebreza
- Page: 18
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
- Reference:
Gabriels, K. & Prebreza, E. (2025). How teenagers’ lifeworlds are shaped with snaps, streaks and social surveillance. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 18-27). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.4
Read online (Flipbook) | Download (PDF)
5. Reducing misinformation and conspiracy theories on social media
Policy-makers often focus on algorithms to reduce the spread of misinformation and conspiracy theories online. However, the main reason misinformation and conspiracy theories proliferate on social media is that human users decide to share them. One of the Digital Services Act’s main goals is to compel digital service providers, specifically Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), to enhance measures against online misinformation. To do so effectively, it is crucial to understand people’s motivations for being active on social media.
People often share false information to serve their identity needs and appease their in-group. For example, false information that disparages political opponents may gain so-called likes and other forms of social approval from like-minded others. Research suggests that believing and sharing misinformation is often not due to incompetence or the intention to mislead others purposefully; instead, it is due mainly to people’s attention being focused on social connections instead of accuracy. Shifting people’s focus on the possible accuracy or inaccuracy of information can reduce their belief in misinformation and their willingness to share it.
This policy brief reviews interventions that successfully shift people’s focus to accuracy. One intervention is warning labels that are presented simultaneously with misinformation. Such warning labels can be pretty effective if implemented correctly. Moreover, interventions either before (prebunking) or after (debunking) encountering misinformation can be effective, although the effects of these interventions tend to be small and decrease over time. Even though emotions and identity needs are the primary reasons people believe misinformation and conspiracy theories, raising public awareness of possible inaccuracies and rationally refuting and correcting such false information makes a difference for many citizens.
- Page: 28
- Author: Prof. Jan-Willem van Prooijen (ORCID)
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
van Prooijen, J.-W. (2025). Reducing misinformation and conspiracy theories on social media. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 28-34). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.5
Read online (Flipbook) | Download (PDF)
6. Digital media and how we think and feel about our body: minimising the bad, maximising the good
The rapid digitalisation of society has transformed how individuals view the world and themselves. One area where this transformation is particularly clear is body image, with digital media playing a role in how people think and feel about their bodies and what they consider to be beautiful or not.
This policy brief first describes body image and how it relates to physical and mental health. Next, it explores the complex relationship between digital media and body image based on decades of research, including how and why digital media can negatively affect body image and how and why digital media can positively affect body image. This knowledge is used to create the Body Image Decision Tool for Digital Media, which stakeholders can apply to help determine the impact of digital content on body image. Next, applications to the Digital Services Act and additional considerations are outlined. This policy brief concludes with policy recommendations that will guide efforts towards minimising negative body image and optimising positive body image for a more significant number of people.
- Page: 35
- Author: Dr Jessica Alleva (ORCID)
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Alleva, J. (2025). Digital media and how we think and feel about our body: minimising the bad, maximising the good. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 35-43). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.6
Read online (Flipbook) | Download (PDF)
7. Transparency of personal health data sharing and the Digital Services Act
Personal health data boosts healthcare research, yet big corporations control much of it. This raises transparency and trust issues as customers have little or no information on how their data is collected, stored, and reused.
The Digital Services Act (DSA) aims to enhance digital service safety, accountability and transparency, particularly for large platforms. However, its application to digital health services requires further elaboration. Digital health services handle personal health data and require transparent practices to ensure privacy and trust. In this article, we explore how there are new opportunities to shine a light on these services and their use of personal health records via the transparency reporting of the DSA.
Concerns over digital health services include those related to giant tech companies dominating the health data market, the potential of data breaches and the inability to leverage these data at a societal level. A standardised transparency report would support interoperability and improve user trust. It must clarify key customer questions on data usage, collection, processing and sharing. Users should be able to identify the use of artificial intelligence (AI) and how AI decisions may impact them, understand secondary data use details, and decide on secondary data sharing.
In conclusion, standardising transparency reporting can become essential for digital health services under the DSA to combat data monopolies, ensure informed user consent and support innovation.
- Page: 44
- Author: Dr Visara Urovi (ORCID)
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Urovi, V. (2025). Transparency of personal health data sharing and the Digital Services Act. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 44-53). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.7
Read online (Flipbook) | Download (PDF)
8. Synthetic media and reality engineering: policy solutions for the EU
In this policy brief, I explore synthetic media: digital artefacts created entirely with generative artificial intelligence (GenAI). These can be visual, auditory, audiovisual or textual, such as deepfake videos or output from large language models (LLMs), such as ChatGPT. Due to the rapid developments in GenAI technology, it is becoming increasingly easy for anyone to engineer a ‘reality’ with synthetic media. Unlike traditional forms of forgery, synthetic media require no source and are algorithmically crafted. This makes them powerful tools for both creativity and deception.
Synthetic media are everywhere, from viral social media hoaxes to malicious deepfake campaigns. In 2024, fake images of celebrities at the Met Gala fooled millions, while realistically-sounding deepfake robocalls tried to disrupt primary elections in the United States. Such misuses fuel a growing social epistemic crisis, eroding trust in democratic processes by blurring the line between fact and fiction. The real threat lies not just in the convincing nature of synthetic media but in their rapid spread across digital platforms, particularly very large online platforms (VLOPs). Understanding these dynamics is essential to mitigating harm at both individual and societal levels.
This brief offers several policy recommendations to address these challenges under the EU Digital Services Act (DSA) and AI Act.
Key proposals are:
- Fortifying investments in digital forensics for early detection of harmful, deceptive media.
- Holding social media platforms accountable for enabling and amplifying synthetic media.
- Building public resilience through psychological inoculation strategies.
These policies address synthetic media, their enablers, and their societal impact responsibly—without throwing the baby out with the bath water. As such, the policy recommendations support safeguarding creativity and democratic integrity while fortifying trust and safety in the digital age.
- Page: 54
- Author: Dr Thomas Frissen (ORCID)
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Frissen, T. (2025). Synthetic media and reality engineering: policy solutions for the EU. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 34-61). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.8
Read online (Flipbook) | Download (PDF)
9. The AI risks in very large online platforms and search engines
This policy brief focuses on the integration of AI technologies in Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), which serve more than 45 million users monthly under the Digital Services Act (DSA). It highlights the key AI technologies, including recommender systems, information retrieval systems, and generative AI, which shape user experiences and impact the dissemination of information. Recommender systems leverage user data to deliver personalised content, while information retrieval systems prioritise relevant documents in response to user queries. Generative AI, a transformative technology, enriches digital content but introduces risks such as hallucinations and the proliferation of misinformation, including deepfakes. Despite the benefits of AI integration, VLOPs and VLOSEs face significant systemic risks. These include privacy and security vulnerabilities, threats to user autonomy, dissemination of harmful content and the addictive nature of these platforms.
The brief discusses the growing concerns about AI-driven rabbit holes that steer users toward extreme content and the potential for generative AI to spread disinformation. Addressing these risks requires a human-centred approach to AI regulation, emphasising ethical design, transparency, user control and human oversight. The policy brief calls for improved training of AI systems with diverse, high-quality data, collaboration among stakeholders, and implementation of explainability techniques to support user control and human oversight. It underscores the need for ongoing policy development to harmonise the DSA with the AI Act, ensuring that platforms and AI systems operate within a legal framework that protects individuals and society while enabling technological innovation.
- Page: 62
- Author: Dr Konstantia Zarkogianni (ORCID)
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Zarkogianni, K. (2025). The AI risks in very large online platforms and search engines. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 62-68). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.9
Read online (Flipbook) | Download (PDF)
10. Immersion and regulation: extended reality technologies, their impact on innovation and policy recommendations
Extended Reality (XR) technologies, encompassing Augmented Reality (AR) and Virtual Reality (VR), are primed to revolutionise digital interactions across various sectors, from retail and education to entertainment and healthcare. As these immersive technologies rapidly evolve, they present unprecedented opportunities and novel challenges for citizens, businesses and policy-makers. This policy brief examines the current landscape of XR technologies, their potential impacts on citizens and society and the regulatory implications surrounding their development and implementation.
XR offers significant benefits, including enhanced access to services, improved learning experiences and new forms of creative expression. However, it also raises concerns about privacy, data protection and potential adverse psychological effects, such as addiction and difficulties distinguishing between virtual and authentic experiences. The Digital Services Act (DSA) and Digital Markets Act (DMA) provide a regulatory framework that supports and potentially hinders XR innovation.
The DSA and DMA are expected to have a mixed but positive long-term impact on XR innovation. While compliance efforts may slow down innovation for smaller XR developers due to the complexity of content moderation and data protection, fair competition, enhanced transparency, and interoperability are likely to foster innovation, increase user trust and attract more users over time, though challenges around real-time moderation and achieving true interoperability remain.
- Page: 69
- Authors: Prof. Dominik Mahr (ORCID), Dr Jonas Heller (ORCID), Dr Tim Hilken (ORCID)
- Copyright: the author and Studio Europa Maastricht | Maastricht University
- License: CC BY 4.0 International
Mahr, D., Heller, J. & Hilken, T. (2025). Immersion and regulation: extended reality technologies, their impact on innovation and policy recommendations. In Verduyn, P. (Ed.), SEM Policy Brief Collection: Digitalisation (pp. 69-76). Maastricht University Press. https://umlib.nl/mup.rep.sem.2501.10
Read online (Flipbook) | Download (PDF)