Home

Donate
Podcast

The Digital Services Act is a Lightning Rod for Debate

Ramsha Jahangir / Feb 15, 2026

The Tech Policy Press podcast is available via your favorite podcast service.

This week marks the second DSA and Platform Regulation conference in Amsterdam, where participants will revisit the Digital Services Act (DSA) two years after it entered full effect across the European Union. Over that period, the law has been tested by national elections, geopolitical tensions, high-profile enforcement actions, and the rapid rise of generative AI. It has become both a benchmark for platform accountability and a political lightning rod.

I spoke with members of the DSA Observatory, which is organizing the conference, to take stock. What have these first years of enforcement clarified? Where does opacity remain? And what does it mean to conduct DSA research in today’s political climate? Guests include:

  • John Albert, Associate Researcher, DSA Observatory
  • Paddy Leerssen, Postdoctoral Researcher at the University of Amsterdam and part of the DSA Observatory.
  • Magdelena Jozwiak, Associate Researcher at the DSA Observatory

What follows is a lightly edited transcript of the discussion.

Ramsha Jahangir:

Two years ago, right after the Digital Services Act entered full effect, nearly 200 researchers, civil society members, regulators, and legal experts gathered in Amsterdam for the DSA and platform regulation conference. That convene captured a moment of anticipation mixed with uncertainty. We assembled them to ask what the DSA might become. Today, we return to ask what it has become and at what cost. The past two years have tested the DSA in ways both anticipated and unforeseen. The European Commission has launched over a dozen investigations. Digital services coordinators have begun enforcement for non-VLOPs and two rounds of risk assessment and audit cycles are complete.

We have watched the DSA navigate elections, geopolitical tensions, and the rapid emergence of generative AI. We have also watched it become a political lightning rod criticized as censorship by some and as tootless regulation by others. But this conference is fundamentally about the people in this room, the researchers, advocates and civil society members who are not merely observing the DSA's implementation, but actively shaping our understanding of it. This community has since grown substantially, but so have the risks. Researchers now face legal threats, including most recently, visa sanctions against some civil society groups as well as barriers to data access.

So, ahead of the conference, I'm joined by the organizers at DSA Observatory today to understand what do these initial years of DSA enforcement actually look like? What have we learned and what's still a black box? And most importantly, what does it cost to do this work right now in this political climate?

John Albert:

My name's John Albert. I'm an associate researcher at the University of Amsterdam Institute for Information Law, and I work on a project called the DSA Observatory.

Paddy Leerssen:

Hi, my name is Paddy Leerssen. I'm a postdoc at the University of Amsterdam. I also work at the DSA Observatory, interested in all things DSA and social media regulation.

Magdalena Jozwiak:

My name is Magdalena Jozwiak, and I also work at the DSA Observatory as an associate researcher, focusing mainly on the issue of systemic risks.

Ramsha Jahangir:

Thank you so much for joining guys. The last time we talked DSA at this length was at the conference two years ago. As far as I remember, the inaugural DSA conference brought together nearly 40 paper presentations over two days. So, I'm curious to know how has DSA research evolved since, and more specifically, when you were looking at conference submissions, where is research heavily concentrated and what are critical areas that remain under-researched?

John Albert:

So, I wasn't actually at the first conference myself, but my understanding is in terms of the fields expanding that we've got something like double the number of submissions. So, the field has definitely grown in a couple of years. There's a lot of interest in the DSA and a lot of academics who are coming up for whom this is a subject that wasn't available to them before. Now it's sort of a blue ocean. Definitely there was a lot of interest in the DSA's risk-based approach. I'd say the majority of the submissions we got were looking at this. It is a very broad topic, very multifaceted. Also looking at specific areas like whether it's the potential for election influence or manipulation, also systemic risks to minors.

And then we had also a range of other subjects like out-of-court dispute settlement. There was quite a lot of interest, deceptive design as well as how the DSA interacts with other regulations in areas like AI, digital markets. So, it's really quite broad in terms of the types of submissions that we got. And in the end, we accepted a hundred papers. So, we were able to really grow the conference over the two days and the number of papers we're able to showcase.

Paddy Leerssen:

And one thing I think is really exciting, two years down the line, one thing that's really different about the research is that we have decisions, we have events, we have legal developments to reflect on. So, that was the case for systemic risks, where we first basically just had this law itself as we know rather unspecific concept of risks and very abstract discussions about what that could mean. And now we have a wealth you could say, of all these different platform reports, commission decisions, et cetera. Now whether we actually are any closer to having a definition of systemic risks, that's something we can continue to disagree about.

And this is true not just for systemic risks, but for basically any aspect of the DSA. So, ODS bodies are trusted flaggers. Back then, we didn't have any and it wasn't even really clear if anyone was willing or prepared to take on that job. And we've seen that absolutely, yes, there are. We have quite a lot now more than I would've predicted then in any case. And we're seeing private litigation, so there's just a lot more empirical substance to discuss and that's really what I'm looking forward to doing.

Ramsha Jahangir:

Absolutely. As you mentioned, we've seen some investigations underway, but also when I look at my notes from last time's conference, there were a lot of early assumptions about what would be the effect of how slow or fast these investigations would be. As you already pointed out what those systemic risks would be, there was a lot of conversation about those being vague and not nearly defined in the DSA. And then also the role of risk assessments and audits, whether they will really deliver into something concrete or not.

So, Magda actually, I would come to you because in the last podcast we had was right around the first cycle of risk assessments and audits were published and we talked about this question then as well. What do we know about the most prominent systemic risks and do you think we know enough?

Magdalena Jozwiak:

I wouldn't say we know enough. The concept is still quite vague I think. What we do have are decisions from the commission, both decisions to open investigations that are shedding some light on what commission considers to be understanding of systemic risk that is accurate under the DSA. And then we have some decisions that are actually about systemic risk.

So, the recent decision in TikTok case, where the Commission for the first time made this kind of, it is only first decision, and we don't know yet how the investigation will fully end, but we already have some preliminary opinion about systemic risks in the context of infinite scrolls and TikTok, how it's addictive towards users, especially minors and the vulnerable users. And that's very substantive and that's very broad understanding of how to interpret the concept of risk. And that's quite encouraging I would say. So, there was, I think last time we talked there was a real doubt about whether commission will be bold and will enforce thoroughly the risk provisions.

And I think we found out now that the Commission is willing to take on this role as an enforcer that is quite thorough. So, that is quite a positive development. Also looking at some decisions to open investigations. We have similar investigation now pending against Meta that also concerns rabbit holes and this kind of addictive design. So, here we go. We already know that TikTok is first on the line, but we have also Meta that has similar services and in both cases this is this kind of fundamental design that is affecting business model of the platforms. So, I'm not sure that we would have thought before that this is actually going to go this way.

Also, the pending investigation against X, we see that Commission was willing to address design and functioning of X services in the context of right to dignity, which was for me very surprising because even in the context of research on human rights, right to dignity is always this kind of vague concept that fills in for kind of value laden assessment or to make a point, it's very vague. It's rarely ever used in this kind of precise manner, but the Commission was willing to actually base its opening of one of the proceedings on the right to dignity in the context of X and how X was not consulting enough minority groups, users and society.

So, that is quite a good path that the Commission has taken. So, then in that respect, I'm quite positive, although reports remain vague and the first round of reports didn't shed mud light much light on the concepts and similarly second round of reports, but that was in a way to be expected and I think we have to give it a little bit more time.

John Albert:

Yeah, I think, I mean it's really good points to make Magda, and just to build on that and to your question, do we know more about systemic risks I think was the question though, and something we definitely know more about is how platforms are managing systemic risks. So, whether this gets to the question of how risky is it, what's the extent of it? At least we get some insight through these reports, the systemic risk reports and audit reports and so forth, how they're managing them.

And it's interesting if you look at the audits at least, which have a very procedural kind of approach to this, a lot of the auditors find that when it comes to systemic risk management platforms by and large don't have a very coherent approach, let's say, to tackling that they're not necessarily tying their risk mitigations to the risks that they identified in a coherent way. At the same time, auditors are finding platforms by and large to be compliant as well. So, there isn't a good definition of what compliance means in this context. And so, auditors are erring sort of on the side of, what's the word? The side of caution. There we go.

Ramsha Jahangir:

Yeah. And this actually goes back to what the discussion was like at last conference. This is actually what we feared with the audits, the way it's designed and some of the wakiness in it, but also the flexibility in it and it really leaves a lot of room and gives a lot of power for auditors to decide what's good or not. I also want to touch on national and private enforcement, because that's a part of the DSA that I don't think gets touched upon a lot or enough. How is that shaping out and what are some of the pressing challenges?

Paddy Leerssen:

Well, I think private enforcement is one of those topics that wasn't really that much on the radar at the last conference, but has become I think quite significant. Some people have phrased this as also at a certain point, maybe not so recently, but maybe a year ago, people were concerned that the commission wasn't being as proactive in its enforcement. There was a lot of speculation about it, were they caving to US pressure now? And like Magda said, I think recent developments have shown that that isn't necessarily the case, right?

The commission does seem to be taking its task seriously and proactively enforcing the DSA, but a public regulator like that can only do so much, especially on such a big and complex issue as the DSA, private enforcement in all these various ways can potentially fill some gaps, right? And whether that's actually users or other interested parties enforcing their own rights or whether that's a CSO or maybe even something like a class action or things that are being discussed, right? And we have had some very interesting developments in that space. Chiefly to my knowledge, now there have been cases in the Netherlands and in Germany, but I think there's more work to be done.

And that's something we will actually want to do at the DSA Observatory in monitoring that across Europe. So, for instance, we've had litigation from Democracy reporting international versus X regarding their access to data and the right to scrape. Well, in any case, whether that's through APIs or to scraping to study that platform. In the Netherlands, we've had Danny Mekich who's a legal expert representing himself contesting shadow-bans under the DSA instance, bits of freedom in a more recent case, which is still ongoing, litigating against Meta regarding recommender system design.

So, I could go on, but these are I think, interesting kind of complements to the public enforcement framework, which I think are very exciting to follow. And one thing I think that may not be like on everyone's radar as much is how, let's say commercial parties might also be engaging with the DSA, right? Generally I would say that the whole kind of e-commerce aspect of the DSA, how things that tend to be very significant in private practice like Google Maps reviews, like what it's a DSA doing there or it's there litigation happening, that's something we don't necessarily have a good view on and requires more research. So, that's on private enforcement.

You also asked about national level enforcement, and for those that don't know, the European Commission is primarily tasked with supervision on systemic risks and the other obligations for the large platforms with more than 45 million users. And the national regulators are regulating the smaller platforms and have a supplementary secondary competence for the large platforms based in their jurisdiction. So, it's less of a central task than the commission has, but there are all sorts of smaller services that do seem significant. Whether it's a discord or it's a hosting infrastructure, there's lots of significant issues there.

How that is developing is another area where I think we can and should learn more as academics. It's very difficult to also use the language barrier to get a good sense of what's going on. But what I will note actually very recently is how we're seeing national governments take on the platform regulation issue again. I think that's quite a significant development, right? So, you've probably seen Pedro Sanchez's government announcement in Spain, not so recently, where they are seeing developing ideas about restrictions for minors which have been pioneered in Australia, but being taken over in Spain, also in France, in Greece.

And many other countries in Europe are proposing similar things, as well as things like what seems like revised ideas about intermediary liability. So, instantly raising questions about how does that interact with the DSA and are they not satisfied with the DSA? Are they trying to reassert passionate competence at the national level? And I'll just mention also France raiding the offices of X. So, we're seeing I think signals or developments at the national level pushing against the DSA framework. And there's a tension there, which I expect in the coming year is going to be a significant topic of debate.

Ramsha Jahangir:

Yeah, I want to come to the geopolitical side of it. Of course, this podcast would be incomplete without that. But before that, something that you pointed out, the tension between national versus commission level enforcement would be key. As we've also seen in Poland's case, for instance, the president basically retold a bill that was supposed to help with national enforcement of the DSA. And again, what I've been curious about as well is that within the parliament in the council and also within national member states as well, is DSA popular as much as it is on the commission level and as outsiders looking at it, so are those conversations happening within national states as well?

Magdalena Jozwiak:

I'm just maybe want to say a couple words being Polish. The Polish case, that is clearly a political decision by the president. So, this bill that was implementing DSA was non-negotiated between NGOs in Poland, Panoptykon, specifically with the government. And initially, indeed, there were some concerns about the possibility of censorship because actually, it was DSA plus a law that not only implemented DSA but extra provisions, but eventually the version that was submitted for the signature of the president was well protected against any forms of censorship. It was actually quite a good compromise that was negotiated. And then veto was very political decision.

We have cohabitation in Poland, so it's quite a political topic now, the censorship. But I think what I saw, I just wonder apart from that, what came to my mind was that I recently read kind of statistics that there is a lot of support for commission pushing back against big platforms on the national level amongst the citizens. So, there was some poll taken in Poland, France, Italy, Spain, indicating that there is a big popular support, high popular support for the commission enforcing DSA. So, I think that is also something governments have to take into account.

So, in general, it seems to me that the tides are shifting in the way that DSA is this kind of also a tool that is pushing against certain geopolitical pressures from across the oceans and the people are picking it up and are supporting it.

Ramsha Jahangir:

The other thing is with the protection of miners, right? With the Grok scandal that we've seen this time, there are a lot of calls for ban on X and platforms that also again is intention with what DSA offers. So, do you guys have thoughts on that? With these calls becoming more and more louder on national state level, do we see the support for DSA going away in the future?

John Albert:

I will first just say something kind of boring and procedural because that's sort of the frame in which I've been focusing on this, looking at all of these reports, which is that the indication we have is that when it comes to basic process side of compliance, X hasn't been doing a great job. And part of this is the fact that seems to be didn't do a proper risk assessment prior to unveiling Grok. Also, if you look at their first audit report, they were the only company to get a negative opinion despite how risk averse the auditors might've been in that instance. So, it seems as though they were failing in the basics of compliance there.

So, that's the procedural hook, let's say, that enforcement might look at when it comes to X. But getting to the deeper issue of the actual harm, I think this is maybe a more interesting question as well.

Paddy Leerssen:

The question being whether there's a demand for a ban of X in Europe. Well, I'm not a social scientist, so I can't speak to what the average person might think of something like this. I don't think that's something the commission not to speak on their behalf, but my guess would be that they wouldn't be at all eager to pursue this. The room for that under the DSA itself is not broad. It's really the last possible resource. And there is also fundamental right to take into account like the bar for blocking an entire website, especially a large social media platform in terms of fundamental rights is very high, right?

There's also the logic that many regulators would have that actually when you're issuing fines you failed, which we can disagree on sometimes actually having timely fines and certain issues can be very strategic. But broadly speaking, I think what the regulator wants is to reach commitments, reach compliance, and this would be actually acknowledging that you can't realize that kind of compliance and in that sense admitting failure. So, that's how I think most regulators would view such a radical measure like banning an entire social media platform.

Ramsha Jahangir:

We'll see. We'll find out.

Paddy Leerssen:

Yeah. And maybe it's just some final points. I think that's one way of interpreting these developments at national level, what we're seeing from the Sanchez government in Spain, et cetera, where there is clearly a lot of dissatisfaction with how this website is working. And I think from what I've seen that the attempts to delegitimize or undermine the Europe's regulatory framework, even its sovereignty you might say, that are coming from the US government and from their allies like Elon Musk are not well-received in Europe. And with all that dissatisfaction, these are signals I think from the national government said something needs to happen.

Ramsha Jahangir:

And you've already mentioned US and Elon Musk's allies, which brings me the question of how is all this political pressure affecting the researchers working in this space? That's something I've been thinking about a lot. In the last conference, David Kaye had mentioned this concern that there was this fear that the DSA's built-in review process could be weaponized by political actors who could frame the law as a failed experiment. And now we've seen this play out in reality, including through the most recent report by the Republican-led US House Judiciary report, which has explicitly called out some of the groups working in this space, whether civil society or researchers.

So, how is this affecting, and also in the context, going back to organizing this conference, do you see people voicing that risks? And how is this shaping research?

Paddy Leerssen:

Well, I think this pressure you're describing from the US is really a tragic development and counterproductive in fact, because I think one of the main things that civil society and research were trying to do in the first edition of the conference, one of the main questions being directed at the commission people there was how are you going to preserve free speech? What safeguards are you going to put in place? And me and others have been arguing that something like a trusted flagger requires accountability as well, right? And these are the kinds of things that the commission is also saying the DSA is designed to protect free speech. Before, you had no protections against the platform.

And that's the discourse we had. And in fact, the way that now US Congress is coming in with their own version of a free speech criticism, which is totally overstated and lacks any kind of nuance or any kind of good faith interpretation that's having the opposite of the intended effect. Now, nobody is really interested in criticizing the DSA for fear of lending legitimacy to these totally propagandistic exaggerated narratives. So, you get kind a rally around the flag of effect. Yes, let's have a conversation about free speech and the DSA, we've been having it for many years, but not in these terms. Let's be realistic about it.

I think the effect that it's been having this criticism. Rhe other risk, of course, is that there might even be a chilling effect on researchers. The criticism come from the US is in such aggressive terms, the things also like the fees are banned for civil society are clearly intended to intimidate as well. I will say that the brunt of that is being borne by civil society groups more so than academic researchers, let's say. Whether it's bits of freedom being called out the Netherlands for attending a session on election safety or whether it's trusted factors in Germany being kind of witch hunted, they are really bearing the brunt of that.

I hope that it doesn't extend to academics anytime soon, but I think David Kaye's words continues to be wise today. We should be prepared for that eventuality.

Ramsha Jahangir:

Are people more interested in studying the way geopolitics is affecting platform regulation in general? Based on some of the proposals you've received, are you seeing that interest go up as well?

Magdalena Jozwiak:

Oh, hard to say if it goes up. But we do have two panels that are, one is focused on DSA beyond Europe and one is focused on geopolitical development. So, clearly there is some interest. We had enough good papers to form two panels that actually extends DSA beyond just its provision and application within the EU.

John Albert:

Yeah, we talked about the Brussels effect before and now you talk about the Trump effect.

Magdalena Jozwiak:

That's true. We actually have a presentation on that.

Paddy Leerssen:

Yeah. And I think a related strain, which is also very much dominant in the US right now is the city of tech oligarchy is this term, which Julie Cohen, a very influential law professor, has been writing about and grappling with this really new kind of political and economic structure, which is maybe originating in the US, but definitely has its spillovers across the world and in tech regulation as well in Europe.

John Albert:

We buried the lede, which is that Paddy will be giving a keynote at the conference. So, we'll get to hear more about this.

Ramsha Jahangir:

More on this. So, looking ahead, what questions feel more urgent this year than the last time? And also what blind spots remain in DSA implementation? So, maybe each of you could go.

John Albert:

I mean, I would say just building on this conversation, it seems that the political environment is the elephant in the room, so to speak, but also being addressed quite explicitly in the conference in terms of what's being maybe under addressed or where just seeing all of these submissions that came in, I was a bit surprised to see that there wasn't more research looking at the porn platforms. That could just be sort of a normal or expected thing given as a subject. If you're an academic starting out, you might not want to dive headlong into that subject. But nevertheless, now that we have a three designated very large online porn platforms, you would think that there's a lot to look at there.

And also relatively less looking at marketplaces. Again, it could have something to do with kind of the way that we frame the call for papers that the orientation is really around issues that concerned speech, I would say more than other subjects, but definitely there's a lot that is covered by the DSA to talk about beyond that.

Magdalena Jozwiak:

Maybe just building on that, I thought that what was perhaps what is a blind spot is this research on gender-based violence that is one of the category of systemic risks. And well, it's not like it's not research at all. The CDT had a very good report on that in connection to porn platforms. But I do think that perhaps some kind of cross-cutting research looking at non-discrimination and fundamental rights and mental development and this kind of illegal content in the context of new directive coming up in 2007, criminalizing some forms of gender-based violence that are facilitated by technology and making them illegal forms of expression around that.

So, I think there is a lot, and it's a big topic that perhaps requires some also multidisciplinary approach. And I think that there is a lot to do in this field. And we do have some presentations connected to this topic in the conference, but I have a thing that there is much more to talk about still.

Ramsha Jahangir:

One thing that I think of when you are sharing this is it also because the commission has sort of neglected these systemic risks.

Magdalena Jozwiak:

Right. Commission focused much on minors and there were guidelines and there's so much happening in this space, but with Grok, the gender-based violence with Grok proceedings, gender-based violence was expressly mentioned as a former systemic risk that commission is going to look at in this context. So, maybe this has to do with how the Commission prioritizes its work. But I mean apart from the fact that Commission has certain priorities, I think researchers also, it's simply fascinating to be able to look into platforms and how they are approaching this issue.

So, we have reports already, so that's some kind of material, empirical material we could look into and see what platforms are doing in that field. So, yeah, I think that would be a great point for any researcher to enter when they think about looking at DSA.

Ramsha Jahangir:

All right, I think we've covered most things and hopefully, the conference discussions will spark more ideas and research in these directions. One thing also to mention is that in the coming weeks right around the conference, TechPolicy press is very pleased to partner with the DSA Observatory and we will be publishing a series of articles focusing on some of these large themes under DSA implementation that we've discussed today. So, keep an eye out and those of you who will be at the conference, I'll hope to see you there and have a good day and thank you all.

Paddy Leerssen:

Thank you.

Magdalena Jozwiak:

Thank you.

John Albert:

Thank you.

Authors

Ramsha Jahangir
Ramsha Jahangir is a Senior Editor at Tech Policy Press. Previously, she led Policy and Communications at the Global Network Initiative (GNI), which she now occasionally represents as a Senior Fellow on a range of issues related to human rights and tech policy. As an award-winning journalist and Tec...

Related

Analysis
What the EU’s X Decision Reveals About How the DSA Is EnforcedFebruary 11, 2026
News
EU Decision Behind €120m Fine on Musk’s X Released by US LawmakersJanuary 30, 2026

Topics