Ivan Sigal 0:00 for Canada, and we recently pushed a paper for social media interoperability here in Canada to be presented in Ottawa, I think, at an event next month. 0:10 And also other types of digital sovereignty stuff. 0:12 So there are other policy folks here, so if that's your jam, come find me, come find us. Speaker B 2:04 Good. Ivan Sigal 2:04 And Ivan, take it away. Speaker C 2:09 Good afternoon, everybody. 2:11 How are you all doing today? 2:12 How are you doing? 2:14 A little— not too tired? 2:15 No food coma? 2:16 No? 2:19 So today I'm going to talk about— actually, my notes are not— sorry, one moment, please. 2:38 OK, apologies, technical glitch. 2:43 Today I'm going to talk a little bit about the— maybe we could call it the elephant in the room, I don't know. 2:51 We're going to talk about authoritarianism and we're going to talk about the fact that openness as a concept and the way that the challenges that the anti-protocol was built to kind of contest weren't necessarily designed to imagine that the country in which it was built was going to be potentially an adversary. 3:17 For context, I spent most of my career working in countries where people did not assume that their governments were supportive of their rights and freedoms. 3:30 And do not assume that a state is a benevolent actor in the service of their interests. 3:38 And we have a— I think everybody knows very well the Silicon Valley vision and version of the history of the internet. 3:46 It's worth noting that many people never experienced that version of the internet. 3:53 And that the idea of the internet as a space for political contestation basically emerged coterminous with the idea of the internet. 4:02 So TamilNet, which was the Tamil nationalist organizing newspaper and website, launched in 1997, the same year that the New York Times went online. 4:16 Chiapas revolutionaries were using Tripod, an early hosting service in the early '90s, to organize, as were Malaysian independence activists. 4:25 So we have a whole alternative history of the use of the internet for political activism and for resistance. 4:37 Today, something like 25 countries are full democracies out of 184. 4:46 85, and only 75 or so have democratic forms as a whole. 4:51 So most people in the world have a very different relationship with their governments and their understanding of what regulatory authority can do and their understanding of what openness means in the context of societies. 5:08 And I want to use that as a basis for trying to establish some basic terms about what we mean by authoritarianism. 5:19 So digital authoritarianism is a term that's been kind of in vogue for the past 5 or 6 years, and it describes very basically the use of technology to advance repressive political interests. 5:29 A slightly older and I think more apt term is networked authoritarianism, which emphasizes the idea that people can be co-opted into supporting authoritarian power by participating in information systems. 5:41 So apologies if a lot of this is a little bit dense, but there's a— and maybe folks, you all know all this already, and apologies if that's true, but I feel like it's helpful to establish some kind of grounding terminology and definitions. 5:57 In this space, the kind of epistemic challenges, the idea that our information spaces are broken and fragmented, as we heard so eloquently this morning, The idea that that's true is not solely an artifact of our technological affordances and underlying financial incentives. 6:16 In many cases, this is the result of intentional efforts by people with power to shape narrative, to persuade and influence, to simultaneously narrow spaces for expression and political action through regulatory and through technological approaches. 6:32 So it isn't happening organically as a result of market artifacts. 6:36 That's also important, but it's not only that. 6:39 And in defining the threat, we can also define the fields of contestation. 6:47 And I'm not going to read this to you, you can read it yourself, but in general, authoritarianism has a— is useful. 6:58 It's useful to give it a tight and narrow definition so that we can talk about it. 7:02 There are many definitions of authoritarianism. 7:04 This is the one that I'm working with, and I find it to be useful because it allows us to talk about authoritarian practices and not just authoritarian states. 7:17 So you can have authoritarianism as a practice that exists within an authority, within a non-authoritarian state that is technically categorized by political scientists as not authoritarian. 7:28 And things like appropriate citizen oversight or The use of accountability and transparency and legitimacy in the application of technology is key to this definition. 7:41 So even in, even in countries that formally are democratic, you might find evidence of authoritarian practices that significantly affect our ability to function in the, in terms of our ability to express ourselves or mobilize or built movements. 8:00 And many of us in this room, I'm guessing, also live in countries where there is the potential or actual presence of democratic backsliding. 8:10 That's a term that I hope everybody understands, but basically it means when democratic governments are no longer— suddenly no longer democratic. 8:19 And what we're seeing in our geopolitical space, which is the kind of the end of the post- post-World War II, potentially the end of the post-World War II liberal era, and the emergence of a transactional geopolitics that kind of threatens to undermine a lot of the basic premises for what we mean by openness in the international order. 8:42 And that idea of openness in the international order is twinned to the idea of openness in information spaces. 8:50 So when we, building open decentralized technologies, use language to define what we mean by openness, we are also inherently talking about this other history. 9:04 And if that history is now being challenged, what does it mean for us in this room? 9:11 More specifically, on the left, this is what authoritarianism does to us. 9:18 Pluralistic ignorance, which means the fact that you don't know, I don't know what you're thinking, you don't know what I'm thinking, we have no idea that we might have some ideas in common, social isolation, and epistemic vertigo, which means that we have no basis for common knowledge. 9:38 These are the challenges that our information spaces are creating for us. 9:42 And again, we heard a lot about that this morning, so I don't necessarily feel a need to repeat myself or repeat what was said. 9:49 And then our goals on the right, especially— and this is a normative set of normative claims, like I'm for these things, so I'm not going to pretend that I'm not— shared reality, collective action, information integrity, values that support more democratic participation, more shared power. 10:07 That's what's at stake. 10:15 Before, so I am the executive chair of newly formed Modal Foundation, which is the entity, the legal entity behind EuroSky. 10:26 And I've been in this role and working in this space for about 8 months. 10:31 Before that, I spent a huge chunk of my life working in countries and contexts that were politically contested. 10:40 And I spent the last decade working on research in information ecosystems. 10:45 The last big project I did was called the Unfreedom Monitor, in which I did a 20-country study of the relationship between technology, regulatory structures, and narrative power. 10:59 And the essence of that idea is that regulatory and technical systems are being used to narrow the space for expression and political participation. 11:12 And the authorities in many countries are using that resultantly narrowed space to amplify a set of narratives that— to try to convince people to give up their own— give up their rights and freedoms in exchange for their safety. 11:29 And I found that to be true in many different political contexts in countries around the world. 11:34 And this was a participatory research project. 11:36 About 30 of us worked on it over a 2-year period. 11:40 Very briefly, looking at this topic, authoritarian strategies involve coercion, cooptation, and persuasion. 11:49 So I'm not going to define them too carefully. 11:52 I think you understand what they mean. 11:53 Fields of contestation, I already mentioned. 11:55 And then the primary actors. 11:57 States, cooperations, and publics. 11:59 And we find— we found in this research that countries have become very adept at mixing, at remixing these different fields in order to create fields of repression. 12:16 And it's very hard to contest. 12:18 So I'm gonna give one simple example that many of you are probably aware of. 12:23 That's the Twitter Files. 12:24 Everybody know what the Twitter Files was? 12:26 Anyone not know? 12:26 Twitter Files was a— it was an information operation, and it was a group of— I'll call them political operatives or information operatives— who claimed that the Biden administration in the United States was censoring expression around vaccine, around vaccine communications, and trying to force the social media companies to to remove anti-vax narratives and misinformation and disinformation. 13:00 It was untrue, but it crossed all of these fields in its architecture, and it was very, very successful as an information operation. 13:08 It targeted information integrity researchers, and it destroyed the field of information research in the United States as a result. 13:18 These were the people who were exposing truths on the platform, studying the platforms, and identifying systemic problems on the platforms. 13:26 They did this by creating a false narrative of state censorship. 13:32 That's the narrative one, of having a political ally take over Twitter and changing the algorithm to change what we could see, spinning up a false congressional investigation and suing the individual researchers who were involved in doing the research in the first place. 13:50 And finally, pressuring the institutions and funders of this research with the threats of even greater losses in the event that they didn't defund this work. 13:59 And that was both state entities and private entities that were funding this type of research. 14:06 And the result is a decimated field of analysis, less knowledge, more social isolation, and more epistemic vertigo. 14:19 Very briefly, this is the Information Monitor Project. 14:23 I mentioned it already, so I'm just going to very briefly talk through some themes in terms of understanding how we think about the system. 14:34 4 big categories of information space: access, data governance, information, and speech. 14:42 These are all contested areas where the battle over rights and freedoms in information spaces are happening. 14:53 Social controls that governments and states are applying through those fields—freedom restrictions, information manipulation, internet control, surveillance, system attacks, and technology controls. 15:08 I can come back to these later if you want me to. 15:09 I want to get on— I'm going to get on to OpenSocial protocols fast, so I'm just going to zip through these so you have a sense this is published documentation and I can direct you to it if you want to read it later. 15:21 And lastly, the types of concrete harms that people experience. 15:27 Intimidation, judicial, physical, and technical are broad categories and These are kind of extensible issues, the things that happen to individuals or groups as a result of participating in trying to express or participate in civic life. 15:46 OK, open social as a field of contestation. 15:53 I have 2 more slides only, and then we'll hopefully have about 10 minutes to to talk. 16:01 Open social technologies like ATproto and everything that we're doing here today have the potential to offer a structural response to the primary levers of control that we're experiencing. 16:13 So coercion, resistance. 16:16 Cooption, exit. 16:18 Persuasion, algorithmic agency. 16:21 And we have a bunch of potential tools available to us that the kinds of technologies that we're building enable us, that allow us or help us to act. 16:31 And these are here, shared truths and public data. 16:36 That's a kind of a core backbone of what it means to build open public information as a ballast or as a counterweight to attempts to create isolation. 16:49 Information and identity provenance. 16:51 Data ownership, decentralization, and as a mechanism for resilience, as a mechanism for economic and technological and informational resilience, cooperative models that allow us to share resources and drive costs down, and the idea that we can actually use different states and their sovereignty as strategic depth for resistance. 17:12 So we have the option of other jurisdictions to support actions. 17:18 There are others, other affordances and other things that could be built. 17:21 And that's kind of gonna be the question in the room. 17:26 The counterweight is what does this kind of exposure, like reliance on an open system expose us to? 17:34 One is that provenance becomes evidence. 17:37 Immutable provenance of information and identity also creates risk. 17:43 Public data and context remix. 17:45 So once you put something out into a public network, somebody else can take it and put it somewhere else out of your control. 17:52 And that is, that's a feature in our system when we have a guarantee of a non-authoritarian state that is actually available to adjudicate, you know, adjudicate your rights in a way that is legitimate and equitable. 18:10 And when that's not true, it becomes a serious risk. 18:16 We've already seen that states are quite adept at identifying vulnerable targets within networks of activists, within networks of people who express themselves or use open technologies as a way of breaking collective action, picking off vulnerable individuals or groups, making examples of them in ways that are silencing or suppressing or create fear that suffuse civic networks. 18:40 And resource scarcity, because a lot of us are enthusiasts, a lot of us are doing this out of a passion, and you can do that until you're hit by a slap lawsuit or until you're hit by any kind of serious legal counterweight that can absorb years of your life and all of your savings. 18:59 So lawfare, that's commonly called. 19:02 Narrative power and market pressure, those are the forces that are often arrayed against open networks. 19:08 That's the end of what I have to say. 19:11 I hope that will help us to think about and frame— think about it and create a frame for this moment. 19:18 And I'd be super happy to hear what others have to say about this. 19:22 We have about 5 minutes. 19:24 And also, I will say, this is a public streamed conversation, so please don't share anything sensitive. 19:30 Very happy to talk about this. 19:32 Anything that shouldn't be public, offline. 19:36 Thank you. 19:49 I see one hand over here. 19:52 Oh, right there first. Speaker D 19:54 Yeah, very interesting. 19:55 Thanks for structuring all this. 19:59 These tools that you mentioned, could you connect it up with BlueSky or the AD Proto system? 20:05 Which of these sort of are available already? 20:08 What needs to be done in the ecosystem to foster some of these other things? 20:12 So if you could just try to bridge that gap, I would appreciate it. Speaker C 20:17 You mean this one, Tools for Resistance one? 20:20 Yeah. 20:22 I mean, it's not only about, it's not only about BlueSky, but there are some things in the BlueSky system that are amenable to that, that I would put on the pro side. 20:31 So algorithmic, the ability to build your own feed, to ensure that the information that you're seeing and share it in a very direct and unmediated way in the way that monolithic platforms don't allow you, affords or facilitates information integrity. 20:48 Foregrounds provenance and foregrounds individual connectivity, which diminishes social and informational fragmentation. 20:59 So that's a great example of how this is a benefit. 21:03 Another one is public data, because public information shared widely at a social level is— especially information that is validated at the provenance level— is a core part of being able to push back at a social scale against authoritarian framings that try to strip information out of our civic space and make knowledge a political tool instead of a basis for shared facticity. 21:35 Yeah. 21:36 There could be— there are— I mean, I think a lot of what's happening in the ATproto science community I don't know exactly about this, honestly. 21:44 I mean, epistemics, scientific epistemics is kind of the highest level of knowledge in terms of its rigor. 21:49 But of course there's also journalistic epistemics, there's legal epistemics, there's all of these different standards of knowledge. 21:58 And the ability to create and manage different levels of standards of knowledge within any kind of knowledge sorting space I find to be incredibly valuable and useful. 22:08 And I think there's a lot we could do there. 22:10 And I also, once we get to permission data, that also adds a really interesting layer here because then we aren't necessarily at risk of putting everything out into the open all the time in ways that might create rebound on people as individuals in a network. Speaker D 22:22 Just to summarize, do you think most of the levers and features are there? 22:26 It's a question of using it. 22:28 It's not like it's missing a lot that needs to be added. Speaker C 22:31 And that's actually one of the questions for the room. 22:33 I mean, I have ideas about that, but I imagine there's probably more that could be done, more that could be built. 22:39 And there probably are vulnerabilities that I haven't identified. 22:42 I didn't manage— I'm sure I didn't hit all of them. 22:45 I have something— these are the ones that are most evident to me. 22:49 Yeah, I have another question here, please, or comment. 22:52 Yeah, I just had a question. Speaker B 22:54 I'm really curious about the research about authoritarian practices that are being done. 23:00 And I was curious to know if that research was applied to companies or corporations in in the sense of how there might be authoritarian practices that are happening through like corporate levers? Speaker C 23:12 I'm curious about that. 23:14 Very often what we see is some kind of combinatorial approach. 23:18 So a state will either co-opt or coerce a company, or those companies will become contract providers for technologies that are being used for— so the spyware industry is a great example, right? 23:33 Quite a profitable industry. 23:34 And spyware used narrowly, uh, to target things like sexual predators with like appropriate legal oversight is not necessarily an authoritarian tool. 23:47 So the idea that we have authoritarian technologies is actually a little bit not the way I would phrase it. 23:53 But when that technology is used to surveil people en masse, without oversight, then those companies are facilitating authoritarianism. 24:05 The other piece of this is you will see, like, I can think, I can think of a great example in Myanmar from a few years ago when the government forced the sale of a telecommunications company owned by Norway, a Norwegian company, and then used all of the— got access through that sale to all of the personal data of the millions of people, the tens of millions of people who had cell phone network whose cell phone networks were provisioned through that company. 24:32 So they forced a sale to a military company and then got access to personal data of like half the country as a result. 24:39 So that tends to be how the dynamics work. 24:47 Do we have time for one more question or are we good? 24:48 Or comment, if there is one. 24:53 If not, I'll show you. 24:55 Great. Speaker E 25:05 Well, first, thanks for this talk. 25:06 I think it's super interesting and very, very relevant and timely. 25:09 I think one of the things we talked a lot about, at least in your talk, is about how open systems can kind of provide these tools for resistance, or at least provide a playground for developing and sustaining tools of resistance. 25:20 I was wondering if you can speak a little bit more about the different types of risks that open systems also bring. 25:25 Yeah, like a little bit more so, and if we've seen any evidence of that. 25:28 Like for example, I think, you know, we know that they're having like long-running information campaigns. 25:33 Does it become easier to track because it's public record, or are there other new novel ways of manipulating information campaigns because it is an open network that kind of like builds assumed trust? Speaker C 25:46 If that makes sense. 25:46 Yeah. 25:48 One of the things I was mulling is if and when there might be some kind of systemic attack against the ATproto. 25:54 And to me, the measure is— I came up with a couple answers. 25:59 The first is this space becomes important enough at a social level that it becomes a particular threat to a group in power. 26:08 And we've already started to see some intimations of that on the Blue Sky Network in the past year. 26:13 So that's one thing that I keep my eyes out for. 26:16 So trolling or provocations, like the Trump administration signs up for a Bluesky account and then starts intentionally breaking the terms of service, hoping to get kicked off the network, that they could then use that as an example or as an opportunity to bring a lawsuit against— a lawsuit for censorship against the company. 26:38 Another one would be at the level of narrative power, which is the attempt, the systematic attempt to kind of pinhole, pigeonhole the vast and quite differentiated set of communities that are on Blue Sky as a certain one political class, like unified ideological political class that are not worth listening to. 26:58 So we've seen that again and again. 26:59 Oh, this is just like a liberal pocket. 27:02 And so you can just safely ignore it because you're not gonna have anything other than an echo chamber here. 27:06 In the blue sky context. 27:09 A third would be using it as an intentional provocation, so leveraging it against— I mean, I can imagine an external state actor, like say the Iranian government, kind of coming onto the platform en masse and then trying— and/or even somebody's PDS, like loading it up with pornography and then creating a case that then collapses the whole thing. 27:34 So using the judicial system against itself. 27:36 And there's another— I could probably— I don't want to give the other side too many other options, but there's a lot of ways that you could red team this in ways that are actually really important to think about. 27:46 And I would love to see some kind of careful and strategic thinking about how to protect both the people on the network firstly, and secondly, the network itself. 27:56 Thank you. Ivan Sigal 27:57 Excellent. 27:57 Thank you very much. 27:58 A round of applause for Ivan Fleek. 28:07 Certainly food for thought for those of us in Canada that have been working on digital sovereignty here and thinking about our public options and how not to wrap ourselves around the inside the flag and get too close to people and make sure that it's public for us, by us, and for us. 28:22 Okay, Sebastian, let's get Sebastian up here to give us a little bit of more context, and then hopefully we can wrap up 5 minutes before the hour and get you guys back to the business