Speaker B 1:17 All right, all right, can people hear me? 1:28 Okay, great. 1:29 And I have a clicky thing, can I click? 1:32 Cool. 1:33 Um, okay, so, uh, for those of you not aware, Northside Social, uh, so we're basically a cooperative space in— actually, we're established in BC, Canada, here in Vancouver. 1:44 So our priority is basically really about building a community for Two-Spirit LGBTQI+ people and marginalized identities. 1:52 I want to focus on the second, or sort of emphasis on the second, as I feel over the past year sort of been pigeonholed into the queer community. 2:00 But quite a few other things, like I myself, you know, I'm actually Sudanese American. 2:04 Both of my parents from Sudan, born and raised actually in America and grew up in Washington State. 2:10 So like 9 hours away from here. 2:14 Yeah, and so if we kind of look at where Bluesky— I need to walk up front here, sorry. 2:18 So kind of look at Bluesky sort of like started sending out invites April, May 2023. 2:26 Over time, you know, we kind of saw rapid growth. 2:28 And so now around 45 million users. 2:31 And it was actually beginning of last year when North Sky sort of started to assemble and we kind of found, established ourselves. 2:38 If we look at the reason for that, there was quite a few issues that sort of happened. 2:44 One of the larger incidents early on was actually when Evita, who was basically a Black community organizer, sort of discovered that you could create handles with the N-word, just easily racial slurs. 2:56 And then Bluesky did not respond to this for multiple weeks. 3:00 Developers were sort of like left to bear the burden of actually communicating and kind of trying to explain what's happening. 3:06 Users were really confused about what these code changes were. 3:09 Were they trying to hide things, mask things? 3:10 It was just sort of a big hot mess. 3:13 Later on, then actually a series of issues, the hammer incident. 3:17 One user basically said like, hey, I want to hit this person with a hammer. 3:21 That resulted in introducing the feature to actually block people. 3:25 And basically throughout all of this, we also saw a lot of issues around racism, transphobia. 3:31 Discrimination and actually, and so a lot of it early on was due to Blue Sky not having really T&S, trust and safety, as a priority. 3:42 And then we sort of saw an acceleration of these issues sort of happening as they're able to actually, I see it kind of grew their moderation services and hired more people, which in a way allowed them to actually do a better job of moderating, but because they didn't actually understand the context of minorities, they actually ended up targeting minorities as a consequence of that. 4:04 And then we also saw the Jesse Singal incident. 4:06 So basically after Trump was elected, a bunch of people moved over to Singal, who's basically a known bad actor, started to sort of harass people and you started panicking about him. 4:16 And over the course of 2 days, he was like, he was basically, he was suspended, unsuspended, suspended again, and unsuspended. 4:22 And then basically told that, well, he's screenshotting people and sharing on Twitter, which is actually a separate platform. 4:28 So we don't target that, but we've actually seen a lot of people actually suspended for doing the same things. 4:35 And then just after Trump was— inauguration ceremony, every single major social media platform removed all protections for LGBT people, minorities. 4:48 Basically, they removed them from classification as hate speech. 4:53 So it kind of showed that— oh, my screen just died. 4:56 Yes, let me fix that very quickly. 5:01 Is it back yet? 5:04 There it is. 5:05 Yeah, yeah. 5:07 And so it's— and so basically then actually over the past year, we saw increased targeting of especially of transgender people during throughout all of this. 5:14 Charlie Kirk incident, everybody's familiar with that. 5:19 The US basically sends a letter to every single major social media platform saying we We expect you to handle the Charlie Kirk discourse appropriately. 5:28 If you do not, we'll revoke your Section 230 status. 5:31 Section 230 basically says that social media platforms are common carriers. 5:37 And so if they revoke it, suddenly now you as a social media platform are responsible for whatever your users say. 5:44 And so what we really kind of saw is that every single time is that minorities are always discarded as soon as anything becomes difficult. 5:54 And so when North Sky sort of started to assemble, we sort of actually began with sort of an idea, you know, what does it mean if we actually had a space, you know, where we can actually speak freely? 6:05 What if we could like, you know, like really celebrate our successes, you know, our challenges, like sort of provide and support one another and, you know, and not have people, you know, shit all over it, quite simply. 6:19 And so it was because I was sort of talking about it. 6:21 We kind of had like a lot of intense discussions and it kind of came down to like it's not really about, you know, an individual. 6:29 It's really about, you know, caring for the community that they can thrive in. 6:33 And then as a consequence, we actually kind of like encourage these community dynamics that support those individuals. 6:38 So to create like a nice, like self-reinforcing positive environment where it's like, you as a person contribute to the community and community in turn actually contributes towards your success and protection. 6:51 And that was kind of where it was kind of came down to like how do we actually focus on the needs of the people sort of looking at the common needs while also sort of providing them agency at the same time. 7:00 So that way sort of like we're not forcing you to abide by rules but actually also giving you the ability to actually kind of participate in them. 7:08 And really when we kind of talk, we started to talk about that, it was actually like, okay, then issues kind of become like, you know, within the community is like, how do we actually sort of de-escalate people from like fight or flight? 7:20 As you know, every single, actually I'd say every single person in Northside Social, like the members and volunteers, we have all been subjected to different sort types of trauma, you know, whether it is like racial, sexual, gender-based. 7:33 We've all actually kind of like been targeted by this. 7:38 And then that kind of means that we have to kind of like create spaces that allow for decompression at the same time as when you're existing in a space, you know, whether it's online or in real life, we're constantly being targeted and harassed. 7:48 There is never any point where you can actually process that in a healthy manner. 7:54 And so that's kind of where when we started actually looking at like, okay, what does it actually moderate? 8:00 To do moderation, to choose trust and safety with this context. 8:04 And so that's where kind of like restorative justice, you know, that was really what we started writing everything out. 8:10 That was like the core tenet of it. 8:13 And the other part of it was actually human-to-human moderation where our actions aren't, you know, opaque. 8:21 Like I like to say, like one of the first moderate actions that we actually took was where We started migrating people over and one user made a post that was actually a clear violation of our guidelines. 8:31 Rather than just using Ozone to take it down, we actually notified them saying, hey, you've made this post. 8:37 It violates the guidelines for these reasons. 8:39 Can you please remove it? 8:41 They immediately deleted it. 8:43 And that was where actually we gave them, the agency, the choice to actually participate in actually building a healthy community. 8:52 And so we say, you know, it's basically like, you know, let's check in with you. 8:57 Let's actually see how you're doing. 8:58 And then the other part is really like our focus is really like how do we actually handle these intercommunity issues, you know, within the community itself? 9:07 As in a way it's, I say it's easier to moderate the challenges actually targeting your community than it is actually moderate what's happening within it. 9:19 And so when we kind of talk about the spaces, there's also the other issue of sort of, you know, we say like protecting like from external influences is we saw like on, on HTTP protocol in particular is actually there's a lot of challenges come with it. 9:31 Like these are two examples from earlier this year where it's like one person was like, you know, person took a picture of her and proceeded to use AI to dress it up differently. 9:40 Another one is a political activist targeted. 9:43 For his activism, saying, hey, I have to actually delete all my content because we're all being targeted by this. 9:48 Let me click that again. 9:51 Yeah. 9:52 And then the other part is where we basically see AI tech bros who basically see this firehose where all this data is public. 9:59 I can use it for training purposes. 10:02 Not commenting on yesterday's announcement. 10:05 And so and so, like, you know, and when this happened, they're like, oh my God, this is this amazing thing I just published and everybody freaks out. 10:12 Fun fact, I emailed Hugging Face's privacy group basically asking them if they can remove for GDPR violations. 10:19 I never got a response. 10:23 Yeah. 10:23 And so by the way, this is an illustration my daughter made for me for this talk. 10:28 She is watching this right now. 10:30 Yeah. 10:34 Yeah. 10:35 And so really kind of like we want to have a community space where people can actually experience joy together and actually be able to like discuss not just their challenges but their successes. 10:47 They can create together. 10:49 But at the same time, when you start creating these spaces, you always have people outside of it who want to invade it, who want to disrupt it, or even be nefarious. 10:59 I think it was what, last week? 11:01 Rudy made a post where he's talking about the hustle, and then somebody responds like, I do not support hustle culture. 11:08 And Rudy's like, oh, I'm sorry, I didn't code switch. 11:13 It's like those types of actions where it's like basically somebody, they'll blindly walk into the space. 11:18 And that's kind of one of the issues that we kind of always see happening as you start kind of do community building. 11:25 And so that's kind of where we then sort of like, discussion early on was also like, okay, then we have feeds, which are great. 11:33 They kind of provide for a lot of flexibility. 11:35 You can create a feed that has specific content. 11:38 Black Sky has their community feed as an example. 11:42 North Sky, we created our own also. 11:45 But at the same time, it's sort of like people can still access it. 11:48 You can kind of do some logic within it, but it doesn't really solve the public side of things. 11:54 And so that's when we start to actually shift away from just talking about feeds towards actually creating spaces. 12:02 And so it's sort of like, you know, what does it look like? 12:03 You know, how do users interact with these spaces? 12:08 Sorry, I'm also kind of spotting because I just wrote all this 2 days ago. 12:12 Yeah, and it's like, what are the new ways you can actually kind of create those interactions within these spaces? 12:16 So it's not sort of like breaking away from actually having a single feed to actually, okay, how do we actually have a space that has different ways of actually interacting with that space itself. 12:27 And then in what ways does that actually result in a positive environment too? 12:34 And this is kind of where I call it Stratos, our sort of our private data solution. 12:40 Kind of like that's where it kind of came out of that. 12:43 As whenever, when I originally wrote the proposal, there was like no announcements yet. 12:48 I think it was basically like Nick had written some things, Paul had written some things, it's all kind of very heavy theoretical. 12:53 But this is where we kind of talked about, okay, let's actually do private data in a way that's actually permissioned. 12:57 Where you can kind of like allow for multiple spaces to exist. 13:03 But at the same time, have some governance in there too. 13:07 And that's kind of where we're saying, okay, actually have the ability to actually moderate this was one of the critical aspects of it. 13:14 When I published the first proposal, one of the common feedbacks was like, okay, why don't you do end-to-end encryption? 13:22 I was like, okay, let me actually think about this. 13:24 So I looked at the MLS spec, I read it, I talked to Ian from Pyrgos, and it's quite something. 13:32 It doesn't work at that type. 13:33 So like end-to-end encryption, essentially, you know, like I'm talking to Nick, nobody can take part in it. 13:39 And then I add another person to it and now they take part in it. 13:41 but they don't see any of the messages from before because it's all encrypted at that specified. 13:46 And so if you want to moderate it, then whoever the moderator is, the moderation service has to be a party of it. 13:51 So that is like there's always like a third party that just can view everything. 13:54 So at that time, at that point, encryption doesn't work. 13:57 And also, how do you also make it so it's easy to actually to search it, to browse it, to access it however you wish? 14:04 And so that's kind of where it was. 14:05 It was one of those big things. 14:06 And also, also how do we make it verifiable? 14:08 So we didn't want to break away from the protocol at the same time. 14:12 So we kind of made the intentional decision, actually, let's actually stick with how AT Proto works in a way that others can actually adopt it. 14:19 Because that was actually kind of the critical aspect of where, as a community builders, we don't want to build a completely unique platform at the same time. 14:29 We want to build something actually that can actually be adopted and grown together with other individuals. 14:36 And so I feel like I'm rushing through this, but let's find out. 14:39 Yeah. 14:39 Yeah. 14:39 So that's basically where we kind of talk about spaces or feeds. 14:43 And so we kind of decide on this hydration model. 14:46 Basically the smokes and mirrors approach is kind of written as where you basically have, you know, on the PDS you have like a public record. 14:54 You know, that can just— has a reference to the private one. 14:57 And so then an app view or an application It can see the public one and it has a reference and it's able to kind of pull it in from the public one. 15:05 And you end up with your private data at the same time. 15:07 So that kind of allows us that compatibility with AT Proto while also kind of serving the needs of having data privacy with it. 15:15 And now for the gutsy part. 15:17 There's a demo. 15:20 So Luca has been begging me for the last 3 days to record this. 15:25 But I have refused. 15:28 Because I like doing things live. 15:33 Yeah. 15:34 So this is basically— please show a post. 15:39 Don't terrify me right now. 15:41 Yeah. 15:42 So here we go. 15:42 So this is basically a web app. 15:45 I kind of threw it together as a demo purposes. 15:48 Where we have the idea is to basically have like your feeds. 15:52 Does this have a laser pointer on it? 15:53 Yes, it does. 15:54 Nice. 15:56 Yeah. 15:56 So basically, you know, see, like, Amelia this morning, she wrote some posts on here for fun. 16:01 And I guess you can see right now, this is basically an app view feed. 16:03 It's actually serving all the private content. 16:07 And for the sake of this, all this is doing is actually it's just serving the data for all the boundaries that I have access to. 16:12 So we use boundaries as mentioned. 16:13 It's a permission space. 16:17 And so what it is basically from authorization side, I'm sending a request. 16:20 It says, okay, let me find out all your boundaries. 16:22 Okay, here you go. 16:23 Here's all your data associated with it. 16:25 And that allows me to actually kind of see it. 16:27 And I see here— I'm kind of actually jumping past it. 16:32 Sorry, let me jump back really quickly. 16:34 By the way, there's documentation. 16:37 Yeah. 16:41 So the idea is basically that we have attestation. 16:44 Here we go. 16:47 That basically is where as soon as— when a user enrolls in the service, we create a public record. 16:52 Basically saying, okay, this is your service, here's your boundaries, here's actually your signing key that we create for you. 16:59 And we kind of write that back to the public record and that sort of enables that discovery aspect of it. 17:04 With the attestation, you can actually kind of verify at the same time. 17:09 So, P256K1 and then P256 key for the users. 17:15 Every user has their own key. 17:16 That's where we say verifiable as in you know what your public key is and you can always actually validate that. 17:23 Your records belong to you. 17:25 And that was sort of like trying to build this trust model at the same time. 17:28 Okay. 17:29 Back behind the desk. 17:31 Yeah. 17:33 Yeah. 17:33 And so, if you look here. 17:37 We have a little nice enrollment record where I can actually see that. 17:41 So this is one, you know, Stratos service. 17:44 These are my two boundaries that I have access to. 17:47 And then down below is the attestation basically providing verification of it. 17:53 And then this nice little thing that also shows it. 17:57 By the way, I forked pdsls and I added Stratos support to it. 18:01 So it actually, it's, It fully works. 18:03 Yeah. 18:05 It's basically showing the verification basically does client-side checks. 18:08 The service did web document to actually validate it and shows that yes, this is actually verified. 18:16 Yeah. 18:16 And then basically then if on the demo app, basically then see up here it does a little check. 18:21 This is all verification. 18:23 And then also from the service, it also validates, yes, you are enrolled. 18:28 And then we see on here these are the different Basically, these are the domains that a person has access to, and then the actual and the additional ones that are available. 18:37 And this basically then allows you to actually basically write a little private post with it. 18:45 Yes, there we go. 18:48 And then here we go. 18:50 And then I'm just going to do a thing of I'm going to give— sorry, actually, let me do this really fast. 18:56 But this is where a user is already enrolled in the service. 19:01 So I'm going to log out and log in with a different one. 19:12 Oh. 19:19 There we go. 19:21 Full auth. 19:22 Scoped enrollment, only the Stratos lexicons have access, of course. 19:28 Go ahead and authorize that. 19:31 And then I'm logged in. 19:33 And we see here that I don't have access to anything. 19:36 So I'm not enrolled. 19:37 I only see my public records. 19:40 And right now the service is actually designed where you can actually have an open enrollment with exclusive domains. 19:47 So the const is basically like I can actually— I can freely enroll in this one. 19:50 So let's do that again. 19:56 And this is currently necessary because the enrollment process basically generates a long-lived OAuth token. 20:03 So that way behind the scenes, whenever you actually create a private record, it also basically writes the public one for you on your behalf. 20:09 So sort of like OAuth delegation or as they call it OAuth wizardry. 20:13 Because this whole setup is insane, honestly. 20:17 Yeah. 20:18 Oh, and Amelia is now showing off that she's actually posting at the same time. 20:21 Yeah. 20:23 Yeah. 20:30 And so basically, you see here, like, I only have access to, you know, open enrollment domain. 20:36 And then there's, of course, like, always going to be an admin interface. 20:38 Where, let me get that, there we go. 20:44 I'm gonna add myself to the bees domain. 20:48 There we go. 20:53 That was the wrong DID. 20:57 Hang on a second, let me find that one. 21:06 Actually, I really like doing live demos because something always goes wrong. 21:12 Let me delete that. 21:13 It's just differently right. 21:15 Yes. 21:18 Also, I appreciate actually showing that this actually works. 21:22 There we go. 21:23 Back to the demo app. 21:24 I refresh that. 21:26 And now I see bees. 21:28 And there. 21:38 Yeah. 21:38 And so basically— and actually there isn't really anything right there in there. 21:43 Was. 21:44 But it's because I write— because I post a lot more on this account. 21:49 That's why. 21:52 But then if I go back to the previous one. 22:00 Back in there. 22:04 Yeah, we added it. 22:06 And it's taking a while to populate. 22:08 Oh, I don't have any content there. 22:13 Right. 22:13 Because I cleared this out for the demo just in case. 22:16 But anyways, but actually anybody can actually log into this app. 22:19 I'll tear it down in a day or two. 22:23 But that's basically the concept behind it is that we actually wanted to basically allow people to to basically adopt this, integrate it in any way they wish. 22:33 Because it's sort of like when we started actually thinking about this, actually the whole reason I built this is because when I met Boris back in November at EuroSky in Berlin, I was like, let's talk about what's happened with the private data working group. 22:47 And what I learned is that nobody could actually decide on an approach as we're sort of disagreeing. 22:52 And I basically told Boris, like, I'm going to prove it's possible. 22:55 By having a demo at AtmesterCon. 22:59 Yeah. 23:08 And it's not so much to say that it's not hard. 23:11 I just like really difficult things. 23:14 That's really what it comes down to. 23:15 And it was like— it was— and so I'd say this is just one approach to it. 23:19 There's quite a few others. 23:20 Actually, like, I posted the proposal in January and then within a couple weeks there were 6 more proposals out there and announcements. 23:26 So it was kind of a bit crazy, but actually at the same time they all have different requirements for it. 23:32 One group wants to have more of a private social media space. 23:37 Another wants to have actually the ability to restrict the data between your device and its target. 23:44 And Blue Sky's approach is more like how do we actually give more agency to the users with primitives. 23:49 And for us, it's actually how do we actually provide spaces for a community with actually the ability to do the governance to take care and support the community at the same time. 24:01 Yeah. 24:01 Let me jump back to this thingy. 24:03 I do have a couple other slides. 24:05 Let me see. 24:07 Yeah. 24:07 And so, I kind of really want to put stress on shoutouts to a few individuals throughout all of this. 24:13 So, like Nick. 24:15 For example, I hopped on quite a few calls with him because it was something where I would say the most difficult part was actually thinking about how does this actually work at scale? 24:26 Because it's one thing to basically say, okay, we're going to have some private data and it kind of sits around, but then it's like, how do you actually handle it? 24:31 We can have thousands of people working using this at the same time. 24:36 And this is where Nick was really instrumental, kind of like talking me through hashing algorithms, which I've only briefly read about before. 24:42 And he gave me a long list. 24:43 I spent an entire week reading research papers. 24:46 Yeah. 24:46 And like Louis, because actually I was working on this locally and then I deployed it to the cloud to test some things and suddenly had horrific issues. 24:55 And I think it was like last week I just tagged him on the Discord and I was like, Louis, how are you doing Postgres with Tranquil? 25:03 And he was kind of providing some advice there. 25:04 And also Faye, of course, for listening to me complain about issues nonstop. 25:09 And then Emilia, who's Emilia is Emilia, honestly. 25:15 What does she not know? 25:17 Yeah, and then Boris, because I challenged— he basically made me challenge myself with this. 25:25 Yeah, but this is— it really kind of comes back to is where private data is a very important thing in many contexts. 25:35 For us, it is actually, you know, providing a private space for communities. 25:40 For others, it is providing a private space for research, right? 25:44 A2 science is actually a common topic that came up yesterday on Friday for them. 25:48 And so this is where we kind of say, like, let's actually build in the open the way that can actually be adopted. 25:54 Yeah. 25:54 And if you want to learn more, you can look at the documentation that I published on Friday morning because I have horrible jet lag and I just kept writing things. 26:06 Yeah. 26:07 And so thank you, everybody. Evelyn Osman 26:18 Excellent work. 26:18 Thank you very much. 26:20 Thank you for the great presentation, the great demo. 26:22 We have a few minutes left for questions from the crowd. 26:25 We got a big crowd here, too. 26:27 Anybody got anything to ask? 26:32 All right, you did such a great job. 26:34 All the questions are answered. 26:35 Excellent. 26:36 All right, well, everybody, that brings us to the end of our media and civics track, sub-track, pre-lunchtime. 26:42 Uh, thank you very much for coming in. 26:43 I'll let you all go and, uh, and enjoy some lunch, and we'll see you afterwards. 26:47 Thank you very much.