FINAL EP 51 HEAT INITIATIVE Rosalia Rivera and Sarah Gardner discuss the need to hold tech giants accountable for child safety, particularly in the context of Apple's failure to detect and remove child sexual abuse material from its platforms. They emphasize the urgency of this issue and the need for collective action to ensure the safety of children online. Rivera and Sarah Gardner also discuss Apple's role in addressing child sexual abuse, the tension between privacy and child safety, and the importance of holding companies accountable for their actions or lack of action. Action Items [ ] Share the heat campaign and podcast with one other person to spread awareness [ ] Sign the heat petition on their website calling on Apple to make requested safety changes [ ] Support passage of the Kids Online Safety Act and contact members of Congress [ ] Share HEAT's social media content and sign the open letter supporting their demands along with other organizations [ ] Continue coalition-building and campaigning efforts against Apple
Here are the links from the show:
Sarah Gardner 00:00
There are some failings on Apple's part that are very, very basic online safety standards. So in iMessage, they don't have a report button for kids to be able to report abuse, unlike pretty much any other messaging service that a child would find themselves on. Something that's not as obvious, but it's playing a big role is how are people uploading images and videos into these apps, especially abusive ones, that are then used to groom kids right, or to share abusive content with other people who they shouldn't be sharing it with? And that is where you're starting to talk about the photo sharing and storage level cloud services, right. So what are sort of the pipe and infrastructure that allows for people to spread this type of content, you actually need to pull back from one any one app and look at that infrastructure. Apple is playing a very critical role in allowing child sexual abuse to spread because it does not detect it. We've got to take a look at the device makers and the cloud storage services in order to fully address that issue.
Rosalia Rivera 01:14
Welcome to about consent podcast that sparks conversations about vention healing and justice to end child sexual abuse, while also creating consent culture now and for generations to come. This is a safe shame free and judgment free zone for both survivors and those who support survivors are welcome. I'm your host, Rosalia Rivera.
Rosalia Rivera 01:42
I am dancing in my chair. Welcome to Episode 51 of the about consent podcast this week, I have the amazing Sarah Gardner. I wish there was like a background that was like yay. You know, like how, when you're in a stadium full of people and everyone's cheering, like you say somebody's name to introduce them to the stage and everybody starts clapping. I wish that I could do that for Sarah because that's how I feel about her. Sarah Gardner, who is my guest today is the co founder and chief executive officer of the heat Initiative, a collective effort of concerned child safety experts and advocates encouraging leading technology companies to detect and eradicate child sexual abuse material on their platforms. In 2023, she launched heets multimillion dollar effort to hold Apple accountable for its inaction in removing child sexual material from iCloud. Sarah previously spent 10 years at Thorne, an organization that builds technology to combat online Child Sexual Abuse material, where she was integral in its growth from a small startup effort to a multi million dollar nonprofit. And as Vice President of External Affairs, she helps develop a plan to eliminate child sexual abuse material from the intranet, which spurred a $63 million investment in the organization through the TED audacious prize. Sarah also worked at free the sleeves, an organization empowering local organizations to end modern forms of slavery. She lives in Los Angeles with her husband, three kids and her Bulldog, as someone who has had the privilege of connecting with Sarah through, for example, organizations like sage, the brief movement and working now with heat on its advocacy and campaigns. I've gotten to know her a bit better. And I have to say she's just truly a good hearted person who has a genuine interest and commitment in this issue. And to me, those are the heroes in the world who are doing the work day after day, year after year, decade after decade. And I am just really grateful that she is doing this work and that she has agreed to come on the podcast to talk about what he is doing. Why apple and you're going to hear all about it. This was a really great episode really breaking down why every parent needs to push this forward, why we need to make sure that Apple does better in today's interview, you're going to hear about why we need to make sure that we are advocating and raising awareness about any technology company, especially one that is so quote unquote reputable and makes so much money year after year from its users and consumers but is not doing right by them. And so we need to I'm not necessarily just expose them, but ask them, invite them push them to do better. So join me for today's episode with sour gardener so that you can understand why Apple why we all need to take action and how we can do that. Let's get into the show. Sarah, thanks for joining me today, I'm excited to talk about this really important piece of online safety activism that he is doing this campaign that I think every parent should know about. So thanks for joining me and taking the time to be with me today. Thank you
Sarah Gardner 05:34
so much for having me, I'm so happy to be here. And I'm really excited, especially to have the opportunity to share some things with your audience that you've built. I think parents do play a critical role in the changes that we want to see. And so I'm glad I can hopefully share some nuggets with them that they can walk away with more information, feeling more empowered to make those changes,
Rosalia Rivera 05:56
let's let's just jump right into it and talk about the fact that first of all, parents have a lot of power. But they're also overwhelmed and under informed because the media really drives the narrative. And the narrative up to this point has been that social media companies are where it's at in terms of where all the danger is, and where all of the focus should be in terms of legislation. And you know, making sure that parents know how to keep their kids safe. And all these pieces. And I think up to now, you know, that's all relevant and important and necessary. But there's this other company, no one's really paying attention to that is actually a really big player, but most people don't know about it. And so because everyone's looking at social media companies, why is he looking at Apple? Great
Sarah Gardner 06:48
question. And I think the answer is that it's an and not sort of an order. But so I always think of social media as the tip of the spear, where kids are interacting in a way and especially with potentially strangers, or people who are not their friends, or that they know, in real life. And so there is unfortunately, a lot of harm that is happening in those spaces. And I think when it happens, and the kid goes to show the parent, you know, you're in Snapchat or you're in Instagram, right, so the focus has really been there. And that's critical. So all of that focus should remain and needs to stay there. One of the things that we are thinking about, though, is how do you holistically change what's happening across these apps, and something that's not as obvious, but it's playing a big role is how are people uploading images and videos into these apps, especially abusive ones, that are then used to groom kids right, or to share abusive content with other people who they shouldn't be sharing it with? And that is where you're starting to talk about the photo sharing and storage level cloud services, right. So what are sort of the pipe and infrastructure that allows for people to spread this type of content, you actually need to pull back from one any one app and look at that infrastructure. Apple is playing a very critical role in allowing child sexual abuse to spread because it does not detect it across its iCloud services. So that's just sort of one of the more technical reasons why Apple philosophically and strategically why we chose Apple is for two reasons. One, they've been by far the most negligent and absent from Child Safety conversations over the last 15 to 20 years. So whereas meta Google, Microsoft X, they, you know, they have a lot of things they have to change and fix, but they also have built very robust teams to handle this issue. Apple has not it has not invested in the same way and prioritized online child safety in this way. So the most valuable company in the world kind of trying to behave as if they're not part of the problem is a problem. But the second reason is, they're also great innovators and seen as very reputable, right. So if Apple makes these changes, if they start detecting child sexual abuse at scale, they allow for users, especially child users to report abusive content. They require safe apps that will set the standard and likely change the whole field for the better because other companies will say, Well apples doing it so I can do it too. Our focus on Apple is about going kind of upstream to what are the intervention endpoints that can happen that will systematically make all the app safer because no one can upload Child Sexual Abuse material into any application. So it's sort of taking a slightly different approach.
Rosalia Rivera 10:15
Yeah. So really going, like closer to the source of where, like you said, upstream. And I love that term, because it always makes me think of a parable, I guess, where there's these kids that are drowning, and they keep getting like they keep finding these kids that they're pulling out of the river. And more and more kids are coming, you know, and so someone finally goes, Why don't we look upstream? Yeah, figure out where they're falling into the stream, to prevent them from even falling in the first place, right. And so I just love the idea of this, because, like you said, these device, these devices, right, are really part of what facilitates this to happen in the first place. It's where the apps get downloaded onto. And we know that kids at younger and younger ages, unfortunately, are getting phones. A lot of parents aren't educated. And I mean, there's so many pieces to this puzzle. And definitely I can see how Apple plays a huge role in being part of not just the problem, but bigger than that the solution, right, they can be part of the solution. And I guess the other aspect of what you were talking about, I think that's really important is that when we think of social media apps, like most parents, I think, are starting to understand how an offender can access them, right? They contact them, and then they get a hold of communications with them, where they start to groom the child and ask them, you know, or coerce them or trick them into giving them some kind of self generated content, right, which we know is called Child Sexual Abuse material, whether it's generated self generated or procured in a different way. But then, now, an offender has that content. And they can upload it to the cloud and distributed among their networks or use it to groom or abuse other miners. And this zeolite compounds the issue, right and on top of it, if Apple is not doing anything to take it down, remove it or detect it, then that is now re victimizing the child who it was like who was used to produce it. So just, you know, to give sort of ground practical understanding for parents to see how this all interplays I think it's important that we recognize how big of a player Apple is in this whole picture, right, in this whole process.
Sarah Gardner 12:37
Yeah, that reminds me of something that I think is important to share, which is, Apple's whole ethos is privacy. Right? It's part of their commercial law privacy app. I mean, I think there's one commercial that's literally just like Apple comma privacy period, right. Privacy, and, and is not necessarily inherently good for children. Which is the and when I say that, what I mean is, so Apple's sort of tenant of like, we're completely hands off or not part, we're not part of any type of moderation stream, we're not looking at any of your anything you're doing. It works for, you know, not selling users data, and creating very private environments in terms of like your data being bought and sold, etc. However, for kids, just like in real life, we wouldn't necessarily say, oh, yeah, go play in a room with a strange adult, you just met with all the windows shutting the door of locked for five hours? Why would we want to recreate that online? And so the we need to make sure that Apple is also thinking about child safety as much as they're thinking about privacy and reconciling sort of those two tenants together. And I think that's something that most consumers of Apple products don't know, they know, Oh, I like Apple, because it's a very private company. But what they don't realize is, how does that then extend into my children's lives? And actually, what types of safety mechanisms do I want in place to ensure that they're safe, versus them being in environments that are completely sort of uncontrollable or have very little oversight?
Rosalia Rivera 14:28
Right, right, exactly. I love that framing. And for parents to understand that I think it's really key. Because, you know, if you are listening and you are a parent, and you're considering getting an iPhone, this is something you really need to consider and think about. And also know I think a lot of the rhetoric that comes back that I hear often not just from companies like Apple but all the social media companies is that it's all on parents to do this right. No regulating and monitored. During and but they are facilitating they're making it so easy for bad actors to have the capacity to do bad things. Yeah, that it just makes it that much harder for a parent who may not even be tech savvy. You know, there's so many updates that are constantly happening. And that, you know, I think Apple kind of prides themselves on like, hey, you know, we're always updating things. But even like you said, some basic things aren't even in place, like recording, if a child gets some kind of thing. And now they're like, oh, but we, you know, you can blur it. But you can also easily just say, Okay, well, let me view it anyway. And if you have a child who doesn't understand the risks that they're taking, like, how does that actually help? So definitely, there's a lot more that can be done. And I guess one of the things that really frustrates me about this, and why I wanted to talk to you as an expert in this is because Apple has the technology, and they have the ability and the capacity to actually create the mechanisms that could create child safety space, like a child friendly space, they actually went ahead and implemented it. Initially, I think this was like two years ago, or three years ago. And then they immediately pulled it back. So I was wondering if you can speak to that, like, why did they do that? Why didn't they continue, you know, moving forward with something that was so beneficial for its users, it's
Sarah Gardner 16:27
a really important sort of moment to point out that, I believe that there are a lot of people at Apple who do want to make these changes. And that was the energy and sort of the driving force behind them innovating, like you said, to create a very privacy Ford solution that did allow for the detection of known Child Sexual Abuse content. And just to spend a moment on that I know, just for the listeners, like when we're talking about see Sam, Child Sexual Abuse material, right images and videos of children being abused. And specifically, what we're asking them to do is to find images and videos that have already been seen before, most of them like 1000s of times, and it's so it's a fingerprinting technology that would allow them to find those. And like, as you said, that is a basic standard that Google and meta and x and Microsoft and everyone else is has in place in some way. So asking for the bare bare minimum here. But I think they did recognize that they didn't want Child Sexual Abuse material in iCloud, it's against their terms of service, you are not, in theory allowed to store it there, how they're enforcing that, we don't know because clearly they're not. But I do want to say that clearly, there are people there that do want to do something about it. And they were the ones that design that solution. What I would say about their decision to retract it is i It came out in the media that they were going to do this, they had announced several different features at once, which really confused people and the media picked up on this line of like, we're going to scan your phones, which is really an accurate for how the technology works, it makes it seem like they were going to look at every single photo, when that was not at all how the system worked or was designed. Instead of standing behind that solution and saying this was a very principled solution, we made this decision, we're going to continue forward. They felt maybe perhaps that that slew of media stories were too negative, and they pulled back. Our argument at the time was, you know, you're the one of the most valuable if not the most valuable company in the world, like you can withstand stand by your principles like get through this you will get right. Um, so I think that that points to the fact that the solution was perhaps not on firm footing internally. And so that is part of our campaign at the heat initiative is to it is to both put pressure on but also inspire those who are at Apple still, that the community of child safety advocates, but also the American public wants you to do this. And that's what our polling says, I think it's upwards of 86% of Americans believe all tech companies have a duty to to detect, report and remove child abuse content. Most Americans think companies are already doing this at scale, which is an interesting sort of challenge at times, actually is for us to be pointing out something that's so obvious. So I think where we need to focus now is in a world where companies are always navigating multiple priorities, how do we make this their top priority and speaking to the parents for a moment, like and you were sort of describing this too, if you're thinking about buying your kids first iPhone own, because they're in that age range, our argument would be wait until Apple does these three things to make the phone safe for your kids. And when Apple starts to see that argument, too, we hope that that actually becomes enough of a pressure point that they're like, Okay, this is this has now risen to a place where we need to revisit this. Yes,
Rosalia Rivera 20:22
yes, yes. Yeah. And I think that's, I want to stay on that for a second. Because I think that's really what speaks to Apple, right. And what's frustrating is, I could see that they caved to this pressure from the media, taking the you know, what, what they did in the wrong direction, and then saying, Oh, no, like, that's somehow, you know, going to hurt our bottom line, which it sounds like, that's really where that came from, and the the negative press or pressure and misunderstanding of what they were trying to do, and it's so sad, that they didn't stand on, like, Hey, this is actually really important. And this matters, and the technology is actually safe for your you know, content, it will contain, you know, will keep privacy at the forefront, and also protect, you know, just like you said, it's not about, you know, or, but But it's about and like we can do this, and we can also do this. And you're you know, what's what's really surprising is that there are other companies that are doing it, like you said, there's social media companies that are already reporting, detecting, removing, and so it's not impossible, and for a company with such, you know, think different, you know, as their slogan, now, they're just being indifferent about it. And we cannot allow them to do that, right, we can't allow them to just be indifferent to this issue and continue to stay under the radar. Because I think until your campaign through the heat initiative really has been shining light on this, they've managed to stay under the radar and really pretend like there's you know, nothing, nothing to see here, you know, yeah. And you, you had actually put out recently on Instagram, a statistic that came from NIC MC. And this was really interesting for me, I want your listeners to really understand what these numbers mean that so NIC make the National Center for Missing and Exploited Children came out with their cyber tip line data. And when you dug into it, you found that Apple was doing very little reporting. But when people look at the numbers they're seeing, like Facebook and meta have reported hundreds of 1000s of, you know, child sexual abuse, material content that they have found on their platforms. And then you look at Apple, and it was like 200, and something. And most people go oh, wow, like, like, it's not happening on Apple. And I just think that that's such a important data point to really unpack and help people understand what that actually means. So can you share more about that? Yeah, absolutely.
Sarah Gardner 23:01
And just one thing, too, is going back to your point about sort of abandoning this issue behind like deciding not to move forward. And that kind of driving you I just wanted to say that was also what drove me to leave my current job and go do this. And that's why he is also meant to represent all of us advocates in the space who saw that moment in time of one of the most valuable companies in the world saying something mattered, and then changing its mind. Nothing bad happening, like nothing happening. Because of that. That's what felt off was that realizing our field, and our movement didn't have the muscle to like, continue to apply that pressure. So I just wanted to say out loud that even though heat is leading on the campaign, we see ourselves as representing so many of the organizations and interests of, of parents and survivors, and also young people, right, and this is their content that they will not be able to get removed or taken down. So just plus one and get you to your answer on that one. You know, the reports in this field. It's a very tricky, nuanced subject, but I liken it to what it what it sort of looked like in the climate field a few decades ago with like emissions, right? So before companies were all doing things, and they were emitting pollution, and no one was regulating it, right. And so the way the the MC MC reporting works is it's looking at sort of like who's essentially reporting on their emissions. And so Apple is just like a company that would be like not even reporting it. All right. So it's like, do you want the companies where you actually can track and see what they're doing and how well they're addressing it? Or is there still like the rogue factory That's not even engaged or reporting at all right. And so it is very much a, they're just not looking for it. Not that it's not there. And one sort of tricky thing about the numbers in general is something we used to also say, at my old job, too, was like, this content is so agnostic in the sense, unfortunately, that predators and people who are sharing this content exists everywhere, and will use any platform available to them to share and trade this type of material. So it's really not any particular platforms fault for it being there in the first place. What we should be judging them on is the investment that they're making in figuring out how much how to remove it, how much investment that is, and whether they're really putting the weight behind, making sure that platform is safe for kids and also removing content that they don't want kids to come across. And to do that at scale can be expensive. It is a cost center. So right now companies are evaluating it thinking I'm losing money to make it safer. So how, like, what is the least amount of safe? I can do? Right? That gets me over the finish line. And I think what we'll see over the next 10 to 15 years is actually more a model like climate, because now we have regulatory practices so that we can monitor them and sort of judge them across a series of metrics. And we need that for our field to say, Okay, this company is doing the 12 things we need them to do. Therefore, they get an A right versus companies doing two out of 12. And that was also one of the reasons for you asked why Apple, Apple has notoriously been absent from the sort of child safety conversation in space. Over the last few decades, they've made some moves more recently to just appear as if they're also more engaged. And they have, they have rolled out some product design features, like you mentioned the communication safety feature that didn't exist before, to help kids who are texting, navigate, potentially like nudes and so forth. But on a whole they've been absent. And one final note is if you look at the Australia, bows report that came out in 2022, where Australia kind of did what I mentioned before of evaluating every company next to each other, just going back to the reporting function in the fact they don't allow for reporting out of iMessage there was one page in that report where it was like all the other major companies, you know, respond to a report in a day, half day, two hours, three hours. And then at the bottom, there was a different category, Apple and Omegle. why do companies where you do go on their website, find it the email and email it and Omegle is now out of business? Yeah. Negligent practices in this space. So like, who are your sort of, you know, who are the companies that you're being benched against should be an indicator of how much investment you've made?
Rosalia Rivera 28:10
Right. Right. Wow, I didn't realize that. And that's, that's fascinating in terms of the how they are one of the richest companies in the world, making the smallest investment of like, what's the least that I need to do? Like, that should not be the ethos. When you're thinking about child safety practices? Yes, it's bare minimum, right. Which is, which is to be honest, like, and I don't like vilifying people or companies. Because I know that there. It's not a monolith. And there's good people in US companies. But at the same time, it's like, why is leadership? Not you know, like, why it's just such a, you know, the question that keeps coming back to that, and what I just want to remind parents too, and one of the reasons why I really want parents to listen to this, understand the issue and realize why it's so important to take action. I'll just give an example for you know, parents who are thinking, Well, I'm going to give my child a phone because I need to communicate with them. And I'm not going to allow them to have any social media and they aren't doing online gaming. Like they're being really practical, right? Yeah, we have to also remember that it's not always people who are strangers online that can abuse children. So even if your child has a phone, and they have a friend who is not a safe friend, or they have a family member who has their contact on their cell phone can still text them, then you know, the child may still have this communication happening. And there's nothing that's really protecting them. And if somebody's you know, the friend or cousin who's older, sends them an image and or, you know, gets them to send something and like this cycle can still happen, right? So it's really these devices says that need to work together both the apps, you know, like you said, even with the app store, like what apps? Are they downloading that Apple has just allowed to be on there, that now they can download onto either their iPad or their phone? There's so many pieces to this that I think parents need to be aware of why Apple needs to take better action. And then if they don't take better action, what are the consequences, right? Like we're talking about children that are being re victimized by the content that's being reshard of them, that they're being blackmailed with that they can't now obtain justice because law enforcement can't even get their hands on it. Like what are the impacts of not detecting, removing, you know, and reporting this stuff, like when a company can report it now, there's somebody that can take action and say, We're gonna go after this person and not allow them to do it to other children. So there's so many implications, I think that's why it was so important to just say, when you see these low numbers coming from Apple, it's not because this stuff isn't happening on their devices or on their cloud iCloud. And it's not because they're doing the most it's actually because they're doing the least so we need to do we need them to do better.
Sarah Gardner 31:11
Yeah, I think and look, they they'll say, Well, we're hardware, we're hardware company, we're not social media. And that is, it's fair, that is different than like Instagram. But exactly, to your point about iMessage. I mean, you would probably know better even than me the statistics around child abuse, and it being someone that the kid knows, or is in their general network, right, versus like a complete stranger, obviously, both occur. But once the person has the kids number, then they can communicate with them on the phone. And it reminded me of a case, two cases I wanted to mention, one was like two girls who were groomed and abused by their basketball coach. And that all happened in iMessage. They were teenagers, you know, you wonder about like, had they been able to easily report or were encouraged to like, would that have come out earlier and went on for a few years until the dad like figured it out? You know, it's those types of things that all happen on iMessage. And then a second example, of again, of what sort of the real life implications of this are is, you know, there are people who store and share known Child Sexual Abuse content. And when those accounts are flagged, oftentimes law enforcement will go in them and find new content. So that person is abusing a child in their life and consuming. We wouldn't get to that person without the detection of the known content. So it's people talk about, oh, it's harmless if the victim has already been identified, there's so many reasons why that is wildly inaccurate, not least of which is the fact that and this is really honestly what it boils down to. And what keeps me motivated and energized around this is I do not believe that an abuse video of a three year old that some gentleman man on using an Apple device can possess that and that that is his data. Like I just fundamentally disagree with that. I don't think that's his data. I think it's evidence of an abuse of a child. I think as a stand up, company, you acknowledge that and in your own environment that you run and, and protect, and have people pay for it's your obligation then to find that and remove it and remove the person off the platform and report it. So that is like really what's at the crux of the debate. Yeah, I
Rosalia Rivera 33:45
let you know, I really love that reframe to have like, this isn't just private data. This is criminal evidence that you're protecting. Again, that's such an important point to highlight. And for people to have that idea burned in their mind that this is not private data. This is criminal evidence is really what it is. So why are we allowing people to protect that criminal evidence? I think that's really important. So I mean, one of my questions was going to be, you know, why should this matter to parents? And I think we just answered that. I think that really drives the point home, and also why I think parents should be speaking out about this more. And you know, I think all of it matters, like, yes, the social media companies should be doing more. But I think putting the heat on Apple and I love the name that you chose for the organization. It makes so much sense we as a collective community of parents and stakeholders in this issue, whether you are a parent or not, there must be a child in your life. Right that I think everyone has a child that they know in their life. And it's ranges on all from all ages, like you said, you know, abuse of a three year old. I think a lot of people think it's only older teens, and you know, sometimes it's selfish unrated and they shouldn't be doing it. Well, the it's not just they just didn't wake up and go, Hey, let me you know, go and do this today. So there's definitely bad actors behind it. And companies like these are really ultimately protecting them if they're not doing more about it. So what can be done legislatively? Or for those who are listening? What can they do to support this campaign? I know you had like a petition on the website at some point. So can you share more about that? Oh,
Sarah Gardner 35:30
yes, thank you for that question so much. And I'm just I'm really excited at the idea too, that there are so many more parents that will know about this. And I, I want to make sure I know you also are sensitive to this, you know, I'm not anti technology. I'm not even like anti iPhone. But I do believe that years from now we'll look back. And we'll think about kids on Instagram, like younger kids, an 11 year old 11 year old on Instagram will feel like a kid in a car without a seatbelt. And I have that same reflection with the phone like, Well, my kids look at even me with my phone. And that will be like the smoking of our time, like, couldn't put their phone down. And so this is all more about like correction in moderation and like basic safety principles, not a all or nothing type of situation. And I never want parents to be fearful. So, but I do think this is a particular instance of corporate negligence that needs to be righted. So things people can do. One is, if you're listening to this, I would ask you to bring this up in conversation with one other person in your life. To your point at the very beginning, social media is kind of dominating the conversation right now. And that's good it needs to. But part of our challenge with this campaign is that because Apple has so successfully kind of like, pulled back and hidden a bit and stayed out of the conversation, people are not aware. And they it's very counterintuitive, almost. And so mentioning this pointing them in the direction of learning more about it is super helpful, just truly getting the word out, like think about the phone as well as the apps. The second thing I would say is you mentioned on our website, we do have a petition, where you can send an email, it's one email, and it's all anonymous, anonymized and automize. So that you can send an email to 20 of the leadership team members at Apple saying that you want these changes, you want them to create reporting for kids, you want them to share up the App Store safety standards, and we want them detecting known Child Sexual Abuse content. They've received 18,000 emails so far using that mechanism. So antastic a really big one. Yeah, because that will also help us when we're negotiating, when they say, well, nobody wants this. It's like, we can point to that and say, Actually, a lot of people do want this. And so that's another way people can help. And on the legislative side, there's so much there's so much activity right now, there's a few bills that are right at the cusp of passing cosa is one of those, the kids Online Safety Act, I would encourage people to look that up. And if they find that that aligns with their value system, contact their member of Congress or their senator. So yeah, thank you for that question. Because we do need so many more people engaged and interested in learning more about this.
Rosalia Rivera 38:22
Yeah, yeah. And those are great calls to action easy enough to do. I'm gonna definitely post the links for both the petition and cosa on the show notes for the episode. So for those listening, and you know, easy access, you can just click on those links. And definitely yes, like, talk to one other person, like, find a way to, you know, open the conversation on this. And instead of the small talk, we can make it an interesting talk, where, especially if you know that your kid's friend, just got a phone, Hey, did you happen to know, you know, and share some information from here or just share this episode? Right? I think that there is definitely momentum, you know, that we're seeing with parents. And that's why I want to encourage them to remember that they have a lot of power. And we are always that frontline. We can't just hand over the power to other people. Because as we can see, I think the term that you used makes it really drive drives it home for me, you said corporate negligence. It's not about demonizing technology, it's about being more conscious with our use of it. And making sure that the companies that are creating these, either apps or technology for minors who are also using it, like they have to think about all the different users, right? We have to make it safe and it's non negotiable. That part of it, I don't think is negotiable. And the other piece I know that you didn't say this, but I'm gonna say this is that I think it is important for parents to go until this device is safe. I'm going to abstain from getting it from my kid, like, we just got a flip phone, you know, we had thought about getting the Apple Watch. And when I heard about this campaign, I and I didn't realize all of these things. Right. So thank you for the education on this topic. We abstained from that, you know, so I encourage other parents to do the same until let Apple know, hey, until you do these things, we are not going to support the company. So you know, think different or be indifferent. Like, what's, what's it going to be?
Sarah Gardner 40:31
That is incredible. I'm definitely stealing that. So. Yeah. Well,
Rosalia Rivera 40:38
thank you so much. And I know that if there are other listeners, who aren't parents who want to, like, you know, contribute somehow to heat or to volunteer, is there a way for them to do that? Or are you just inviting them to, you know, the calls to action that we just shared? Because I know that there's some people who like, you know, either are part of the brave movement or other organizations, what can they do to support the heat initiative?
Sarah Gardner 41:02
Oh, that's so awesome. I mean, besides the ones I mentioned, I would definitely say share our content on social media, and kind of get the word out there. And there's an open letter that's signed by, I think it's 16 Other organizations who are supporting our demands, and it includes movements like the brave movement, and others. So I would really also recommend supporting those organizations financially and with time and volunteer energy, because that is sort of the like I said before, even though we're driving this campaign against Apple, we really do see ourselves more as a conduit for all of those organizations who stand with us in these demands. So yeah, I love
Rosalia Rivera 41:50
that coalition building. Yeah. Wonderful. Wonderful. Well, Sara, thank you so much for your commitment to this issue. And for the work that you've done in the past with Thorin, which, you know, continues to be an amazing organization as well, I just want to give them a shout out because I know they've been really leading in this work for a long time as well. And there's there's lots of good people there, you know, that continue to fight this fight. So I want to invite listeners to definitely share the content from heat. I know even Soren just to give them another quick shout out. They also have Thorn for parents. Now I know that you have helping with that. So you know, there's lots of resources out there. I just want to remind parents, there is a lot that we can do together. And we also shouldn't be the only ones doing it alone. So this is this is why this campaign I think, matters so much and why we need to hold these types of companies accountable for the work that they need to do. So. Follow heets social channels, join the email list to stay up to date with the campaign and upcoming events. I know that I'm going to definitely continue to do that. So I want to invite everyone to do that. Sarah, thank you so much for joining me today.
Sarah Gardner 42:59
Thank you so much. I had such a great time.