The outsized abuse is meant to silence us. It’s meant to intimidate us, and they hope that by doing this we will feel ashamed, we will feel scared for ourselves, for our safety, for our family, and we will remove ourselves from the public conversation.
Gender digital hate, harassment and abuse happens every day. How in the world does one survive as a woman, girl or gender-diverse person online?
I’m Andrea Gunraj from the Canadian Women’s Foundation.
Welcome to Alright, Now What? a podcast from the Canadian Women’s Foundation. We put an intersectional feminist lens on stories that make you wonder, “why is this still happening?” We explore systemic routes and strategies for change that will move us closer to the goal of gender justice.
The work of the Canadian Women’s Foundation and our partners takes place on traditional First Nations, Métis and Inuit territories. We are grateful for the opportunity to meet and work on this land. However, we recognize that land acknowledgements are not enough. We need to pursue truth, reconciliation, decolonization, and allyship in an ongoing effort to make right with all our relations.
Facebook, Twitch, Discord, Tinder, X, Instagram, TikTok, YouTube, Reddit, LinkedIn, Zoom- digital life is real life and digital spaces aren’t safe for everyone.
Whether you’re on social media, streaming platforms, dating, messaging and meeting apps, or on game sites, if you’re a woman, girl, or Two spirit, trans or non-binary person, you’re at greater risk of hate, harassment and violence. One in five women experience online harassment in Canada. Young women, racialized women and 2SLGBTQIA+ people are amongst those who face higher risks. Canada’s rising hate crimes is in large part due to increased hate in digital spaces against women, 2SLGBTQIA+ people, and targeted ethnic and religious groups. Gendered digital hate, harassment and abuse happens every day. It’s pervasive, urgent and growing. You deserve to be safe and free from harm.
Over coming months, we’re delving into this with leading experts and content creators, releasing in-depth episodes every single week. We’ll talk about the problem and what we can do to change it. We’ll offer practical tips to help you in your digital life, and we’ll talk about what it means to take back the tech for all of us. There are no easy answers or quick fixes, but one thing is clear. Power, safety, support and rights to thrive today and tomorrow won’t be our reality until we’re safe everywhere, including digital spaces.
Today I’m with Nina Jankowicz, author and internationally recognized expert on disinformation and democratization. She’s Vice President at the Center for Information Resilience and has advised governments, international organizations and tech companies. She led the US Disinformation Governance Board and has had fellowships at the Wilson Center researching the effects of disinformation on women and freedom of expression around the world.
A note about content: this episode addresses gender-based violence.
I am by trade a disinformation researcher and author. Something important about me is that I have been the subject of a national, and in some cases even international, online hate campaign because of a role that I held in the Biden administration for a very short time combating disinformation. On a more positive note, I’ve published 2 books, one on Russian disinformation, called How to Lose the Information War, and the second one on being a woman online, which is called How to be a Woman Online and takes readers through kind of a play by play in how to be secure and safe online, but also how to deal with some of the things that we encounter as women online that are unique to our experience, that I can confirm my husband and male colleagues don’t deal with.
Even though I’ve been, you know, a professional in Washington, DC for the past over 10 years now, I’m sad to admit, I still like to do fun things, and though I recently became a mom, one of my favorite pastimes in my spare time is participating in musical theater. This is something my haters online love to make fun of, but I am not, I am not easily dissuaded from my nerdy hobbies, so keep it coming haters, keep it coming.
Your book, How to be a woman online, is great and it’s a practical read. How did it even start?
The book, How to be a woman online, actually stemmed from some broader research that I did at the Wilson Center with a group of women who were interested in unpacking gendered disinformation and online abuse in the context of the 2020 US election.
There had been a lot of work done looking at specific platforms or specific individual women, women versus men, but there hadn’t been a lot that was: A) Longitudinal in terms of more than just a specific couple days online, and also cross-platform. And so we did this pilot study comparing 13 women in mostly the United States, but we did include three women from Canada, the UK and New Zealand across various political parties, age groups, racial- and gender-diverse folks.
What we found was really staggering: over 330,000 pieces of gendered abuse or disinformation across a two-month period targeted at these 13 candidates – short period of time, wide look at platforms, and a very small amount of subjects. And I think it confirmed what we all knew, but we didn’t quite know how visceral and how bad the problem was until we had to look individually at each of these pieces of abuse to kind of categorize them.
We found also that so much of the abuse that we see online goes undetected and a lot of the stuff that we found, we wouldn’t have found if we hadn’t done such in-depth research looking at kind of the conversation around these individual candidates for office, and understanding the nicknames that abusers gave them, some of which would be kind of undetectable if you weren’t searching for them.
So this is where kind of my skepticism about artificial intelligence comes in, right? Like, because a lot of people are saying: “Oo, you know, AI can solve abuse, it can find it so easily, it can categorize it, it can, you know, penalize the abusers and take out human content moderation out of the equation”. Without humans looking at this stuff, we would have missed probably, I don’t know I can’t give you a specific number, but a good majority of it. And so we called this malign creativity, because a lot of the abusers were very purposely using what we call memetic language, or even images in order to avoid detection.
For instance, using images where they could use text so that the classifiers that were looking through content online didn’t find what they were posting about. Instead of calling someone “Jews” for instance, they could write “Juice” J-U-I-C-E, and the AI wouldn’t find it.
This is the type of thing that women are dealing with so much. So we’ll report something and it goes completely undealt with because, generally the computer on the other side doesn’t understand the implication, and sometimes the people don’t even understand the implication.
The problem is really, really much worse than I think many people realize that it is because we are missing so much of the conversation and the research that gets done when we just put search term for “whore” or “rape” into the search box.
Let me ask you a basic question. Why does hate and harassment and abuse happen online? What’s the goal of it?
In my experience, both my personal experience and my research experience, talking to women through focus groups, but also looking at the quantitative data, I think that people really like to try to push women out of public life.
The outsized abuse is meant to silence us. It’s meant to intimidate us, and they hope that by doing this we will feel ashamed, we will feel scared for ourselves, for our safety, for our family, and we will remove ourselves from the public conversation. Because when we looked across these many different platforms, 6 different platforms during the malign creativity study, we found the large majority of the abuse was on Twitter. And the reason that it was on Twitter is not just because it’s a popular platform, it is because, we hypothesize, people like to yell at their targets, not just about their targets. If they’re on a platform like Parlor or Truth Social, let’s say, it doesn’t have the impact that it has on the individual, the woman at the receiving end seeing it. It’s not as fun to talk about somebody behind their back. It’s fun to shout directly at them. And I think we’re going to see that shifting as you know, people coagulate around one platform in the post-Twitter era.
People went as far during the abuse campaign against me or during the height of it, I would say it’s still going on, people went as far as to comment on pictures of a cat that is long dead that I had posted on Facebook in a public kind of photography forum. They were in a Facebook group or on a Facebook page, and they saw and searched for my name on Facebook and saw this and commented on pictures of this dead cat so that they could get to me.
Similarly, you know, they’d send emails to me to my various addresses at the Wilson Center or at Syracuse University, where I’m an adjunct professor. All sorts of things. And the idea, again, is to intimidate us.
It’s really shocking to me, but I think the Internet does facilitate that, because these folks almost certainly would not have the courage to say something like that to your face if it had societal and social repercussions for them.
Your book includes a vivid scenario that talks about how we tend to downplay abuse in digital spaces. Why don’t we take it seriously?
This is something I think about a lot and there is a broad-based, I would say, minimization of the experience of women online: “It can’t possibly have an effect on women beyond when they shut down that social media platform or walk away from their device or computer”. And that really discounts A) women’s role in society, but B) the psychological experiences that happen on the Internet. And anybody who has, you know, seen a disturbing video online, even if it has nothing to do with you, should understand that that can have a very visceral reaction on your psyche, on your physical, and emotional well-being. And now imagine if you’re being sent death and rape threats, or even just visceral kind of vitriolic harassment from hundreds, or thousands, or 10s of thousands of people over a course of a short period of time.
It has absolutely an effect on the way that you move about in the world, particularly if that harassment or threats are coupled with doxing (when your personal information is released) and that is something that’s happened to me. I’ve also had the unfortunate experience of being recognized in public by people who, I don’t want to say wish me harm, but certainly people who weren’t fans of me, who then posted my real time location on their Twitter profiles. One of the guys had 10s of thousands of followers and he posted a picture of me.
I think people don’t realize that, again, especially in the post pandemic era, when we rely on so much for the Internet, our social connection, our careers, that it is just absurd to say to people, to women in particular, that you can’t use the Internet. Like, that’s what you’re asking of us when you say “don’t feed the trolls”. OK, let me just shut down all of my social media accounts, completely remove myself from the conversation, whether that’s a public conversation or whether I’m just somebody trying to live my life and connect with my family and friends online. You’re asking us to remove ourselves from that key part of public life, to not use the affordances that make our lives easier on the Internet because of the potential harm that it does.
Essentially what it boils down to is “stay at home and raise your kids”, right? Because if you’re not engaging in the public conversation, if you’re not connecting with people, then what are you doing? And that’s really scary to me.
Our policymakers also discount that. They have not yet bridged the gap between what we see online and kind of the harms caused, and the laws that we have for things like stalking, other types of abuse, you know, I’ve brought cases and opened protective orders against people who have harmed me online and I’m using statutes that are related to people who have been in domestic violence situations. And I’m lucky that I have judges who can, you know, see that connection, but that’s not always the case.
Just earlier today I was talking to a researcher in India who, through an organization called IT for Change, working on a judicial toolkit so that they can help judges in India interpret the legal code there through a feminist lens. They’ve also done training with law enforcement officers. We need more of that in the United States, frankly, and probably in Canada too.
We’re just really lagging behind in our understanding of the impact that online abuse, harassment, and other types of victimization have on people’s offline existence.
In the research you’ve done in this area, what are hidden issues that really stand out to you most?
Women of intersectional identities, so who either are, you know, racially diverse, ethnically diverse, religiously diverse, or sexually diverse experience compounded abuse. Myself as a white woman, I’ve experienced some pretty terrible stuff, but if I had been a woman of colour or a queer woman, I would have gotten way, way, way worse. In particular, Black women in the United States, they skip over all the quote unquote niceties of abuse and they go straight for the violent stuff. They go straight to doxing and targeting people’s families.
You’ve got a banner right behind you that says, “Until all of us have made it, none of us have made it”. And I think it’s really important to give voice to those experiences and the fact that we often overlook intersectionality in our conversations about online abuse against women. So that’s the first thing.
The second thing that I do think is pretty shocking and surprising is how little political will there is among the social media companies where this abuse is taking place to stop the abuse. It is really disconcerting to me that they don’t want to do more to make it a safe place for 50% of their users.
As a result of this inaction from tech companies, we have a severe gender disparity on the Internet. Women don’t use the Internet as much as men or to the same extent. So on Twitter, and this is an old statistic now, but men got retweeted twice as often as women. On Reddit, which is the Home page, the front page of the Internet, 75% of users are men. And there’s a reason for that, t’s because there’s some of the most disgusting, misogynistic diatribes I’ve ever seen on Reddit. And what that means, again, is that women are just removing themselves from the conversation. They are self-selecting and self-censoring out.
Your book shares some practical tips on what we can personally do to protect ourselves and each other in hostile digital spaces. Can you share some top tips with us now?
The 1st and most important thing is to think about how you secure your data.
In some ways I regret that it’s the first practical chapter of my book because it’s boring. It’s just talking about how to change your privacy settings and things like that. But having good passwords, having two factor authentication on your accounts and thinking about how you want to remove your personal data from the Internet and there are various services that you can subscribe to that do this for you. You can also do it on your own, it’s just quite time consuming to write these removal requests and learn how to do it for each of the platforms that your personal data might be present on, so I’m talking about ad exchanges and things like this, which are more prevalent in the US than they are in Canada, even affects Europeans, even with GDPR. So it’s a thing that we all have to think about.
That is the sort of stuff that will keep you physically safe. It will keep people from making sure that they’re not impersonating you. It will keep your personal information and kind of anything you don’t want public off of the Internet and it will make sure that it is more difficult to dox you. Not impossible unfortunately, because of public records. I find it very ironic that you know all our transparency in the West leads to people who are authoritarian in tendencies, using it against us, but that will make it more difficult for people to dox you and put you and your family at risk.
The other stuff I think that’s really important is understanding the policies of the platforms that you are choosing to have an account on. Recently, a friend of mine who’s an author had, in a long and unfortunate way, a local blog accidentally doxed her. It was a sordid tale. But they kind of refused to take it down and didn’t understand the implications of having that picture on the internet.
And so, I had to walk her through, like how do we report this tweet. She wasn’t aware of the reporting kind of guidelines and so one of the things that I advocate for in my book is for everybody to understand again what are the things that don’t fly on a platform? Where do you know and how do you know if it crosses the line and then what do you do when that line is crossed? And I’m shocked but not surprised again, that so many of the women that I speak to across a variety of different fields don’t know how to report stuff online or don’t take the time to do it. And that might be again because of the lack of political will that we spoke about. They know that it’s not going to go anywhere or have a feeling that it’s not going to go anywhere. But it does send an important signal to the platform that whether intentional or not, this is an account that has broken the rules. It gives them data about what to look for in the future. So that they can crack down on the people who are using that malign creativity and trying to get around the rules that we talked about before.
In most cases, it doesn’t take very long, so there are ways that you can do that. You can have other people do it for you, but it’s something that I think is part of being a good online bystander on the Internet. And that’s not just when it’s happening to you, it’s about when it’s happening to people that you see as well.
This one is a bit more proactive and not just defensive. But it’s about creating the sort of environment that you want to see online, and this was interesting to see on Threads when people started joining Threads on mass. People were talking about, you know, not reblogging or retweeting, for lack of a better word, the content of the bad actors who were trying to get on threads. They were like, let’s not do that here, let’s not amplify them, block them immediately, that’s going to give important information to Threads. And it was music to my ears to see that people finally were understanding how the oxygen of amplification works on the Internet.
If you want to see more diverse women sharing content online support them, share their content tell them how much it means to you. And if you see the baddies doing horrible stuff actively report them, block them, don’t engage with them. All of that sends really important signals again to the platforms about the sorts of things that people are finding useful online.
And then I think just calling out bad behavior when we see it, and sometimes you know this can be done in a way that incentivizes further abuse. And I’m not a big fan of that, so what I like to do, and I talk about this in the book, is shame people but not do it in a way that amplifies them further. So I tend to take a screenshot. I will blur out, or if I’m on Instagram kind of X out using the drawing tool, their avatar and their username, and then I make fun of them.
Which is kind of cathartic for me, but it doesn’t give them any amplification, like I said. It shames them and points out how ridiculous their behavior is and also, is kind of empowering for me, and for other women who are who are looking through this stuff and saying like, wow.
I had written a newsletter about my dog who recently died, and it was like an emotional piece about you know, this being that I spent eleven years with, 11 really important years, and I got so much positive feedback from it of people who really identified with it. And then one man who just wrote, “you are embarrassing”. And I thought that that was really interesting. It was kind of, you know: “Am I the embarrassing one? You’re the one that’s commenting on somebody’s post about their dead dog. I guess you’re really not a dog person, right?” That sort of thing I think can just really flip the script on like: “I’m not intimidated by you. I’m not going to give you what you want. I’m not going to take the bait, but I am going to call you out for your bad behavior.”
Let me ask you this: What needs to change on a systems level, decision making and policy and practice?
One of the things that I think needs to happen is better oversight of the platforms. And I’m not just talking about there needs to be a government agency who is telling them what people can and can’t post. I don’t think that that should happen. But what I do think that should happen is some sort of, you know, we have our federal commissions here that deal with elections or communications, I think there should be a federal Internet Commission or a federal social media commission that acts as a clearing house for data access. So it reviews algorithmic decisions that are being made, reviews content moderation decisions, acts as a vetting organization that can decide if researchers get access to certain data so that they can dig through it and make public some of the decisions that are being made by platforms or the effects that they’re having on people. Because right now the way that we’re operating is entirely based on whistleblowers and journalists and researchers who are doing things that aren’t always in line with, you know standard data practices or the terms of service of the specific platforms and we’re operating with a small amount of data.
Ways into the platforms are slowly being closed off and that allows us to pressure them much, much less than we were before. It’s an indirect effect, right? But the more that we get information about how those decisions are being made online, the broader picture we can have and the more accurate picture that we can have about how that stuff affects people offline.
I absolutely think we need to pressure lawmakers. I try to have hard conversations whenever I’m in the room with lawmakers or when I’m writing things now, holding their feet to the fire, really holding them to account because they hold the power. And you can have a bunch of working groups, it’s not just lawmakers, but policymakers at other levels as well.
The long and short of it is that we just need to keep speaking truth to power and I wish it didn’t have to be something that I’m doing, but it is something that I think can have a tangible impact for women in the future. And so I’m going to keep doing it.
So help us bring this episode home. What takeaway do you want us to leave with?
I think the other thing that is important to know about online abuse and the realities of being a woman online is that we tend to discount our experiences and it’s not until someone else gives them voice or recognizes them that we say, yeah, you know what that is kind of screwed up and I’m not going to stand for it. I think it’s important to talk about this stuff when it’s happening. It’s important to report it if you have the resources to report, in particular you know, sustained abuse, doxing, stalking, etcetera to law enforcement, it’s important to do that as well. And let me tell you, it is exhausting. It is not, again, something I would wish on my worst enemy. I have been going back and forth with my stalker on a protective order for the majority of this year.
I view that as also in its own way of form of change, because every officer that I encounter in filing that, every judge that I encounter is going to learn a little bit about what it’s like to be a woman on the Internet today. And that’s going to change the system more broadly for other women in the future, so I encourage folks to talk about stuff, to report it up through the channels as much as they can.
Our foremothers, if I may, started talking about the barriers they had at the workplace and, in particular sexual harassment that was just endemic. This is not part of the job of being a woman online. This is not the cost of raising our voices because the men in our lives don’t have to deal with the same thing. And so we shouldn’t stand for it.
Alright, now what? Check out How to Be a Woman Online: Surviving Abuse and Harassment and How to Fight Back by Nina Jankowicz, available in book and audio format.
Get the facts on gender, digital hate, harassment and abuse by visiting our fact page on canadianwomen.org.
While you’re there, read about our new Feminist Creator Prize to uplift feminist digital creators advocating for gender justice, safety, and freedom from harm.
Did this episode help you? Did you learn anything new? We want to know! Please visit this episode’s show notes to fill out our brief listener survey. You’ll be entered to win a special prize pack.
This series of podcast episodes has been made possible in part by the Government of Canada.
Please listen, subscribe, rate, and review this podcast. If you appreciate this content, please consider becoming a monthly donor to the Canadian Women’s Foundation. People like you will make the goal of gender justice a reality. Visit canadianwomen.org to give today and thank you for your tireless support.