With Sarah Sobieraj, author of Credible Threat: Attacks Against Women Online and the Future of Democracy. The Commissioner for Human Rights in the Council of Europe says, “just speaking out … about issues online, often when related to feminism, gender equality, sexual abuse or specific aspects of women’s rights, such as sexual and reproductive health and rights, may be a trigger for violence and abuse.”

The goal of this violence, says the Organization of American States, is to “create a hostile online environment for women in order to shame, intimidate, denigrate, belittle, or silence them by means of surveillance, theft or manipulation of information, or control of their communication channels.”

No wonder that almost one-third of people in Canada are hesitant about using social media or taking part in online discussions due to harassment concerns. We all lose out when women and gender-diverse people are silenced.

Over coming months, we’re delving into this with leading experts and content creators, releasing in-depth episodes every single week. We talk about the problem and what we can do to change it. We offer practical tips to help you in your digital life, and we talk about what it means to “take back the tech” for all of us.

Our guest Sarah Sobieraj is Professor and Chairperson in the Department of Sociology at Tufts University and is a Faculty Associate with the Berkman Klein Center for Internet & Society at Harvard University. She’s an expert in US political culture, extreme incivility, digital abuse and harassment, and the mediated information environment. Amongst other books, she is the author of Credible Threat: Attacks Against Women Online and the Future of Democracy (Oxford University Press, 2020), The Outrage Industry: Political Opinion Media and the New Incivility (Oxford University Press, 2014) with J. Berry, and Soundbitten: The Perils of Media-Centered Political Activism (NYU Press, 2011). Her scholarship can also be found in journals and venues such as the New York Times, the Washington Post, the Boston Globe, Politico, Vox, CNN, PBS, NPR, National Review, and The Atlantic, among others.

Content note: this episode addresses gender-based violence.

Transcript

00:00:02 Sarah

The venom that’s coming is really emerging from, I think an unsettling and often subconscious sense that these long, taken for granted, social hierarchies might be shifting, right? That they might be upsetting the footing of those who’ve been traditionally able to speak and be listened to without either being challenged or without more competition for even just airtime.

00:00:24 Andrea

When you break out the role you’re expected to perform, you can get attacked in digital spaces. The problem is that the role is so narrow for women and gender diverse people. Many are attacked for even existing.

I’m Andrea Gunraj from the Canadian Women’s Foundation.

Welcome to Alright, Now What? a podcast from the Canadian Women’s Foundation. We put an intersectional feminist lens on stories that make you wonder, “why is this still happening?” We explore systemic routes and strategies for change that will move us closer to the goal of gender justice.

The work of the Canadian Women’s Foundation and our partners takes place on traditional First Nations, Métis and Inuit territories. We are grateful for the opportunity to meet and work on this land. However, we recognize that land acknowledgements are not enough. We need to pursue truth, reconciliation, decolonization, and allyship in an ongoing effort to make right with all our relations.

00:01:25 Andrea

Whether you’re on social media, streaming platforms, dating messaging and meeting apps, or on game sites, if you’re a woman, girl, or Two Spirit, trans or non-binary person, you’re at greater risk of hate, harassment, and violence.

The Commissioner for Human Rights in the Council of Europe says, “just speaking out … about issues online, often when related to feminism, gender equality, sexual abuse or specific aspects of women’s rights, such as sexual and reproductive health and rights, may be a trigger for violence and abuse.”

The goal of this violence, says the Organization of American States, is to “create a hostile online environment for women in order to shame, intimidate, denigrate, belittle, or silence them by means of surveillance, theft or manipulation of information, or control of their communication channels.”

No wonder, then, that almost one-third of people in Canada are hesitant about using social media or taking part in online discussions due to harassment concerns.

We all lose out when women and gender-diverse people are silenced. Hostility, attacks, shunning, belittlement, and unsafety hurt some people more than others, but that’s the thing about abuse. It has a way of spilling out and chilling everybody, making our world worse for everyone, in the end.

Over coming months, we’re delving into this with leading experts and content creators, releasing in-depth episodes every single week. We talk about the problem and what we can do to change it. We offer practical tips to help you in your digital life, and we talk about what it means to “take back the tech” for all of us.

Today, we’re with Sarah Sobieraj, Professor and Chairperson in the Department of Sociology at Tufts University and Faculty Associate with the Berkman Klein Center for Internet & Society at Harvard University. She’s an expert in US political culture, extreme incivility, digital abuse and harassment, and the mediated information environment. She’s author of several books, most recently, Credible Threat: Attacks Against Women Online and the Future of Democracy.

A note about content: this episode addresses gender-based violence.

00:03:43 Sarah

My academic interests have to do with voice and visibility, specifically political voice and visibility, and how that’s shaped by interventions with the media. So how the media might make some voices more or less visible or audible, publicly. Personally, I love everything from baking to art to cards – I play an awful lot of cards – and even improv. I enjoy a number of different activities.

00:04:12 Andrea

What have you learned about the way gendered digital hate, harassment, and abuse plays out? What’s the root cause of this?

00:04:18 Sarah

There’s a few things that I noticed about it that I feel like can get lost in the broader story. One is that when women, who were the focus of my study, when women are attacked, they’re attacked on the grounds that they’re very identities are unacceptable. So it’s their perceived – often it’s perceived wrong, but it’s their perceived race, gender, ethnicity, religion and sexual orientation, etc. that are the central basis of condemnation. So that is the problem, and you can hear that in the abuse, the way it sounds. So if I think – because I am often looking at people who are politically, vocal, or talking about social issues – if I think, for example of the Vice President of the United States, Kamala Harris, her race and gender are frequently invoked. They’ll be gendered and racialized epithets, certainly stereotypes about the identity groups to which she’s perceived as belonging, sexualizing comments, etc.

Another noteworthy feature about the identity-based abuse is that, although it is intimate and ad hominem, the attacks are also simultaneously generic. If someone is calling you a skanky whore, that feels very personal, of course, but what you notice quickly is that the attacks are actually impersonal and that they could essentially be lobbed at any one from your social location. So you might look at something terrible that’s said about Kamala Harris, again, and find that that same exact insult, if you took her name out, could be used against any other woman of colour in a particularly visible space. So it’s this nonspecific misogyny that’s a red flag.

Again, there may be religious or racial, ethnic cues, these sorts of things, certainly. That sort of variation really reminds us that although that abuse looks and feels like bullying, like interpersonal bullying, the rage is much more structural. It’s actually rooted in hostility toward the voice and visibility of individual speakers as representatives of a group of people.

00:06:28 Andrea

You talk about digital hate and abuse as “patterned resistance” against growing visibility of women and gender-diverse people online. Tell us more about that – why is it so important to understand this patterned resistance?

00:06:41 Sarah

We tend to hear people talk about these phenomena even when you’re talking with a coworker or a friend about their experiences, but certainly also in the headlines, as though they are individual problems or personal issues rather than sort of patterned resistance to voice and visibility of specific groups of people. And the reason that this matters, that we understand that it’s actually resistance and anger that’s against entire groups of people is that it helps us to understand why the attacks are unevenly distributed –why women tend to get so many more, and why they’re unevenly distributed even among women.

The abuse is unusually burdensome for people with three sort of sets of attributes. So, the first is that women who are members of multiple marginalized groups. So, if you’re a woman of color, you’re going to be blessed, likely with racialized, gender-based attacks and race-based attacks, in addition to the kinds of harassment that white women might experience. Or if you’re a Muslim, Jewish or a member of another religious minority group, you’re going to find sort of additional layers of hostility.

Another group of women who are particularly targeted tend to be women who challenge or complicate realms that are male dominated. So, that might be if they’re active in speaking about gaming, technology, science, politics, policy, law enforcement, military, sports, right, any sphere that we think of as male dominated or that has typically been male dominated, there is more anger and hostility directed at those women.

And then the third category of people who tend to receive a particularly heavy dose of abuse are feminists, or people perceived as feminists, or women who I came to start thinking of as otherwise non-compliant with gendered expectation. And the non-compliance can be a pretty low bar. You don’t have to do a lot to be non-compliant. So, it might be someone who’s overweight and body positive or comfortable with their appearance. It might be a woman who’s simply in a position of power. It might be a woman who’s open about enjoying sex. And of course, you can be in multiple of these categories, right.

So, if you think of someone like maybe Anita Sarkeesian from Gamergate, you know who might be someone who’s abuse people are familiar with, even though it’s been almost a decade since that really came to our attention. You know, she was sort of at the intersection of those fears – she’s writing and gaming, perceived as feminist, her ethnicity was often misrecognized, but people would pull into account their sort of assessments of her.

The hatred that’s there is really a visceral response to these women as destabilizers and they are, by the way, destabilizers, whether or not they intend to be. So, if you are a woman writing about policy or science or tech, that you’re very presence and even more so if you are a woman of color, and certainly if you’re a feminist.

So, if we haven’t had to and I say that in air quotes, if we haven’t had to listen to the voices of Black, Muslim women or trans women, maybe, or Black trans Muslim women in gaming … The venom that’s coming is really emerging from, I think, an unsettling and often subconscious sense that these long taken for granted social hierarchies might be shifting right, that it might be upsetting the footing of those who’ve been traditionally able to speak and be listened to without either being challenged or without more competition for even just airtime.

Because if you’ve been benefiting from inequality by being able to hold the floor more often or hold more sway in conversations, especially if you’re someone who perceives the inequality as meritocratic right, it’s understandable that a shift toward access or inclusion or equality might paradoxically feel like injustice instead of more openness, right. Because now you’re sharing the floor. But if you perceive that floor as something that belonged to you in the 1st place, it doesn’t feel right. And I do think, again, that this can also be subconscious. Just sort of the sense that, “Why are people listening to you … I don’t have to listen to you”. That kind of thing.

00:11:05 Andrea

What’s the impact of patterned resistance on those who are targeted most?

00:11:09 Sarah

If I think strictly about the personal and I don’t want to suggest if I sort of separate it from the social, which isn’t entirely easy for me, but at the personal level – and not talking about me personally, but the women who I worked with in my research – many of them struggle with mental health, for example. There are women who have experienced PTSD as a result of the intensity of the attacks or the frequency of the attacks. A lot of them have faced economic disadvantage or consequences as a result. Maybe they’ve had to hire bodyguards, or maybe they need to take Ubers when they would have normally walked. Or maybe they need to do things like hire a reputation management firm to clean up things that are online.

But there are also the more subtle reputational harms that can happen. People also that I spoke with struggled with just social ramifications. People … a lot of what is said about women online is defamatory, it’s hurtful, it can be stigmatizing. It sort of complicates their lives. And I will say in terms of women’s ability to cope, one of the things that I think is most important and often also missing in the conversation is that your ability to manage an onslaught of sort of vitriol online, especially if it happens over a period of time, is also shaped by your privilege and position.

So, if I even look at the examples I just gave, if you think about whether you have the luxury of, for example, hiring that reputation management firm. Another thing women do will be outsourced their social media management and they have somebody else literally looking at and working through the comments and the abuse. But also, if you are someone, so to use my probably safer “on air” but still unpleasant example of skanky whore, if someone calls, you know, a faculty member, a senior faculty member at Harvard University, a skanky whore, that would not feel good. It would not be good, but it’s probably not going to be overly damaging if she’s established with a reputable, you know a well-known institution and has a body of work behind her. For women who are new in their careers, defamatory things that are said about them online, cruel things that are said about them online can really, really be very, very damaging. If you’re a freelance journalist, for example, right. Journalists are dealing with an enormous amount of hate online. If you’re a freelance journalist just starting out, or even, you know, a new staffer who has a permanent position, these kinds of things are said about you, it can cast doubt in complicated ways.

If you are, say a Latina, and the abuse that’s spread about you – the comments, the hate, the doctored videos … If that maps on to existing stereotypes about your racial and ethnic group, then they are perceived as more believable and more plausible, right? It’s going to have a bigger impact.

00:14:21 Andrea

How about the impact of patterned resistance as a group? As a society, even? How is it shaping us, whether we are fully conscious of it or not?

00:14:30 Sarah

Of course, the threat of unwanted sexual attention in particular, and violence, has always constrained women’s use of public space. If you think of things like the fear of sexual assault that looms over women when they’re alone in nature, or when they’re maybe in an elevator or parking garage, public transit, maybe on college campuses, right?

So, the fear of sexual assault, the sense of vulnerability and embarrassment, humiliation that come with street harassment often. Or the landmines that are presented by sexual harassment in the workplace, so these are always things that have shaped the way women use and feel comfortable using public space.

Literally decades ago, a geographer Gill Valentine studied the way women felt about and used physical public spaces and what she found was so interesting. She noted that women were constrained by what she called these mental maps of the public spaces that they used that were shaped by their fear of male violence and unwanted attention. So, they had pretty elaborate thoughts and codes about where they could go, with who, at what times, you know, under what circumstances. And when I’m listening to victims of digital abuse and harassment, it is clear that just like that threat of sexual intimidation and assault constrained where we go in physical spaces, digital abuse is creating a climate of fear and self-monitoring that mirrors those calculations that women are making when they go out into other public spaces, right.

It hovers over the keyboards of the women that I work with. I mean some of them, as this begins to accumulate, for some of them, they left their preferred social media platforms altogether. Some took lengthy breaks to sort of recover from fear, frustration, some are under pseudonyms, or use male avatars. Almost all were engaged in some sort of self-censorship, even if they wouldn’t define it that way, what I would see as self-censorship. Where they are choosing very carefully about what topics they will and won’t write about and where they’d like to publish them or are comfortable publishing them.

And this is especially true when the issues they want to address are controversial or their point of view is unpopular. Those who are most marginalized, they may be often bringing new points of view to bear or saying things that are in fact unpopular, if for no other reason, they’re unfamiliar or difficult to hear, it is difficult and it is scary, and they are weighing their choices carefully.

I think we are all already being impacted by identity-based abuse and harassment, and I’ll keep it to just, you know, three or four ways that it’s shaping the social landscape, particularly around democracy. But the first of which is, and it sounds very maybe romantic at this point, but, you know, a robust democracy is built on political discourse where people, including those without power, can discuss even the most sensitive topics. Abortion, guns, immigration, right? Share their ideas, experiences, and opinions freely. And when we lose those voices, right, if we lose those voices, and when we lose those voices, there are both policy implications, right, if we don’t know … it’s difficult to figure out how to approach different issues if we don’t know how they affect people differently, right.

And it also has epistemological implications about what we’re even able to know if we don’t know what the experience is like for, say, recent immigrants, right. It’s very difficult to process and think about the things we contend with, right. One of the key components of the abuse and harassment is disinformation. So, attacks on people, whether you’re talking about female journalists, public intellectuals, activists, people running for office, et cetera, they’re riddled with made-up information that casts doubt then on the reliability of news, often, on scientific findings, on proposed policies, on the integrity of the speakers.

If we cannot trust or we begin to lose all trust in journalism, or all trust in people who are running for office, or trust in information more broadly, that’s a real problem. Democracy is only legitimate to the extent that we have the ability to make informed decisions on our own behalf when we vote. And if we don’t have quality information, or we cannot discern reasonably generated information that’s been produced responsibly from information that is garbage, we’re not able to do that the way we need to.

Totally connected to the point I just made is that journalism itself is suffering. Female journalists’ self-censorship is not limited to the personal opinions that they share. It impacts what they cover and how they cover it. And this came up repeatedly in the interviews that I did where a woman would tell me, “Oh, I will not even touch a story on sexual assault” or “I’m not even gonna touch a story on this topic that we’re talking about, I don’t want to write about this, it’s not worth it”.

And then the last way I’ll mention is what we can think of as the pipeline problem. If you are on the sidelines and you are a young woman or a person from another marginalized group and you’re looking on and what you see are that activists, candidates, women in office, advocates, public intellectuals, right, are treated this way, it becomes pretty clear that political involvement and activism are high-risk endeavors that are probably best avoided. We do not need to lose a big chunk of valuable minds to the pool of possible people doing public service work. The senior journalists that I speak to say, “I sometimes don’t even know what to say when young women ask me about going into journalism, I don’t know if I would do it again.”

00:20:36 Andrea

You’ve been doing this research for a long time. Do any findings shock or surprise you?

00:20:42 Sarah

It almost makes me feel choked up as I think about it, but I think it is making me a little choked up. I think it’s difficult to convey how ugly and hurtful the harassment and abuse is. And even when I give sort of academic talks on the topic or try to write about the topic, propriety and a number of other things make it very difficult to share what it looks and sounds like. A number of the participants had the experience of someone doctoring images or videos from fairly gruesome, like torture-based pornography with their likeness in it. And for others, just even more, you know, mundane pornography. I remember one woman who said and this is not a direct quote, but essentially what she was saying was, “I know it’s not me and the people who know me know that it’s not me, you know, it might not even be a very good editing job, but it feels like it’s me, it is still humiliating”. And so, when someone takes that, puts it out into the universe, and you cannot, as we know right, you can’t really get that cat back in the bag because of the way digital technologies function, when that happens, it’s really horrible.

If someone’s making rape threats, which are incredibly common, they’re often not only very, very violent and graphic, you know it’s descriptive, but they will often also invoke their identities. It’s ugly, is my point.

So, I think the other thing, though, that remains shocking is it just how damaging it can be to someone’s career, to their life, to their mental health, their physical health, their well-being. That a person’s life can change in a matter of days as a result of a particularly intense or well-coordinated attack. And it may be years in the making to get back to a place where they feel safe and comfortable. I have women in my study who … there was a woman who changed her name and left the country, a woman in the study who’s had to work with bomb sniffing dogs to check for IED’s when she goes to give talks, right? These are unbelievably damaging when they are severe.

And I want to say also that in a way, it’s a paradox. When we attend too much to the severe ones, we do also run the risk of normalizing more routine, what has become disgustingly, routine abuse and harassment and hostility. So, it is not uncommon for me to have the experience of being at – when I was doing this book, in particular – being at a social event, somebody asking me what I work on and I say, “Hmm, I’m working on, you know, abuse and harassment and hostility online” and they’ll say, “Oh, I’ve been really lucky” and they might say, “I mean I get called a bitch and a slut, but it’s been … I’ve really not had a problem”. So, to have that sort of normalization, it makes it feel as though it is not okay or maybe it’s overreacting to complain or to say something.

For a lot of the women that I work with, and I think a lot of them in general, the abuse and harassment is coming in the context of their work environment. This is a new terrain, I think, for a lot of employers, that when you are asking someone, and let me say you might be asking them directly, like say, “You need to be on social media promoting your articles or promoting your campaign” or it can be indirect where the metric of the quality of someone’s work is how many views it’s gotten, or how many click-throughs, right … If you are in any way incentivizing their public voice and visibility, you have to recognize that that experience of being public is going to impact the people you are employing very differently based on who they are, the work that they’re doing and where and how widely they are seen.

00:25:05 Andrea

Can you share solutions you think will help those who are targeted, maybe when it comes to platforms and policies and systems?

00:25:13 Sarah

It is a really difficult problem because that sort of language processing tools, automated detection, is notoriously weak at getting meaning and satire, irony, education, right. It’s very difficult. You can, yes, create a “bag of words detector”. But that’s going to not work. And it’s not going to work well and it’s going to over-select in these types of things. But what we have going on now is interesting in that it’s almost hard for us to remember, but initially, we worked more online from a model that was “I’m going to say something, it’s going to be filtered and then published.” Just the way Clay Shirky would talk about it. You know, you’re going to go to a comment section. It’s going to go into moderation on your news story or whatnot. And then it’s going to be published. Now we are at a published, then filter. You know, it is sort of flipped, so everything goes out on most platforms. For something to be attended to, it needs to be flagged or reported or, you know, it depends on the language that they use on a particular site, in order to even get reviewed. And the problem with that for those who are targeted, is you can’t unsee or unread, right. It’s already sort of out there in the in the ether.

And the idea of reviewing everything you know, going back to a filter first, then published model at the scale that we’re at, I get that that might not be reasonable, viable. I also would not want to be the human being that does that work, right? There’s a ton of great research on how difficult and damaging it is to be a content moderator, but that’s a separate conversation. But I think that there are a couple things that would be reasonable asks of platforms. If someone has been targeted or is a likely target, say they’re running for office, right, I think it is in fact reasonable to ask tech companies for some stretch of time, whether it is a week, whether it is 2 weeks, it’s going to depend on the circumstances, that anything that is, again it depends on the platform, but directed at that person, does in fact get filtered first and then published.

Yes, it would be expensive. It would be difficult, but I think it is an in-between at least that would be something that would help. Another thing that would really help is if there were a probationary period when you made a new account. When someone’s being attacked, if an account gets, you know sort of banned or blocked or suspended, it is very easy to just create a new account, even to do that through automated procedures, if you instead had to earn your way out of you know, sort of a probationary period so that for some period of time or number of posts, everything you wrote went through the “filter then published” model and then you earned the right to then speak without first being filtered again. To me that seems an in-between that is asking a lot of tech companies, but that would be not the same as saying, “Okay, you’ve got to sort of monitor everything all the time”, which I don’t think is reasonable.

Again, it’s not going to address the crime thing, but if we had an advocate assigned to someone when they’ve been the victim of online abuse and harassment in the way we have someone assigned, often, if they’re the victim of, say, sexual assault, that person could at least support the target or victim by helping to answer the questions. You know, the questions, because I get them, the questions are predictable, right? Can I get this taken down? Where do I report this? What am I allowed to do? You know? Who can help me? So, it’s people who understand and could respond to the problem can give them the tools. And direct them to the places that can be the most supportive and to at least recognize that although we don’t have a law perhaps to go after the people who are doing this to you in many cases, we do have some ways to support you as you navigate this very difficult space that you’re in.

Because the police will say often, “Well, let us know if anything happens”. And so, they privilege physical harm above all else. There is real harm that happens when your reputation is damaged, when your mental health is damaged right, there are harms there that are not sufficient to really sort of respond to in the eyes of law enforcement.

I feel in a sense like the female journalists who are more senior who I talk to, I don’t really know, like people ask me, “What advice do you have for academics who are writing about sensitive or controversial polarizing topics?” I’m loath to give advice. I don’t want to tell people, “Don’t do it”. And I don’t want to tell people, “You have to just sort soldier through”, because it’s pretty unpredictable. There’s a sort of randomness about it that’s striking. You know you can have one person writing about very controversial content in a highly visible way who deals with some, you know, a little bit of negative blowback. You can have another person with perhaps even a slightly smaller audience writing in a similar way about a similar topic who becomes the center of attention for the wrong group of people and the effects can be devastating.

00:30:49 Andrea

Alright, now what? Check out Sarah Sobieraj’s book Credible Threat- Attacks Against Women Online in the Future of Democracy, published by Oxford University Press. Get the facts on gendered digital hate, harassment and abuse by visiting our fact page on canadianwomen.org.

While you’re there, read about our new Feminist Creator Prize to uplift feminist digital creators advocating for gender justice, safety, and freedom from harm.

Did this episode help you? Did you learn anything new? We want to know. Please visit this episode’s show notes to fill out our brief listener survey. You’ll be entered to win a special prize pack. This series of podcast episodes has been made possible in part by the Government of Canada.

Please listen, subscribe, rate, and review this podcast. If you appreciate this content, please consider becoming a monthly donor to the Canadian Women’s Foundation. People like you will make the goal of gender justice a reality. Visit canadianwomen.org to give today and thank you for your tireless support.