Disinformation always underpins hate. Always, always, always, always. It’s a lie at the heart of it and they found the places where they can spread those lies and they have colonized them incredibly effectively.
Countering gender, digital hate, harassment and violence, and the misinformation that underpins it, means changing the way digital spaces have been set up in the first place. How do we do that?
I’m Andrea Gunraj from the Canadian Women’s Foundation.
Welcome to Alright, Now What? a podcast from the Canadian Women’s Foundation. We put an intersectional feminist lens on stories that make you wonder, “why is this still happening?” We explore systemic routes and strategies for change that will move us closer to the goal of gender justice.
The work of the Canadian Women’s Foundation and our partners takes place on traditional First Nations, Métis and Inuit territories. We are grateful for the opportunity to meet and work on this land. However, we recognize that land acknowledgements are not enough. We need to pursue truth, reconciliation, decolonization, and allyship in an ongoing effort to make right with all our relations.
Whether you’re on social media, streaming platforms, dating, messaging and meeting apps, or on game sites, if you’re a woman, girl, or Two Spirit, trans, or non-binary person, you’re at greater risk of hate, harassment, and violence.
In our connected lives, it’s easy to forget the basic facts of digital media and the internet itself as a new technology. Take social media. Social media spaces are run by corporations, many of which are based in the United States. The most influential ones were launched out of the state of California. The global footprint of these companies is huge. There are over 36 million internet users and 33.1 million social media users in Canada. That’s the vast majority of our population. But users in Canada represent only a small slice of the world’s social media users. For all the time we spend on social media and the internet, it’s rather under-regulated. What a user wants – safety, connection, belonging, community – is often at odds with how the spaces have been set up. In Canada, like the rest of the world, we’ve got a lot of challenges with regulation.
It means that gendered digital hate, harassment, and abuse keeps happening every day. It’s pervasive, urgent, and growing.
In coming months, we’re delving into this with leading experts and content creators, releasing in-depth episodes every single week. We talk about the problem and what we can do to change it. We offer practical tips to help you in your digital life, and we talk about what it means to “take back the tech” for all of us.
We’re joined by Imran Ahmed founder and CEO of the Center for Countering Digital Hate. He’s an authority on social and psychological malignancies on social media, such as identity-based hate, extremism, disinformation, and conspiracy theories. He regularly appears in the media and documentaries as an expert on how bad actors use digital spaces to harm others and benefit themselves, as well as how and why bad platforms allow them to do so.
He starts by sharing what the Center for Countering Digital Hate is all about.
A note about content: this episode addresses gender-based violence.
It’s a US-headquartered nonprofit that looks at the way that digital hate and disinformation are produced and distributed on social media in particular. And we’ve realized over time that those have serious costs for our society. I realized it seven years ago when my colleague Jo Cox, who was a 35-year-old mother of two, a Member of Parliament in the UK, a colleague of mine, was assassinated by a far-right terrorist who’d been radicalized online. He shot, stabbed, and beat her to death, and as he did it, he was shouting slogans from the Internet.
That was my moment of clarity when I realized that social media was resocializing the way that many people thought about politics, about reality, driving people to the edge of – to insanity in his case.
Ever since, I have tried to work to study the way that bad actors use platforms, the way that platforms make it easy for them to do so. In fact, make it worse and then try to create costs for the production and distribution of hate and disinformation. How can we make it less attractive? How can we change the economics, the calculus so that we have safer online platforms?
When Jo was killed, which was in the midst of a referendum in the UK. It was the Brexit referendum. You know two things had happened in our politics really suddenly. One was the rapid influx of antisemitism into the political left. The second was this referendum that was full of conspiracy theories and hate. A variant of the great replacement theory was very prominent – that the EU was trying to bring in Muslims to rape 14-year-old white girls, to destroy the white race. And also, you know other sort of, you know, theoretically less malignant conspiracies like that election officers had been told that if you vote in pencil to rub it out and put a remain in pen. Which of course is a precursor to a lot of the election – that disinformation undermining elections that we see in the United States and elsewhere.
The first act that I did after my friend was killed and after I quit working in politics in Parliament was to study it, and I spent three years studying and I went to speak to the platforms. I knew executives there. And I’d show them what I was finding. But the problem is that they weren’t really listening, and I could tell that they weren’t really listening. Well, they were listening, but only because they knew that that meant I wouldn’t go and complain to the public or to the press or to anyone else. So, they made me feel that change was possible, but it wasn’t really.
And at the end of three years, in September 2019, we launched the public organization and you know we’ve gone from two guys who were sitting in the back rooms trying to work out what the hell was going on to a sort of a medium-sized organization with 20 staff around the world, probably in the forefront of studying this particular aspect, so the digital resocialization of our society.
That journey was really important in shaping our theory of change because we realized that it wasn’t just about the platforms. Of course, they’re full of normal people. Of course, those people don’t want to see more hate. I meet very, very few people, apart from hardcore hate actors, who want to increase hate. You know, when I worked with the, and we’ve worked with administrations on the left and right, what we realized is the platforms actually have an incentive to keep it on there because actually if they remove hate content, they remove some of the most engaging and emotional content on their sites, which reduces the amount of attention and therefore reduces their revenues. And they have to take an action. They have to have staffing in place to go and do that and that’s a double cost. You reduce your revenues, you increase your cost. And if you do nothing, well then, frankly, nothing happens. There’s no other mechanism, no other lever that is imposing costs on them. So, the calculus, actually almost a fiduciary duty of executives there, is to do nothing.
So, we’ve spent the last four years studying, well, how can we create new costs for companies? Whether that’s through regulation, through working with advertisers who will say we won’t fund, we won’t give you advertising if you continue to tolerate the spread of hate, but that’s how our theory of change works and it’s been incredibly successful in doing so.
Can you share more about how pervasive digital hate and disinformation and misinformation affects women and gender-diverse people and all kinds of marginalized communities?
You know, I grew up in, I’m 44 and I grew up in the North of England in Manchester, which is sort of a fairly socially conservative place. My best friend at school, Tom, is gay and he was telling this – I got married a couple of years ago, Tom was my best man and he told this story to my utter horror, to this entire room of people about how we grew up under Section 28 of the Education Act, which was what Thatcher introduced. It was basically the same as Florida’s “Don’t say gay” law. You couldn’t talk about homosexuality. Tom told the story of how when we were like 12 or 13, we didn’t know what it was that he was feeling because there was no one to tell us what gayness was. And so, we convinced ourselves it was a phase and, you know, I went, I bought – I was very tall. I was 6/1 when I hit 13 and I never grew another inch. But you know, I went to the train station, and we bought a magazine that had pictures of girls in bikinis, and I gave it to them and said, “Read this, and you’ll like girls”. And it’s a really sweet story of these two really dumb kids, but actually and, you know, in one respect, in another respect, it sounds as though I was trying to deprogram him. I wasn’t. He’s gay now and very happy and I’m very happy about that. But that was the world that we lived in and actually the world has gotten better since then for gay people, and I’m really, really happy about that.
I’m really happy that we’ve become more tolerant. I’m really happy that the way that women are treated in our society is better than when I was growing up. I’m really happy that the way that we treat trans people, that we treat people of diverse genders and identities … A lot of my staff are Gen Z and their openness, their vulnerability, their inclusivity, is extraordinary to behold and sets such a high standard for me.
But actually, I think that there has been a pushback and there are people that want us to desocialize us, to revert us to an atavistic state where women and gay people and trans people are considered lesser humans. They look for any space in which they can spread the hate and the disinformation – and disinformation always underpins hate. Always, always, always, always. It’s a lie at the heart of it. And they found the places where they can spread those lies and they have colonized them incredibly effectively.
And of course, because of the way those platforms work, because controversy actually, ironically, our annoyance, our fury, our rage at what they write amplifies them. It doesn’t actually dissuade them. It rewards them with further amplification that that leads to the greater visibility and therefore by definition, the normalization and therefore the resocialization of our society to believe that there is an increasingly big backlash against the rights that we have fought for and earned over decades, which are being stripped down and slowly dismantled in a matter of years.
In your writing and media work, you drill down on all kinds of relevant issues. I’m particularly wondering how you see artificial intelligence impacting women and gender-diverse people and the push for human rights in general.
It’s such an interesting question and you know in one respect the AI that most people have been talking about for the last year or two is really a very specific subset of AI – generative AI, which is about the production of content by algorithms which approximate intelligence by essentially copying the intelligence of other people. They take in vast amounts of other data, almost like a sophisticated auto complete. You ask it, what do you think about X? And it says what is the most frequent answer to that question. And that’s really what it is.
There are lots of different ways in which AI and what I think of sophisticated algorithms, so sophisticated mathematics are affecting our world and that can be some of the most potent and I think realized. The most currently present and exigent harms that have been created by AI are actually in things like the way that employment software, you know organizations that HR departments will use AI to sift through resumes. There are so many different ways in which AI is being used right now. But again, without any guardrails.
So we haven’t come to a conclusion as to how that AI should work and how it should serve society rather than reshape society. And a very good way of thinking about it is by using the example of generative AI. Generative AI necessarily reflects the biases of the content that’s put into it. If you feed into it, say – Elon Musk currently is saying that people can’t scrape Twitter to get lots and lots of data on what people say. It’s probably a good thing, cause from what I’ve seen, Twitter is full of some pretty stupid takes on things you know and a couple of occasionally good ones. But you know if you just, unfiltered, take in the Internet and take all content, scrape it, essentially and put it into your model, your large language model. Why are you surprised if the large language model outputs a load of nonsense?
And it’s the same with things like resumes. What you do is you then bake in the prejudices that exist in the material and in the coding, in the values that are being input into it and you make … And you almost sort of assign them some sort of legitimacy by calling them intelligence. And that’s really problematic. Ironically, I think one of the problems with AI is that we call it intelligence and not what it is, which is just, it’s just a shortcut for copying and pasting what someone else has done before, baking in some of the prejudices and problems. So, I think across a whole array of areas, AI hasn’t been thought through hard enough at a level at which it can be action. Meaningful action could be put in place to ensure that safety, transparency, accountability and responsibility is in place.
Your work makes me zoom out and think about these issues from a high level – the purpose and function of digital media companies that run spaces where we interact. It was built a certain way and that’s a big part of the looming hidden problem.
I do, I see a lot of virtue in – and I’m not a technophobe. I’m actually really – I’m very much a technophile. You know, growing up, I had a computer with 32K of RAM, when I was four or five years old, the BBC Basic Computer. And I’d code in basic, and I’ve got six siblings, so like the only way my parents could control us is by giving us a Super Nintendo and making us play against each other. It’s still a really important part of my life.
I moved to the United States in June 2020 during the pandemic and of course, social media was the means by which I received love, soccer, connection. I could express myself and be vulnerable and receive it back. And it was really important to me. But that’s not why social media companies created their companies. They created their companies because they knew that there was a market in creating new billboard space. And that’s all they are, they’re billboard companies – they sell ad space. 98% of Meta’s revenue, which has made Mark Zuckerberg 100 billion dollars as an individual. That’s what he’s worth. You know, faster than almost any human being on Earth in the history of mankind. He did that by selling ad space.
That the American view on growth and the economy is that you have to keep growing so like once they’ve built the basic system, which is pretty cool, then they have to start thinking, well, how do we keep them on there for longer and longer and longer? And that’s when you get changes like the news feed. So no longer are you allowed to see what your friends and family are saying. You have to see what the computer thinks is most likely to keep you addicted and what the computer realized very quickly was that it was actually that which gets you pissed off. You know that which makes you emotional, that which is sticky, makes you argue, and then you sit there seeing how many people like your response to the bad person.
You know what is their business now, and what has that business always been really, is attention and addiction and advertising.
We’ve been interviewing such savvy people who know their stuff and have so much to say. I wonder – does anything surprise you anymore in your work?
My wife says that I’m one of those people who can watch like a, you know, a mystery movie and like, five minutes in she’s worked it out and like even when they reveal the murderer, I’m like, “Oh my God, I can’t believe it was them”. Like I’m not actually like a sleuth, a Holmes-ian character, but I take things in their stride.
You know right now, for example, we’re being sued by Elon Musk and you know, after he then started calling me a rat on his platform, Twitter – calling, you know, calling people on my board – a couple of days later, Jim Jordan asked us for documents in … Who’s Jim Jordan – he’s a member of the House of Representatives, and he’s chair of the Subcommittee on Weaponization of the federal government. He’s investigating CCDH.
All these things happen simultaneously, and I have to admit I just, I see them as being the natural response to our success. And in that respect, of course, you expect the natural response to our success, just as we were a natural response to the harm. You can’t get away with creating that much harm for that long to that I mean people without there being someone who will stand up and go: “Sorry, that’s not acceptable”. So, I don’t take it personally, but it has been incredibly difficult. Believe me, we’ve had to raise money and we’re still trying to raise, you know, money to fight that lawsuit and make sure that we’ve got the best defense possible. However, I don’t get surprised by these things, I don’t get upset by them, I just, we just accept that as part of the cost of doing business.
And that said, that is business and then in the rest of my life is completely different in that I remember in my very, very early 20s, someone who I was very close to, my best friend’s boyfriend actually, said to me – he was Corsican – and he said to me, “Imi every day you have to find something in your life which makes you go, wow”. And I do, and I find something every day in my life, which makes me go “wow” so, you know, I am a blessed person in that respect.
What can we do to push for change and deal with problem platforms and a lack of accountability and recourse in digital spaces?
The hate and their attempts to resocialize our society are because they’re losing, because we’ve succeeded over decades and centuries to make the world a more friendly place and to increase the amount to which gay, women, trans people can participate fully in society and enjoy success.
The means by which they fight back are dirty. Things like the spreading of disinformation and lies deliberately to undermine tolerance, things like trolling and abuse.
I always say to people, “The reason you’re trolled is not because you’re a bad person, it’s because, or even they don’t even think you’re a bad person, but they know that it’s a means by which they can silence you”. And so, the most important thing is that you continue to have your voice.
Some of the wins that we’ve had in our society haven’t yet – they’re at the level of social law and so norms of attitude and behaviour and values. And actually banking those through legislation and ensuring that our legislative and regulatory framework reflect the morals of our society is really important. And that’s where our politicians have really let us down. I mean, I say this knowing a lot about the UK and US and the EU and less so about Canada. But you know, I really feel that our politicians lag behind society’s evolving tolerance and values really badly. And what they have failed to do is also understand the degree to which digital spaces are being used to even further push things backwards. In fact, they’re using the distorted lens that social media provides on opinions in society as a reason for not actually passing the legislation that would bank those wins and ensure that we have those rights protected in the law and that our values are reflected in the law as well.
I think that this battle – CCDH fights a tiny, tiny sliver of the battle. We look very specifically at how we can change the cost calculus within those companies themselves through the social media platforms themselves, through regulation and talking to the advertisers, making it costly for them to do nothing. But there are so many other aspects of our fight.
Of course, it’s all of our fight. But it’s not the fight that I’ve chosen to actually be an active combatant in. I’m a combatant in one aspect. But banking those other wins is important. And look at the way that we frame things. Look at the way that we explain to politicians. And I’ve worked with politicians for most of my career. If you can explain to them what’s really going on and the ways in which digital environments provide a distorted lens on reality, the scale of demand out there, the political salience of these issues of encoding the tolerance that we have in our society into our law, we will enjoy more success in the future. But you know this is a multi-pronged battle from their side and it has to be won in response. And beyond that I just I promise you our comradeship and our friendship and our assistance in any way that we can.
Legislation and regulation – there seems to be a lot of push back from digital media companies and it seems overwhelming to try to get any guardrails in place. What’s the biggest gap in regulation we need to hone in on?
Actually what you need is a day – you need a comprehensive framework. And so that’s why we developed, actually with Canadian lawmakers and with US, UK, New Zealand, Australia and the EU. In May 2022, we held a conference in Washington DC at CCDH where we brought together these lawmakers and said, “What would the day minimis requirements be for an effective regulatory system for social media”. And what came out of that is a STAR framework, it’s on our website counterhate.com. You can download it and read it. We’ve just released some new research showing that a majority of even the American public wants every single component of that STAR framework, which is safety by design, transparency of the algorithms, transparency of the economics, the advertising transparency of content enforcement decisions, accountability – so, there’s a there’s a governmental body to whom they have to provide answers and that they can’t just hide behind spin and lies as they do at the moment – and responsibility to be shares that when harms are created, that there’s actually costs there for them, not just for us in society. You know whether there are problems on their platform when they know about it, they fail to do anything about it, at that point, they should become partly liable as well out of negligence.
Trying to do a piecemeal ends up being an absolute disaster. Like all you’ve got is one lever and you know it’s insufficient. You need a comprehensive set of powers available. The EU with the Digital Services Act and the UK, with the Online Safety Bill actually provide a template that Canadian politicians could look at and could very, very rapidly adapt to protect Canadian citizens.
As human beings, like we forget what happened six minutes ago, let alone six years ago. Social media is only around 15-16 years old. It’s not been there forever. It’s not God-given. It’s not the immutable laws of physics. It is something that’s been around for a while, and it has done some resocializing of our society. We can, if we wish to, push back. What’s been said today is, well, “What if we destroy the Internet or destroy social media?” You’re not going to destroy the Internet by legislating or regulating. Other jurisdictions have already shown that it’s possible.
But more importantly, the status quo is not good enough, because actually what we are seeing is regression in our societies. They’re more polarized. They’re more vociferous. They’re more brittle. They’re more fragile. They’re more angry. They’re more hateful. They’re more full of disinformation. And that’s undermining our ability have safe, tolerant societies in which democracy is viable and in which we’re able to progress human rights and civil rights. This is a really, really important fight. And if you get involved in any way that you can, there are so many organizations out there doing great work and of course you can always go and see what we’re saying on counterhate.com, find our latest research, get it to your lawmakers, and ask them what the hell are you doing about this?
Alright, now what? Check out the wealth of research and reports from the Center for Countering Digital Hate at counterhate.com.
Get the facts on gendered digital hate, harassment, and abuse by visiting our fact page on canadianwomen.org.
While you’re there, read about our new Feminist Creator Prize to uplift feminist digital creators advocating for gender justice, safety, and freedom from harm.
Did this episode help you? Did you learn anything new? We want to know! Please visit this episode’s show notes to fill out our brief listener survey. You’ll be entered to win a special prize pack.
This series of podcast episodes has been made possible in part by the Government of Canada.
Please listen, subscribe, rate, and review this podcast. If you appreciate this content, please consider becoming a monthly donor to the Canadian Women’s Foundation. People like you will make the goal of gender justice a reality. Visit canadianwomen.org to give today and thank you for your tireless support.