Joan Donovan on The Open Mind: Social Media Platforms and Their Irresponsibility
Articles Blog

Joan Donovan on The Open Mind: Social Media Platforms and Their Irresponsibility


HEFFNER: I’m Alexander Heffner, your host
on The Open Mind. We’ve intently covered technologies that degrade
democracy and civil society here on the program, and today we resume that examination. Joan Donovan is the project lead for media
manipulation at Data and Society. She’s conducted action research to map and
improve the communication infrastructure built by protestors. She’s also researched white supremacists use
of DNA ancestry tests, social movements in technology, specifically identifying information
bottlenecks, algorithmic behavior, and organizations with likeminded networks. In the Guardian, she and Dana Boyd make the
case for quarantining extremist ideas. When confronted with white supremacists, newspaper
editors should consider what she and Dana talk about as strategic silence. Likewise, I would argue, broadcasters must
exert discretion in not covering Donald Trump’s dis-informational rallies, bigoted rallies
that have provided an open megaphone to terrorize minority groups. In this spirit, I wanted to invite Joan to
share her insights into this media downturn and how we can incentivize integrity in our
modern day communications. Thank you for being here, Joan. DONOVAN: Thanks for having me. It’s really a pleasure. HEFFNER: That really seems to me to be the
trillion-dollar question, how we can incentivize integrity or good behavior. Isn’t that the most important question
DONOVAN: Well, it’s part of it… of one of the things that I like to think about is
I like to keep the long view of the Internet in my mind. When I think about how we first began to build
what could have been called a wide area information system, which is the first iterations of the
Internet were military installations, but then also universities were thinking about
how do we connect libraries, how do we share books, how do we share legal documents, how
do we know that the document’s in the same place every time. And so as the Internet developed from this
informational knowledge base space, we then saw the influx of commercialization. We saw more open participatory culture online. People were able to administer their own email
networks. They were able to have their own chat boards. BBS was very popular and as people moved online,
the different subcultures that these people were sometimes ashamed to participate in public
space or under threat of violence couldn’t LGBT being one group of people that were early
adopters of the Internet. We saw that the Internet have really allowed
for free play of identity and free play of information in a way that did build some really
unique and important communities. And as we go through and think about the first
iteration before we get social media was this social networking, right? The first movie about Facebook, “The Social
Network.” And it was really about connecting people
to people. It had very little to do with it just distribution
of news, distribution of information. It was really about that people-to -people
and as we see older versions of the Internet that is about knowledge creation and knowledge
construction and about giving people information, collide with everybody being a potential author,
we have a different system of communication entirely that we haven’t built in some of
the important lessons that journalists have you learned from, you know, hundreds of years
of working with print, that I think that platform companies really need to take a hard look
at the ethics of journalism, see why they developed standards, ethics, protocols, and
adopt those ethics and protocols into their algorithms as well as into their protocols
and their content moderation. HEFFNER: One of the most pernicious influences
in terms of what must be quarantined are the algorithms on YouTube that are feeding people,
disinformation, misinformation in a lot of cases now outright bigotry. How are we, how are we to deal with that quarantine. DONOVAN: Yeah, so we’ve been thinking a lot
about this and doing multiple case studies on YouTube to understand when under what conditions
are you served certain content. So if you’re a big fan of a certain music
genre, you continuously watch bands that are part of this music genre. People who also watch those bands click and
like and subscribe to bands that like that genre. It builds a subculture and the algorithm learns,
machine learning, how to serve you then, more from that genre so that you stay on the platform
longer. It works the same for mom-versation groups. If you’re into stuff about babies, it’ll continuously
serve you stuff about babies, but in the accounts that we have where we’ve been watching lots
of far right and white nationalists content, we find that it’s very difficult to break
out of those echo chambers, that even if you go onto your own account with the intention
of watching something as stupid as cats eating pizza, that even in the recommendations and
the auto play for the next set of videos, it’s going to continuously try to serve you
things that it knows you like to watch. So when you even try to pivot out of these
echo chambers, it’ll continuously serve you more and more extremists are white nationalist
content. HEFFNER: I’ve analogized it to carcinogens
in the discourse. It’s feeding you hate and it’s not different
from what should be quarantined on Fox News when you had an analyst say to a black guest,
you’re out of your cotton picking mind. These are on YouTube, very explicit, and on
Fox News coded sometimes ways of calling people the N word and that’s what’s going on. So it’s mainstream
DONOVAN: And so when we’ve looked at some of that in terms of the subcultures of white
nationalists and when they do figure out that the algorithm is flagging for certain things,
especially the N word, they pivot. So we saw a big pivot, during Black Panther
there was a lot of talk about the movie and white nationalists wanted to participate in
the discourse about it because there were nationalist themes in Black Panther and but
what we saw is on message boards off YouTube because they knew the N word was being flagged
they were saying call them Wakandans. And so we watched, as a lot of the videos
from accounts that we follow that are known white nationalists would put Wakandans in
the title so that they could then search for this turn of phrase. And we notice very clearly that the subcultures
are often experimenting with the algorithms. And one of the difficulties, if you’re coming
at it from a content moderation point of view of the platform companies is they employ these
trust and safety teams but unless the trust and safety teams are looking off-platform
for coordination, they’re not going to see the effects of that in their platform very
clearly because they’re just going to see potentially an uptick in something like what
Wakandans and it’s going to make sense because of course everybody’s talking about this movie,
but they don’t know that it’s a code word for the N word for these white nationalists
communities. As well we’ve tracked the uptick in the word
pit-bull and they’re trying to replace the N word with the word pit bull to be a subset
of discourse around black criminality. HEFFNER: And this deals with an emotional
and psychological and societal issue of people having hate in their hears or not having hearts
not having souls in their desperate attempt to keep alive a bigoted sensibility. DONOVAN: Well, there’s that. But then there’s also the context in which
we are trying to understand this, is that you have, you know, we’ve, we’ve now had about
40, 30 to 40 years of identity based movements, getting very successful redistribution, not
necessarily wealth but of resources and of rights, particularly the Black Lives Matter
movement’s been, you know, really making inroads with dealing with police brutality. We see the Dreamers movement with DACA. And so when we look at these young white men,
what we see is a group of people that are trying to form an identity based movement,
but they are not thinking about it in the same way that we see other identity based
movements thinking about equality, justice, inclusion. We see them wanting to retain white privilege. They want to retain or return to an imagined
America where they are at the top and through this, what we’re trying to understand is how
those different movement logics are manifesting in media and manifesting in different coverage
of these white supremacist movements who really believe and know that media is the most important
lever that you can pull in order to get attention to your movement, all movements know this,
and so we see them coming out into public spaces. We see them showing up in public parks. Of course Charlottesville was one such rally
that they got a lot of media attention and they rode that wave for months and months
and months thereafter. HEFFNER: YouTube, whose parent is Google,
Facebook, Twitter. They don’t want to operate in a society that
has transcended racism, at least in terms of the contemporary norms. They want to continue to give life to the
hateful rhetoric in defense of this puritanical absolutist First Amendment notion, so when
we have you, Tarleton Gillespie, Zeynep Tufekci, my question always comes back to that – Google
and YouTube in this case in particular, their unwillingness to assert that there are norms
by which their members ought to subscribe. DONOVAN: Yeah, and it’s really difficult because
within Silicon Valley culture you have a distorted lens of diversity. Serena Ameroot writes about this and her work
on understanding the difference between Asian and white in Silicon Valley. So there’s a lot of Asian people in Silicon
Valley and so therefore Silicon Valley believes that it’s authentically participating in a
diversity in the workplace as well. There’s evidence that we’ve seen as well as
there’s an organization called Coworker that’s been mapping and tracking alt right influences
in Silicon Valley and how aligned with this notion that you should be able to represent
yourself and your people and your movement. We see in Silicon Valley a lot of people getting
very defensive about the notion that we should be opening up design in a way that has both
responsibility at the front and accountability at the back. So that means that responsibility means that
you think about your product and you think about and you and you invest a lot of time
and energy into understanding the ethics of the design and the deployment, just like you
would a market study for a drug. Right? What are the potential side effects that I
don’t see? We see Facebook is starting to develop a team
that can do some of that work, but because the algorithms and the systems and the, the
management philosophy is also in these corporations that are difficult to unbox, we don’t really
know what happens in the midstream part of the machine. And then what we lack is any system of accountability
where someone somewhere takes responsibility for saying yes, I provided the platform and
the space for these groups to organize and show up in a place and do this kind of violent
destruction. And so, that kind of system where we start
with responsibility, we do the time that it takes to develop the products in an ethical
way and then evaluate them, there’s many civil society groups right now working with Facebook
on what’s called the civil rights audit. And Color of Change is one group that’s working
on that, on the National Hispanic Media Coalition. And so they’re trying to understand that that
portion that very few researchers have insight into. And then on the backend we need regulation
or some form of regulators to step in because when we get to that step, what we notice is
there’s a lot of finger pointing between regulators and technology companies and congressmen,
but nobody is taking account of what’s really happening, right? HEFFNER: Right. We’ve talked about that on this program. The Honest Ads Act in particular are sponsored
by Amy Klobuchar, John McCain and Mark Warner, but with respect to this question of quarantining
racist content from my perspective as a broadcaster, it was heartening to see at least one or two
examples in recent weeks of most networks and cable outfits ignoring the Trump rallies
because of a collective acknowledgement that they were spewing hate and lies and that they
were not news events. In fact, the only way you could interpret
them as news events is that you’re misinforming or disinfecting a huge constituency of Americans. So at least when it came to CNN and MSNBC,
for the first time since the campaign, there was an acknowledgement, we’re not going to
go live to this Trump rally. That’s the way to quarantine it on television. We’ll talk about what George Lakoff describes
as a truth sandwich as opposed to a lie sandwich. So you start with the truth instead of starting
with the lie, then correct the record instead of the lead, the headline being the lie. So we need truth sandwiches in the discourse. Quarantining Trump rallies is one way to do
that. But how do you quarantine online? DONOVAN: Yeah, so this is a part of our next
step in our research at Data and Society. Dana and I have been thinking about what is
the best way to get platform companies to adopt a journalistic ethic and practice around
choices in what to serve people as news. And it’s a difficult thing because we have
so many people who have learned to write that are authors in and of themselves that do not
necessarily mainstream media, which to me really means large scale million plus viewers,
but we have a lot of people online that maybe have half a million viewers that are what
we might even call citizen journalists that have been incredibly important in serving
the investigative function of journalism now that we’ve seen quite a big decline in local
journalism as well as investment in investigative journalism. And so one of the tensions is that, of course,
you can serve more of the big market news and you continue to perpetuate those problems. You can focus on strengthening that core of
journalists that are using Internet technologies in order to do not, you know, like Jack Smith
is a good example of someone that has a very broad audience but isn’t breaking through
to that large level, so Mic for instance, is a good example of that, or Vox, you know,
how do you get investment in those kinds of media companies that a interesting journalism. And then you have a bunch of people that are
pretending to be journalists, right, so they’re incentivized by advertising and the money
that they can make, the clicks, they can get their incentivized through pandemonium and
mischief and the LOLs or they’re incentivized to be cloaked. They’re essentially cloaked political campaigns. And this is what we saw happen a lot during
the 2016 election as we see things that are self described as news masquerading as you
know, masquerading as news but there are definitely politically motivated campaigners and maybe
they’re not an endorsed by any election candidate, but they’re invoking First Amendment rights
and First Amendment protections of the press. And that’s an interesting split too because
First Amendment protection of the press is different from First Amendment protection
of just any old citizen, right. But the courts have been very clear that protection
of the press is something that really matters to them. And so when a group or an organization has
established themselves as press, they do get a little bit more latitude to say and do things
that might be considered libel or slanderous because of the importance of the work. HEFFNER: We talk about quarantine, but we
have to talk about deletion too, deletion of racist content, deletion of fictitious
fallacious content, like you said, masquerading as real news. And on Facebook and Twitter, they’ve had a
real struggle with not just the monetization of the misinformation, but having gone through
an entire campaign, a verifying with that blue checkmark, folks who were actually not
legitimate news sources and also being unwilling to remove content that was posing as real
journalism. And one example is if you Google Tom Payne
or Thomas Paine, you see an account that was a political operation and not a news gathering
operation. If you’re searching one of the most important
figures in American history on Google, and the first or second indexed result is a disinformation
campaign which actually poses as a Pulitzer prize winning or Loeb winning reporting outfit. That’s a problem. DONOVAN: When we see disinformation proliferate
online and we see strategic coordinated amplification using these tools, a lot of those tools are
right out of the box of marketers, right? So for a long time, marketers online knew
that one account is never effective, so they knew they had to be 100 or a thousand people,
right? And you could buy that technology and it was
fairly cheap and you could run your own small worlds network online to promote your products. And so that kind of technology just took a
while for states to adopt. But it’s, it’s a known quantity online. And, you know, anti-spam teams know exactly
how to do it. Groups that deal with lobbies online, know
how to spot these things. What I think was unexpected about the use
of bots is that we also see what we might call cyborgs participating. These are people that will operate the account
sometime as a Bot, but then other times as a human being so they might actually respond
to you. And this is confusing sometimes for the way
in which content moderation has done. So there’s one level of that and then we see
other nefarious people using those kinds of technologies, but we’re still stuck in the
mindset of one account for one person, but that’s not true online. Online is very much a magnification of society. It is not a reflection of it, but we have
a lot of researchers also invested in studying Twitter in such a way that they believe it
is a reflection of society. And so our research really hasn’t even caught
up with the technology being used to manipulate what we think about as information flows online. Another interesting case study is Wikipedia
itself because Wikipedia is used for a lot of natural language processing algorithms. But when we see manipulation on Wikipedia,
we know that there are downstream effects for other AI and machine learning technologies
and so when we’re trying to map all of these things out, what we’re really trying to understand
is also what platforms can do. And unfortunately the incentive for a lot
of platform companies because they’re so big, is to stay big and to stay global. And so when they want to stay global and stay
big, they have to be very general about all of their rules. Right? And so if you do start targeting things related
to white supremacists content, then they start parsing at these levels that are absurd to
a researcher like myself where recently there was a disclosure that in the Facebook content
moderation, slide deck, that there was a difference between white supremacy and white nationalism. And my first question is, but why? Right? And it’s because someone somewhere down the
line said to them that white nationalism is his identity and white supremacy is a sort
of a, an ideology. And I don’t agree with that. They both reach the same ends. But when you, when they get down to this level
of content moderation and they start parsing these things, they start making some very
irrational decisions because the labor involved in investing, the labor involved in doing
that correctly is enormous and they can’t just blanket it and ban all these words because
as we were talking about earlier, the subcultures evolve and they move and so that’s what’s
difficult about it is you can’t have this radically open platform and, and make money
and at the same time invest in the people power necessary to moderate all that content. HEFFNER: We only have a couple minutes Joan,
but what you’re saying is depressing because it shows just how far the companies are from
where they need to be in a collective ownership of their airwaves. DONOVAN: One of the main things that we’ve
spent the last year really focusing on is understanding from the platform perspective,
the challenges they really face in building out products that essentially haven’t changed
that much in the last five years. There’s just been a lot of attention to them
in the post 2016 election. But if we look to the last five years of activism
online, we know a lot of these problems existed prior to the campaign, but what we’re also
trying to understand is when these companies do provide access to fascists and dictators
and authoritarians and they promote it and allow them to use their ad targeting networks,
things of that nature that we do get governments that lean towards the right. And so what we need is a much stronger network
of people that are reacting to and are networked in such a way that they are trusting of one
another and then are mindful of when they’re being manipulated. And I wouldn’t say that all persuasive media
campaigns online, our manipulation, what I really understand as manipulation is when
there’s forgery, hoaxing, trickery involved, lies, sock puppet accounts. These are the things that we track. HEFFNER: So the other is persuasion. DONOVAN: Yeah, persuasion. You know, good old school propaganda might
be like a way that people talk about it, but persuasion campaigns where you have to stand
for something you do. And if social media has given us incredible
voice and I, you know, the discourses around clicktivism and slacktivism in 2011, 2012
were maybe the wrong things to talk about when we talked about what it means meant to
do action because online, what we do know is that access to information is paramount. Right now we’re trying to understand the relationship
between the government and society and media organizes our beliefs about society, right? But media is our most important lens with
which to view the world. And so media tells us how to understand foreign
policy. Media tells us how to understand government,
media tells us how to understand technology and I do believe that there are many great
doing an excellent job of pushing these issues and keeping them at the forefront of our public
discourse. But at the same time, and this might be a
product of cable television, which is we do see a difference in the kinds of things that
people share online versus the kinds of things that they are asked to watch day in and day
out on television. And I think that you’re right to point out
that what we need is more coverage, better coverage also on TV as well as on the radio
so that people are seeing a wide swath of information and that they are equipped with
better and more journalism, rather than, the echo chambers that the platforms are currently
keeping us stuck in. HEFFNER: Thank you Joan. DONOVAN: Thank you. HEFFNER: And thanks to you in the audience. I hope you join us again next time for thoughtful
excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind
to view this program online or to access over 1,500 other interviews and do check us out
on Twitter and Facebook @OpenMindTV for updates on future programming.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top