Why every social media site is a dumpster fire
Articles Blog

Why every social media site is a dumpster fire


Time to unwind with a little Facebook. Facebook continues to be under fire for failing to crack down on fake news. It’s been called a haven for fake news. Russian trolls used Facebook to exploit racial
tension. Did you fall for propaganda from a Russian
troll? Jesus. Facebook is dead. YouTube! YouTube is the latest social media company under the microscope. Ads running on YouTube channels that promote white nationalism. Trending tab featured a conspiracy video that attacked survivors of the Parkland shooting. God. Never mind. Time for thirst traps on Instagram. Russian bot ads appeared on Instagram. Thousands of ads posted on Instagram meant to divide Americans. Okay, that’s it. I’m going back to my true safe space. The one place where nobody can hurt me. Pinterest. OH. MY. GOD. WHAT HAPPENED TO PINTEREST? There’s nowhere to go anymore. Every social media site has become a dumpster
fire. Why is this happening? What’s turning all these sites into fever
swamps? Aw crap. It’s me, isn’t it? Tech’s biggest companies are once again the platform for conspiracy theories. Hate speech Fake news Forcing YouTube and Facebook to apologize. That was a big mistake, and I’m sorry. Executives admitted their limitations but promised they will do a better job. Let’s start at the OG misinformation platform. No. No. What? No. The human brain. Humans are social animals at their root, and they’re constantly looking for reinforcement
signals or signals that we belong. Jay Van Bavel researches what kinds of information humans respond to on social media. Van Bavel did a study last year tracking what kinds of tweets were more likely to go
viral when it came to divisive issues like gun control. And he found that tweets that used moral/emotional
words, words like “blame,” “hate,” and “shame” were way more likely to be retweeted than tweets with neutral language. One of the reasons we think this is happening
is because when you’re using that type of tribal language it sends a signal about who you are, what you care about, and what group you belong to. This makes sense if you think about it. If you’re trying to signal to others that
you’re a real Ariana Grande fan, you don’t say, “I personally enjoy Ariana’s
music.” You say, “Ariana Grande is the best singer
of all time. Anyone who disagrees is an idiot.” You know, like, hypothetically. Are you playing with imaginary hair? That tribal desire to organize into “us” versus
“them” is a basic part of human nature. It’s why we have hardcore nerd fandoms — Cuz I am a Gryffindor! — and lose our minds at sports games. Yankees suck! Yankees suck! But when it comes to politics that desire can push us toward some extreme
views. One hypothesis is that when people are sharing the most extreme forms of political content, that sends the strongest, clearest signal about what their identity is, and it signals
very clearly who the outgroup is. You can see that tendency when you look at which US senators have the most Facebook followers. The further left or right, the more followers. When we rally around these politicians, it leaves no doubt about which tribe we belong
to. If they’re sending information that’s moderate, it doesn’t clearly signify who their ingroup
and outgroup is. And so there is this incentive structure potentially to share more and more extreme information to signal more and more clearly who you are and who you affiliate with. But these loud signals of our group status are way easier to do online than in person. And that’s because in the real world there are social costs to being a jerk. So normally when we’re talking about things
like politics with our friends at the bar or with our family
at Thanksgiving we have social checks in place that send us
signals that maybe this isn’t landing well with everybody. Maybe it’s resonating with your sister or
your brother, but your mom and dad are giving you the stink
eye. We get these signals all the time from people that we’re excluding them or that we’re rubbing
them the wrong way, and if we value those relationships we tend to tone down our language. And this is where the problem with social
media starts. Platforms like Facebook, YouTube, and Twitter, they’re designed to do one thing: keep you on the site for as long as possible. The more time you spend on the site, the more commercials, sidebar ads, and promoted tweets they can show you, the more money they make. If your goal is to get the greatest amount of engagement with an audience, you need content that’s going to be addictive. In terms of politics and news, stories are laden with emotion, that connect to our identities, and is morally arousing are the types of stories that are going to get people engaged the most. The problem is getting the stink eye is a really unpleasant experience. We don’t like being told that we’ve crossed a line or gone too far. And we are less likely to stay on a website that makes us feel that way. So social media sites have been designed to protect us from stink eye. To cater to our tribal nature by figuring out what we like and showing us more
of it. By identifying what products and people and
politicians you like, they can identify with some degree of certainty what your politics are and then feed you back more and more information that confirms those beliefs. It’s not just algorithms doing this. These websites invite us to sort ourselves
into tribes. We get to follow and subscribe to people we
agree with, block sources of information we don’t like, and literally join groups of people who think the same way we do. You get content that confirms your beliefs and doesn’t challenge you, and then it’s a dissonance-free environment. You don’t have to face up to individuals and people who disagree with you. “Dissonance-free environment” is a fancy way
of saying a place where you don’t get stink eye. That study about how polarizing tweets got more engagement, it also found that those tweets rarely left our echo chambers. We’re getting tons of positive feedback from people who already agree with us. And all that positive feedback can push us further to the extreme. If I share some extreme political content and it gets a lot of likes, I realize because
I’ve been reinforced that that’s what people in my social network
like or that’s what’s more likely to go viral. And then I might try to match it or make my next post even more extreme to get more reinforcement. And if you’re looking for something more extreme
to share, these platforms will help you find it. Watch a few anti-immigrant videos — This is totally out of control — and YouTube’s algorithm will start recommending videos about white genocide. Join a few pro-Trump groups, and your Facebook feed will fill up with smear
campaigns and conspiracy theories. The 9/11 attacks themselves were orchestrated by the Bush administration. Even search for information about vaccines
on Pinterest, and your homepage will be full of anti-vaxxer
bullshit. And so you can see people potentially being led down this pathway of more and more
extreme posts. There is a social reinforcement system that would otherwise take a long time through lots of interactions with people in
our community that can now be done very rapidly and at an enormous scale. All of this makes social media sites goldmines for con artists, conspiracy theorists, and trolls who exploit our tribal mentality to get clicks
and views. Anybody can write a blog, however incendiary,
and if it has a catchy title or catchy content, people are going to share it. Kaepernick is an attention-seeking crybaby who takes out his perceived oppression on the flag and national anthem. I stand for our service members, our veterans,
our LEOs and our first responders. Not for the indulgent a-hole who disrespects
them. Follow me on Twitter and Instagram. It’s a terrifyingly effective strategy. A 2016 Democratic strategist said that when it came to which kinds of ads performed
best on Facebook, “ugly and incendiary won every time.” It’s a real tough life if you say you are
a liberal. Trump train moving ahead full steam. It ain’t too late if- The same is true for conspiracy theories and
fake news stories. One study looked at 10 years of true and false
stories on Twitter. The authors measured what they called “retweet
cascades”: chain reactions where the original story is shared and retweeted to a much larger audience. And when they compared the cascades of real
stories and fake stories, the fake ones reached thousands more people. And it didn’t matter that these stories were coming from small accounts. Anybody could go viral if the story triggered enough of a tribal response. In 2016, it was teenagers in Macedonia making thousands of dollars publishing fake election news on Facebook. After the Parkland shooting, it was random
YouTubers going viral by accusing students of being
crisis actors. The Russian trolls messing with our elections? They’re not superhackers. They’re people posting low-quality, highly emotional content that they know will
go viral. The Russian playbook exposed the architectural
flaws in products like Facebook, Instagram, YouTube. Anybody can run that playbook. So far social media companies have responded
to this by trying to punish bad actors. Facebook has suspended hundreds of pages tied to a Russian group. Social media companies are banning Alex Jones. Twitter has banned Milo Yiannopoulos. But punishing individual bad actors doesn’t
change the incentives that brought them to the platform
in the first place. One fake news writer told the Washington Post
that if Facebook cracked down on his content, “I would try different things. I have at least 10 sites right now. If they cracked down on a couple, I’ll just
use others.” And this is why getting rid of Alex Jones if you ban him, someone else will realize
they can get rich pushing the same type of agenda for
a period of time: followers, clicks, advertisers, speaking fees, and other opportunities that are incredibly
lucrative. The problem with social media isn’t that a few bad apples are ruining the fun. It’s that these sites are designed to reward
bad apples. And until these companies decide that there’s something more important than getting
people to watch ads, we’re going to keep seeing the worst of human
nature reflected back at us.

100 thoughts on “Why every social media site is a dumpster fire

  1. The real question is what to do about it. Vox is part of the problem, too. Notice the side reference implying that no one sane can ever question the efficacy/safety of any vaccine–indicating membership in the tribe of Leftism.

  2. Vox is shamefully anti american fake news outlet…. like and follow me on insta, youtube, myspace etc #doitliketomi

    I like vox, they're always informative and funny

  3. How about the video on old media outlets attacking new media in generalizing waves while never addressing their own faults?

  4. If you're a Republican male that follows Tomi Lahren you are a weak loser. She doesn't believe anything she says, she just says it because loser dudes think she's hot and she says what they want to hear so they follow her and support her and she makes money.

  5. I know there is a way to educate social users to be more aware of their tribal nature and how to be woke from their misinformed knowledge. I refuse to educate user by using Facebook.

  6. "Tribal nature?" I'm basically at the left edge of American politics and I keep getting Ben Shapiro recommendations on YouTube. Please tell YouTube to be a little more tribal for me.

  7. Carlos, you really do need to do an interview with Tim Pool. Also, both David Parkman and Thio Joe have videos on the lack of "cognitive ability" (for abstraction) with most Internet users (they both use the word "stupid").

  8. Also, it looks like Alex Jones made David Hogg a millionaire merely by trying to attack David (as did Laura Ingraham).

  9. The social media conmine. I made a video that can trigger Republicans and make them mad (unless they are Trump haters (a minority), of which they won’t get triggered)

  10. Me: So are you going to go out of your way to promote more moderate content in order to slowly break away this big problem?
    Media sites: CENSORSHIP!!
    (Facepalm)

  11. That's why I can't stay on any social media platform for longer than a few weeks without losing my mind. I feel like I get dumber because there is no one that critically assesses what me or someone else is saying. The loudest response you get is the number of likes on your post. You cannot see the stink eye of people that disagree.

  12. Yup the flatearth movement is a perfect example. A few shady guys spreading outrageous claims getting gullible people to fill their pockets…

  13. Pinterest sucks, use Tumblr instead. Or just use Reddit and only subscribe to subs you are interested to.

  14. Anti-vaxxer bullshit? What exactly is the bullshit? You obviously haven't seriously looked into the issue. If you really believe conspiracies don't exist, you might want to revisit history.

  15. so This social media is like John the baptist or was it Paul who strayed during the teachings of Jesus?… conect this to the bible for me?
    maybe Id care?
    – valley girl accent

  16. "The worst of human nature" is not people saying and thinking mean things about politicians on the socials- its acting on it. Yet, what is acting?

    Wars, genocides, apathy towards famine, rape culture, immigration policy motivated by racism, polluting the air and water, and exploitative labor all are things we simultaneously are bystanders-to and are participating in– trying to untie this simulteneaity is probably difficult.

    One part of me says, that ordinary men are the murderers of genocides. But they are pulling triggers, we a just typing, we have less culpability. Our apathy and inability to experiece vershtehen- an understanding of another- is somehow less bad than the murderers'. But why? The other part of me agrees with you in saying "the worst of humanity". Certainly the "act of killing", but is viewing "the act of killing" and not doing anything about it, just as bad?

  17. I really wanna know the song of this video that goes on to about 1:30 along the first duration Can you please let me know?

  18. Interesting that only the echo chamber on one side is shown to be the "bad actors" here. No bad actors on the other side of the echo chamber. Oh, is that a tribal message. Shoot. I'm doing it too.

  19. 2009: kids, don't believe everything you read online
    2019:mom, don't believe everything you read online

  20. In retrospect, it's actually surprsing that Vox would publish a video about this topic. It's not surprising, though, that they're too blind to see they're just as bad as the companies they mention in this video.

  21. I think it also depends on who you follow what you're searching. Cuz I never have this problem on my social media but it's definitely not a surprise that ppl do.

  22. Its all algorythme. Once you enter you can't escape. Unless you can wash your search history like searching something new or different

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top