Don’t Let Terrorists Ruin the Internet

 

As of late Friday, it was still incredibly easy to access video of the New Zealand terror attack. Only a bit of searching found it still available on Facebook, where the massacre was first live-streamed before going viral on other social media platforms such as Twitter and YouTube. The gunman wanted amplification, and he got it. It was even easier to find the shooter’s rant, infused with white supremacy and deep familiarity with the online world and associated subcultures.

Not that tech companies aren’t trying to counter it. Indeed, they have every incentive to — both in the name of human decency and as companies already under tremendous pressure for inadequate content moderation. But a fast as the videos are pulled down, they are reuploaded. The platforms, despite cutting-edge AI and thousands of human moderators, are again proving “no match for the speed of their users; new artificial-intelligence tools created to scrub such platforms of terrorist content could not defeat human cunning and impulse to gawk,” writes Charlie Warzel in The New York Times.

But is this solid evidence of “massive incompetence” by Big Tech, as media columnist Margaret Sullivan charges in The Washington Post. I wish I knew for sure, but it’s doubtful. Live content appears particularly tricky to moderate. And throwing a legion of moderators with the best technology at the problem has proven insufficient even when just dealing with video.

Now there almost assuredly will be activists calling for new rules to make platforms more liable for the content on them. (This is probably already happening.) Yet that hardly seems like a solution — even putting aside the risk such a move presents to the fundamental openness of the internet — if the level of moderation effectiveness that the public and politicians want simply isn’t yet possible. Indeed, the recent announcement by Facebook of a strategy shift — toward encrypted person-to-person messaging, rather than one-to-many sharing — suggests more people and better AI won’t be a solution anytime soon. Easy solutions have yet to be found to this “impossible job.” And what measures are taken will assuredly have trade-offs in terms of the Cowen Trilemma: scalability, effectiveness, and consistency.

None of which should take Big Tech off the hook in terms of devoting more resources to the problem, particular when it comes to amplification. But it is the culture that built up the internet — one the shooter is intimately familiar with — that is at least as much the problem here as the internet’s basic infrastructure or how tech firms are responding. Users should probably have better moderation tools at their disposal, but what about our fellow humans who actively desire this sort of content in their feeds or timelines? Or the politicians who egg them on, whether explicitly or with subtlety, or ignore what appears to be a global supremacist movement that hates the West?

Some panicky pols would temporarily (I hope only temporarily) shut down various platforms during events like those in New Zealand. While terrorists take advantage of our open society, they also hate it. Let’s not do their job for them.

Published in Economics, Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 6 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Aaron Miller Inactive
    Aaron Miller
    @AaronMiller

    Are you suggesting that today’s generations would have been better off if videos of Hitler’s hysterical speeches and Nazi concentration camps had been destroyed?  

    Nobody likes that murderers can make themselves famous via live and far reaching media. But if we preserve footage of other atrocities to educate people about historical evils, what is the standard by which this recent evil would be erased from video records? 

    I expect that violence promoted through social media is more influentially communicated by rants and false histories than by demonstrations. 

    • #1
  2. EJHill Podcaster
    EJHill
    @EJHill

    At some point we are going to have to make a determination whether “platforms” are publishers or common carriers and define what they are and are not responsible for. But no law is going to totally stop the distribution of this stuff. 

    Child pornography is almost universally decried, yet it’s out there. And the effort to eradicate it is constant and never ending. The same with videos that show death. It’s depressing to stop and think about it.

    • #2
  3. MarciN Member
    MarciN
    @MarciN

    The attempt YouTube made to stop the reproduction and dissemination of the terrorist’s manifesto was tremendous:

    YouTube now has 10,000 workers devoted to addressing content that violates its rules.

    Facebook also tried to keep up with the uploadings:

    Facebook says it has more than 15,000 contractors and employees reviewing content, part of a 30,000-person department working on safety and security issues at the company. The department includes engineers building technical tools to block graphic content, as well as employees dubbed “graphic violence specialists” who make decisions about whether violent images posted on the site have social or news value or whether, as in the case of beheadings, they are meant to terrorize and have no place on the site, Monika Bickert, head of global policy management at Facebook, said in an interview in February.

    It was and remains so apparently impossible, according to the first article I cited, to catch these videos:

    The video was given a supercharged boost because of the way Mr. Tarrant promoted it. Before the shooting spree, Mr. Tarrant apparently posted his alleged intention to attack the mosque, and provided links to the live stream and an accompanying manifesto filled with white supremacist conspiracy theories on 8chan, an anonymous messaging forum favored by extremist groups.

    Part of the calculation, say internet researchers, was to take advantage of 8chan’s culture of archiving sensitive videos. By giving a heads up to the 8chan community about the attack and then posting a link to the live stream, Mr. Tarrant ensured that the video couldn’t be permanently deleted.

    “You have these groups of people who consider themselves quasi movements online and they believe they own the internet and as a result these calls to action are almost rote memory,” said Joan Donovan, director of the Technology and Social Change Research Project at Harvard University’s Shorenstein Center. “They’re just part of the culture, so if someone says that they’re going to commit some kind of atrocity then you will see this downloading and reuploading practice happen.”

    Facebook and YouTube are working on AI programs to stop the use of their platforms by terrorists, but it seems impossible to get rid of these videos completely. I am glad they keep trying and delete as many as they can. Their effort makes a statement in and of itself. 

     

    • #3
  4. unsk2 Member
    unsk2
    @

    This “living streaming”  of this New Zealand terrorist act probably was rightly pulled immediately. It was purposely meant to stir up hate and violence.

    The Media almost  immediately and unilaterally painted this attack as some sort of “right wing” Tribal score settling intent on  killing Muslims.  There was a raft of articles on the rise of “right wing” web sites and their link to violence. YouTube allegedly pulled a large number of “rightwing” videos and there were many, many calls for “something to be done” about right wing violence – the scourge of present day at least according to our righteous betters.

    However,  Brandon Tarrant by his own manifesto was an “eco-fascist” who  admired Communist China and by at least by some accounts targeted Muslims not because of their religion but because they were “invaders” of  pristine  New Zealand who were despoiling the environment because of their nasty habit of having so many children.

    If there is to be any censorship at all, it should be unbiased, which clearly has not been the case. Radical Islamic  and Left Wing sites have been allowed to preach hate  and foment violence unfettered for years with nary a peep from our new  self anointed potential censors, but all we hear about from the Media and these potential censors over and over again is the rise of “right wing ” hate.

    The Left has been looking it seems for many years without much success for  the “smoking gun” horrific “white supremacist” ,”right wing” violence that they have warning us about to to give them the justification to “control” this awful scourge and  to come down very hard on any opinion they deem to be “right wing” or “white nationalist” hate.  The Left has been just foaming at the mouth to get the chance to censor and squelch free speech, particularly “right wing” free speech.

    There may be some good reasons to limit extremist sites on the internet, but unfortunately in this highly partisan and biased time,  it seem highly unlikely that any agency or organization put  forward to regulate and censor such sites would likely not be able to do so without giving in to their innate bias which given the usually nominated people for such tasks  would be very biased  indeed.  The road to ruin of the internet  could be  paved with such “good” intent.

    • #4
  5. lowtech redneck Coolidge
    lowtech redneck
    @lowtech redneck

    The problem with the internet is that there is too much censorship, not too little.

    Whatever justifications are offered, a push for further restrictions will only be used as an excuse to de-platform and shut down conservatives on the internet.  Period.

    • #5
  6. Hank Rhody, Meddling Cowpoke Contributor
    Hank Rhody, Meddling Cowpoke
    @HankRhody

    James Pethokoukis: Or the politicians who egg them on, whether explicitly or with subtlety, or ignore what appears to be a global supremacist movement that hates the West?

    “global supremacist movement”? You mind explaining what you mean by that? Are you implying that this guy was backed by a giant network of neo-nazis? Or are you talking about the Muslims he’s shooting?

    • #6
Become a member to join the conversation. Or sign in if you're already a member.