Law
enforcement officials, technology companies and lawmakers have long
tried to limit what they call the “radicalization” of young people over
the internet.
The
term has often been used to describe a specific kind of radicalization —
that of young Muslim men who are inspired to take violent action by the
online messages of Islamist groups like the Islamic State. But as it
turns out, it isn’t just violent jihadists who benefit from the
internet’s power to radicalize young people from afar.
White
supremacists are just as adept at it. Where the pre-internet Ku Klux
Klan grew primarily from personal connections and word of mouth, today’s
white supremacist groups have figured out a way to expertly use the
internet to recruit and coordinate among a huge pool of potential
racists. That became clear two weeks ago with the riots in
Charlottesville, Va., which became a kind of watershed event for
internet-addled racists.
“It
was very important for them to coordinate and become visible in public
space,” said Joan Donovan, a scholar of media manipulation and
right-wing extremism at Data & Society,
an online research institute. “This was an attempt to say, ‘Let’s come
out; let’s meet each other. Let’s build camaraderie, and let’s show
people who we are.’”
Ms.
Donovan and others who study how the internet shapes extremism said
that even though Islamists and white nationalists have different views
and motivations, there are broad similarities in how the two operate
online — including how they spread their message, recruit and organize
offline actions. The similarities suggest a kind of blueprint for a
response — efforts that may work for limiting the reach of jihadists may
also work for white supremacists, and vice versa.
In
fact, that’s the battle plan. Several research groups in the United
States and Europe now see the white supremacist and jihadi threats as
two faces of the same coin. They’re working on methods to fight both,
together — and slowly, they have come up with ideas for limiting how
these groups recruit new members to their cause.
Their
ideas are grounded in a few truths about how extremist groups operate
online, and how potential recruits respond. After speaking to many
researchers, I compiled this rough guide for combating online
radicalization.
Recognize the internet as an extremist breeding ground.
The first step in combating online extremism is kind of obvious: It is to recognize the extremists as a threat.
For
the Islamic State, that began to happen in the last few years. After a
string of attacks in Europe and the United States by people who had been
indoctrinated in the swamp of online extremism, politicians demanded
action. In response, Google, Facebook, Microsoft and other online giants
began identifying extremist content and systematically removing it from their services, and have since escalated their efforts.
When
it comes to fighting white supremacists, though, much of the tech
industry has long been on the sidelines. This laxity has helped create a
monster. In many ways, researchers said, white supremacists are even
more sophisticated than jihadists in their use of the internet.
The
earliest white nationalist sites date back to the founding era of the
web. For instance, Stormfront.org, a pioneering hate site, was started
as a bulletin board in 1990. White supremacist groups have also been
proficient at spreading their messages using the memes, language and
style that pervade internet subcultures. Beyond setting up sites of
their own, they have more recently managed to spread their ideology to online groups that were once largely apolitical, like gaming and sci-fi groups.
And
they’ve grown huge. “The white nationalist scene online in America is
phenomenally larger than the jihadists’ audience, which tends to operate
under the radar,” said Vidhya Ramalingam, the co-founder of Moonshot CVE,
a London-based start-up that works with internet companies to combat
violent extremism. “It’s just a stunning difference between the audience
size.”
After
the horror of Charlottesville, internet companies began banning and
blocking content posted by right-wing extremist groups. So far their
efforts have been hasty and reactive, but Ms. Ramalingam sees it as at
the start of a wider effort.
“It’s
really an unprecedented moment where social media and tech companies
are recognizing that their platforms have become spaces where these
groups can grow, and have been often unpoliced,” she said. “They’re
really kind of waking up to this and taking some action.”
Engage directly with potential recruits.
If
tech companies are finally taking action to prevent radicalization, is
it the right kind of action? Extremism researchers said that blocking
certain content may work to temporarily disrupt groups, but may
eventually drive them further underground, far from the reach of
potential saviors.
A more lasting plan involves directly intervening in the process of radicalization. Consider The Redirect Method,
an anti-extremism project created by Jigsaw, a think tank founded by
Google. The plan began with intensive field research. After interviews
with many former jihadists, white supremacists and other violent
extremists, Jigsaw discovered several important personality traits that
may abet radicalization.
One
factor is a skepticism of mainstream media. Whether on the far right or
ISIS, people who are susceptible to extremist ideologies tend to
dismiss outlets like The New York Times or the BBC, and they often go in
search of alternative theories online.
Another
key issue is timing. There’s a brief window between initial interest in
an extremist ideology and a decision to join the cause — and after
recruits make that decision, they are often beyond the reach of
outsiders. For instance, Jigsaw found that when jihadists began planning
their trips to Syria to join ISIS, they had fallen too far down the
rabbit hole and dismissed any new information presented to them.
Jigsaw
put these findings to use in an innovative way. It curated a series of
videos showing what life is truly like under the Islamic State in Syria
and Iraq. The videos, which weren’t filmed by news outlets, offered a
credible counterpoint to the fantasies peddled by the group — they show
people queuing up for bread, fighters brutally punishing civilians, and
women and children being mistreated.
Then, to make sure potential recruits saw the videos at the right time in their recruitment process, Jigsaw used one of Google’s most effective technologies: ad targeting. In the same way that a pair of shoes you looked up last week follows you around the internet, Jigsaw’s counterterrorism videos were pushed to likely recruits.
Jigsaw
can’t say for sure if the project worked, but it found that people
spent lots of time watching the videos, which suggested they were of
great interest, and perhaps dissuaded some from extremism.
Moonshot
CVE, which worked with Jigsaw on the Redirect project, put together
several similar efforts to engage with both jihadists and white
supremacist groups. It has embedded undercover social workers in
extremist forums who discreetly message potential recruits to dissuade
them. And lately it’s been using targeted ads to offer mental health
counseling to those who might be radicalized.
“We’ve
seen that it’s really effective to go beyond ideology,” Ms. Ramalingam
said. “When you offer them some information about their lives, they’re
disproportionately likely to interact with it.”
What
happens online isn’t all that matters in the process of radicalization.
The offline world obviously matters too. Dylann Roof — the white
supremacist who murdered nine people at a historically African-American
church in Charleston, S.C., in 2015 — was radicalized online. But as a new profile in GQ Magazine makes clear, there was much more to his crime than the internet, including his mental state and a racist upbringing.
Still,
just about every hate crime and terrorist attack, these days, was
planned or in some way coordinated online. Ridding the world of all of
the factors that drive young men to commit heinous acts isn’t possible.
But disrupting the online radicalization machine? With enough work, that
may just be possible.
No comments:
Post a Comment