- Spitfire List - http://spitfirelist.com -

FTR #1074 FakeBook: Walkin’ the Snake on the Earth Island with Facebook (FascisBook, Part 2; In Your Facebook, Part 4)

Dave Emory’s entire life­time of work is avail­able on a flash dri­ve that can be obtained HERE [1]. The new dri­ve is a 32-giga­byte dri­ve that is cur­rent as of the pro­grams and arti­cles post­ed by the fall of 2017. The new dri­ve (avail­able for a tax-deductible con­tri­bu­tion of $65.00 or more.)

WFMU-FM is pod­cast­ing For The Record–You can sub­scribe to the pod­cast HERE [2].

You can sub­scribe to e‑mail alerts from Spitfirelist.com HERE [3].

You can sub­scribe to RSS feed from Spitfirelist.com HERE [3].

You can sub­scribe to the com­ments made on pro­grams and posts–an excel­lent source of infor­ma­tion in, and of, itself, HERE [4].

Please con­sid­er sup­port­ing THE WORK DAVE EMORY DOES [5].

This broad­cast was record­ed in one, 60-minute seg­ment [6].

[7]

An Indi­an book plac­ing Adolf Hitler along­side Mahat­ma Gand­hi, Barack Oba­ma and (ahem) Naren­dra Modi.

Intro­duc­tion: We have spo­ken repeat­ed­ly about the Nazi tract Ser­pen­t’s Walk [8], in which the Third Reich goes under­ground, buys into the opin­ion-form­ing media and, even­tu­al­ly, takes over.

Hitler, the Third Reich and their actions are glo­ri­fied and memo­ri­al­ized. The essence of the book is syn­op­sized on the back cov­er:

“It assumes that Hitler’s war­rior elite — the SS — did­n’t give up their strug­gle for a White world when they lost the Sec­ond World War. Instead their sur­vivors went under­ground and adopt­ed some of their tac­tics of their ene­mies: they began build­ing their eco­nom­ic mus­cle and buy­ing into the opin­ion-form­ing media. A cen­tu­ry after the war they are ready to chal­lenge the democ­rats and Jews for the hearts and minds of White Amer­i­cans, who have begun to have their fill of gov­ern­ment-enforced mul­ti-cul­tur­al­ism and ‘equal­i­ty.’ ”

Some­thing anal­o­gous is hap­pen­ing in Ukraine and India.

In FTR #889 [9], we not­ed that Pierre Omid­yar, a dar­ling of the so-called “pro­gres­sive” sec­tor for his found­ing of The Inter­cept, was deeply involved with the financ­ing of the ascent of both Naren­dra Mod­i’s Hin­dut­va fas­cist BJP and the OUN/B suc­ces­sor orga­ni­za­tions in Ukraine.

Omid­yar’s anoint­ment [10] as an icon of inves­tiga­tive report­ing could not be more iron­ic, in that jour­nal­ists and crit­ics of his fas­cist allies in Ukraine and India are being repressed and mur­dered, there­by fur­ther­ing the sup­pres­sion of truth in those soci­eties. This sup­pres­sion of truth feeds in to the Ser­pen­t’s Walk sce­nario.

This pro­gram sup­ple­ments past cov­er­age of Face­book in FTR #‘s 718 [11], 946 [12], 1021 [13], 1039 [14] not­ing how Face­book has net­worked with the very Hin­dut­va fas­cist Indi­an ele­ments and OUN/B suc­ces­sor orga­ni­za­tions in Ukraine. This net­work­ing has been–ostensibly to com­bat fake news. The real­i­ty may well high­light that the Face­book/B­JP-RSS/OUN/B links gen­er­ates fake news, rather than inter­dict­ing it. The fake news so gen­er­at­ed, how­ev­er, will be to the lik­ing of the fas­cists in pow­er in both coun­tries, man­i­fest­ing as a “Ser­pen­t’s Walk” revi­sion­ist sce­nario.

Key ele­ments of dis­cus­sion and analy­sis include:

  1. Indi­an pol­i­tics has been large­ly dom­i­nat­ed [15] by fake news, spread by social media: ” . . . . In the con­tin­u­ing Indi­an elec­tions, as 900 mil­lion peo­ple are vot­ing to elect rep­re­sen­ta­tives to the low­er house of the Par­lia­ment, dis­in­for­ma­tion and hate speech are drown­ing out truth on social media net­works in the coun­try and cre­at­ing a pub­lic health cri­sis like the pan­demics of the past cen­tu­ryThis con­ta­gion of a stag­ger­ing amount of mor­phed images, doc­tored videos and text mes­sages is spread­ing large­ly through mes­sag­ing ser­vices and influ­enc­ing what India’s vot­ers watch and read on their smart­phones. A recent study by Microsoft [16] found that over 64 per­cent Indi­ans encoun­tered fake news online, the high­est report­ed among the 22 coun­tries sur­veyed. . . . These plat­forms are filled with fake news and dis­in­for­ma­tion aimed at influ­enc­ing polit­i­cal choic­es dur­ing the Indi­an elec­tions. . . .
  2. Naren­dra Mod­i’s Hin­dut­va fas­cist BJP has been the pri­ma­ry ben­e­fi­cia­ry [15] of fake news, and his regime has part­nered with Face­book: ” . . . . The hear­ing was an exer­cise in absur­dist the­ater [17] because the gov­ern­ing B.J.P. has been the chief ben­e­fi­cia­ry of divi­sive con­tent that reach­es mil­lions because of the way social media algo­rithms, espe­cial­ly Face­book, ampli­fy ‘engag­ing’ arti­cles. . . .”
  3. Rajesh Jain is among those BJP func­tionar­ies who serve Face­book [18], as well as the Hin­dut­va fas­cists: ” . . . . By the time Rajesh Jain was scal­ing up his oper­a­tions in 2013, the BJP’s infor­ma­tion tech­nol­o­gy (IT) strate­gists had begun inter­act­ing with social media plat­forms like Face­book and its part­ner What­sApp. If sup­port­ers of the BJP are to be believed, the par­ty was bet­ter than oth­ers in util­is­ing the micro-tar­get­ing poten­tial of the plat­forms. How­ev­er, it is also true that Facebook’s employ­ees in India con­duct­ed train­ing work­shops to help the mem­bers of the BJP’s IT cell. . . .”
  4. Dr. Hiren Joshi is anoth­er [18] of the BJP oper­a­tives who is heav­i­ly involved with Face­book. ” . . . . Also assist­ing the social media and online teams to build a larg­er-than-life image for Modi before the 2014 elec­tions was a team led by his right-hand man Dr Hiren Joshi, who (as already stat­ed) is a very impor­tant advis­er to Modi whose writ extends way beyond infor­ma­tion tech­nol­o­gy and social media. . . .  Joshi has had, and con­tin­ues to have, a close and long-stand­ing asso­ci­a­tion with Facebook’s senior employ­ees in India. . . .”
  5. Shiv­nath Thukral, who was hired by Face­book in 2017 [19] to be its Pub­lic Pol­i­cy Direc­tor for India & South Asia, worked with Joshi’s team in 2014.  ” . . . . The third team, that was intense­ly focused on build­ing Modi’s per­son­al image, was head­ed by Hiren Joshi him­self who worked out of the then Gujarat Chief Minister’s Office in Gand­hi­na­gar. The mem­bers of this team worked close­ly with staffers of Face­book in India, more than one of our sources told us. As will be detailed lat­er, Shiv­nath Thukral, who is cur­rent­ly an impor­tant exec­u­tive in Face­book, worked with this team. . . .”
  6. An osten­si­bly remorse­ful BJP politician–Prodyut Bora–high­light­ed [18] the dra­mat­ic effect of Face­book and its What­sApp sub­sidiary have had on Indi­a’s pol­i­tics: ” . . . . In 2009, social media plat­forms like Face­book and What­sApp had a mar­gin­al impact in India’s 20 big cities. By 2014, how­ev­er, it had vir­tu­al­ly replaced the tra­di­tion­al mass media. In 2019, it will be the most per­va­sive media in the coun­try. . . .”
  7. A con­cise state­ment about the rela­tion­ship between the BJP and Face­book was issued by BJP tech office Vinit Goen­ka [18]” . . . . At one stage in our inter­view with [Vinit] Goen­ka that last­ed over two hours, we asked him a point­ed ques­tion: ‘Who helped whom more, Face­book or the BJP?’ He smiled and said: ‘That’s a dif­fi­cult ques­tion. I won­der whether the BJP helped Face­book more than Face­book helped the BJP. You could say, we helped each oth­er.’ . . .”
[20]

Cel­e­bra­tion of the 75th Anniver­sary of the 14th Waf­fen SS Divi­sion in Lviv, Ukraine

In Ukraine, as well, Face­book and the OUN/B suc­ces­sor orga­ni­za­tions func­tion sym­bi­ot­i­cal­ly:

(Note that the Atlantic Coun­cil is dom­i­nant in the array of indi­vid­u­als and insti­tu­tions con­sti­tut­ing the Ukrain­ian fascist/Facebook coop­er­a­tive effort. We have spo­ken about the Atlantic Coun­cil in numer­ous pro­grams, includ­ing FTR #943 [21]. The orga­ni­za­tion has deep oper­a­tional links to ele­ments of U.S. intel­li­gence, as well as the OUN/B milieu that dom­i­nates the Ukrain­ian dias­po­ra.)

CrowdStrike–at the epi­cen­ter [22] of the sup­posed Russ­ian hack­ing con­tro­ver­sy is note­wor­thy. Its co-founder and chief tech­nol­o­gy offi­cer, Dmit­ry Alper­ovitch is a senior fel­low at the Atlantic Coun­cil, financed by ele­ments that are at the foun­da­tion of fan­ning the flames of the New Cold War: “In this respect, it is worth not­ing that one of the com­mer­cial cyber­se­cu­ri­ty com­pa­nies the gov­ern­ment has relied on is Crowd­strike, which was one of the com­pa­nies ini­tial­ly brought in by the DNC to inves­ti­gate the alleged hacks. . . . Dmitri Alper­ovitch [23] is also a senior fel­low at the Atlantic Coun­cil. . . . The con­nec­tion between [Crowd­strike co-founder and chief tech­nol­o­gy offi­cer Dmitri] Alper­ovitch and the Atlantic Coun­cil has gone large­ly unre­marked upon, but it is rel­e­vant giv­en that the Atlantic Coun­cil—which is is fund­ed in part [24] by the US State Depart­ment, NATO, the gov­ern­ments of Latvia and Lithua­nia, the Ukrain­ian World Con­gress, and the Ukrain­ian oli­garch Vic­tor Pinchuk—has been among the loud­est voic­es call­ing for a new Cold War with Rus­sia. As I point­ed out in the pages of The Nation in Novem­ber, the Atlantic Coun­cil has spent the past sev­er­al years pro­duc­ing some of the most vir­u­lent spec­i­mens of the new Cold War pro­pa­gan­da. . . .

In May of 2018, Face­book decid­ed to effec­tive­ly out­source the work of iden­ti­fy­ing pro­pa­gan­da and mis­in­for­ma­tion dur­ing elec­tions to the Atlantic Coun­cil. [25]

” . . . . Face­book [26] is part­ner­ing with the Atlantic Coun­cil in anoth­er effort to com­bat elec­tion-relat­ed pro­pa­gan­da and mis­in­for­ma­tion from pro­lif­er­at­ing on its ser­vice. The social net­work­ing giant said Thurs­day that a part­ner­ship with the Wash­ing­ton D.C.-based think tank would help it bet­ter spot dis­in­for­ma­tion dur­ing upcom­ing world elec­tions. The part­ner­ship is one of a num­ber of steps Face­book is tak­ing to pre­vent the spread of pro­pa­gan­da and fake news after fail­ing to stop it [27] from spread­ing on its ser­vice in the run up to the 2016 U.S. pres­i­den­tial elec­tion. . . .”

Since autumn 2018, Face­book has looked to hire a pub­lic pol­i­cy man­ag­er for Ukraine. The job came after years of Ukraini­ans crit­i­ciz­ing the plat­form for take­downs of its activists’ pages and the spread of [alleged] Russ­ian dis­in­fo tar­get­ing Kyiv. Now, it appears to have one: @Kateryna_Kruk [28].— Christo­pher Miller (@ChristopherJM) June 3, 2019 [29]

[30]

Oleh Tihany­bok, leader of the OUN/B suc­ces­sor orga­ni­za­tion Svo­bo­da, for which Katery­na Kruk worked.

Katery­na Kruk:

  1. Is Facebook’s Pub­lic Pol­i­cy Man­ag­er for Ukraine as of May of this year, accord­ing to her LinkedIn page [31].
  2. Worked as an ana­lyst and TV host for the Ukrain­ian ‘anti-Russ­ian pro­pa­gan­da’ out­fit Stop­Fake. Stop­Fake is the cre­ation of Ire­na Chalu­pa, who works for the Atlantic Coun­cil and the Ukrain­ian gov­ern­ment [32] and appears to be the sis­ter of Andrea and Alexan­dra Chalu­pa [21].
  3. Joined the “Krem­lin Watch” team at the Euro­pean Val­ues think-tank [33], in Octo­ber of 2017.
  4. Received the Atlantic Coun­cil’s Free­dom award for her com­mu­ni­ca­tions work dur­ing the Euro­maid­an protests [34] in June of 2014.
  5. Worked for OUN/B suc­ces­sor orga­ni­za­tion Svo­bo­da dur­ing the Euro­maid­an [35] protests. “ . . . ‘There are peo­ple who don’t sup­port Svo­bo­da because of some of their slo­gans, but they know it’s the most active polit­i­cal par­ty and go to them for help, said Svo­bo­da vol­un­teer Katery­na Kruk. . . . ” . . . .
  6. Also has a num­ber of arti­cles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advo­cates for the cre­ation of an inde­pen­dent Ukrain­ian Ortho­dox Church to dimin­ish the influ­ence of the Russ­ian Ortho­dox Church [36].
  7. Accord­ing to her LinkedIn page [31] has also done exten­sive work for the Ukrain­ian gov­ern­ment. From March 2016 to Jan­u­ary 2017 she was the Strate­gic Com­mu­ni­ca­tions Man­ag­er for the Ukrain­ian par­lia­ment where she was respon­si­ble for social media and inter­na­tion­al com­mu­ni­ca­tions. From Jan­u­ary-April 2017 she was the Head of Com­mu­ni­ca­tions at the Min­istry of Health.
  8. Was not only was a vol­un­teer for Svo­bo­da dur­ing the 2014 Euro­maid­an protests, but open­ly cel­e­brat­ed on twit­ter the May 2014 mas­sacre in Odessa when the far right burned dozens of pro­tes­tors alive. Kruk’s twit­ter feed is set to pri­vate now so there isn’t pub­lic access to her old tweet, but peo­ple have screen cap­tures of it. Here’s a tweet [37] from Yasha Levine with a screen­shot of Kruk’s May 2, 2014 tweet where she writes: “#Odessa cleaned itself from ter­ror­ists, proud for city fight­ing for its identity.glory to fall­en heroes..” She even threw in a “glo­ry to fall­en heroes” at the end of her tweet cel­e­brat­ing this mas­sacre. Keep in mind that it was month after this tweet that the Atlantic Coun­cil gave her that Free­dom Award for her com­mu­ni­ca­tions work dur­ing the protests.
  9. In 2014 [38], . . .  tweet­ed that a man had asked her to con­vince his grand­son not to join the Azov Bat­tal­ion, a neo-Nazi mili­tia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [39].” And he then trav­eled to Luhan­sk to fight pro-Russ­ian rebels.
  10. Lion­ized [38] a Nazi sniper killed in Ukraine’s civ­il war. In March 2018, a 19-year neo-Nazi named Andriy “Dil­ly” Krivich [40] was shot and killed by a sniper. Krivich had been fight­ing with the fas­cist Ukrain­ian group Right Sec­tor, and had post­ed pho­tos on social media wear­ing Nazi Ger­man sym­bols. After he was killed, Kruk tweet­ed an homage to the teenage Nazi [41]. (The Nazi was also lion­ized on Euro­maid­an Press’ Face­book page [42].)
  11. Has [38] staunch­ly defend­ed the use of the slo­gan “Sla­va Ukrai­ni,” [43]which was first coined and pop­u­lar­ized [44] by Nazi-col­lab­o­rat­ing fas­cists [45], and is now the offi­cial salute of Ukraine’s army [46].
  12. Has [38] also said that the Ukrain­ian fas­cist politi­cian Andriy Paru­biy [47], who co-found­ed a neo-Nazi par­ty before lat­er becom­ing the chair­man of Ukraine’s par­lia­ment the Rada, is “act­ing smart [48],” writ­ing, “Paru­biy touche [49].” . . . .

In the con­text of Face­book’s insti­tu­tion­al lev­el net­work­ing with fas­cists, it is worth not­ing that social media them­selves have been cit­ed [50] as a con­tribut­ing fac­tor to right-wing domes­tic ter­ror­ism. . . . The first is sto­chas­tic ter­ror­ism [51]: ‘The use of mass, pub­lic com­mu­ni­ca­tion, usu­al­ly against a par­tic­u­lar indi­vid­ual or group, which incites or inspires acts of ter­ror­ism which are sta­tis­ti­cal­ly prob­a­ble but hap­pen seem­ing­ly at ran­dom.’ I encoun­tered the idea in a Fri­day thread from data sci­en­tist Emi­ly Gorcens­ki [52], who used it to tie togeth­er four recent attacks. . . . .”

The pro­gram con­cludes with review [53] (from FTR #1039 [14]) of the psy­cho­log­i­cal war­fare strat­e­gy adapt­ed by Cam­bridge Ana­lyt­i­ca to the polit­i­cal are­na. Christo­pher Wylie–the for­mer head of research at Cam­bridge Ana­lyt­i­ca who became one of the key insid­er whis­tle-blow­ers about how Cam­bridge Ana­lyt­i­ca oper­at­ed and the extent of Facebook’s knowl­edge about it–gave an inter­view to Cam­paign Mag­a­zine. (We dealt with Cam­bridge Ana­lyt­i­ca in FTR #‘s 946 [12], 1021 [13].) Wylie recounts how, as direc­tor of research at Cam­bridge Ana­lyt­i­ca, his orig­i­nal role was to deter­mine how the com­pa­ny could use the infor­ma­tion war­fare tech­niques used by SCL Group – Cam­bridge Analytica’s par­ent com­pa­ny and a defense con­trac­tor pro­vid­ing psy op ser­vices for the British mil­i­tary. Wylie’s job was to adapt the psy­cho­log­i­cal war­fare strate­gies that SCL had been using on the bat­tle­field to the online space. As Wylie put it:

“ . . . . When you are work­ing in infor­ma­tion oper­a­tions projects, where your tar­get is a com­bat­ant, the auton­o­my or agency of your tar­gets is not your pri­ma­ry con­sid­er­a­tion. It is fair game to deny and manip­u­late infor­ma­tion, coerce and exploit any men­tal vul­ner­a­bil­i­ties a per­son has, and to bring out the very worst char­ac­ter­is­tics in that per­son because they are an ene­my…But if you port that over to a demo­c­ra­t­ic sys­tem, if you run cam­paigns designed to under­mine people’s abil­i­ty to make free choic­es and to under­stand what is real and not real, you are under­min­ing democ­ra­cy and treat­ing vot­ers in the same way as you are treat­ing ter­ror­ists. . . . .”

Wylie also draws par­al­lels between the psy­cho­log­i­cal oper­a­tions used on demo­c­ra­t­ic audi­ences and the bat­tle­field tech­niques used to be build an insur­gency.

1a. Fol­low­ing the sweep­ing vic­to­ry of the BJP in India’s elec­tions that exceed­ed the expec­ta­tions [54], there’s no short­age of ques­tions of how the BJP man­aged such a resound­ing vic­to­ry despite what appeared to be grow­ing pop­u­lar frus­tra­tions with the par­ty just six months ago [55]. And while the embrace of nation­al­ism and sec­tar­i­an­ism no doubt played a major role along with the ten­sions with Pak­istan, it’s also impor­tant to give cred­it to the pro­found role social media played in this year’s elec­tions. Specif­i­cal­ly, orga­nized social media dis­in­for­ma­tion cam­paigns run by the BJP [15]:

“India Has a Pub­lic Health Cri­sis. It’s Called Fake News.” by Samir Patil; The New York Times; 04/29/2019 [15].

In the con­tin­u­ing Indi­an elec­tions, as 900 mil­lion peo­ple are vot­ing to elect rep­re­sen­ta­tives to the low­er house of the Par­lia­ment, dis­in­for­ma­tion and hate speech are drown­ing out truth on social media net­works in the coun­try and cre­at­ing a pub­lic health cri­sis like the pan­demics of the past cen­tu­ry.

This con­ta­gion of a stag­ger­ing amount of mor­phed images, doc­tored videos and text mes­sages is spread­ing large­ly through mes­sag­ing ser­vices and influ­enc­ing what India’s vot­ers watch and read on their smart­phones. A recent study by Microsoft [16] found that over 64 per­cent Indi­ans encoun­tered fake news online, the high­est report­ed among the 22 coun­tries sur­veyed.

India has the most social media users, with 300 mil­lion users on Face­book, 200 mil­lion on What­sApp and 250 mil­lion using YouTube. Tik­Tok, the video mes­sag­ing ser­vice owned by a Chi­nese com­pa­ny, has more than 88 mil­lion users [56] in India. And there are Indi­an mes­sag­ing appli­ca­tions such as ShareChat, which claims to have 40 mil­lion users and allows them to com­mu­ni­cate in 14 Indi­an lan­guages.

These plat­forms are filled with fake news and dis­in­for­ma­tion aimed at influ­enc­ing polit­i­cal choic­es dur­ing the Indi­an elec­tions. Some of the egre­gious instances are a made-up BBC sur­vey [57] pre­dict­ing vic­to­ry for the gov­ern­ing Bharatiya Jana­ta Par­ty and a fake video of the oppo­si­tion Con­gress Par­ty pres­i­dent, Rahul Gand­hi, say­ing a machine can con­vert pota­toes into gold [58].

Fake sto­ries are spread by legions of online trolls and unsus­pect­ing users, with dan­ger­ous impact. A rumor spread through social media about child kid­nap­pers arriv­ing in var­i­ous parts of India has led to 33 deaths in 69 inci­dents of mob vio­lence [59] since 2017, accord­ing to Indi­aSpend, a data jour­nal­ism web­site.

Six months before the 2014 gen­er­al elec­tions in India, 62 peo­ple were killed in sec­tar­i­an vio­lence and 50,000 were dis­placed from their homes in the north­ern state of Uttar Pradesh. Inves­ti­ga­tions by the police found that a fake video was shared [60] on What­sApp to whip up sec­tar­i­an pas­sions.

In the lead-up to the elec­tions, the Indi­an gov­ern­ment sum­moned the top exec­u­tives of Face­book and Twit­ter to dis­cuss the cri­sis of coor­di­nat­ed mis­in­for­ma­tion, fake news and polit­i­cal bias on their plat­forms. In March, Joel Kaplan, Facebook’s glob­al vice pres­i­dent for pub­lic pol­i­cy, was called to appear before a com­mit­tee of 31 mem­bers of the Indi­an Par­lia­ment — who were most­ly from the rul­ing Bharatiya Jana­ta Par­ty — to dis­cuss “safe­guard­ing [61] cit­i­zens’ rights on social/online news media plat­forms.”

The hear­ing was an exer­cise in absur­dist the­ater [17] because the gov­ern­ing B.J.P. has been the chief ben­e­fi­cia­ry of divi­sive con­tent that reach­es mil­lions because of the way social media algo­rithms, espe­cial­ly Face­book, ampli­fy “engag­ing” arti­cles.

As else­where in the world, Face­book, Twit­ter and YouTube are ambiva­lent about tack­ling the prob­lem head-on for the fear of mak­ing deci­sions that invoke the wrath of nation­al polit­i­cal forces [17]. The tightrope walk was evi­dent when in April, Face­book announced a ban on about 1,000 fake news pages tar­get­ing India. They includ­ed pages direct­ly asso­ci­at­ed with polit­i­cal par­ties.

Face­book announced that a major­i­ty of the pages were asso­ci­at­ed with the oppo­si­tion Indi­an Nation­al Con­gress par­ty, but it mere­ly named the tech­nol­o­gy com­pa­ny asso­ci­at­ed with the gov­ern­ing B.J.P. pages. Many news reports lat­er point­ed [62] out that the pages relat­ed to the B.J.P. that were removed were far more con­se­quen­tial and reached mil­lions.

Ask­ing the social media plat­forms to fix the cri­sis is a deeply flawed approach because most of the dis­in­for­ma­tion is shared in a decen­tral­ized man­ner through mes­sag­ing. Seek­ing to mon­i­tor those mes­sages is a step toward accept­ing mass sur­veil­lance. The Indi­an gov­ern­ment loves the idea and has pro­posed laws [63] that, among oth­er things, would break end-to-end encryp­tion and obtain user data with­out a court order. 

The idea of more effec­tive fact-check­ing has come up often in the debates around India’s dis­in­for­ma­tion con­ta­gion. But it comes with many con­cep­tu­al dif­fi­cul­ties: A large pro­por­tion of mes­sages shared on social net­works in India have lit­tle to do with ver­i­fi­able facts and ped­dle prej­u­diced opin­ions. Face­book India has a small 11- to 22 [64]-mem­ber fact-check­ing team for con­tent relat­ed to Indi­an elec­tions.

Fake news is not a tech­no­log­i­cal or sci­en­tif­ic prob­lem with a quick fix. It should be treat­ed as a new kind of pub­lic health cri­sis in all its social and human com­plex­i­ty. The answer might lie in look­ing back at how we respond­ed to the epi­demics, the infec­tious dis­eases in the 19th and ear­ly 20th cen­turies, which have sim­i­lar char­ac­ter­is­tics. . . .

1b. As the fol­low­ing arti­cle notes, the far­ci­cal nature of the BJP gov­ern­ment ask­ing Face­book to help with the dis­in­for­ma­tion cri­sis is even more far­ci­cal by the fact that Face­book has pre­vi­ous­ly con­duct­ing train­ing work­shops to help the BJP use Face­book more effec­tive­ly. The arti­cle describes the teams of IT cells that were set up by the BJP for the 2014 elec­tion to build a larg­er-than-life image for Modi. There were four cells.

One of those cells was run by Modi’s right hand man Dr Hiren Joshi. Joshi has had, and con­tin­ues to have, a close and long-stand­ing asso­ci­a­tion with Facebook’s senior employ­ees in India accord­ing to the arti­cle. Hiren’s team worked close­ly with Facebook’s staff. Shiv­nath Thukral, who was hired by Face­book in 2017 to be its Pub­lic Pol­i­cy Direc­tor for India & South Asia [19], worked with this team in 2014. And that’s just an overview of how tight­ly Face­book was work­ing with the BJP in 2014 [18]:

“Meet the advi­sors who helped make the BJP a social media pow­er­house of data and pro­pa­gan­da” by Cyril Sam & Paran­joy Guha Thakur­ta; Scroll.in; 05/06/2019 [18].

By the time Rajesh Jain was scal­ing up his oper­a­tions in 2013, the BJP’s infor­ma­tion tech­nol­o­gy (IT) strate­gists had begun inter­act­ing with social media plat­forms like Face­book and its part­ner What­sApp. If sup­port­ers of the BJP are to be believed, the par­ty was bet­ter than oth­ers in util­is­ing the micro-tar­get­ing poten­tial of the plat­forms. How­ev­er, it is also true that Facebook’s employ­ees in India con­duct­ed train­ing work­shops to help the mem­bers of the BJP’s IT cell.

Help­ing par­ty func­tionar­ies were adver­tis­ing hon­chos like Sajan Raj Kurup, founder of Cre­ative­land Asia and Prahlad Kakkar, the well-known adver­tis­ing pro­fes­sion­al. Actor Anu­pam Kher became the pub­lic face of some of the adver­tis­ing cam­paigns. Also assist­ing the social media and online teams to build a larg­er-than-life image for Modi before the 2014 elec­tions was a team led by his right-hand man Dr Hiren Joshi, who (as already stat­ed) is a very impor­tant advis­er to Modi whose writ extends way beyond infor­ma­tion tech­nol­o­gy and social media.

Cur­rent­ly, Offi­cer On Spe­cial Duty in the Prime Minister’s Office, he is assist­ed by two young pro­fes­sion­al “techies,” Nirav Shah and Yash Rajiv Gand­hi. Joshi has had, and con­tin­ues to have, a close and long-stand­ing asso­ci­a­tion with Facebook’s senior employ­ees in India. In 2013, one of his impor­tant col­lab­o­ra­tors was Akhilesh Mishra who lat­er went on to serve as a direc­tor of the Indi­an government’s web­site, MyGov India – which is at present led by Arvind Gup­ta who was ear­li­er head of the BJP’s IT cell.

Mishra is CEO of Bluekraft Dig­i­tal Foun­da­tion. The Foun­da­tion has been linked to a dis­in­for­ma­tion web­site titled “The True Pic­ture,” has pub­lished books authored by Prime Min­is­ter Naren­dra Modi and pro­duces cam­paign videos for NaMo Tele­vi­sion, a 24 hour cable tele­vi­sion chan­nel ded­i­cat­ed to pro­mot­ing Modi.

The 2014 Modi pre-elec­tion cam­paign was inspired by the 2012 cam­paign to elect Barack Oba­ma as the “world’s first Face­book Pres­i­dent.” Some of the man­agers of the Modi cam­paign like Jain were appar­ent­ly inspired by Sasha Issenberg’s book on the top­ic, The Vic­to­ry Lab: The Secret Sci­ence of Win­ning Cam­paignsIn the first data-led elec­tion in India in 2014, infor­ma­tion was col­lect­ed from every pos­si­ble source to not just micro-tar­get users but also fine-tune mes­sages prais­ing and “mythol­o­gis­ing” Modi as the Great Leader who would ush­er in acche din for the coun­try.

Four teams spear­head­ed the cam­paign. The first team was led by Mum­bai-based Jain who fund­ed part of the com­mu­ni­ca­tion cam­paign and also over­saw vot­er data analy­sis. He was helped by Shashi Shekhar Vem­pati in run­ning NITI and “Mis­sion 272+.” As already men­tioned, Shekhar had worked in Infos­ys and is at present the head of Prasar Bharati Cor­po­ra­tion which runs Door­dar­shan and All India Radio.

The sec­ond team was led by polit­i­cal strate­gist Prashant Kishor and his I‑PAC or Indi­an Polit­i­cal Action Com­mit­tee who super­vised the three-dimen­sion­al pro­jec­tion pro­gramme for Modi besides pro­grammes like Run for Uni­ty, Chai Pe Char­cha (or Dis­cus­sions Over Tea), Man­than (or Churn­ing) and Cit­i­zens for Account­able Gov­er­nance (CAG) that roped in man­age­ment grad­u­ates to gar­ner sup­port for Modi at large gath­er­ings. Hav­ing worked across the polit­i­cal spec­trum and oppor­tunis­ti­cal­ly switched affil­i­a­tion to those who backed (and paid) him, 41-year-old Kishor is cur­rent­ly the sec­ond-in-com­mand in Jana­ta Dal (Unit­ed) head­ed by Bihar Chief Min­is­ter Nitish Kumar.

The third team, that was intense­ly focused on build­ing Modi’s per­son­al image, was head­ed by Hiren Joshi him­self who worked out of the then Gujarat Chief Minister’s Office in Gand­hi­na­gar. The mem­bers of this team worked close­ly with staffers of Face­book in India, more than one of our sources told us. As will be detailed lat­er, Shiv­nath Thukral, who is cur­rent­ly an impor­tant exec­u­tive in Face­book, worked with this team. (We made a num­ber of tele­phone calls to Joshi’s office in New Delhi’s South Block seek­ing a meet­ing with him and also sent him an e‑mail mes­sage request­ing an inter­view but he did not respond.)

The fourth team was led by Arvind Gup­ta, the cur­rent CEO of MyGov.in, a social media plat­form run by the gov­ern­ment of India. He ran the BJP’s cam­paign based out of New Del­hi. When con­tact­ed, he too declined to speak on the record say­ing he is now with the gov­ern­ment and not a rep­re­sen­ta­tive of the BJP. He sug­gest­ed we con­tact Amit Malviya who is the present head of the BJP’s IT cell. He came on the line but declined to speak specif­i­cal­ly on the BJP’s rela­tion­ship with Face­book and What­sApp.

The four teams worked sep­a­rate­ly. “It was (like) a relay (race),” said Vinit Goen­ka who was then the nation­al co-con­ven­er of the BJP’s IT cell, adding: “The only knowl­edge that was shared (among the teams) was on a ‘need to know’ basis. That’s how any sen­si­ble organ­i­sa­tion works.”

From all accounts, Rajesh Jain worked inde­pen­dent­ly from his Low­er Par­el office and invest­ed his own funds to sup­port Modi and towards exe­cut­ing what he described as “Project 275 for 2014” in a blog post that he wrote in June 2011, near­ly three years before the elec­tions actu­al­ly took place. The BJP, of course, went on to win 282 seats in the 2014 Lok Sab­ha elec­tions, ten above the half-way mark, with a lit­tle over 31 per cent of the vote.

As an aside, it may be men­tioned in pass­ing that – like cer­tain for­mer bhak­ts or fol­low­ers of Modi – Jain today appears less than enthu­si­as­tic about the per­for­mance of the gov­ern­ment over the last four and a half years. He is cur­rent­ly engaged in pro­mot­ing a cam­paign called Dhan Vapasi (or “return our wealth”) which is aimed at mon­etis­ing sur­plus land and oth­er assets held by gov­ern­ment bod­ies, includ­ing defence estab­lish­ments, and pub­lic sec­tor under­tak­ings, for the ben­e­fit of the poor and the under­priv­i­leged. Dhan Vapasi, in his words, is all about mak­ing “every Indi­an rich and free.”

In one of his recent videos that are in the pub­lic domain, Jain remarked: “For the 2014 elec­tions, I had spent three years and my own mon­ey to build a team of 100 peo­ple to help with Modi’s cam­paign. Why? Because I trust­ed that a Modi-led BJP gov­ern­ment could end the Con­gress’ anti-pros­per­i­ty pro­grammes and put India on a path to pros­per­i­ty, a nayi disha (or new direc­tion). But four years have gone by with­out any sig­nif­i­cant change in pol­i­cy. India need­ed that to elim­i­nate the big and hame­sha (peren­ni­al) prob­lems of pover­ty, unem­ploy­ment and cor­rup­tion. The Modi-led BJP gov­ern­ment fol­lowed the same old failed pol­i­cy of increas­ing tax­es and spend­ing. The ruler changed, but the out­comes have not.”

As men­tioned, when we con­tact­ed 51-year-old Jain, who heads the Mum­bai-based Net­core group of com­pa­nies, said to be India’s biggest dig­i­tal media mar­ket­ing cor­po­rate group, he declined to be inter­viewed. Inci­den­tal­ly, he had till Octo­ber 2017 served on the boards of direc­tors of two promi­nent pub­lic sec­tor com­pa­nies. One was Nation­al Ther­mal Pow­er Cor­po­ra­tion (NTPC) – Jain has no expe­ri­ence in the pow­er sec­tor, just as Sam­bit Patra, BJP spokesper­son, who is an “inde­pen­dent” direc­tor on the board of the Oil and Nat­ur­al Gas Cor­po­ra­tion, has zero expe­ri­ence in the petro­le­um indus­try. Jain also served on the board of the Unique Iden­ti­fi­ca­tion Author­i­ty of India (UIDAI), which runs the Aad­har pro­gramme.

Unlike Jain who was not at all forth­com­ing, 44-year-old Prodyut Bora, founder of the BJP’s IT cell in 2007 (bare­ly a year after Face­book and Twit­ter had been launched) was far from ret­i­cent while speak­ing to us. He had resigned from the party’s nation­al exec­u­tive in Feb­ru­ary 2015 after ques­tion­ing Modi and Amit Shah’s “high­ly indi­vid­u­alised and cen­tralised style of deci­sion-mak­ing” that had led to the “sub­ver­sion of demo­c­ra­t­ic tra­di­tions” in the gov­ern­ment and in the par­ty.

Bora recalled how he was one of the first grad­u­ates from the lead­ing busi­ness school, the Indi­an Insti­tute of Man­age­ment, Ahmed­abad, to join the BJP because of his great admi­ra­tion for the then Prime Min­is­ter Atal Behari Vaj­pay­ee. It was at the behest of the then par­ty pres­i­dent Raj­nath Singh (who is now Union Home Min­is­ter) that he set up the party’s IT cell to enable its lead­ers to come clos­er to, and inter­act with, their sup­port­ers.

The cell, he told us, was cre­at­ed not with a man­date to abuse peo­ple on social media plat­forms. He lament­ed that “mad­ness” has now gripped the BJP and the desire to win elec­tions at any cost has “destroyed the very ethos” of the par­ty he was once a part of. Today, the Gur­gaon-based Bora runs a firm mak­ing air purifi­ca­tion equip­ment and is involved with an inde­pen­dent polit­i­cal par­ty in his home state, Assam.

He told us: “The process of being eco­nom­i­cal with the truth (in the BJP) began in 2014. The (elec­tion) cam­paign was send­ing out unver­i­fied facts, infomer­cials, memes, dodgy data and graphs. From there, fake news was one step up the curve. Lead­ers of polit­i­cal par­ties, includ­ing the BJP, like to out­source this work because they don’t want to leave behind dig­i­tal foot­prints. In 2009, social media plat­forms like Face­book and What­sApp had a mar­gin­al impact in India’s 20 big cities. By 2014, how­ev­er, it had vir­tu­al­ly replaced the tra­di­tion­al mass media. In 2019, it will be the most per­va­sive media in the coun­try.” . . . .

. . . . At one stage in our inter­view with [Vinit] Goen­ka that last­ed over two hours, we asked him a point­ed ques­tion: “Who helped whom more, Face­book or the BJP?”

He smiled and said: “That’s a dif­fi­cult ques­tion. I won­der whether the BJP helped Face­book more than Face­book helped the BJP. You could say, we helped each oth­er.”

1c. Accord­ing to Christo­pher Miller of RFERL, Face­book select­ed Katery­na Kruk for the posi­tion:

Since autumn 2018, Face­book has looked to hire a pub­lic pol­i­cy man­ag­er for Ukraine. The job came after years of Ukraini­ans crit­i­ciz­ing the plat­form for take­downs of its activists’ pages and the spread of Russ­ian dis­in­fo tar­get­ing Kyiv. Now, it appears to have one: @Kateryna_Kruk [28].— Christo­pher Miller (@ChristopherJM) June 3, 2019 [29]

Kruk’s LinkedIn page [31] also lists her as being Facebook’s Pub­lic Pol­i­cy Man­ag­er for Ukraine as of May of this year.

Kruk  worked as an ana­lyst and TV host for the Ukrain­ian ‘anti-Russ­ian pro­pa­gan­da’ out­fit Stop­Fake. Stop­Fake is the cre­ation of Ire­na Chalu­pa, who works for the Atlantic Coun­cil and the Ukrain­ian gov­ern­ment [32] and appears to be the sis­ter of Andrea and Alexan­dra Chalu­pa.

(As an exam­ple of how StopFake.org approach­es Ukraine’s far right, here’s a tweet [65] from StopFake’s co-founder, Yevhen Fed­chenko, from May of 2018 where he com­plains about an arti­cle in Hro­madske Inter­na­tion­al [66] that char­ac­ter­izes C14 as a neo-Nazi group:

“for Hro­madske C14 is ‘neo- nazi’, in real­i­ty one of them – Olek­san­dr Voitko – is a war vet­er­an and before going to the war – alum and fac­ul­ty at @MohylaJSchool [67], jour­nal­ist at For­eign news desk at Chan­nel 5. Now also active par­tic­i­pant of war vet­er­ans grass-root orga­ni­za­tion. https://t.co/QmaGnu6QGZ [68]— Yevhen Fed­chenko (@yevhenfedchenko) May 5, 2018) [69]

In Octo­ber of 2017, Kruk joined the “Krem­lin Watch” team at the Euro­pean Val­ues think-tank [33]In June of 2014, The Atlantic Coun­cil gave Kruk its Free­dom award for her com­mu­ni­ca­tions work dur­ing the Euro­maid­an protests [34]. Kruk also has a num­ber of arti­cles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advo­cates for the cre­ation of an inde­pen­dent Ukrain­ian Ortho­dox Church to dimin­ish the influ­ence of the Russ­ian Ortho­dox Church [36]. Keep in mind that, in May of 2018, Face­book decid­ed to effec­tive­ly out­source the work of iden­ti­fy­ing pro­pa­gan­da and mis­in­for­ma­tion dur­ing elec­tions to the Atlantic Coun­cil [25], so choos­ing some­one like Kruk who already has the Atlantic Council’s stamp of approval is in keep­ing with that trend.

Accord­ing to Kruk’s LinkedIn page [31] she’s also done exten­sive work for the Ukrain­ian gov­ern­ment. From March 2016 to Jan­u­ary 2017 she was the Strate­gic Com­mu­ni­ca­tions Man­ag­er for the Ukrain­ian par­lia­ment where she was respon­si­ble for social media and inter­na­tion­al com­mu­ni­ca­tions. From Jan­u­ary-April 2017 she was the Head of Com­mu­ni­ca­tions at the Min­istry of Health.

Kruk not only was a vol­un­teer for Svo­bo­da dur­ing the 2014 Euro­maid­an protests, she also open­ly cel­e­brat­ed on twit­ter the May 2014 mas­sacre in Odessa when the far right burned dozens of pro­tes­tors alive. Kruk’s twit­ter feed is set to pri­vate now so there isn’t pub­lic access to her old tweet, but peo­ple have screen cap­tures of it. Here’s a tweet [37] from Yasha Levine with a screen­shot of Kruk’s May 2, 2014 tweet where she writes:
“#Odessa cleaned itself from ter­ror­ists, proud for city fight­ing for its identity.glory to fall­en heroes..”

She even threw in a “glo­ry to fall­en heroes” at the end of her tweet cel­e­brat­ing this mas­sacre. Keep in mind that it was month after this tweet that the Atlantic Coun­cil gave her that Free­dom Award for her com­mu­ni­ca­tions work dur­ing the protests.

An arti­cle from Jan­u­ary of 2014 about the then-ongo­ing Maid­an square protests, The arti­cle cov­ers the grow­ing pres­ence of the far right in the protests and their attacks on left-wing pro­tes­tors. Kruk is inter­viewed in the arti­cle and describes her­self as a Svo­bo­da vol­un­teer. Kruk issued a tweet cel­e­brat­ing the Odessa mas­sacre a few months lat­er and also stands out from a pub­lic rela­tions stand­point: Kruk was send­ing mes­sages for why aver­age Ukraini­ans who don’t nec­es­sar­i­ly sup­port the far right should sup­port the far right at that moment, which was one of the most use­ful mes­sages she could have been send­ing for the far right at that time [35]:

“The Ukrain­ian Nation­al­ism at the Heart of ‘Euro­maid­an’” by Alec Luhn; The Nation; 01/21/2014 [35].

. . . . For now, Svo­bo­da and oth­er far-right move­ments like Right Sec­tor are focus­ing on the protest-wide demands for civic free­doms gov­ern­ment account­abil­i­ty rather than overt­ly nation­al­ist agen­das. Svo­bo­da enjoys a rep­u­ta­tion as a par­ty of action, respon­sive to cit­i­zens’ prob­lems. Noyevy cut an inter­view with The Nation short to help local res­i­dents who came with a com­plaint that a devel­op­er was tear­ing down a fence with­out per­mis­sion.

“There are peo­ple who don’t sup­port Svo­bo­da because of some of their slo­gans, but they know it’s the most active polit­i­cal par­ty and go to them for help,” said Svo­bo­da vol­un­teer Katery­na Kruk. “Only Svo­bo­da is help­ing against land seizures in Kiev.” . . . .

1d. Kruk has man­i­fest­ed oth­er fas­cist sym­pa­thies and con­nec­tions:

  1. In 2014, she tweet­ed that a man had asked her to con­vince his grand­son not to join the Azov Bat­tal­ion, a neo-Nazi mili­tia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [39].” And he then trav­eled to Luhan­sk to fight pro-Russ­ian rebels.
  2. [70]

    Nazi sniper Dil­ly Krivich, posthu­mous­ly lion­ized by Katery­na Kruk

    In March 2018, a 19-year neo-Nazi named Andriy “Dil­ly” Krivich [40] was shot and killed by a sniper. Krivich had been fight­ing with the fas­cist Ukrain­ian group Right Sec­tor, and had post­ed pho­tos on social media wear­ing Nazi Ger­man sym­bols. After he was killed, Kruk tweet­ed an homage to the teenage Nazi [41]. (The Nazi was also lion­ized on Euro­maid­an Press’ Face­book page [42].)

  3. Kruk has staunch­ly defend­ed the use of the slo­gan “Sla­va Ukrai­ni,” [43]which was first coined and pop­u­lar­ized [44] by Nazi-col­lab­o­rat­ing fas­cists [45], and is now the offi­cial salute of Ukraine’s army [46].
  4. She has also said that the Ukrain­ian fas­cist politi­cian Andriy Paru­biy [47], who co-found­ed a neo-Nazi par­ty before lat­er becom­ing the chair­man of Ukraine’s par­lia­ment the Rada, is “act­ing smart [48],” writ­ing, “Paru­biy touche [49].” . . . .

“Facebook’s New Pub­lic Pol­i­cy Man­ag­er Is Nation­al­ist Hawk Who Vol­un­teered with Fas­cist Par­ty Dur­ing US-Backed Coup” by Ben Nor­ton; The Gray Zone; 6/4/2019. [38]

. . . . Svo­bo­da is not the only Ukrain­ian fas­cist group Katery­na Kruk has expressed sup­port for. In 2014, she tweet­ed that a man had asked her to con­vince his grand­son not to join the Azov Bat­tal­ion, a neo-Nazi mili­tia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [39].” And he then trav­eled to Luhan­sk to fight pro-Russ­ian rebels.

That’s not all. In March 2018, a 19-year neo-Nazi named Andriy “Dil­ly” Krivich [40] was shot and killed by a sniper. Krivich had been fight­ing with the fas­cist Ukrain­ian group Right Sec­tor, and had post­ed pho­tos on social media wear­ing Nazi Ger­man sym­bols. After he was killed, Kruk tweet­ed an homage to the teenage Nazi [41]. (The Nazi was also lion­ized on Euro­maid­an Press’ Face­book page [42].)

Kruk has staunch­ly defend­ed the use of the slo­gan “Sla­va Ukrai­ni,” [43]which was first coined and pop­u­lar­ized [44] by Nazi-col­lab­o­rat­ing fas­cists [45], and is now the offi­cial salute of Ukraine’s army [46].

She has also said that the Ukrain­ian fas­cist politi­cian Andriy Paru­biy [47], who co-found­ed a neo-Nazi par­ty before lat­er becom­ing the chair­man of Ukraine’s par­lia­ment the Rada, is “act­ing smart [48],” writ­ing, “Paru­biy touche [49].” . . . .

2. The essence of the book Ser­pen­t’s Walk  is pre­sent­ed on the back cov­er:

Ser­pen­t’s Walk by “Ran­dolph D. Calver­hall;” Copy­right 1991 [SC]; Nation­al Van­guard Books; 0–937944-05‑X. [8]

It assumes that Hitler’s war­rior elite — the SS — did­n’t give up their strug­gle for a White world when they lost the Sec­ond World War. Instead their sur­vivors went under­ground and adopt­ed some of the tac­tics of their ene­mies: they began build­ing their eco­nom­ic mus­cle and buy­ing into the opin­ion-form­ing media. A cen­tu­ry after the war they are ready to chal­lenge the democ­rats and Jews for the hearts and minds of White Amer­i­cans, who have begun to have their fill of gov­ern­ment-enforced mul­ti-cul­tur­al­ism and ‘equal­i­ty.’

3. This process is described in more detail in a pas­sage of text, con­sist­ing of a dis­cus­sion between Wrench (a mem­ber of this Under­ground Reich) and a mer­ce­nary named Less­ing.

Ser­pen­t’s Walk by “Ran­dolph D. Calver­hall;” Copy­right 1991 [SC]; Nation­al Van­guard Books; 0–937944-05‑X; pp. 42–43. [8]

. . . . The SS . . . what was left of it . . . had busi­ness objec­tives before and dur­ing World War II. When the war was lost they just kept on, but from oth­er places: Bogo­ta, Asun­cion, Buenos Aires, Rio de Janeiro, Mex­i­co City, Colom­bo, Dam­as­cus, Dac­ca . . . you name it. They real­ized that the world is head­ing towards a ‘cor­po­racra­cy;’ five or ten inter­na­tion­al super-com­pa­nies that will run every­thing worth run­ning by the year 2100. Those super-cor­po­ra­tions exist now, and they’re already divid­ing up the pro­duc­tion and mar­ket­ing of food, trans­port, steel and heavy indus­try, oil, the media, and oth­er com­modi­ties. They’re most­ly con­glom­er­ates, with fin­gers in more than one pie . . . . We, the SS, have the say in four or five. We’ve been com­pet­ing for the past six­ty years or so, and we’re slow­ly gain­ing . . . . About ten years ago, we swung a merg­er, a takeover, and got vot­ing con­trol of a super­corp that runs a small but sig­nif­i­cant chunk of the Amer­i­can media. Not open­ly, not with bands and trum­pets or swastikas fly­ing, but qui­et­ly: one huge cor­po­ra­tion cud­dling up to anoth­er one and gen­tly munch­ing it up, like a great, gub­bing amoe­ba. Since then we’ve been replac­ing exec­u­tives, push­ing some­body out here, bring­ing some­body else in there. We’ve swing pro­gram con­tent around, too. Not much, but a lit­tle, so it won’t show. We’ve cut down on ‘nasty-Nazi’ movies . . . good guys in white hats and bad guys in black SS hats . . . lov­able Jews ver­sus fiendish Ger­mans . . . and we have media psy­chol­o­gists, ad agen­cies, and behav­ior mod­i­fi­ca­tion spe­cial­ists work­ing on image changes. . . .

4. The broad­cast address­es the grad­ual remak­ing of the image of the Third Reich that is rep­re­sent­ed in Ser­pen­t’s Walk. In the dis­cus­sion excerpt­ed above, this process is fur­ther described.

Ser­pen­t’s Walk by “Ran­dolph D. Calver­hall;” Copy­right 1991 [SC]; Nation­al Van­guard Books; 0–937944-05‑X; pp. 42–44. [8]

. . . . Hell, if you can con granny into buy­ing Sug­ar Turds instead of Bran Farts, then why can’t you swing pub­lic opin­ion over to a cause as vital and impor­tant as ours?’ . . . In any case, we’re slow­ly replac­ing those neg­a­tive images with oth­ers: the ‘Good Bad Guy’ rou­tine’ . . . ‘What do you think of Jesse James? John Dillinger? Julius Cae­sar? Genghis Khan?’ . . . The real­i­ty may have been rough, but there’s a sort of glit­ter about most of those dudes: mean hon­chos but respectable. It’s all how you pack­age it. Opin­ion is a godamned com­mod­i­ty!’ . . . It works with any­body . . . Give it time. Aside from the media, we’ve been buy­ing up pri­vate schools . . . and help­ing some pub­lic ones through phil­an­thropic foun­da­tions . . . and work­ing on the church­es and the Born Agains. . . .

5. Through the years, we have high­light­ed the Nazi tract Ser­pen­t’s Wal [71]k, excerpt­ed above, which deals, in part, with the reha­bil­i­ta­tion of the Third Reich’s rep­u­ta­tion and the trans­for­ma­tion of Hitler into a hero.

In FTR #1015 [72], we not­ed that a Ser­pen­t’s Walk sce­nario is indeed unfold­ing in India.

[73]Key points of analy­sis and dis­cus­sion include:

  1. Naren­dra Mod­i’s pres­ence on the same book cove [74](along with Gand­hi, Man­dela, Oba­ma and Hitler.)
  2. Modi him­self has his own polit­i­cal his­to­ry [75] with children’s books that pro­mote Hitler as a great leader: ” . . . . In 2004, reports sur­faced of high-school text­books in the state of Gujarat, which was then led by Mr. Modi, that spoke glow­ing­ly of Nazism and fas­cism [75]. Accord­ing to ‘The Times of India,’ in a sec­tion called ‘Ide­ol­o­gy of Nazism,’ the text­book said Hitler had ‘lent dig­ni­ty and pres­tige to the Ger­man gov­ern­ment,’ ‘made untir­ing efforts to make Ger­many self-reliant’ and ‘instilled the spir­it of adven­ture in the com­mon peo­ple.’  . . . .”
  3. In India, many have a favor­able view of Hitler [76]: ” . . . . as far back as 2002, the Times of India report­ed a sur­vey [76] that found that 17 per­cent of stu­dents in elite Indi­an col­leges ‘favored Adolf Hitler as the kind of leader India ought to have.’ . . . . Con­sid­er Mein Kampf [77], Hitler’s auto­bi­og­ra­phy. Reviled it might be in the much of the world, but Indi­ans buy thou­sands of copies of it every month. As a recent paper in the jour­nal EPW tells us (PDF [78]), there are over a dozen Indi­an pub­lish­ers who have edi­tions of the book on the mar­ket. Jaico, for exam­ple, print­ed its 55th edi­tion in 2010, claim­ing to have sold 100,000 copies in the pre­vi­ous sev­en years. (Con­trast this to the 3,000 copies my own 2009 book, Road­run­ner, has sold). In a coun­try where 10,000 copies sold makes a book a best­seller, these are sig­nif­i­cant num­bers. . . .”
  4. A class­room of school chil­dren filled with fans of Hitler had a very dif­fer­ent sen­ti­ment about Gand­hi. ” . . . . ‘He’s a cow­ard!’ That’s the obvi­ous flip side of this love of Hitler in India. It’s an implic­it rejec­tion of Gand­hi. . . .”
  5. Appar­ent­ly, Mein Kampf has achieved grav­i­tas among busi­ness stu­dents in India [79]” . . . . What’s more, there’s a steady trick­le of reports that say it has become a must-read for busi­ness-school stu­dents [80]; a man­age­ment guide much like Spencer Johnson’s Who Moved My Cheese or Edward de Bono’s Lat­er­al Think­ing. If this undis­tin­guished artist could take an entire coun­try with him, I imag­ine the rea­son­ing goes, sure­ly his book has some lessons for future cap­tains of indus­try? . . . .”

6. Christo­pher Wylie–the for­mer head of research at Cam­bridge Ana­lyt­i­ca who became one of the key insid­er whis­tle-blow­ers about how Cam­bridge Ana­lyt­i­ca oper­at­ed and the extent of Facebook’s knowl­edge about it–gave an inter­view last month to Cam­paign Mag­a­zine. (We dealt with Cam­bridge Ana­lyt­i­ca in FTR #‘s 946 [12], 1021 [13].)

Wylie recounts how, as direc­tor of research at Cam­bridge Ana­lyt­i­ca, his orig­i­nal role was to deter­mine how the com­pa­ny could use the infor­ma­tion war­fare tech­niques used by SCL Group – Cam­bridge Analytica’s par­ent com­pa­ny and a defense con­trac­tor pro­vid­ing psy op ser­vices for the British mil­i­tary. Wylie’s job was to adapt the psy­cho­log­i­cal war­fare strate­gies that SCL had been using on the bat­tle­field to the online space. As Wylie put it:

“ . . . . When you are work­ing in infor­ma­tion oper­a­tions projects, where your tar­get is a com­bat­ant, the auton­o­my or agency of your tar­gets is not your pri­ma­ry con­sid­er­a­tion. It is fair game to deny and manip­u­late infor­ma­tion, coerce and exploit any men­tal vul­ner­a­bil­i­ties a per­son has, and to bring out the very worst char­ac­ter­is­tics in that per­son because they are an ene­my…But if you port that over to a demo­c­ra­t­ic sys­tem, if you run cam­paigns designed to under­mine people’s abil­i­ty to make free choic­es and to under­stand what is real and not real, you are under­min­ing democ­ra­cy and treat­ing vot­ers in the same way as you are treat­ing ter­ror­ists. . . . .”

Wylie also draws par­al­lels between the psy­cho­log­i­cal oper­a­tions used on demo­c­ra­t­ic audi­ences and the bat­tle­field tech­niques used to be build an insur­gency. It starts with tar­get­ing peo­ple more prone to hav­ing errat­ic traits, para­noia or con­spir­a­to­r­i­al think­ing, and get them to “like” a group on social media. The infor­ma­tion you’re feed­ing this tar­get audi­ence may or may not be real. The impor­tant thing is that it’s con­tent that they already agree with so that “it feels good to see that infor­ma­tion.” Keep in mind that one of the goals of the ‘psy­cho­graph­ic pro­fil­ing’ that Cam­bridge Ana­lyt­i­ca was to iden­ti­fy traits like neu­roti­cism.

Wylie goes on to describe the next step in this insur­gency-build­ing tech­nique: keep build­ing up the inter­est in the social media group that you’re direct­ing this tar­get audi­ence towards until it hits around 1,000–2,000 peo­ple. Then set up a real life event ded­i­cat­ed to the cho­sen dis­in­for­ma­tion top­ic in some local area and try to get as many of your tar­get audi­ence to show up. Even if only 5 per­cent of them show up, that’s still 50–100 peo­ple con­verg­ing on some local cof­fee shop or what­ev­er. The peo­ple meet each oth­er in real life and start talk­ing about about “all these things that you’ve been see­ing online in the depths of your den and get­ting angry about”. This tar­get audi­ence starts believ­ing that no one else is talk­ing about this stuff because “they don’t want you to know what the truth is”. As Wylie puts it, “What start­ed out as a fan­ta­sy online gets port­ed into the tem­po­ral world and becomes real to you because you see all these peo­ple around you.”

“Cam­bridge Ana­lyt­i­ca whistle­blow­er Christo­pher Wylie: It’s time to save cre­ativ­i­ty” by Kate Magee; Cam­paign; 11/05/2018 [53].

In the ear­ly hours of 17 March 2018, the 28-year-old Christo­pher Wylie tweet­ed: “Here we go….”

Lat­er that day, The Observ­er and The New York Times pub­lished the sto­ry of Cam­bridge Analytica’s mis­use of Face­book data, which sent shock­waves around the world, caused mil­lions to #Delete­Face­book, and led the UK Infor­ma­tion Commissioner’s Office to fine the site the max­i­mum penal­ty for fail­ing to pro­tect users’ infor­ma­tion. Six weeks after the sto­ry broke, Cam­bridge Ana­lyt­i­ca closed. . . .

. . . . He believes that poor use of data is killing good ideas. And that, unless effec­tive reg­u­la­tion is enact­ed, society’s wor­ship of algo­rithms, unchecked data cap­ture and use, and the like­ly spread of AI to all parts of our lives is caus­ing us to sleep­walk into a bleak future.

Not only are such cir­cum­stances a threat to adland – why do you need an ad to tell you about a prod­uct if an algo­rithm is choos­ing it for you? – it is a threat to human free will. “Cur­rent­ly, the only moral­i­ty of the algo­rithm is to opti­mise you as a con­sumer and, in many cas­es, you become the prod­uct. There are very few exam­ples in human his­to­ry of indus­tries where peo­ple them­selves become prod­ucts and those are scary indus­tries – slav­ery and the sex trade. And now, we have social media,” Wylie says.

“The prob­lem with that, and what makes it inher­ent­ly dif­fer­ent to sell­ing, say, tooth­paste, is that you’re sell­ing parts of peo­ple or access to peo­ple. Peo­ple have an innate moral worth. If we don’t respect that, we can cre­ate indus­tries that do ter­ri­ble things to peo­ple. We are [head­ing] blind­ly and quick­ly into an envi­ron­ment where this men­tal­i­ty is going to be ampli­fied through AI every­where. We’re humans, we should be think­ing about peo­ple first.”

His words car­ry weight, because he’s been on the dark side. He has seen what can hap­pen when data is used to spread mis­in­for­ma­tion, cre­ate insur­gen­cies and prey on the worst of people’s char­ac­ters.

The polit­i­cal bat­tle­field

A quick refresh­er on the scan­dal, in Wylie’s words: Cam­bridge Ana­lyt­i­ca was a com­pa­ny spun out of SCL Group, a British mil­i­tary con­trac­tor that worked in infor­ma­tion oper­a­tions for armed forces around the world. It was con­duct­ing research on how to scale and digi­tise infor­ma­tion war­fare – the use of infor­ma­tion to con­fuse or degrade the effi­ca­cy of an ene­my. . . .

. . . . As direc­tor of research, Wylie’s orig­i­nal role was to map out how the com­pa­ny would take tra­di­tion­al infor­ma­tion oper­a­tions tac­tics into the online space – in par­tic­u­lar, by pro­fil­ing peo­ple who would be sus­cep­ti­ble to cer­tain mes­sag­ing.

This mor­phed into the polit­i­cal are­na. After Wylie left, the com­pa­ny worked on Don­ald Trump’s US pres­i­den­tial cam­paign and – pos­si­bly – the UK’s Euro­pean Union ref­er­en­dum. In Feb­ru­ary 2016, Cam­bridge Analytica’s for­mer chief exec­u­tive, Alexan­der Nix, wrote in Cam­paign that his com­pa­ny had “already helped super­charge Leave.EU’s social-media cam­paign”. Nix has stren­u­ous­ly denied this since, includ­ing to MPs.

It was this shift from the bat­tle­field to pol­i­tics that made Wylie uncom­fort­able. “When you are work­ing in infor­ma­tion oper­a­tions projects, where your tar­get is a com­bat­ant, the auton­o­my or agency of your tar­gets is not your pri­ma­ry con­sid­er­a­tion. It is fair game to deny and manip­u­late infor­ma­tion, coerce and exploit any men­tal vul­ner­a­bil­i­ties a per­son has, and to bring out the very worst char­ac­ter­is­tics in that per­son because they are an ene­my,” he says.

“But if you port that over to a demo­c­ra­t­ic sys­tem, if you run cam­paigns designed to under­mine people’s abil­i­ty to make free choic­es and to under­stand what is real and not real, you are under­min­ing democ­ra­cy and treat­ing vot­ers in the same way as you are treat­ing ter­ror­ists.”

One of the rea­sons these tech­niques are so insid­i­ous is that being a tar­get of a dis­in­for­ma­tion cam­paign is “usu­al­ly a plea­sur­able expe­ri­ence”, because you are being fed con­tent with which you are like­ly to agree. “You are being guid­ed through some­thing that you want to be true,” Wylie says.

To build an insur­gency, he explains, you first tar­get peo­ple who are more prone to hav­ing errat­ic traits, para­noia or con­spir­a­to­r­i­al think­ing, and get them to “like” a group on social media. They start engag­ing with the con­tent, which may or may not be true; either way “it feels good to see that infor­ma­tion”.

When the group reach­es 1,000 or 2,000 mem­bers, an event is set up in the local area. Even if only 5% show up, “that’s 50 to 100 peo­ple flood­ing a local cof­fee shop”, Wylie says. This, he adds, val­i­dates their opin­ion because oth­er peo­ple there are also talk­ing about “all these things that you’ve been see­ing online in the depths of your den and get­ting angry about”.

Peo­ple then start to believe the rea­son it’s not shown on main­stream news chan­nels is because “they don’t want you to know what the truth is”. As Wylie sums it up: “What start­ed out as a fan­ta­sy online gets port­ed into the tem­po­ral world and becomes real to you because you see all these peo­ple around you.” . . . . 

. . . . Psy­cho­graph­ic poten­tial

One such appli­ca­tion was Cam­bridge Analytica’s use of psy­cho­graph­ic pro­fil­ing, a form of seg­men­ta­tion that will be famil­iar to mar­keters, although not in com­mon use.

The com­pa­ny used the OCEAN mod­el, which judges peo­ple on scales of the Big Five per­son­al­i­ty traits: open­ness to expe­ri­ences, con­sci­en­tious­ness, extra­ver­sion, agree­able­ness and neu­roti­cism.

Wylie believes the method could be use­ful in the com­mer­cial space. For exam­ple, a fash­ion brand that cre­ates bold, colour­ful, pat­terned clothes might want to seg­ment wealthy woman by extro­ver­sion because they will be more like­ly to buy bold items, he says.

Scep­tics say Cam­bridge Analytica’s approach may not be the dark mag­ic that Wylie claims. Indeed, when speak­ing to Cam­paign in June 2017, Nix unchar­ac­ter­is­ti­cal­ly played down the method, claim­ing the com­pa­ny used “pret­ty bland data in a pret­ty enter­pris­ing way”.

But Wylie argues that peo­ple under­es­ti­mate what algo­rithms allow you to do in pro­fil­ing. “I can take pieces of infor­ma­tion about you that seem innocu­ous, but what I’m able to do with an algo­rithm is find pat­terns that cor­re­late to under­ly­ing psy­cho­log­i­cal pro­files,” he explains.

“I can ask whether you lis­ten to Justin Bieber, and you won’t feel like I’m invad­ing your pri­va­cy. You aren’t nec­es­sar­i­ly aware that when you tell me what music you lis­ten to or what TV shows you watch, you are telling me some of your deep­est and most per­son­al attrib­ut­es.”

This is where mat­ters stray into the ques­tion of ethics. Wylie believes that as long as the com­mu­ni­ca­tion you are send­ing out is clear, not coer­cive or manip­u­la­tive, it’s fine, but it all depends on con­text. “If you are a beau­ty com­pa­ny and you use facets of neu­roti­cism – which Cam­bridge Ana­lyt­i­ca did – and you find a seg­ment of young women or men who are more prone to body dys­mor­phia, and one of the proac­tive actions they take is to buy more skin cream, you are exploit­ing some­thing which is unhealthy for that per­son and doing dam­age,” he says. “The ethics of using psy­cho­me­t­ric data real­ly depend on whether it is pro­por­tion­al to the ben­e­fit and util­i­ty that the cus­tomer is get­ting.” . . .

Clash­es with Face­book

Wylie is opposed to self-reg­u­la­tion, because indus­tries won’t become con­sumer cham­pi­ons – they are, he says, too con­flict­ed.

“Face­book has known about what Cam­bridge Ana­lyt­i­ca was up to from the very begin­ning of those projects,” Wylie claims. “They were noti­fied, they autho­rised the appli­ca­tions, they were giv­en the terms and con­di­tions of the app that said explic­it­ly what it was doing. They hired peo­ple who worked on build­ing the app. I had legal cor­re­spon­dence with their lawyers where they acknowl­edged it hap­pened as far back as 2016.”

He wants to cre­ate a set of endur­ing prin­ci­ples that are hand­ed over to a tech­ni­cal­ly com­pe­tent reg­u­la­tor to enforce. “Cur­rent­ly, the indus­try is not respond­ing to some pret­ty fun­da­men­tal things that have hap­pened on their watch. So I think it is the right place for gov­ern­ment to step in,” he adds.

Face­book in par­tic­u­lar, he argues is “the most obsti­nate and bel­liger­ent in recog­nis­ing the harm that has been done and actu­al­ly doing some­thing about it”. . . .

7. Social media have been under­scored as a con­tribut­ing fac­tor to right-wing, domes­tic ter­ror­ism. . . . The first is sto­chas­tic ter­ror­ism [51]: ‘The use of mass, pub­lic com­mu­ni­ca­tion, usu­al­ly against a par­tic­u­lar indi­vid­ual or group, which incites or inspires acts of ter­ror­ism which are sta­tis­ti­cal­ly prob­a­ble but hap­pen seem­ing­ly at ran­dom.’ I encoun­tered the idea in a Fri­day thread from data sci­en­tist Emi­ly Gorcens­ki [52], who used it to tie togeth­er four recent attacks. . . . .”

“Why Social Media is Friend to Far-Right Politi­cians Around the World” by Casey New­ton; The Verge; 10/30/2018. [50]

The Links Between Social Media, Domes­tic Ter­ror­ism and the Retreat from Democ­ra­cy

It was an awful week­end of hate-fueled vio­lence, ugly rhetoric, and wor­ri­some retreats from our demo­c­ra­t­ic ideals. Today I’m focused on two ways of fram­ing what we’re see­ing, from the Unit­ed States to Brazil. While nei­ther offers any com­fort, they do give help­ful names to phe­nom­e­na I expect will be with us for a long while.

The first is sto­chas­tic ter­ror­ism [51]: “The use of mass, pub­lic com­mu­ni­ca­tion, usu­al­ly against a par­tic­u­lar indi­vid­ual or group, which incites or inspires acts of ter­ror­ism which are sta­tis­ti­cal­ly prob­a­ble but hap­pen seem­ing­ly at ran­dom.” I encoun­tered the idea in a Fri­day thread from data sci­en­tist Emi­ly Gorcens­ki [52], who used it to tie togeth­er four recent attacks.

In her thread, Gorcens­ki argues that var­i­ous right-wing con­spir­a­cy the­o­ries and frauds, ampli­fied both through main­stream and social media, have result­ed in a grow­ing num­ber of cas­es where men snap and com­mit vio­lence. “Right-wing media is a gra­di­ent push­ing right­wards, toward vio­lence and oppres­sion,” she wrote. “One of the symp­toms of this is that you are basi­cal­ly guar­an­teed to gen­er­ate ran­dom ter­ror­ists. Like pop­corn ker­nels pop­ping.”

On Sat­ur­day, anoth­er ker­nel popped. Robert A. Bow­ers, the sus­pect in a shoot­ing at a syn­a­gogue that left 11 peo­ple dead, was steeped in online con­spir­a­cy cul­ture. He post­ed fre­quent­ly to Gab, a Twit­ter clone that empha­sizes free speech and has become a favored social net­work among white nation­al­ists. Julie Turke­witz and Kevin Roose described his hate­ful views in the New York Times [81]:

After open­ing an account on it in Jan­u­ary, he had shared a stream of anti-Jew­ish slurs and con­spir­a­cy the­o­ries. It was on Gab where he found a like-mind­ed com­mu­ni­ty, repost­ing mes­sages from Nazi sup­port­ers.

“Jews are the chil­dren of Satan,” read Mr. Bowers’s biog­ra­phy.

Bow­ers is in cus­tody — his life was saved by Jew­ish doc­tors and nurs­es [82] — and pre­sum­ably will nev­er go free again. Gab’s life, how­ev­er, may be imper­iled. Two pay­ment proces­sors, Pay­Pal and Stripe, de-plat­formed the site, as did its cloud host, Joyent. The site went down on Mon­day [83] after its host­ing provider GoDad­dy, told it to find anoth­er one. Its founder post­ed defi­ant mes­sages on Twit­ter and else­where promis­ing it would sur­vive.

Gab hosts a lot of deeply upset­ting con­tent [84], and to its sup­port­ers, that’s the point. Free speech is a right, their rea­son­ing goes, and it ought to be exer­cised. Cer­tain­ly it seems wrong to sug­gest that Gab or any oth­er sin­gle plat­form “caused” Bow­ers to act. Hatred, after all, is an ecosys­tem. But his action came amid a con­cert­ed effort to focus atten­tion on a car­a­van of migrants com­ing to the Unit­ed States in seek of refugee.

Right-wing media, most notably Fox News, has advanced the idea that the car­a­van is linked to Jew­ish bil­lion­aire (and Holo­caust sur­vivor) George Soros [85]. An actu­al Con­gress­man, Flori­da Repub­li­can Matt Gaetz, sug­gest­ed the car­a­van was fund­ed by Soros [86]. Bow­ers enthu­si­as­ti­cal­ly pushed these con­spir­a­cy the­o­ries on social media [87].

In his final post on Gab, Bow­ers wrote [88]: “I can’t sit by and watch my peo­ple get slaugh­tered. Screw your optics. I’m going in.”

The indi­vid­ual act was ran­dom. But it had become sta­tis­ti­cal­ly prob­a­ble thanks to the rise of anti-immi­grant rhetoric across all man­ner of media. And I fear we will see far more of it before the cur­rent fever breaks.

The sec­ond con­cept I’m think­ing about today is demo­c­ra­t­ic reces­sion. The idea, which is rough­ly a decade old, is that democ­ra­cy is in retreat around the globe. The Econ­o­mist cov­ered it in Jan­u­ary [89]:

The tenth edi­tion of the Econ­o­mist Intel­li­gence Unit’s Democ­ra­cy Index [90] sug­gests that this unwel­come trend remains firm­ly in place. The index, which com­pris­es 60 indi­ca­tors across five broad categories—electoral process and plu­ral­ism, func­tion­ing of gov­ern­ment, polit­i­cal par­tic­i­pa­tion, demo­c­ra­t­ic polit­i­cal cul­ture and civ­il liberties—concludes that less than 5% of the world’s pop­u­la­tion cur­rent­ly lives in a “full democ­ra­cy”. Near­ly a third live under author­i­tar­i­an rule, with a large share of those in Chi­na. Over­all, 89 of the 167 coun­tries assessed in 2017 received low­er scores than they had the year before.

In Jan­u­ary, The Econ­o­mist con­sid­ered Brazil a “flawed democ­ra­cy.” But after this week­end, the coun­try may under­go a more pre­cip­i­tous decline in demo­c­ra­t­ic free­doms. As expect­ed, far-right can­di­date Jair Bol­sonaro, who speaks approv­ing­ly of the country’s pre­vi­ous mil­i­tary dic­ta­tor­ship, hand­i­ly won elec­tion over his left­ist rival.

In the best piece I read today [91]Buz­zFeed’s Ryan Brod­er­ick — who was in Brazil for the elec­tion — puts Bolsonaro’s elec­tion into the con­text of the inter­net and social plat­form. Brod­er­ick focus­es on the sym­bio­sis between inter­net media, which excels at pro­mot­ing a sense of per­pet­u­al cri­sis and out­rage, and far-right lead­ers who promise a return to nor­mal­cy.

Typ­i­cal­ly, large right-wing news chan­nels or con­ser­v­a­tive tabloids will then take these sto­ries going viral on Face­book and repack­age them for old­er, main­stream audi­ences. Depend­ing on your country’s media land­scape, the far-right trolls and influ­encers may try to hijack this social-media-to-news­pa­per-to-tele­vi­sion pipeline. Which then cre­ates more con­tent to screen­shot, meme, and share. It’s a feed­back loop.

Pop­ulist lead­ers and the legions of influ­encers rid­ing their wave know they can cre­ate fil­ter bub­bles inside of plat­forms like Face­book or YouTube that promise a safer time, one that nev­er exist­ed in the first place, before the protests, the vio­lence, the cas­cad­ing crises, and end­less news cycles. Don­ald Trump wants to Make Amer­i­can Great Again; Bol­sonaro wants to bring back Brazil’s mil­i­tary dic­ta­tor­ship; Shin­zo Abe wants to recap­ture Japan’s impe­r­i­al past; Germany’s AFD per­formed the best with old­er East Ger­man vot­ers long­ing for the days of author­i­tar­i­an­ism [92]. All of these lead­ers promise to close bor­ders, to make things safe. Which will, of course, usu­al­ly exac­er­bate the prob­lems they’re promis­ing to dis­ap­pear. Anoth­er feed­back loop.

A third feed­back loop, of course, is between a social media ecosys­tem pro­mot­ing a sense of per­pet­u­al cri­sis and out­rage, and the ran­dom-but-sta­tis­ti­cal­ly-prob­a­ble pro­duc­tion of domes­tic ter­ror­ists.

Per­haps the glob­al rise of author­i­tar­i­ans and big tech plat­forms are mere­ly cor­re­lat­ed, and no cau­sa­tion can be proved. But I increas­ing­ly won­der whether we would ben­e­fit if tech com­pa­nies assumed that some lev­el of cau­sa­tion was real — and, assum­ing that it is, what they might do about it.

DEMOCRACY

On Social Media, No Answers for Hate [93]

You don’t have to go to Gab to see hate­ful posts. Sheera Frenkel, Mike Isaac, and Kate Con­ger report on how the past week’s domes­tic ter­ror attacks play out on once-hap­pi­er places, most notably Insta­gram:

On Mon­day, a search on Insta­gram, the pho­to-shar­ing site owned by Face­book, pro­duced a tor­rent of anti-Semit­ic images and videos uploaded in the wake of Saturday’s shoot­ing at a Pitts­burgh syn­a­gogue.

A search for the word “Jews” dis­played 11,696 posts with the hash­tag “#jewsdid911,” claim­ing that Jews had orches­trat­ed the Sept. 11 ter­ror attacks. Oth­er hash­tags on Insta­gram ref­er­enced Nazi ide­ol­o­gy, includ­ing the num­ber 88, an abbre­vi­a­tion used for the Nazi salute “Heil Hitler.”

Attacks on Jew­ish peo­ple ris­ing on Insta­gram and Twit­ter, researchers say [94]

Just before the syn­a­gogue attack took place on Sat­ur­day, David Ingram post­ed this sto­ry about an alarm­ing rise in attacks on Jews on social plat­forms:

Samuel Wool­ley, a social media researcher who worked on the study, ana­lyzed more than 7 mil­lion tweets from August and Sep­tem­ber and found an array of attacks, also often linked to Soros. About a third of the attacks on Jews came from auto­mat­ed accounts known as “bots,” he said.

“It’s real­ly spik­ing dur­ing this elec­tion,” Wool­ley, direc­tor of the Dig­i­tal Intel­li­gence Lab­o­ra­to­ry, which stud­ies the inter­sec­tion of tech­nol­o­gy and soci­ety, said in a tele­phone inter­view. “We’re see­ing what we think is an attempt to silence con­ver­sa­tions in the Jew­ish com­mu­ni­ty.”

Russ­ian dis­in­for­ma­tion on Face­book tar­get­ed Ukraine well before the 2016 U.S. elec­tion [95]

Dana Priest, James Jaco­by and Anya Bourg report that Ukraine’s expe­ri­ence with infor­ma­tion war­fare offered an ear­ly — and unheed­ed — warn­ing to Face­book:

To get Zuckerberg’s atten­tion, the pres­i­dent post­ed a ques­tion for a town hall meet­ing at Facebook’s Sil­i­con Val­ley head­quar­ters. There, a mod­er­a­tor read it aloud.

“Mark, will you estab­lish a Face­book office in Ukraine?” the mod­er­a­tor said, chuck­ling, accord­ing to a video of the assem­bly. The room of young employ­ees rip­pled with laugh­ter. But the government’s sug­ges­tion was seri­ous: It believed that a Kiev office, staffed with peo­ple famil­iar with Ukraine’s polit­i­cal sit­u­a­tion, could help solve Facebook’s high-lev­el igno­rance about Russ­ian infor­ma­tion war­fare. . . . .