Spitfire List Web site and blog of anti-fascist researcher and radio personality Dave Emory.

For The Record  

FTR #1077 Surveillance Valley, Part 3: Cambridge Analytica, Democracy and Counterinsurgency

Dave Emory’s entire life­time of work is avail­able on a flash dri­ve that can be obtained HERE. The new dri­ve is a 32-giga­byte dri­ve that is cur­rent as of the pro­grams and arti­cles post­ed by the fall of 2017. The new dri­ve (avail­able for a tax-deductible con­tri­bu­tion of $65.00 or more.)

WFMU-FM is pod­cast­ing For The Record–You can sub­scribe to the pod­cast HERE.

You can sub­scribe to e‑mail alerts from Spitfirelist.com HERE.

You can sub­scribe to RSS feed from Spitfirelist.com HERE.

You can sub­scribe to the com­ments made on pro­grams and posts–an excel­lent source of infor­ma­tion in, and of, itself, HERE.

Please con­sid­er sup­port­ing THE WORK DAVE EMORY DOES.

This broad­cast was record­ed in one, 60-minute seg­ment.

Carl Schmitt, on the right. Arguably Nazi Ger­many’s top legal the­o­reti­cian and a dom­i­nant influ­ence on Face­book and Palan­tir king­pin Peter Thiel’s think­ing.

Intro­duc­tion: Con­tin­u­ing the dis­cus­sion from FTR #1076, the broad­cast recaps key aspects of analy­sis of the Cam­bridge Ana­lyt­i­ca scan­dal.

In our last pro­gram, we not­ed that both the inter­net (DARPA projects includ­ing Project Agile) and the Ger­man Nazi Par­ty had their ori­gins as coun­terin­sur­gency gam­bits. Not­ing Hitler’s speech before The Indus­try Club of Dus­sel­dorf, in which he equat­ed com­mu­nism with democ­ra­cy, we high­light how the Cam­bridge Ana­lyt­i­ca scan­dal reflects the coun­terin­sur­gency ori­gins of the Inter­net, and how the Cam­bridge Ana­lyt­i­ca affair embod­ies anti-Democ­ra­cy/as coun­terin­sur­gency.

Key aspects of the Cam­bridge Ana­lyt­i­ca affair include:

  1. The use of psy­cho­graph­ic per­son­al­i­ty testing on Face­book that is used for polit­i­cal advan­tage: ” . . . . For sev­er­al years, a data firm even­tu­al­ly hired by the Trump cam­paign, Cam­bridge Ana­lyt­i­ca, has been using Face­book as a tool to build psy­cho­log­i­cal pro­files that rep­re­sent some 230 mil­lion adult Amer­i­cans. A spin­off of a British con­sult­ing com­pa­ny and some­time-defense con­trac­tor known for its coun­tert­er­ror­ism ‘psy ops’ work in Afghanistan, the firm does so by seed­ing the social net­work with per­son­al­i­ty quizzes. Respon­dents — by now hun­dreds of thou­sands of us, most­ly female and most­ly young but enough male and old­er for the firm to make infer­ences about oth­ers with sim­i­lar behav­iors and demo­graph­ics — get a free look at their Ocean scores. Cam­bridge Ana­lyt­i­ca also gets a look at their scores and, thanks to Face­book, gains access to their pro­files and real names. . . .”
  2. The par­ent com­pa­ny of Cam­bridge Analytica–SCL–was deeply involved with coun­tert­er­ror­ism “psy-ops” in Afghanistan, embody­ing the essence of the coun­terin­sur­gency dynam­ic at the root of the devel­op­ment of the Inter­net. The use of online data to sub­vert democ­ra­cy recalls Hitler’s speech to the Indus­try Club of Dus­sel­dorf, in which he equat­ed democ­ra­cy with com­mu­nism: ” . . . .  Cam­bridge Ana­lyt­i­ca was a com­pa­ny spun out of SCL Group, a British mil­i­tary con­trac­tor that worked in infor­ma­tion oper­a­tions for armed forces around the world. It was con­duct­ing research on how to scale and digi­tise infor­ma­tion war­fare – the use of infor­ma­tion to con­fuse or degrade the effi­ca­cy of an ene­my. . . . As direc­tor of research, Wylie’s orig­i­nal role was to map out how the com­pa­ny would take tra­di­tion­al infor­ma­tion oper­a­tions tac­tics into the online space – in par­tic­u­lar, by pro­fil­ing peo­ple who would be sus­cep­ti­ble to cer­tain mes­sag­ing. This mor­phed into the polit­i­cal are­na. After Wylie left, the com­pa­ny worked on Don­ald Trump’s US pres­i­den­tial cam­paign . . . .”
  3. Cam­bridge Ana­lyt­i­ca whistle­blow­er Christo­pher Wylie’s obser­va­tions on the anti-demo­c­ra­t­ic nature of the fir­m’s work: ” . . . . It was this shift from the bat­tle­field to pol­i­tics that made Wylie uncom­fort­able. ‘When you are work­ing in infor­ma­tion oper­a­tions projects, where your tar­get is a com­bat­ant, the auton­o­my or agency of your tar­gets is not your pri­ma­ry con­sid­er­a­tion. It is fair game to deny and manip­u­late infor­ma­tion, coerce and exploit any men­tal vul­ner­a­bil­i­ties a per­son has, and to bring out the very worst char­ac­ter­is­tics in that per­son because they are an ene­my,’ he says. ‘But if you port that over to a demo­c­ra­t­ic sys­tem, if you run cam­paigns designed to under­mine people’s abil­i­ty to make free choic­es and to under­stand what is real and not real, you are under­min­ing democ­ra­cy and treat­ing vot­ers in the same way as you are treat­ing ter­ror­ists.’ . . . .”
  4. Wylie’s obser­va­tions on how Cam­bridge Ana­lyt­i­ca’s method­ol­o­gy can be used to build a fas­cist polit­i­cal move­ment: ” . . . . One of the rea­sons these tech­niques are so insid­i­ous is that being a tar­get of a dis­in­for­ma­tion cam­paign is ‘usu­al­ly a plea­sur­able expe­ri­ence’, because you are being fed con­tent with which you are like­ly to agree. ‘You are being guid­ed through some­thing that you want to be true,’ Wylie says. To build an insur­gency, he explains, you first tar­get peo­ple who are more prone to hav­ing errat­ic traits, para­noia or con­spir­a­to­r­i­al think­ing, and get them to ‘like’ a group on social media. They start engag­ing with the con­tent, which may or may not be true; either way ‘it feels good to see that infor­ma­tion’. When the group reach­es 1,000 or 2,000 mem­bers, an event is set up in the local area. Even if only 5% show up, ‘that’s 50 to 100 peo­ple flood­ing a local cof­fee shop’, Wylie says. This, he adds, val­i­dates their opin­ion because oth­er peo­ple there are also talk­ing about ‘all these things that you’ve been see­ing online in the depths of your den and get­ting angry about’. Peo­ple then start to believe the rea­son it’s not shown on main­stream news chan­nels is because ‘they don’t want you to know what the truth is’. As Wylie sums it up: ‘What start­ed out as a fan­ta­sy online gets port­ed into the tem­po­ral world and becomes real to you because you see all these peo­ple around you.’ . . . .”
  5. Wylie’s obser­va­tion that Face­book was “All In” on the Cam­bridge Ana­lyt­i­ca machi­na­tions: ” . . . . ‘Face­book has known about what Cam­bridge Ana­lyt­i­ca was up to from the very begin­ning of those projects,” Wylie claims. “They were noti­fied, they autho­rised the appli­ca­tions, they were giv­en the terms and con­di­tions of the app that said explic­it­ly what it was doing. They hired peo­ple who worked on build­ing the app. I had legal cor­re­spon­dence with their lawyers where they acknowl­edged it hap­pened as far back as 2016.’ . . . .”
  6. The deci­sive par­tic­i­pa­tion of “Spy Tech” firm Palan­tir in the Cam­bridge Ana­lyt­i­ca oper­a­tion: Peter Thiel’s sur­veil­lance firm Palan­tir was appar­ent­ly deeply involved with Cam­bridge Ana­lyt­i­ca’s gam­ing of per­son­al data har­vest­ed from Face­book in order to engi­neer an elec­toral vic­to­ry for Trump. Thiel was an ear­ly investor in Face­book, at one point was its largest share­hold­er and is still one of its largest share­hold­ers. In addi­tion to his oppo­si­tion to democ­ra­cy because it alleged­ly is inim­i­cal to wealth cre­ation, Thiel does­n’t think women should be allowed to vote and holds Nazi legal the­o­reti­cian Carl Schmitt in high regard. ” . . . . It was a Palan­tir employ­ee in Lon­don, work­ing close­ly with the data sci­en­tists build­ing Cambridge’s psy­cho­log­i­cal pro­fil­ing tech­nol­o­gy, who sug­gest­ed the sci­en­tists cre­ate their own app — a mobile-phone-based per­son­al­i­ty quiz — to gain access to Face­book users’ friend net­works, accord­ing to doc­u­ments obtained by The New York Times. The rev­e­la­tions pulled Palan­tir — co-found­ed by the wealthy lib­er­tar­i­an Peter Thiel — into the furor sur­round­ing Cam­bridge, which improp­er­ly obtained Face­book data to build ana­lyt­i­cal tools it deployed on behalf of Don­ald J. Trump and oth­er Repub­li­can can­di­dates in 2016. Mr. Thiel, a sup­port­er of Pres­i­dent Trump, serves on the board at Face­book. ‘There were senior Palan­tir employ­ees that were also work­ing on the Face­book data,’ said Christo­pher Wylie, a data expert and Cam­bridge Ana­lyt­i­ca co-founder, in tes­ti­mo­ny before British law­mak­ers on Tues­day. . . . The con­nec­tions between Palan­tir and Cam­bridge Ana­lyt­i­ca were thrust into the spot­light by Mr. Wylie’s tes­ti­mo­ny on Tues­day. Both com­pa­nies are linked to tech-dri­ven bil­lion­aires who backed Mr. Trump’s cam­paign: Cam­bridge is chiefly owned by Robert Mer­cer, the com­put­er sci­en­tist and hedge fund mag­nate, while Palan­tir was co-found­ed in 2003 by Mr. Thiel, who was an ini­tial investor in Face­book. . . .”
  7. The use of “dark posts” by the Cam­bridge Ana­lyt­i­ca team. (We have not­ed that Brad Parscale has reassem­bled the old Cam­bridge Ana­lyt­i­ca team for Trump’s 2020 elec­tion cam­paign. It seems prob­a­ble that AOC’s mil­lions of online fol­low­ers, as well as the “Bernie Bots,” will be get­ting “dark posts” craft­ed by AI’s scan­ning their online efforts.) ” . . . . One recent adver­tis­ing prod­uct on Face­book is the so-called ‘dark post’: A news­feed mes­sage seen by no one aside from the users being tar­get­ed. With the help of Cam­bridge Ana­lyt­i­ca, Mr. Trump’s dig­i­tal team used dark posts to serve dif­fer­ent ads to dif­fer­ent poten­tial vot­ers, aim­ing to push the exact right but­tons for the exact right peo­ple at the exact right times. . . .”

Peter Thiel

Sup­ple­ment­ing the dis­cus­sion about Cam­bridge Ana­lyt­i­ca, the pro­gram reviews infor­ma­tion from FTR #718 about Face­book’s appar­ent involve­ment with ele­ments and indi­vid­u­als linked to CIA and DARPA: ” . . . . Face­book’s most recent round of fund­ing was led by a com­pa­ny called Grey­lock Ven­ture Cap­i­tal, who put in the sum of $27.5m. One of Grey­lock­’s senior part­ners is called Howard Cox, anoth­er for­mer chair­man of the NVCA, who is also on the board of In-Q-Tel. What’s In-Q-Tel? Well, believe it or not (and check out their web­site), this is the ven­ture-cap­i­tal wing of the CIA. After 9/11, the US intel­li­gence com­mu­ni­ty became so excit­ed by the pos­si­bil­i­ties of new tech­nol­o­gy and the inno­va­tions being made in the pri­vate sec­tor, that in 1999 they set up their own ven­ture cap­i­tal fund, In-Q-Tel, which ‘iden­ti­fies and part­ners with com­pa­nies devel­op­ing cut­ting-edge tech­nolo­gies to help deliv­er these solu­tions to the Cen­tral Intel­li­gence Agency and the broad­er US Intel­li­gence Com­mu­ni­ty (IC) to fur­ther their mis­sions’. . . .”

More about the CIA/DARPA links to the devel­op­ment of Face­book: ” . . . . The sec­ond round of fund­ing into Face­book ($US12.7 mil­lion) came from ven­ture cap­i­tal firm Accel Part­ners. Its man­ag­er James Brey­er was for­mer­ly chair­man of the Nation­al Ven­ture Cap­i­tal Asso­ci­a­tion, and served on the board with Gilman Louie, CEO of In-Q-Tel, a ven­ture cap­i­tal firm estab­lished by the Cen­tral Intel­li­gence Agency in 1999. One of the com­pa­ny’s key areas of exper­tise are in ‘data min­ing tech­nolo­gies’. Brey­er also served on the board of R&D firm BBN Tech­nolo­gies, which was one of those com­pa­nies respon­si­ble for the rise of the inter­net. Dr Ani­ta Jones joined the firm, which includ­ed Gilman Louie. She had also served on the In-Q-Tel’s board, and had been direc­tor of Defence Research and Engi­neer­ing for the US Depart­ment of Defence. She was also an advis­er to the Sec­re­tary of Defence and over­see­ing the Defence Advanced Research Projects Agency (DARPA), which is respon­si­ble for high-tech, high-end devel­op­ment. . . .”

Oleh Tihany­bok, leader of the OUN/B suc­ces­sor orga­ni­za­tion Svo­bo­da, to which Face­book’s Katery­na Kruk belonged.

Pro­gram High­lights Include: Review of Face­book’s plans to use brain-to-com­put­er tech­nol­o­gy to oper­ate its plat­form, there­by the enabling of record­ing and data­bas­ing peo­ple’s thoughts; Review of Face­book’s employ­ment of for­mer DARPA head Regi­na Dugan to imple­ment the brain-to-com­put­er tech­nol­o­gy; Review of Face­book’s build­ing 8–designed to dupli­cate DARPA; Review of Face­book’s hir­ing of the Atlantic Coun­cil to police the social medi­um’s online con­tent; Review of Face­book’s part­ner­ing with Naren­dra Mod­i’s Hin­dut­va fas­cist gov­ern­ment in India; Review of Face­book’s emloy­ment of Ukrain­ian fas­cist Katery­na Kruk to man­age the social medi­um’s Ukrain­ian con­tent.

1a. Face­book per­son­al­i­ty tests that alleged­ly let you learn things about what make you tick allows who­ev­er set up that test learn what makes you tick too. Since it’s done through Face­book, they can iden­ti­fy your test results with your real iden­ti­ty.

If the Face­book per­son­al­i­ty test in ques­tion hap­pens to report your “Ocean score” (Open­ness, Con­sci­en­tious­ness, Extra­ver­sion, Agree­able­ness and Neu­roti­cism), that means the test your tak­ing was cre­at­ed by Cam­bridge Ana­lyt­i­ca, a com­pa­ny with one of Don­ald Trump’s bil­lion­aire sug­ar-dad­dies, Robert Mer­cer, as a major investor. And it’s Cam­bridge Ana­lyt­i­ca that gets to learn all those fun facts about your psy­cho­log­i­cal pro­file too. And Steve Ban­non sat on its board:

“The Secret Agen­da of a Face­book Quiz” by McKen­zie Funk; The New York Times; 1/19/2017.

Do you pan­ic eas­i­ly? Do you often feel blue? Do you have a sharp tongue? Do you get chores done right away? Do you believe in the impor­tance of art?

If ever you’ve answered ques­tions like these on one of the free per­son­al­i­ty quizzes float­ing around Face­book, you’ll have learned what’s known as your Ocean score: How you rate accord­ing to the big five psy­cho­log­i­cal traits of Open­ness, Con­sci­en­tious­ness, Extra­ver­sion, Agree­able­ness and Neu­roti­cism. You may also be respon­si­ble the next time Amer­i­ca is shocked by an elec­tion upset.

For sev­er­al years, a data firm even­tu­al­ly hired by the Trump cam­paign, Cam­bridge Ana­lyt­i­ca, has been using Face­book as a tool to build psy­cho­log­i­cal pro­files that rep­re­sent some 230 mil­lion adult Amer­i­cans. A spin­off of a British con­sult­ing com­pa­ny and some­time-defense con­trac­tor known for its coun­tert­er­ror­ism “psy ops” work in Afghanistan, the firm does so by seed­ing the social net­work with per­son­al­i­ty quizzes. Respon­dents — by now hun­dreds of thou­sands of us, most­ly female and most­ly young but enough male and old­er for the firm to make infer­ences about oth­ers with sim­i­lar behav­iors and demo­graph­ics — get a free look at their Ocean scores. Cam­bridge Ana­lyt­i­ca also gets a look at their scores and, thanks to Face­book, gains access to their pro­files and real names.

Cam­bridge Ana­lyt­i­ca worked on the “Leave” side of the Brex­it cam­paign. In the Unit­ed States it takes only Repub­li­cans as clients: Sen­a­tor Ted Cruz in the pri­maries, Mr. Trump in the gen­er­al elec­tion. Cam­bridge is report­ed­ly backed by Robert Mer­cer, a hedge fund bil­lion­aire and a major Repub­li­can donor; a key board mem­ber is Stephen K. Ban­non, the head of Bre­it­bart News who became Mr. Trump’s cam­paign chair­man and is set to be his chief strate­gist in the White House.

In the age of Face­book, it has become far eas­i­er for cam­paign­ers or mar­keters to com­bine our online per­sonas with our offline selves, a process that was once con­tro­ver­sial but is now so com­mon­place that there’s a term for it, “onboard­ing.” Cam­bridge Ana­lyt­i­ca says it has as many as 3,000 to 5,000 data points on each of us, be it vot­ing his­to­ries or full-spec­trum demo­graph­ics — age, income, debt, hob­bies, crim­i­nal his­to­ries, pur­chase his­to­ries, reli­gious lean­ings, health con­cerns, gun own­er­ship, car own­er­ship, home­own­er­ship — from con­sumer-data giants.

No data point is very infor­ma­tive on its own, but pro­fil­ing vot­ers, says Cam­bridge Ana­lyt­i­ca, is like bak­ing a cake. “It’s the sum of the ingre­di­ents,” its chief exec­u­tive offi­cer, Alexan­der Nix, told NBC News. Because the Unit­ed States lacks Euro­pean-style restric­tions on sec­ond- or third­hand use of our data, and because our free­dom-of-infor­ma­tion laws give data bro­kers broad access to the inti­mate records kept by local and state gov­ern­ments, our lives are open books even with­out social media or per­son­al­i­ty quizzes.

Ever since the adver­tis­ing exec­u­tive Lester Wun­der­man coined the term “direct mar­ket­ing” in 1961, the abil­i­ty to tar­get spe­cif­ic con­sumers with ads — rather than blan­ket­ing the air­waves with mass appeals and hop­ing the right peo­ple will hear them — has been the marketer’s holy grail. What’s new is the effi­cien­cy with which indi­vid­u­al­ly tai­lored dig­i­tal ads can be test­ed and matched to our per­son­al­i­ties. Face­book is the microtargeter’s ulti­mate weapon.

The explo­sive growth of Facebook’s ad busi­ness has been over­shad­owed by its increas­ing role in how we get our news, real or fake. In July, the social net­work post­ed record earn­ings: quar­ter­ly sales were up 59 per­cent from the pre­vi­ous year, and prof­its almost tripled to $2.06 bil­lion. While active users of Face­book — now 1.71 bil­lion month­ly active users — were up 15 per­cent, the real sto­ry was how much each indi­vid­ual user was worth. The com­pa­ny makes $3.82 a year from each glob­al user, up from $2.76 a year ago, and an aver­age of $14.34 per user in the Unit­ed States, up from $9.30 a year ago. Much of this growth comes from the fact that adver­tis­ers not only have an enor­mous audi­ence in Face­book but an audi­ence they can slice into the tranch­es they hope to reach.

One recent adver­tis­ing prod­uct on Face­book is the so-called “dark post”: A news­feed mes­sage seen by no one aside from the users being tar­get­ed. With the help of Cam­bridge Ana­lyt­i­ca, Mr. Trump’s dig­i­tal team used dark posts to serve dif­fer­ent ads to dif­fer­ent poten­tial vot­ers, aim­ing to push the exact right but­tons for the exact right peo­ple at the exact right times.

Imag­ine the full capa­bil­i­ty of this kind of “psy­cho­graph­ic” adver­tis­ing. In future Repub­li­can cam­paigns, a pro-gun vot­er whose Ocean score ranks him high on neu­roti­cism could see storm clouds and a threat: The Demo­c­rat wants to take his guns away. A sep­a­rate pro-gun vot­er deemed agree­able and intro­vert­ed might see an ad empha­siz­ing tra­di­tion and com­mu­ni­ty val­ues, a father and son hunt­ing togeth­er.

In this elec­tion, dark posts were used to try to sup­press the African-Amer­i­can vote. Accord­ing to Bloomberg, the Trump cam­paign sent ads remind­ing cer­tain select­ed black vot­ers of Hillary Clinton’s infa­mous “super preda­tor” line. It tar­get­ed Miami’s Lit­tle Haiti neigh­bor­hood with mes­sages about the Clin­ton Foundation’s trou­bles in Haiti after the 2010 earth­quake. Fed­er­al Elec­tion Com­mis­sion rules are unclear when it comes to Face­book posts, but even if they do apply and the facts are skewed and the dog whis­tles loud, the already weak­en­ing pow­er of social oppro­bri­um is gone when no one else sees the ad you see — and no one else sees “I’m Don­ald Trump, and I approved this mes­sage.”

While Hillary Clin­ton spent more than $140 mil­lion on tele­vi­sion spots, old-media experts scoffed at Trump’s lack of old-media ad buys. Instead, his cam­paign pumped its mon­ey into dig­i­tal, espe­cial­ly Face­book. One day in August, it flood­ed the social net­work with 100,000 ad vari­a­tions, so-called A/B test­ing on a bib­li­cal scale, sure­ly more ads than could eas­i­ly be vet­ted by human eyes for com­pli­ance with Facebook’s “com­mu­ni­ty stan­dards.”

1b. Christo­pher Wylie–the for­mer head of research at Cam­bridge Ana­lyt­i­ca who became one of the key insid­er whis­tle-blow­ers about how Cam­bridge Ana­lyt­i­ca oper­at­ed and the extent of Facebook’s knowl­edge about it–gave an inter­view last month to Cam­paign Mag­a­zine. (We dealt with Cam­bridge Ana­lyt­i­ca in FTR #‘s 946, 1021.)

Wylie recounts how, as direc­tor of research at Cam­bridge Ana­lyt­i­ca, his orig­i­nal role was to deter­mine how the com­pa­ny could use the infor­ma­tion war­fare tech­niques used by SCL Group – Cam­bridge Analytica’s par­ent com­pa­ny and a defense con­trac­tor pro­vid­ing psy op ser­vices for the British mil­i­tary. Wylie’s job was to adapt the psy­cho­log­i­cal war­fare strate­gies that SCL had been using on the bat­tle­field to the online space. As Wylie put it:

“ . . . . When you are work­ing in infor­ma­tion oper­a­tions projects, where your tar­get is a com­bat­ant, the auton­o­my or agency of your tar­gets is not your pri­ma­ry con­sid­er­a­tion. It is fair game to deny and manip­u­late infor­ma­tion, coerce and exploit any men­tal vul­ner­a­bil­i­ties a per­son has, and to bring out the very worst char­ac­ter­is­tics in that per­son because they are an ene­my…But if you port that over to a demo­c­ra­t­ic sys­tem, if you run cam­paigns designed to under­mine people’s abil­i­ty to make free choic­es and to under­stand what is real and not real, you are under­min­ing democ­ra­cy and treat­ing vot­ers in the same way as you are treat­ing ter­ror­ists. . . . .”

Wylie also draws par­al­lels between the psy­cho­log­i­cal oper­a­tions used on demo­c­ra­t­ic audi­ences and the bat­tle­field tech­niques used to be build an insur­gency. It starts with tar­get­ing peo­ple more prone to hav­ing errat­ic traits, para­noia or con­spir­a­to­r­i­al think­ing, and get them to “like” a group on social media. The infor­ma­tion you’re feed­ing this tar­get audi­ence may or may not be real. The impor­tant thing is that it’s con­tent that they already agree with so that “it feels good to see that infor­ma­tion.” Keep in mind that one of the goals of the ‘psy­cho­graph­ic pro­fil­ing’ that Cam­bridge Ana­lyt­i­ca was to iden­ti­fy traits like neu­roti­cism.

Wylie goes on to describe the next step in this insur­gency-build­ing tech­nique: keep build­ing up the inter­est in the social media group that you’re direct­ing this tar­get audi­ence towards until it hits around 1,000–2,000 peo­ple. Then set up a real life event ded­i­cat­ed to the cho­sen dis­in­for­ma­tion top­ic in some local area and try to get as many of your tar­get audi­ence to show up. Even if only 5 per­cent of them show up, that’s still 50–100 peo­ple con­verg­ing on some local cof­fee shop or what­ev­er. The peo­ple meet each oth­er in real life and start talk­ing about about “all these things that you’ve been see­ing online in the depths of your den and get­ting angry about”. This tar­get audi­ence starts believ­ing that no one else is talk­ing about this stuff because “they don’t want you to know what the truth is”. As Wylie puts it, “What start­ed out as a fan­ta­sy online gets port­ed into the tem­po­ral world and becomes real to you because you see all these peo­ple around you.”

“Cam­bridge Ana­lyt­i­ca whistle­blow­er Christo­pher Wylie: It’s time to save cre­ativ­i­ty” by Kate Magee; Cam­paign; 11/05/2018.

In the ear­ly hours of 17 March 2018, the 28-year-old Christo­pher Wylie tweet­ed: “Here we go….”

Lat­er that day, The Observ­er and The New York Times pub­lished the sto­ry of Cam­bridge Analytica’s mis­use of Face­book data, which sent shock­waves around the world, caused mil­lions to #Delete­Face­book, and led the UK Infor­ma­tion Commissioner’s Office to fine the site the max­i­mum penal­ty for fail­ing to pro­tect users’ infor­ma­tion. Six weeks after the sto­ry broke, Cam­bridge Ana­lyt­i­ca closed. . . .

. . . . He believes that poor use of data is killing good ideas. And that, unless effec­tive reg­u­la­tion is enact­ed, society’s wor­ship of algo­rithms, unchecked data cap­ture and use, and the like­ly spread of AI to all parts of our lives is caus­ing us to sleep­walk into a bleak future.

Not only are such cir­cum­stances a threat to adland – why do you need an ad to tell you about a prod­uct if an algo­rithm is choos­ing it for you? – it is a threat to human free will. “Cur­rent­ly, the only moral­i­ty of the algo­rithm is to opti­mise you as a con­sumer and, in many cas­es, you become the prod­uct. There are very few exam­ples in human his­to­ry of indus­tries where peo­ple them­selves become prod­ucts and those are scary indus­tries – slav­ery and the sex trade. And now, we have social media,” Wylie says.

“The prob­lem with that, and what makes it inher­ent­ly dif­fer­ent to sell­ing, say, tooth­paste, is that you’re sell­ing parts of peo­ple or access to peo­ple. Peo­ple have an innate moral worth. If we don’t respect that, we can cre­ate indus­tries that do ter­ri­ble things to peo­ple. We are [head­ing] blind­ly and quick­ly into an envi­ron­ment where this men­tal­i­ty is going to be ampli­fied through AI every­where. We’re humans, we should be think­ing about peo­ple first.”

His words car­ry weight, because he’s been on the dark side. He has seen what can hap­pen when data is used to spread mis­in­for­ma­tion, cre­ate insur­gen­cies and prey on the worst of people’s char­ac­ters.

The polit­i­cal bat­tle­field

A quick refresh­er on the scan­dal, in Wylie’s words: Cam­bridge Ana­lyt­i­ca was a com­pa­ny spun out of SCL Group, a British mil­i­tary con­trac­tor that worked in infor­ma­tion oper­a­tions for armed forces around the world. It was con­duct­ing research on how to scale and digi­tise infor­ma­tion war­fare – the use of infor­ma­tion to con­fuse or degrade the effi­ca­cy of an ene­my. . . .

. . . . As direc­tor of research, Wylie’s orig­i­nal role was to map out how the com­pa­ny would take tra­di­tion­al infor­ma­tion oper­a­tions tac­tics into the online space – in par­tic­u­lar, by pro­fil­ing peo­ple who would be sus­cep­ti­ble to cer­tain mes­sag­ing.

This mor­phed into the polit­i­cal are­na. After Wylie left, the com­pa­ny worked on Don­ald Trump’s US pres­i­den­tial cam­paign . . . .

. . . . It was this shift from the bat­tle­field to pol­i­tics that made Wylie uncom­fort­able. “When you are work­ing in infor­ma­tion oper­a­tions projects, where your tar­get is a com­bat­ant, the auton­o­my or agency of your tar­gets is not your pri­ma­ry con­sid­er­a­tion. It is fair game to deny and manip­u­late infor­ma­tion, coerce and exploit any men­tal vul­ner­a­bil­i­ties a per­son has, and to bring out the very worst char­ac­ter­is­tics in that per­son because they are an ene­my,” he says.

“But if you port that over to a demo­c­ra­t­ic sys­tem, if you run cam­paigns designed to under­mine people’s abil­i­ty to make free choic­es and to under­stand what is real and not real, you are under­min­ing democ­ra­cy and treat­ing vot­ers in the same way as you are treat­ing ter­ror­ists.”

One of the rea­sons these tech­niques are so insid­i­ous is that being a tar­get of a dis­in­for­ma­tion cam­paign is “usu­al­ly a plea­sur­able expe­ri­ence”, because you are being fed con­tent with which you are like­ly to agree. “You are being guid­ed through some­thing that you want to be true,” Wylie says.

To build an insur­gency, he explains, you first tar­get peo­ple who are more prone to hav­ing errat­ic traits, para­noia or con­spir­a­to­r­i­al think­ing, and get them to “like” a group on social media. They start engag­ing with the con­tent, which may or may not be true; either way “it feels good to see that infor­ma­tion”.

When the group reach­es 1,000 or 2,000 mem­bers, an event is set up in the local area. Even if only 5% show up, “that’s 50 to 100 peo­ple flood­ing a local cof­fee shop”, Wylie says. This, he adds, val­i­dates their opin­ion because oth­er peo­ple there are also talk­ing about “all these things that you’ve been see­ing online in the depths of your den and get­ting angry about”.

Peo­ple then start to believe the rea­son it’s not shown on main­stream news chan­nels is because “they don’t want you to know what the truth is”. As Wylie sums it up: “What start­ed out as a fan­ta­sy online gets port­ed into the tem­po­ral world and becomes real to you because you see all these peo­ple around you.” . . . . 

. . . . Psy­cho­graph­ic poten­tial

. . . . But Wylie argues that peo­ple under­es­ti­mate what algo­rithms allow you to do in pro­fil­ing. “I can take pieces of infor­ma­tion about you that seem innocu­ous, but what I’m able to do with an algo­rithm is find pat­terns that cor­re­late to under­ly­ing psy­cho­log­i­cal pro­files,” he explains.

“I can ask whether you lis­ten to Justin Bieber, and you won’t feel like I’m invad­ing your pri­va­cy. You aren’t nec­es­sar­i­ly aware that when you tell me what music you lis­ten to or what TV shows you watch, you are telling me some of your deep­est and most per­son­al attrib­ut­es.” . . . .

. . . . Clash­es with Face­book

Wylie is opposed to self-reg­u­la­tion, because indus­tries won’t become con­sumer cham­pi­ons – they are, he says, too con­flict­ed.

“Face­book has known about what Cam­bridge Ana­lyt­i­ca was up to from the very begin­ning of those projects,” Wylie claims. “They were noti­fied, they autho­rised the appli­ca­tions, they were giv­en the terms and con­di­tions of the app that said explic­it­ly what it was doing. They hired peo­ple who worked on build­ing the app. I had legal cor­re­spon­dence with their lawyers where they acknowl­edged it hap­pened as far back as 2016.” . . . . 

1c. In FTR #946, we exam­ined Cam­bridge Ana­lyt­i­ca, its Trump and Steve Ban­non-linked tech firm that har­vest­ed Face­book data on behalf of the Trump cam­paign.

Peter Thiel’s sur­veil­lance firm Palan­tir was appar­ent­ly deeply involved with Cam­bridge Ana­lyt­i­ca’s gam­ing of per­son­al data har­vest­ed from Face­book in order to engi­neer an elec­toral vic­to­ry for Trump. Thiel was an ear­ly investor in Face­book, at one point was its largest share­hold­er and is still one of its largest share­hold­ers. ” . . . . It was a Palan­tir employ­ee in Lon­don, work­ing close­ly with the data sci­en­tists build­ing Cambridge’s psy­cho­log­i­cal pro­fil­ing tech­nol­o­gy, who sug­gest­ed the sci­en­tists cre­ate their own app — a mobile-phone-based per­son­al­i­ty quiz — to gain access to Face­book users’ friend net­works, accord­ing to doc­u­ments obtained by The New York Times. The rev­e­la­tions pulled Palan­tir — co-found­ed by the wealthy lib­er­tar­i­an Peter Thiel — into the furor sur­round­ing Cam­bridge, which improp­er­ly obtained Face­book data to build ana­lyt­i­cal tools it deployed on behalf of Don­ald J. Trump and oth­er Repub­li­can can­di­dates in 2016. Mr. Thiel, a sup­port­er of Pres­i­dent Trump, serves on the board at Face­book. ‘There were senior Palan­tir employ­ees that were also work­ing on the Face­book data,’ said Christo­pher Wylie, a data expert and Cam­bridge Ana­lyt­i­ca co-founder, in tes­ti­mo­ny before British law­mak­ers on Tues­day. . . . The con­nec­tions between Palan­tir and Cam­bridge Ana­lyt­i­ca were thrust into the spot­light by Mr. Wylie’s tes­ti­mo­ny on Tues­day. Both com­pa­nies are linked to tech-dri­ven bil­lion­aires who backed Mr. Trump’s cam­paign: Cam­bridge is chiefly owned by Robert Mer­cer, the com­put­er sci­en­tist and hedge fund mag­nate, while Palan­tir was co-found­ed in 2003 by Mr. Thiel, who was an ini­tial investor in Face­book. . . .”

“Spy Contractor’s Idea Helped Cam­bridge Ana­lyt­i­ca Har­vest Face­book Data” by NICHOLAS CONFESSORE and MATTHEW ROSENBERG; The New York Times; 03/27/2018

As a start-up called Cam­bridge Ana­lyt­i­ca sought to har­vest the Face­book data of tens of mil­lions of Amer­i­cans in sum­mer 2014, the com­pa­ny received help from at least one employ­ee at Palan­tir Tech­nolo­gies, a top Sil­i­con Val­ley con­trac­tor to Amer­i­can spy agen­cies and the Pen­ta­gon. It was a Palan­tir employ­ee in Lon­don, work­ing close­ly with the data sci­en­tists build­ing Cambridge’s psy­cho­log­i­cal pro­fil­ing tech­nol­o­gy, who sug­gest­ed the sci­en­tists cre­ate their own app — a mobile-phone-based per­son­al­i­ty quiz — to gain access to Face­book users’ friend net­works, accord­ing to doc­u­ments obtained by The New York Times.

Cam­bridge ulti­mate­ly took a sim­i­lar approach. By ear­ly sum­mer, the com­pa­ny found a uni­ver­si­ty researcher to har­vest data using a per­son­al­i­ty ques­tion­naire and Face­book app. The researcher scraped pri­vate data from over 50 mil­lion Face­book users — and Cam­bridge Ana­lyt­i­ca went into busi­ness sell­ing so-called psy­cho­me­t­ric pro­files of Amer­i­can vot­ers, set­ting itself on a col­li­sion course with reg­u­la­tors and law­mak­ers in the Unit­ed States and Britain.

The rev­e­la­tions pulled Palan­tir — co-found­ed by the wealthy lib­er­tar­i­an Peter Thiel — into the furor sur­round­ing Cam­bridge, which improp­er­ly obtained Face­book data to build ana­lyt­i­cal tools it deployed on behalf of Don­ald J. Trump and oth­er Repub­li­can can­di­dates in 2016. Mr. Thiel, a sup­port­er of Pres­i­dent Trump, serves on the board at Face­book.

“There were senior Palan­tir employ­ees that were also work­ing on the Face­book data,” said Christo­pher Wylie, a data expert and Cam­bridge Ana­lyt­i­ca co-founder, in tes­ti­mo­ny before British law­mak­ers on Tues­day. . . .

. . . .The con­nec­tions between Palan­tir and Cam­bridge Ana­lyt­i­ca were thrust into the spot­light by Mr. Wylie’s tes­ti­mo­ny on Tues­day. Both com­pa­nies are linked to tech-dri­ven bil­lion­aires who backed Mr. Trump’s cam­paign: Cam­bridge is chiefly owned by Robert Mer­cer, the com­put­er sci­en­tist and hedge fund mag­nate, while Palan­tir was co-found­ed in 2003 by Mr. Thiel, who was an ini­tial investor in Face­book. . . .

. . . . Doc­u­ments and inter­views indi­cate that start­ing in 2013, Mr. Chmieli­auskas began cor­re­spond­ing with Mr. Wylie and a col­league from his Gmail account. At the time, Mr. Wylie and the col­league worked for the British defense and intel­li­gence con­trac­tor SCL Group, which formed Cam­bridge Ana­lyt­i­ca with Mr. Mer­cer the next year. The three shared Google doc­u­ments to brain­storm ideas about using big data to cre­ate sophis­ti­cat­ed behav­ioral pro­files, a prod­uct code-named “Big Dad­dy.”

A for­mer intern at SCL — Sophie Schmidt, the daugh­ter of Eric Schmidt, then Google’s exec­u­tive chair­man — urged the com­pa­ny to link up with Palan­tir, accord­ing to Mr. Wylie’s tes­ti­mo­ny and a June 2013 email viewed by The Times.

“Ever come across Palan­tir. Amus­ing­ly Eric Schmidt’s daugh­ter was an intern with us and is try­ing to push us towards them?” one SCL employ­ee wrote to a col­league in the email.

. . . . But he [Wylie] said some Palan­tir employ­ees helped engi­neer Cambridge’s psy­cho­graph­ic mod­els.

“There were Palan­tir staff who would come into the office and work on the data,” Mr. Wylie told law­mak­ers. “And we would go and meet with Palan­tir staff at Palan­tir.” He did not pro­vide an exact num­ber for the employ­ees or iden­ti­fy them.

Palan­tir employ­ees were impressed with Cambridge’s back­ing from Mr. Mer­cer, one of the world’s rich­est men, accord­ing to mes­sages viewed by The Times. And Cam­bridge Ana­lyt­i­ca viewed Palantir’s Sil­i­con Val­ley ties as a valu­able resource for launch­ing and expand­ing its own busi­ness.

In an inter­view this month with The Times, Mr. Wylie said that Palan­tir employ­ees were eager to learn more about using Face­book data and psy­cho­graph­ics. Those dis­cus­sions con­tin­ued through spring 2014, accord­ing to Mr. Wylie.

Mr. Wylie said that he and Mr. Nix vis­it­ed Palantir’s Lon­don office on Soho Square. One side was set up like a high-secu­ri­ty office, Mr. Wylie said, with sep­a­rate rooms that could be entered only with par­tic­u­lar codes. The oth­er side, he said, was like a tech start-up — “weird inspi­ra­tional quotes and stuff on the wall and free beer, and there’s a Ping-Pong table.”

Mr. Chmieli­auskas con­tin­ued to com­mu­ni­cate with Mr. Wylie’s team in 2014, as the Cam­bridge employ­ees were locked in pro­tract­ed nego­ti­a­tions with a researcher at Cam­bridge Uni­ver­si­ty, Michal Kosin­s­ki, to obtain Face­book data through an app Mr. Kosin­s­ki had built. The data was cru­cial to effi­cient­ly scale up Cambridge’s psy­cho­met­rics prod­ucts so they could be used in elec­tions and for cor­po­rate clients. . . .

2a. There are indi­ca­tions that ele­ments in  and/or asso­ci­at­ed with CIA and Pentagon/DARPA were  involved with Face­book almost from the begin­ning: ” . . . . Face­book’s most recent round of fund­ing was led by a com­pa­ny called Grey­lock Ven­ture Cap­i­tal, who put in the sum of $27.5m. One of Grey­lock­’s senior part­ners is called Howard Cox, anoth­er for­mer chair­man of the NVCA, who is also on the board of In-Q-Tel. What’s In-Q-Tel? Well, believe it or not (and check out their web­site), this is the ven­ture-cap­i­tal wing of the CIA. After 9/11, the US intel­li­gence com­mu­ni­ty became so excit­ed by the pos­si­bil­i­ties of new tech­nol­o­gy and the inno­va­tions being made in the pri­vate sec­tor, that in 1999 they set up their own ven­ture cap­i­tal fund, In-Q-Tel, which ‘iden­ti­fies and part­ners with com­pa­nies devel­op­ing cut­ting-edge tech­nolo­gies to help deliv­er these solu­tions to the Cen­tral Intel­li­gence Agency and the broad­er US Intel­li­gence Com­mu­ni­ty (IC) to fur­ther their mis­sions’. . . .”

“With Friends Like These . . .” by Tim Hodgkin­son; guardian.co.uk; 1/14/2008.

. . . . The third board mem­ber of Face­book is Jim Brey­er. He is a part­ner in the ven­ture cap­i­tal firm Accel Part­ners, who put $12.7m into Face­book in April 2005. On the board of such US giants as Wal-Mart and Mar­vel Enter­tain­ment, he is also a for­mer chair­man of the Nation­al Ven­ture Cap­i­tal Asso­ci­a­tion (NVCA). Now these are the peo­ple who are real­ly mak­ing things hap­pen in Amer­i­ca, because they invest in the new young tal­ent, the Zucker­bergs and the like. Face­book’s most recent round of fund­ing was led by a com­pa­ny called Grey­lock Ven­ture Cap­i­tal, who put in the sum of $27.5m. One of Grey­lock­’s senior part­ners is called Howard Cox, anoth­er for­mer chair­man of the NVCA, who is also on the board of In-Q-Tel. What’s In-Q-Tel? Well, believe it or not (and check out their web­site), this is the ven­ture-cap­i­tal wing of the CIA. After 9/11, the US intel­li­gence com­mu­ni­ty became so excit­ed by the pos­si­bil­i­ties of new tech­nol­o­gy and the inno­va­tions being made in the pri­vate sec­tor, that in 1999 they set up their own ven­ture cap­i­tal fund, In-Q-Tel, which “iden­ti­fies and part­ners with com­pa­nies devel­op­ing cut­ting-edge tech­nolo­gies to help deliv­er these solu­tions to the Cen­tral Intel­li­gence Agency and the broad­er US Intel­li­gence Com­mu­ni­ty (IC) to fur­ther their mis­sions”. . . .

2b.  More about the CIA/Pentagon link to the devel­op­ment of Face­book: ” . . . . The sec­ond round of fund­ing into Face­book ($US12.7 mil­lion) came from ven­ture cap­i­tal firm Accel Part­ners. Its man­ag­er James Brey­er was for­mer­ly chair­man of the Nation­al Ven­ture Cap­i­tal Asso­ci­a­tion, and served on the board with Gilman Louie, CEO of In-Q-Tel, a ven­ture cap­i­tal firm estab­lished by the Cen­tral Intel­li­gence Agency in 1999. One of the com­pa­ny’s key areas of exper­tise are in ‘data min­ing tech­nolo­gies’.  Brey­er also served on the board of R&D firm BBN Tech­nolo­gies, which was one of those com­pa­nies respon­si­ble for the rise of the inter­net. Dr Ani­ta Jones joined the firm, which includ­ed Gilman Louie. She had also served on the In-Q-Tel’s board, and had been direc­tor of Defence Research and Engi­neer­ing for the US Depart­ment of Defence. She was also an advis­er to the Sec­re­tary of Defence and over­see­ing the Defence Advanced Research Projects Agency (DARPA), which is respon­si­ble for high-tech, high-end devel­op­ment. . . .”

“Facebook–the CIA Con­spir­a­cy” by Matt Greenop; The New Zealand Her­ald; 8/8/2007.

. . . . Face­book’s first round of ven­ture cap­i­tal fund­ing ($US500,000) came from for­mer Pay­pal CEO Peter Thiel. Author of anti-mul­ti­cul­tur­al tome ‘The Diver­si­ty Myth’, he is also on the board of rad­i­cal con­ser­v­a­tive group Van­guard­PAC.

The sec­ond round of fund­ing into Face­book ($US12.7 mil­lion) came from ven­ture cap­i­tal firm Accel Part­ners. Its man­ag­er James Brey­er was for­mer­ly chair­man of the Nation­al Ven­ture Cap­i­tal Asso­ci­a­tion, and served on the board with Gilman Louie, CEO of In-Q-Tel, a ven­ture cap­i­tal firm estab­lished by the Cen­tral Intel­li­gence Agency in 1999. One of the com­pa­ny’s key areas of exper­tise are in “data min­ing tech­nolo­gies”.

Brey­er also served on the board of R&D firm BBN Tech­nolo­gies, which was one of those com­pa­nies respon­si­ble for the rise of the inter­net.

Dr Ani­ta Jones joined the firm, which includ­ed Gilman Louie. She had also served on the In-Q-Tel’s board, and had been direc­tor of Defence Research and Engi­neer­ing for the US Depart­ment of Defence.

She was also an advis­er to the Sec­re­tary of Defence and over­see­ing the Defence Advanced Research Projects Agency (DARPA), which is respon­si­ble for high-tech, high-end devel­op­ment. . . .

3. Face­book wants to read your thoughts.

  1. ” . . . Face­book wants to build its own “brain-to-com­put­er inter­face” that would allow us to send thoughts straight to a com­put­er. ‘What if you could type direct­ly from your brain?’ Regi­na Dugan, the head of the company’s secre­tive hard­ware R&D divi­sion, Build­ing 8, asked from the stage. Dugan then pro­ceed­ed to show a video demo of a woman typ­ing eight words per minute direct­ly from the stage. In a few years, she said, the team hopes to demon­strate a real-time silent speech sys­tem capa­ble of deliv­er­ing a hun­dred words per minute. ‘That’s five times faster than you can type on your smart­phone, and it’s straight from your brain,’ she said. ‘Your brain activ­i­ty con­tains more infor­ma­tion than what a word sounds like and how it’s spelled; it also con­tains seman­tic infor­ma­tion of what those words mean.’ . . .”
  2. ” . . . . Brain-com­put­er inter­faces are noth­ing new. DARPA, which Dugan used to head, has invest­ed heav­i­ly in brain-com­put­er inter­face tech­nolo­gies to do things like cure men­tal ill­ness and restore mem­o­ries to sol­diers injured in war. But what Face­book is propos­ing is per­haps more radical—a world in which social media doesn’t require pick­ing up a phone or tap­ping a wrist watch in order to com­mu­ni­cate with your friends; a world where we’re con­nect­ed all the time by thought alone. . . .”
  3. ” . . . . Facebook’s Build­ing 8 is mod­eled after DARPA and its projects tend to be equal­ly ambi­tious. . . .”
  4. ” . . . . But what Face­book is propos­ing is per­haps more radical—a world in which social media doesn’t require pick­ing up a phone or tap­ping a wrist watch in order to com­mu­ni­cate with your friends; a world where we’re con­nect­ed all the time by thought alone. . . .”

Face­book Lit­er­al­ly Wants to Read Your Thoughts” by Kris­ten V. Brown; Giz­modo; 4/19/2017.

At Facebook’s annu­al devel­op­er con­fer­ence, F8, on Wednes­day, the group unveiled what may be Facebook’s most ambitious—and creepiest—proposal yet. Face­book wants to build its own “brain-to-com­put­er inter­face” that would allow us to send thoughts straight to a com­put­er.

What if you could type direct­ly from your brain?” Regi­na Dugan, the head of the company’s secre­tive hard­ware R&D divi­sion, Build­ing 8, asked from the stage. Dugan then pro­ceed­ed to show a video demo of a woman typ­ing eight words per minute direct­ly from the stage. In a few years, she said, the team hopes to demon­strate a real-time silent speech sys­tem capa­ble of deliv­er­ing a hun­dred words per minute.

“That’s five times faster than you can type on your smart­phone, and it’s straight from your brain,” she said. “Your brain activ­i­ty con­tains more infor­ma­tion than what a word sounds like and how it’s spelled; it also con­tains seman­tic infor­ma­tion of what those words mean.”

Brain-com­put­er inter­faces are noth­ing new. DARPA, which Dugan used to head, has invest­ed heav­i­ly in brain-com­put­er inter­face tech­nolo­gies to do things like cure men­tal ill­ness and restore mem­o­ries to sol­diers injured in war. But what Face­book is propos­ing is per­haps more radical—a world in which social media doesn’t require pick­ing up a phone or tap­ping a wrist watch in order to com­mu­ni­cate with your friends; a world where we’re con­nect­ed all the time by thought alone.

“Our world is both dig­i­tal and phys­i­cal,” she said. “Our goal is to cre­ate and ship new, cat­e­go­ry-defin­ing con­sumer prod­ucts that are social first, at scale.”

She also showed a video that demon­strat­ed a sec­ond tech­nol­o­gy that showed the abil­i­ty to “lis­ten” to human speech through vibra­tions on the skin. This tech has been in devel­op­ment to aid peo­ple with dis­abil­i­ties, work­ing a lit­tle like a Braille that you feel with your body rather than your fin­gers. Using actu­a­tors and sen­sors, a con­nect­ed arm­band was able to con­vey to a woman in the video a tac­tile vocab­u­lary of nine dif­fer­ent words.

Dugan adds that it’s also pos­si­ble to “lis­ten” to human speech by using your skin. It’s like using braille but through a sys­tem of actu­a­tors and sen­sors. Dugan showed a video exam­ple of how a woman could fig­ure out exact­ly what objects were select­ed on a touch­screen based on inputs deliv­ered through a con­nect­ed arm­band.

Facebook’s Build­ing 8 is mod­eled after DARPA and its projects tend to be equal­ly ambi­tious. Brain-com­put­er inter­face tech­nol­o­gy is still in its infan­cy. So far, researchers have been suc­cess­ful in using it to allow peo­ple with dis­abil­i­ties to con­trol par­a­lyzed or pros­thet­ic limbs. But stim­u­lat­ing the brain’s motor cor­tex is a lot sim­pler than read­ing a person’s thoughts and then trans­lat­ing those thoughts into some­thing that might actu­al­ly be read by a com­put­er.

The end goal is to build an online world that feels more immer­sive and real—no doubt so that you spend more time on Face­book.

“Our brains pro­duce enough data to stream 4 HD movies every sec­ond. The prob­lem is that the best way we have to get infor­ma­tion out into the world — speech — can only trans­mit about the same amount of data as a 1980s modem,” CEO Mark Zucker­berg said in a Face­book post. “We’re work­ing on a sys­tem that will let you type straight from your brain about 5x faster than you can type on your phone today. Even­tu­al­ly, we want to turn it into a wear­able tech­nol­o­gy that can be man­u­fac­tured at scale. Even a sim­ple yes/no ‘brain click’ would help make things like aug­ment­ed real­i­ty feel much more nat­ur­al.”

“That’s five times faster than you can type on your smart­phone, and it’s straight from your brain,” she said. “Your brain activ­i­ty con­tains more infor­ma­tion than what a word sounds like and how it’s spelled; it also con­tains seman­tic infor­ma­tion of what those words mean.”

Brain-com­put­er inter­faces are noth­ing new. DARPA, which Dugan used to head, has invest­ed heav­i­ly in brain-com­put­er inter­face tech­nolo­gies to do things like cure men­tal ill­ness and restore mem­o­ries to sol­diers injured in war. But what Face­book is propos­ing is per­haps more radical—a world in which social media doesn’t require pick­ing up a phone or tap­ping a wrist watch in order to com­mu­ni­cate with your friends; a world where we’re con­nect­ed all the time by thought alone.

4. The broad­cast then reviews (from FTR #1074) Face­book’s inex­tri­ca­ble link with the Hin­dut­va fas­cist BJP of Naren­dra Modi:

Key ele­ments of dis­cus­sion and analy­sis include:

  1. Indi­an pol­i­tics has been large­ly dom­i­nat­ed by fake news, spread by social media: ” . . . . In the con­tin­u­ing Indi­an elec­tions, as 900 mil­lion peo­ple are vot­ing to elect rep­re­sen­ta­tives to the low­er house of the Par­lia­ment, dis­in­for­ma­tion and hate speech are drown­ing out truth on social media net­works in the coun­try and cre­at­ing a pub­lic health cri­sis like the pan­demics of the past cen­tu­ryThis con­ta­gion of a stag­ger­ing amount of mor­phed images, doc­tored videos and text mes­sages is spread­ing large­ly through mes­sag­ing ser­vices and influ­enc­ing what India’s vot­ers watch and read on their smart­phones. A recent study by Microsoft found that over 64 per­cent Indi­ans encoun­tered fake news online, the high­est report­ed among the 22 coun­tries sur­veyed. . . . These plat­forms are filled with fake news and dis­in­for­ma­tion aimed at influ­enc­ing polit­i­cal choic­es dur­ing the Indi­an elec­tions. . . .
  2. Naren­dra Mod­i’s Hin­dut­va fas­cist BJP has been the pri­ma­ry ben­e­fi­cia­ry of fake news, and his regime has part­nered with Face­book: ” . . . . The hear­ing was an exer­cise in absur­dist the­ater because the gov­ern­ing B.J.P. has been the chief ben­e­fi­cia­ry of divi­sive con­tent that reach­es mil­lions because of the way social media algo­rithms, espe­cial­ly Face­book, ampli­fy ‘engag­ing’ arti­cles. . . .”
  3. Rajesh Jain is among those BJP func­tionar­ies who serve Face­book, as well as the Hin­dut­va fas­cists: ” . . . . By the time Rajesh Jain was scal­ing up his oper­a­tions in 2013, the BJP’s infor­ma­tion tech­nol­o­gy (IT) strate­gists had begun inter­act­ing with social media plat­forms like Face­book and its part­ner What­sApp. If sup­port­ers of the BJP are to be believed, the par­ty was bet­ter than oth­ers in util­is­ing the micro-tar­get­ing poten­tial of the plat­forms. How­ev­er, it is also true that Facebook’s employ­ees in India con­duct­ed train­ing work­shops to help the mem­bers of the BJP’s IT cell. . . .”
  4. Dr. Hiren Joshi is anoth­er of the BJP oper­a­tives who is heav­i­ly involved with Face­book. ” . . . . Also assist­ing the social media and online teams to build a larg­er-than-life image for Modi before the 2014 elec­tions was a team led by his right-hand man Dr Hiren Joshi, who (as already stat­ed) is a very impor­tant advis­er to Modi whose writ extends way beyond infor­ma­tion tech­nol­o­gy and social media. . . .  Joshi has had, and con­tin­ues to have, a close and long-stand­ing asso­ci­a­tion with Facebook’s senior employ­ees in India. . . .”
  5. Shiv­nath Thukral, who was hired by Face­book in 2017 to be its Pub­lic Pol­i­cy Direc­tor for India & South Asia, worked with Joshi’s team in 2014.  ” . . . . The third team, that was intense­ly focused on build­ing Modi’s per­son­al image, was head­ed by Hiren Joshi him­self who worked out of the then Gujarat Chief Minister’s Office in Gand­hi­na­gar. The mem­bers of this team worked close­ly with staffers of Face­book in India, more than one of our sources told us. As will be detailed lat­er, Shiv­nath Thukral, who is cur­rent­ly an impor­tant exec­u­tive in Face­book, worked with this team. . . .”
  6. An osten­si­bly remorse­ful BJP politician–Prodyut Bora–high­light­ed the dra­mat­ic effect of Face­book and its What­sApp sub­sidiary have had on Indi­a’s pol­i­tics: ” . . . . In 2009, social media plat­forms like Face­book and What­sApp had a mar­gin­al impact in India’s 20 big cities. By 2014, how­ev­er, it had vir­tu­al­ly replaced the tra­di­tion­al mass media. In 2019, it will be the most per­va­sive media in the coun­try. . . .”
  7. A con­cise state­ment about the rela­tion­ship between the BJP and Face­book was issued by BJP tech office Vinit Goen­ka” . . . . At one stage in our inter­view with [Vinit] Goen­ka that last­ed over two hours, we asked him a point­ed ques­tion: ‘Who helped whom more, Face­book or the BJP?’ He smiled and said: ‘That’s a dif­fi­cult ques­tion. I won­der whether the BJP helped Face­book more than Face­book helped the BJP. You could say, we helped each oth­er.’ . . .”

5. In Ukraine, as well, Face­book and the OUN/B suc­ces­sor orga­ni­za­tions func­tion sym­bi­ot­i­cal­ly:

CrowdStrike–at the epi­cen­ter of the sup­posed Russ­ian hack­ing con­tro­ver­sy is note­wor­thy. Its co-founder and chief tech­nol­o­gy offi­cer, Dmit­ry Alper­ovitch is a senior fel­low at the Atlantic Coun­cil, financed by ele­ments that are at the foun­da­tion of fan­ning the flames of the New Cold War: “In this respect, it is worth not­ing that one of the com­mer­cial cyber­se­cu­ri­ty com­pa­nies the gov­ern­ment has relied on is Crowd­strike, which was one of the com­pa­nies ini­tial­ly brought in by the DNC to inves­ti­gate the alleged hacks. . . . Dmitri Alper­ovitch is also a senior fel­low at the Atlantic Coun­cil. . . . The con­nec­tion between [Crowd­strike co-founder and chief tech­nol­o­gy offi­cer Dmitri] Alper­ovitch and the Atlantic Coun­cil has gone large­ly unre­marked upon, but it is rel­e­vant giv­en that the Atlantic Coun­cil—which is is fund­ed in part by the US State Depart­ment, NATO, the gov­ern­ments of Latvia and Lithua­nia, the Ukrain­ian World Con­gress, and the Ukrain­ian oli­garch Vic­tor Pinchuk—has been among the loud­est voic­es call­ing for a new Cold War with Rus­sia. As I point­ed out in the pages of The Nation in Novem­ber, the Atlantic Coun­cil has spent the past sev­er­al years pro­duc­ing some of the most vir­u­lent spec­i­mens of the new Cold War pro­pa­gan­da. . . .

(Note that the Atlantic Coun­cil is dom­i­nant in the array of indi­vid­u­als and insti­tu­tions con­sti­tut­ing the Ukrain­ian fascist/Facebook coop­er­a­tive effort. We have spo­ken about the Atlantic Coun­cil in numer­ous pro­grams, includ­ing FTR #943. The orga­ni­za­tion has deep oper­a­tional links to ele­ments of U.S. intel­li­gence, as well as the OUN/B milieu that dom­i­nates the Ukrain­ian dias­po­ra.)

In May of 2018, Face­book decid­ed to effec­tive­ly out­source the work of iden­ti­fy­ing pro­pa­gan­da and mis­in­for­ma­tion dur­ing elec­tions to the Atlantic Coun­cil.

” . . . . Face­book is part­ner­ing with the Atlantic Coun­cil in anoth­er effort to com­bat elec­tion-relat­ed pro­pa­gan­da and mis­in­for­ma­tion from pro­lif­er­at­ing on its ser­vice. The social net­work­ing giant said Thurs­day that a part­ner­ship with the Wash­ing­ton D.C.-based think tank would help it bet­ter spot dis­in­for­ma­tion dur­ing upcom­ing world elec­tions. The part­ner­ship is one of a num­ber of steps Face­book is tak­ing to pre­vent the spread of pro­pa­gan­da and fake news after fail­ing to stop it from spread­ing on its ser­vice in the run up to the 2016 U.S. pres­i­den­tial elec­tion. . . .”

Since autumn 2018, Face­book has looked to hire a pub­lic pol­i­cy man­ag­er for Ukraine. The job came after years of Ukraini­ans crit­i­ciz­ing the plat­form for take­downs of its activists’ pages and the spread of [alleged] Russ­ian dis­in­fo tar­get­ing Kyiv. Now, it appears to have one: @Kateryna_Kruk.— Christo­pher Miller (@ChristopherJM) June 3, 2019

Oleh Tihany­bok, leader of the OUN/B suc­ces­sor orga­ni­za­tion Svo­bo­da, for which Katery­na Kruk worked.

Katery­na Kruk:

  1. Is Facebook’s Pub­lic Pol­i­cy Man­ag­er for Ukraine as of May of this year, accord­ing to her LinkedIn page.
  2. Worked as an ana­lyst and TV host for the Ukrain­ian ‘anti-Russ­ian pro­pa­gan­da’ out­fit Stop­Fake. Stop­Fake is the cre­ation of Ire­na Chalu­pa, who works for the Atlantic Coun­cil and the Ukrain­ian gov­ern­ment and appears to be the sis­ter of Andrea and Alexan­dra Chalu­pa.
  3. Joined the “Krem­lin Watch” team at the Euro­pean Val­ues think-tank, in Octo­ber of 2017.
  4. Received the Atlantic Coun­cil’s Free­dom award for her com­mu­ni­ca­tions work dur­ing the Euro­maid­an protests in June of 2014.
  5. Worked for OUN/B suc­ces­sor orga­ni­za­tion Svo­bo­da dur­ing the Euro­maid­an protests. “ . . . ‘There are peo­ple who don’t sup­port Svo­bo­da because of some of their slo­gans, but they know it’s the most active polit­i­cal par­ty and go to them for help, said Svo­bo­da vol­un­teer Katery­na Kruk. . . . ” . . . .
  6. Also has a num­ber of arti­cles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advo­cates for the cre­ation of an inde­pen­dent Ukrain­ian Ortho­dox Church to dimin­ish the influ­ence of the Russ­ian Ortho­dox Church.
  7. Accord­ing to her LinkedIn page has also done exten­sive work for the Ukrain­ian gov­ern­ment. From March 2016 to Jan­u­ary 2017 she was the Strate­gic Com­mu­ni­ca­tions Man­ag­er for the Ukrain­ian par­lia­ment where she was respon­si­ble for social media and inter­na­tion­al com­mu­ni­ca­tions. From Jan­u­ary-April 2017 she was the Head of Com­mu­ni­ca­tions at the Min­istry of Health.
  8. Was not only was a vol­un­teer for Svo­bo­da dur­ing the 2014 Euro­maid­an protests, but open­ly cel­e­brat­ed on twit­ter the May 2014 mas­sacre in Odessa when the far right burned dozens of pro­tes­tors alive. Kruk’s twit­ter feed is set to pri­vate now so there isn’t pub­lic access to her old tweet, but peo­ple have screen cap­tures of it. Here’s a tweet from Yasha Levine with a screen­shot of Kruk’s May 2, 2014 tweet where she writes: “#Odessa cleaned itself from ter­ror­ists, proud for city fight­ing for its identity.glory to fall­en heroes..” She even threw in a “glo­ry to fall­en heroes” at the end of her tweet cel­e­brat­ing this mas­sacre. Keep in mind that it was month after this tweet that the Atlantic Coun­cil gave her that Free­dom Award for her com­mu­ni­ca­tions work dur­ing the protests.
  9. In 2014, . . .  tweet­ed that a man had asked her to con­vince his grand­son not to join the Azov Bat­tal­ion, a neo-Nazi mili­tia. “I couldn’t do it,” she said. “I thanked that boy and blessed him.” And he then trav­eled to Luhan­sk to fight pro-Russ­ian rebels.
  10. Lion­ized a Nazi sniper killed in Ukraine’s civ­il war. In March 2018, a 19-year neo-Nazi named Andriy “Dil­ly” Krivich was shot and killed by a sniper. Krivich had been fight­ing with the fas­cist Ukrain­ian group Right Sec­tor, and had post­ed pho­tos on social media wear­ing Nazi Ger­man sym­bols. After he was killed, Kruk tweet­ed an homage to the teenage Nazi. (The Nazi was also lion­ized on Euro­maid­an Press’ Face­book page.)
  11. Has staunch­ly defend­ed the use of the slo­gan “Sla­va Ukrai­ni,”which was first coined and pop­u­lar­ized by Nazi-col­lab­o­rat­ing fas­cists, and is now the offi­cial salute of Ukraine’s army.
  12. Has also said that the Ukrain­ian fas­cist politi­cian Andriy Paru­biy, who co-found­ed a neo-Nazi par­ty before lat­er becom­ing the chair­man of Ukraine’s par­lia­ment the Rada, is “act­ing smart,” writ­ing, “Paru­biy touche.” . . . .

Discussion

No comments for “FTR #1077 Surveillance Valley, Part 3: Cambridge Analytica, Democracy and Counterinsurgency”

Post a comment