Dave Emory’s entire lifetime of work is available on a flash drive that can be obtained HERE [1]. The new drive is a 32-gigabyte drive that is current as of the programs and articles posted by the fall of 2017. The new drive (available for a tax-deductible contribution of $65.00 or more.)
WFMU-FM is podcasting For The Record–You can subscribe to the podcast HERE [2].
You can subscribe to e‑mail alerts from Spitfirelist.com HERE [3].
You can subscribe to RSS feed from Spitfirelist.com HERE [3].
You can subscribe to the comments made on programs and posts–an excellent source of information in, and of, itself, HERE [4].
Please consider supporting THE WORK DAVE EMORY DOES [5].
This broadcast was recorded in one, 60-minute segment [6].
Introduction: Continuing the discussion from FTR #1076 [7], the broadcast recaps key aspects of analysis of the Cambridge Analytica scandal.
In our last program, we noted that both the internet (DARPA projects including Project Agile) and the German Nazi Party had their origins as counterinsurgency gambits. Noting Hitler’s speech before The Industry Club of Dusseldorf, in which he equated communism with democracy, we highlight how the Cambridge Analytica scandal reflects the counterinsurgency origins of the Internet, and how the Cambridge Analytica affair embodies anti-Democracy/as counterinsurgency.
Key aspects of the Cambridge Analytica affair include:
- The use of psychographic personality testin [8]g on Facebook that is used for political advantage: ” . . . . For several years, a data firm eventually hired by the Trump campaign, Cambridge Analytica, has been using Facebook as a tool to build psychological profiles that represent some 230 million adult Americans. A spinoff of a British consulting company and sometime-defense contractor known for its counterterrorism ‘psy ops’ work in Afghanistan, the firm does so by seeding the social network with personality quizzes. Respondents — by now hundreds of thousands of us, mostly female and mostly young but enough male and older for the firm to make inferences about others with similar behaviors and demographics — get a free look at their Ocean scores. Cambridge Analytica also gets a look at their scores and, thanks to Facebook, gains access to their profiles and real names. . . .”
- The parent company of Cambridge Analytica–SCL–was deeply involve [9]d [9] with counterterrorism “psy-ops” in Afghanistan, embodying the essence of the counterinsurgency dynamic at the root of the development of the Internet. The use of online data to subvert democracy recalls Hitler’s speech to the Industry Club of Dusseldorf, in which he equated democracy with communism: ” . . . . Cambridge Analytica was a company spun out of SCL Group, a British military contractor that worked in information operations for armed forces around the world. It was conducting research on how to scale and digitise information warfare – the use of information to confuse or degrade the efficacy of an enemy. . . . As director of research, Wylie’s original role was to map out how the company would take traditional information operations tactics into the online space – in particular, by profiling people who would be susceptible to certain messaging. This morphed into the political arena. After Wylie left, the company worked on Donald Trump’s US presidential campaign . . . .”
- Cambridge Analytica whistleblower Christopher Wylie’s observations [9] on the anti-democratic nature of the firm’s work: ” . . . . It was this shift from the battlefield to politics that made Wylie uncomfortable. ‘When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy,’ he says. ‘But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists.’ . . . .”
- Wylie’s observations on how Cambridge Analytica’s methodology [9] can be used to build a fascist political movement: ” . . . . One of the reasons these techniques are so insidious is that being a target of a disinformation campaign is ‘usually a pleasurable experience’, because you are being fed content with which you are likely to agree. ‘You are being guided through something that you want to be true,’ Wylie says. To build an insurgency, he explains, you first target people who are more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to ‘like’ a group on social media. They start engaging with the content, which may or may not be true; either way ‘it feels good to see that information’. When the group reaches 1,000 or 2,000 members, an event is set up in the local area. Even if only 5% show up, ‘that’s 50 to 100 people flooding a local coffee shop’, Wylie says. This, he adds, validates their opinion because other people there are also talking about ‘all these things that you’ve been seeing online in the depths of your den and getting angry about’. People then start to believe the reason it’s not shown on mainstream news channels is because ‘they don’t want you to know what the truth is’. As Wylie sums it up: ‘What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.’ . . . .”
- Wylie’s observation [9] that Facebook was “All In” on the Cambridge Analytica machinations: ” . . . . ‘Facebook has known about what Cambridge Analytica was up to from the very beginning of those projects,” Wylie claims. “They were notified, they authorised the applications, they were given the terms and conditions of the app that said explicitly what it was doing. They hired people who worked on building the app. I had legal correspondence with their lawyers where they acknowledged it happened as far back as 2016.’ . . . .”
- The decisive participation [10] of “Spy Tech” firm Palantir in the Cambridge Analytica operation: Peter Thiel’s surveillance firm Palantir was apparently deeply involved with Cambridge Analytica’s gaming of personal data harvested from Facebook in order to engineer an electoral victory for Trump. Thiel was an early investor in Facebook, at one point was its largest shareholder and is still one of its largest shareholders. In addition to his opposition to democracy [11] because it allegedly is inimical to wealth creation, Thiel doesn’t think women should be allowed to vote and holds Nazi legal theoretician Carl Schmitt in high regard [12]. ” . . . . It was a Palantir employee in London, working closely with the data scientists building Cambridge’s psychological profiling technology, who suggested the scientists create their own app — a mobile-phone-based personality quiz — to gain access to Facebook users’ friend networks, according to documents obtained by The New York Times. The revelations pulled Palantir — co-founded by the wealthy libertarian Peter Thiel [13] — into the furor surrounding Cambridge, which improperly obtained Facebook data to build analytical tools it deployed on behalf of Donald J. Trump and other Republican candidates in 2016. Mr. Thiel, a supporter of President Trump, serves on the board at Facebook. ‘There were senior Palantir employees that were also working on the Facebook data,’ said Christopher Wylie [14], a data expert and Cambridge Analytica co-founder, in testimony before British lawmakers on Tuesday. . . . The connections between Palantir and Cambridge Analytica were thrust into the spotlight by Mr. Wylie’s testimony on Tuesday. Both companies are linked to tech-driven billionaires who backed Mr. Trump’s campaign: Cambridge is chiefly owned by Robert Mercer, the computer scientist and hedge fund magnate, while Palantir was co-founded in 2003 by Mr. Thiel, who was an initial investor in Facebook. . . .”
- The use of “dark posts” by the Cambridge Analytica team. (We have noted that Brad Parscale [15] has reassembled the old Cambridge Analytica team for Trump’s 2020 election campaign. It seems probable that AOC’s millions of online followers, as well as the “Bernie Bots,” will be getting “dark posts” crafted by AI’s scanning their online efforts.) ” . . . . One recent advertising product on Facebook is the so-called ‘dark post’: A newsfeed message seen by no one aside from the users being targeted. With the help of Cambridge Analytica, Mr. Trump’s digital team used dark posts to serve different ads to different potential voters, aiming to push the exact right buttons for the exact right people at the exact right times. . . .”
Supplementing the discussion about Cambridge Analytica, the program reviews information from FTR #718 [16]about Facebook’s apparent involvement [17] with elements and individuals linked to CIA and DARPA: ” . . . . Facebook’s most recent round of funding was led by a company called Greylock Venture Capital, who put in the sum of $27.5m. One of Greylock’s senior partners is called Howard Cox, another former chairman of the NVCA, who is also on the board of In-Q-Tel. What’s In-Q-Tel? Well, believe it or not (and check out their website), this is the venture-capital wing of the CIA. After 9/11, the US intelligence community became so excited by the possibilities of new technology and the innovations being made in the private sector, that in 1999 they set up their own venture capital fund, In-Q-Tel, which ‘identifies and partners with companies developing cutting-edge technologies to help deliver these solutions to the Central Intelligence Agency and the broader US Intelligence Community (IC) to further their missions’. . . .”
More about the CIA/DARPA links to the development of Facebook: ” . . . . The second round of funding into Facebook ($US12.7 million) came from venture capital firm Accel Partners. Its manager James Breyer was formerly chairman of the National Venture Capital Association, and served on the board with Gilman Louie, CEO of In-Q-Tel, a venture capital firm established by the Central Intelligence Agency in 1999. One of the company’s key areas of expertise are in ‘data mining technologies’. Breyer also served on the board of R&D firm BBN Technologies, which was one of those companies responsible for the rise of the internet. Dr Anita Jones joined the firm, which included Gilman Louie. She had also served on the In-Q-Tel’s board, and had been director of Defence Research and Engineering for the US Department of Defence. She was also an adviser to the Secretary of Defence and overseeing the Defence Advanced Research Projects Agency (DARPA), which is responsible for high-tech, high-end development. . . .”
Program Highlights Include: Review of Facebook’s plans [18] to use brain-to-computer technology to operate its platform, thereby the enabling of recording and databasing people’s thoughts; Review of Facebook’s employment of former DARPA head Regina Dugan [18] to implement the brain-to-computer technology; Review of Facebook’s building 8–designed to duplicate DARPA [18]; Review of Facebook’s hiring of the Atlantic Council [19] to police the social medium’s online content; Review of Facebook’s partnering [20] with Narendra Modi’s Hindutva fascist government in India; Review of Facebook’s emloyment of Ukrainian fascist Kateryna Kruk [20] to manage the social medium’s Ukrainian content.
1a. Facebook personality tests that allegedly let you learn things about what make you tick allows whoever set up that test learn what makes you tick too. Since it’s done through Facebook, they can identify your test results with your real identity.
If the Facebook personality test in question happens to report your “Ocean score” (Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism), that means the test your taking was created by Cambridge Analytica, a company with one of Donald Trump’s billionaire sugar-daddies, Robert Mercer, as a major investor. And it’s Cambridge Analytica that gets to learn all those fun facts about your psychological profile too. And Steve Bannon sat on its board:
“The Secret Agenda of a Facebook Quiz” by McKenzie Funk; The New York Times; 1/19/2017. [8]
Do you panic easily? Do you often feel blue? Do you have a sharp tongue? Do you get chores done right away? Do you believe in the importance of art?
If ever you’ve answered questions like these on one of the free personality quizzes floating around Facebook, you’ll have learned what’s known as your Ocean score: How you rate according to the big five psychological traits of Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism. You may also be responsible the next time America is shocked by an election upset.
For several years, a data firm eventually hired by the Trump campaign, Cambridge Analytica, has been using Facebook as a tool to build psychological profiles that represent some 230 million adult Americans. A spinoff of a British consulting company and sometime-defense contractor known for its counterterrorism “psy ops” work in Afghanistan, the firm does so by seeding the social network with personality quizzes. Respondents — by now hundreds of thousands of us, mostly female and mostly young but enough male and older for the firm to make inferences about others with similar behaviors and demographics — get a free look at their Ocean scores. Cambridge Analytica also gets a look at their scores and, thanks to Facebook, gains access to their profiles and real names.
Cambridge Analytica worked on the “Leave” side of the Brexit campaign. In the United States it takes only Republicans as clients: Senator Ted Cruz in the primaries, Mr. Trump in the general election. Cambridge is reportedly backed by Robert Mercer [21], a hedge fund billionaire and a major Republican donor; a key board member is Stephen K. Bannon, the head of Breitbart News who became Mr. Trump’s campaign chairman and is set to be his chief strategist in the White House.
In the age of Facebook, it has become far easier for campaigners or marketers to combine our online personas with our offline selves, a process that was once controversial but is now so commonplace that there’s a term for it, “onboarding.” Cambridge Analytica says it has as many as 3,000 to 5,000 data points on each of us, be it voting histories or full-spectrum demographics — age, income, debt, hobbies, criminal histories, purchase histories, religious leanings, health concerns, gun ownership, car ownership, homeownership — from consumer-data giants.
No data point is very informative on its own, but profiling voters, says Cambridge Analytica, is like baking a cake. “It’s the sum of the ingredients,” its chief executive officer, Alexander Nix, told NBC News. Because the United States lacks European-style restrictions on second- or thirdhand use of our data, and because our freedom-of-information laws give data brokers broad access to the intimate records kept by local and state governments, our lives are open books even without social media or personality quizzes.
Ever since the advertising executive Lester Wunderman coined the term “direct marketing” in 1961, the ability to target specific consumers with ads — rather than blanketing the airwaves with mass appeals and hoping the right people will hear them — has been the marketer’s holy grail. What’s new is the efficiency with which individually tailored digital ads can be tested and matched to our personalities. Facebook is the microtargeter’s ultimate weapon.
The explosive growth of Facebook’s ad business has been overshadowed by its increasing role in how we get our news, real or fake. In July, the social network posted record earnings: quarterly sales were up 59 percent from the previous year, and profits almost tripled to $2.06 billion. While active users of Facebook — now 1.71 billion monthly active users — were up 15 percent, the real story was how much each individual user was worth. The company makes $3.82 a year from each global user, up from $2.76 a year ago, and an average of $14.34 per user in the United States, up from $9.30 a year ago. Much of this growth comes from the fact that advertisers not only have an enormous audience in Facebook but an audience they can slice into the tranches they hope to reach.
One recent advertising product on Facebook is the so-called “dark post”: A newsfeed message seen by no one aside from the users being targeted. With the help of Cambridge Analytica, Mr. Trump’s digital team used dark posts to serve different ads to different potential voters, aiming to push the exact right buttons for the exact right people at the exact right times.
Imagine the full capability of this kind of “psychographic” advertising. In future Republican campaigns, a pro-gun voter whose Ocean score ranks him high on neuroticism could see storm clouds and a threat: The Democrat wants to take his guns away. A separate pro-gun voter deemed agreeable and introverted might see an ad emphasizing tradition and community values, a father and son hunting together.
In this election, dark posts were used to try to suppress the African-American vote. According to Bloomberg, the Trump campaign sent ads reminding certain selected black voters of Hillary Clinton’s infamous “super predator” line. It targeted Miami’s Little Haiti neighborhood with messages about the Clinton Foundation’s troubles in Haiti after the 2010 earthquake. Federal Election Commission rules are unclear when it comes to Facebook posts, but even if they do apply and the facts are skewed and the dog whistles loud, the already weakening power of social opprobrium is gone when no one else sees the ad you see — and no one else sees “I’m Donald Trump, and I approved this message.”
While Hillary Clinton spent more than $140 million on television spots, old-media experts scoffed at Trump’s lack of old-media ad buys. Instead, his campaign pumped its money into digital, especially Facebook. One day in August, it flooded the social network with 100,000 ad variations, so-called A/B testing on a biblical scale, surely more ads than could easily be vetted by human eyes for compliance with Facebook’s “community standards.”
1b. Christopher Wylie–the former head of research at Cambridge Analytica who became one of the key insider whistle-blowers about how Cambridge Analytica operated and the extent of Facebook’s knowledge about it–gave an interview last month to Campaign Magazine. (We dealt with Cambridge Analytica in FTR #‘s 946 [22], 1021 [23].)
Wylie recounts how, as director of research at Cambridge Analytica, his original role was to determine how the company could use the information warfare techniques used by SCL Group – Cambridge Analytica’s parent company and a defense contractor providing psy op services for the British military. Wylie’s job was to adapt the psychological warfare strategies that SCL had been using on the battlefield to the online space. As Wylie put it:
“ . . . . When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy…But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists. . . . .”
Wylie also draws parallels between the psychological operations used on democratic audiences and the battlefield techniques used to be build an insurgency. It starts with targeting people more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to “like” a group on social media. The information you’re feeding this target audience may or may not be real. The important thing is that it’s content that they already agree with so that “it feels good to see that information.” Keep in mind that one of the goals of the ‘psychographic profiling’ that Cambridge Analytica was to identify traits like neuroticism.
Wylie goes on to describe the next step in this insurgency-building technique: keep building up the interest in the social media group that you’re directing this target audience towards until it hits around 1,000–2,000 people. Then set up a real life event dedicated to the chosen disinformation topic in some local area and try to get as many of your target audience to show up. Even if only 5 percent of them show up, that’s still 50–100 people converging on some local coffee shop or whatever. The people meet each other in real life and start talking about about “all these things that you’ve been seeing online in the depths of your den and getting angry about”. This target audience starts believing that no one else is talking about this stuff because “they don’t want you to know what the truth is”. As Wylie puts it, “What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.”
In the early hours of 17 March 2018, the 28-year-old Christopher Wylie tweeted: “Here we go….”
Later that day, The Observer and The New York Times published the story of Cambridge Analytica’s misuse of Facebook data, which sent shockwaves around the world, caused millions to #DeleteFacebook, and led the UK Information Commissioner’s Office to fine the site the maximum penalty for failing to protect users’ information. Six weeks after the story broke, Cambridge Analytica closed. . . .
. . . . He believes that poor use of data is killing good ideas. And that, unless effective regulation is enacted, society’s worship of algorithms, unchecked data capture and use, and the likely spread of AI to all parts of our lives is causing us to sleepwalk into a bleak future.
Not only are such circumstances a threat to adland – why do you need an ad to tell you about a product if an algorithm is choosing it for you? – it is a threat to human free will. “Currently, the only morality of the algorithm is to optimise you as a consumer and, in many cases, you become the product. There are very few examples in human history of industries where people themselves become products and those are scary industries – slavery and the sex trade. And now, we have social media,” Wylie says.
“The problem with that, and what makes it inherently different to selling, say, toothpaste, is that you’re selling parts of people or access to people. People have an innate moral worth. If we don’t respect that, we can create industries that do terrible things to people. We are [heading] blindly and quickly into an environment where this mentality is going to be amplified through AI everywhere. We’re humans, we should be thinking about people first.”
His words carry weight, because he’s been on the dark side. He has seen what can happen when data is used to spread misinformation, create insurgencies and prey on the worst of people’s characters.
The political battlefield
A quick refresher on the scandal, in Wylie’s words: Cambridge Analytica was a company spun out of SCL Group, a British military contractor that worked in information operations for armed forces around the world. It was conducting research on how to scale and digitise information warfare – the use of information to confuse or degrade the efficacy of an enemy. . . .
. . . . As director of research, Wylie’s original role was to map out how the company would take traditional information operations tactics into the online space – in particular, by profiling people who would be susceptible to certain messaging.
This morphed into the political arena. After Wylie left, the company worked on Donald Trump’s US presidential campaign . . . .
. . . . It was this shift from the battlefield to politics that made Wylie uncomfortable. “When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy,” he says.
“But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists.”
One of the reasons these techniques are so insidious is that being a target of a disinformation campaign is “usually a pleasurable experience”, because you are being fed content with which you are likely to agree. “You are being guided through something that you want to be true,” Wylie says.
To build an insurgency, he explains, you first target people who are more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to “like” a group on social media. They start engaging with the content, which may or may not be true; either way “it feels good to see that information”.
When the group reaches 1,000 or 2,000 members, an event is set up in the local area. Even if only 5% show up, “that’s 50 to 100 people flooding a local coffee shop”, Wylie says. This, he adds, validates their opinion because other people there are also talking about “all these things that you’ve been seeing online in the depths of your den and getting angry about”.
People then start to believe the reason it’s not shown on mainstream news channels is because “they don’t want you to know what the truth is”. As Wylie sums it up: “What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.” . . . .
. . . . Psychographic potential
. . . . But Wylie argues that people underestimate what algorithms allow you to do in profiling. “I can take pieces of information about you that seem innocuous, but what I’m able to do with an algorithm is find patterns that correlate to underlying psychological profiles,” he explains.
“I can ask whether you listen to Justin Bieber, and you won’t feel like I’m invading your privacy. You aren’t necessarily aware that when you tell me what music you listen to or what TV shows you watch, you are telling me some of your deepest and most personal attributes.” . . . .
. . . . Clashes with Facebook
Wylie is opposed to self-regulation, because industries won’t become consumer champions – they are, he says, too conflicted.
“Facebook has known about what Cambridge Analytica was up to from the very beginning of those projects,” Wylie claims. “They were notified, they authorised the applications, they were given the terms and conditions of the app that said explicitly what it was doing. They hired people who worked on building the app. I had legal correspondence with their lawyers where they acknowledged it happened as far back as 2016.” . . . .
1c. In FTR #946 [22], we examined Cambridge Analytica, its Trump and Steve Bannon-linked tech firm that harvested Facebook data on behalf of the Trump campaign.
Peter Thiel’s surveillance firm Palantir was apparently deeply involved with Cambridge Analytica’s gaming of personal data harvested from Facebook in order to engineer an electoral victory for Trump. Thiel was an early investor in Facebook, at one point was its largest shareholder and is still one of its largest shareholders. ” . . . . It was a Palantir employee in London, working closely with the data scientists building Cambridge’s psychological profiling technology, who suggested the scientists create their own app — a mobile-phone-based personality quiz — to gain access to Facebook users’ friend networks, according to documents obtained by The New York Times. The revelations pulled Palantir — co-founded by the wealthy libertarian Peter Thiel [13] — into the furor surrounding Cambridge, which improperly obtained Facebook data to build analytical tools it deployed on behalf of Donald J. Trump and other Republican candidates in 2016. Mr. Thiel, a supporter of President Trump, serves on the board at Facebook. ‘There were senior Palantir employees that were also working on the Facebook data,’ said Christopher Wylie [14], a data expert and Cambridge Analytica co-founder, in testimony before British lawmakers on Tuesday. . . . The connections between Palantir and Cambridge Analytica were thrust into the spotlight by Mr. Wylie’s testimony on Tuesday. Both companies are linked to tech-driven billionaires who backed Mr. Trump’s campaign: Cambridge is chiefly owned by Robert Mercer, the computer scientist and hedge fund magnate, while Palantir was co-founded in 2003 by Mr. Thiel, who was an initial investor in Facebook. . . .”
As a start-up called Cambridge Analytica [24] sought to harvest the Facebook data of tens of millions of Americans in summer 2014, the company received help from at least one employee at Palantir Technologies, a top Silicon Valley contractor to American spy agencies and the Pentagon. It was a Palantir employee in London, working closely with the data scientists building Cambridge’s psychological profiling technology, who suggested the scientists create their own app — a mobile-phone-based personality quiz — to gain access to Facebook users’ friend networks, according to documents obtained by The New York Times.
Cambridge ultimately took a similar approach. By early summer, the company found a university researcher to harvest data using a personality questionnaire and Facebook app. The researcher scraped private data from over 50 million Facebook users — and Cambridge Analytica [25] went into business selling so-called psychometric profiles of American voters, setting itself on a collision course with regulators and lawmakers in the United States and Britain.
The revelations pulled Palantir — co-founded by the wealthy libertarian Peter Thiel [13] — into the furor surrounding Cambridge, which improperly obtained Facebook data to build analytical tools it deployed on behalf of Donald J. Trump and other Republican candidates in 2016. Mr. Thiel, a supporter of President Trump, serves on the board at Facebook.
“There were senior Palantir employees that were also working on the Facebook data,” said Christopher Wylie [14], a data expert and Cambridge Analytica co-founder, in testimony before British lawmakers on Tuesday. . . .
. . . .The connections between Palantir and Cambridge Analytica were thrust into the spotlight by Mr. Wylie’s testimony on Tuesday. Both companies are linked to tech-driven billionaires who backed Mr. Trump’s campaign: Cambridge is chiefly owned by Robert Mercer, the computer scientist and hedge fund magnate, while Palantir was co-founded in 2003 by Mr. Thiel, who was an initial investor in Facebook. . . .
. . . . Documents and interviews indicate that starting in 2013, Mr. Chmieliauskas began corresponding with Mr. Wylie and a colleague from his Gmail account. At the time, Mr. Wylie and the colleague worked for the British defense and intelligence contractor SCL Group, which formed Cambridge Analytica with Mr. Mercer the next year. The three shared Google documents to brainstorm ideas about using big data to create sophisticated behavioral profiles, a product code-named “Big Daddy.”
A former intern at SCL — Sophie Schmidt, the daughter of Eric Schmidt, then Google’s executive chairman — urged the company to link up with Palantir, according to Mr. Wylie’s testimony and a June 2013 email viewed by The Times.
“Ever come across Palantir. Amusingly Eric Schmidt’s daughter was an intern with us and is trying to push us towards them?” one SCL employee wrote to a colleague in the email.
. . . . But he [Wylie] said some Palantir employees helped engineer Cambridge’s psychographic models.
“There were Palantir staff who would come into the office and work on the data,” Mr. Wylie told lawmakers. “And we would go and meet with Palantir staff at Palantir.” He did not provide an exact number for the employees or identify them.
Palantir employees were impressed with Cambridge’s backing from Mr. Mercer, one of the world’s richest men, according to messages viewed by The Times. And Cambridge Analytica viewed Palantir’s Silicon Valley ties as a valuable resource for launching and expanding its own business.
In an interview this month with The Times, Mr. Wylie said that Palantir employees were eager to learn more about using Facebook data and psychographics. Those discussions continued through spring 2014, according to Mr. Wylie.
Mr. Wylie said that he and Mr. Nix visited Palantir’s London office on Soho Square. One side was set up like a high-security office, Mr. Wylie said, with separate rooms that could be entered only with particular codes. The other side, he said, was like a tech start-up — “weird inspirational quotes and stuff on the wall and free beer, and there’s a Ping-Pong table.”
Mr. Chmieliauskas continued to communicate with Mr. Wylie’s team in 2014, as the Cambridge employees were locked in protracted negotiations with a researcher at Cambridge University, Michal Kosinski, to obtain Facebook data through an app Mr. Kosinski had built. The data was crucial to efficiently scale up Cambridge’s psychometrics products so they could be used in elections and for corporate clients. . . .
2a. There are indications that elements in and/or associated with CIA and Pentagon/DARPA were involved with Facebook almost from the beginning: ” . . . . Facebook’s most recent round of funding was led by a company called Greylock Venture Capital, who put in the sum of $27.5m. One of Greylock’s senior partners is called Howard Cox, another former chairman of the NVCA, who is also on the board of In-Q-Tel. What’s In-Q-Tel? Well, believe it or not (and check out their website), this is the venture-capital wing of the CIA. After 9/11, the US intelligence community became so excited by the possibilities of new technology and the innovations being made in the private sector, that in 1999 they set up their own venture capital fund, In-Q-Tel, which ‘identifies and partners with companies developing cutting-edge technologies to help deliver these solutions to the Central Intelligence Agency and the broader US Intelligence Community (IC) to further their missions’. . . .”
“With Friends Like These . . .” by Tim Hodgkinson; guardian.co.uk; 1/14/2008. [17]
. . . . The third board member of Facebook is Jim Breyer. He is a partner in the venture capital firm Accel Partners, who put $12.7m into Facebook in April 2005. On the board of such US giants as Wal-Mart and Marvel Entertainment, he is also a former chairman of the National Venture Capital Association (NVCA). Now these are the people who are really making things happen in America, because they invest in the new young talent, the Zuckerbergs and the like. Facebook’s most recent round of funding was led by a company called Greylock Venture Capital, who put in the sum of $27.5m. One of Greylock’s senior partners is called Howard Cox, another former chairman of the NVCA, who is also on the board of In-Q-Tel. What’s In-Q-Tel? Well, believe it or not (and check out their website), this is the venture-capital wing of the CIA. After 9/11, the US intelligence community became so excited by the possibilities of new technology and the innovations being made in the private sector, that in 1999 they set up their own venture capital fund, In-Q-Tel, which “identifies and partners with companies developing cutting-edge technologies to help deliver these solutions to the Central Intelligence Agency and the broader US Intelligence Community (IC) to further their missions”. . . .
2b. More about the CIA/Pentagon link to the development of Facebook: ” . . . . The second round of funding into Facebook ($US12.7 million) came from venture capital firm Accel Partners. Its manager James Breyer was formerly chairman of the National Venture Capital Association, and served on the board with Gilman Louie, CEO of In-Q-Tel, a venture capital firm established by the Central Intelligence Agency in 1999. One of the company’s key areas of expertise are in ‘data mining technologies’. Breyer also served on the board of R&D firm BBN Technologies, which was one of those companies responsible for the rise of the internet. Dr Anita Jones joined the firm, which included Gilman Louie. She had also served on the In-Q-Tel’s board, and had been director of Defence Research and Engineering for the US Department of Defence. She was also an adviser to the Secretary of Defence and overseeing the Defence Advanced Research Projects Agency (DARPA), which is responsible for high-tech, high-end development. . . .”
“Facebook–the CIA Conspiracy” by Matt Greenop; The New Zealand Herald; 8/8/2007. [26]
. . . . Facebook’s first round of venture capital funding ($US500,000) came from former Paypal CEO Peter Thiel. Author of anti-multicultural tome ‘The Diversity Myth’, he is also on the board of radical conservative group VanguardPAC.
The second round of funding into Facebook ($US12.7 million) came from venture capital firm Accel Partners. Its manager James Breyer was formerly chairman of the National Venture Capital Association, and served on the board with Gilman Louie, CEO of In-Q-Tel, a venture capital firm established by the Central Intelligence Agency in 1999. One of the company’s key areas of expertise are in “data mining technologies”.
Breyer also served on the board of R&D firm BBN Technologies, which was one of those companies responsible for the rise of the internet.
Dr Anita Jones joined the firm, which included Gilman Louie. She had also served on the In-Q-Tel’s board, and had been director of Defence Research and Engineering for the US Department of Defence.
She was also an adviser to the Secretary of Defence and overseeing the Defence Advanced Research Projects Agency (DARPA), which is responsible for high-tech, high-end development. . . .
3. Facebook wants to read your thoughts.
- ” . . . Facebook wants to build its own “brain-to-computer interface” that would allow us to send thoughts straight to a computer. ‘What if you could type directly from your brain?’ Regina Dugan, the head of the company’s secretive hardware R&D division, Building 8, asked from the stage. Dugan then proceeded to show a video demo of a woman typing eight words per minute directly from the stage. In a few years, she said, the team hopes to demonstrate a real-time silent speech system capable of delivering a hundred words per minute. ‘That’s five times faster than you can type on your smartphone, and it’s straight from your brain,’ she said. ‘Your brain activity contains more information than what a word sounds like and how it’s spelled; it also contains semantic information of what those words mean.’ . . .”
- ” . . . . Brain-computer interfaces are nothing new. DARPA, which Dugan used to head, has invested heavily [27] in brain-computer interface technologies to do things like cure mental illness and restore memories to soldiers injured in war. But what Facebook is proposing is perhaps more radical—a world in which social media doesn’t require picking up a phone or tapping a wrist watch in order to communicate with your friends; a world where we’re connected all the time by thought alone. . . .”
- ” . . . . Facebook’s Building 8 is modeled after DARPA and its projects tend to be equally ambitious. . . .”
- ” . . . . But what Facebook is proposing is perhaps more radical—a world in which social media doesn’t require picking up a phone or tapping a wrist watch in order to communicate with your friends; a world where we’re connected all the time by thought alone. . . .”
“Facebook Literally Wants to Read Your Thoughts” by Kristen V. Brown; Gizmodo; 4/19/2017. [18]
At Facebook’s annual developer conference, F8, on Wednesday, the group unveiled what may be Facebook’s most ambitious—and creepiest—proposal yet. Facebook wants to build its own “brain-to-computer interface” that would allow us to send thoughts straight to a computer.
What if you could type directly from your brain?” Regina Dugan, the head of the company’s secretive hardware R&D division, Building 8, asked from the stage. Dugan then proceeded to show a video demo of a woman typing eight words per minute directly from the stage. In a few years, she said, the team hopes to demonstrate a real-time silent speech system capable of delivering a hundred words per minute.
“That’s five times faster than you can type on your smartphone, and it’s straight from your brain,” she said. “Your brain activity contains more information than what a word sounds like and how it’s spelled; it also contains semantic information of what those words mean.”
Brain-computer interfaces are nothing new. DARPA, which Dugan used to head, has invested heavily [27] in brain-computer interface technologies to do things like cure mental illness and restore memories to soldiers injured in war. But what Facebook is proposing is perhaps more radical—a world in which social media doesn’t require picking up a phone or tapping a wrist watch in order to communicate with your friends; a world where we’re connected all the time by thought alone.
“Our world is both digital and physical,” she said. “Our goal is to create and ship new, category-defining consumer products that are social first, at scale.”
She also showed a video that demonstrated a second technology that showed the ability to “listen” to human speech through vibrations on the skin. This tech has been in development to aid people with disabilities, working a little like a Braille that you feel with your body rather than your fingers. Using actuators and sensors, a connected armband was able to convey to a woman in the video a tactile vocabulary of nine different words.
Dugan adds that it’s also possible to “listen” to human speech by using your skin. It’s like using braille but through a system of actuators and sensors. Dugan showed a video example of how a woman could figure out exactly what objects were selected on a touchscreen based on inputs delivered through a connected armband.
Facebook’s Building 8 is modeled after DARPA and its projects tend to be equally ambitious. Brain-computer interface technology is still in its infancy. So far, researchers have been successful in using it to allow people with disabilities to control paralyzed or prosthetic limbs. But stimulating the brain’s motor cortex is a lot simpler than reading a person’s thoughts and then translating those thoughts into something that might actually be read by a computer.
The end goal is to build an online world that feels more immersive and real—no doubt so that you spend more time on Facebook.
“Our brains produce enough data to stream 4 HD movies every second. The problem is that the best way we have to get information out into the world — speech — can only transmit about the same amount of data as a 1980s modem,” CEO Mark Zuckerberg said in a Facebook post. “We’re working on a system that will let you type straight from your brain about 5x faster than you can type on your phone today. Eventually, we want to turn it into a wearable technology that can be manufactured at scale. Even a simple yes/no ‘brain click’ would help make things like augmented reality feel much more natural.”
“That’s five times faster than you can type on your smartphone, and it’s straight from your brain,” she said. “Your brain activity contains more information than what a word sounds like and how it’s spelled; it also contains semantic information of what those words mean.”
Brain-computer interfaces are nothing new. DARPA, which Dugan used to head, has invested heavily [27] in brain-computer interface technologies to do things like cure mental illness and restore memories to soldiers injured in war. But what Facebook is proposing is perhaps more radical—a world in which social media doesn’t require picking up a phone or tapping a wrist watch in order to communicate with your friends; a world where we’re connected all the time by thought alone.
…
4. The broadcast then reviews (from FTR #1074 [20]) Facebook’s inextricable link with the Hindutva fascist BJP of Narendra Modi:
Key elements of discussion and analysis include:
- Indian politics has been largely dominated [28] by fake news, spread by social media: ” . . . . In the continuing Indian elections, as 900 million people are voting to elect representatives to the lower house of the Parliament, disinformation and hate speech are drowning out truth on social media networks in the country and creating a public health crisis like the pandemics of the past century. This contagion of a staggering amount of morphed images, doctored videos and text messages is spreading largely through messaging services and influencing what India’s voters watch and read on their smartphones. A recent study by Microsoft [29] found that over 64 percent Indians encountered fake news online, the highest reported among the 22 countries surveyed. . . . These platforms are filled with fake news and disinformation aimed at influencing political choices during the Indian elections. . . . ”
- Narendra Modi’s Hindutva fascist BJP has been the primary beneficiary [28] of fake news, and his regime has partnered with Facebook: ” . . . . The hearing was an exercise in absurdist theater [30] because the governing B.J.P. has been the chief beneficiary of divisive content that reaches millions because of the way social media algorithms, especially Facebook, amplify ‘engaging’ articles. . . .”
- Rajesh Jain is among those BJP functionaries who serve Facebook [31], as well as the Hindutva fascists: ” . . . . By the time Rajesh Jain was scaling up his operations in 2013, the BJP’s information technology (IT) strategists had begun interacting with social media platforms like Facebook and its partner WhatsApp. If supporters of the BJP are to be believed, the party was better than others in utilising the micro-targeting potential of the platforms. However, it is also true that Facebook’s employees in India conducted training workshops to help the members of the BJP’s IT cell. . . .”
- Dr. Hiren Joshi is another [31] of the BJP operatives who is heavily involved with Facebook. ” . . . . Also assisting the social media and online teams to build a larger-than-life image for Modi before the 2014 elections was a team led by his right-hand man Dr Hiren Joshi, who (as already stated) is a very important adviser to Modi whose writ extends way beyond information technology and social media. . . . Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India. . . .”
- Shivnath Thukral, who was hired by Facebook in 2017 [32] to be its Public Policy Director for India & South Asia, worked with Joshi’s team in 2014. ” . . . . The third team, that was intensely focused on building Modi’s personal image, was headed by Hiren Joshi himself who worked out of the then Gujarat Chief Minister’s Office in Gandhinagar. The members of this team worked closely with staffers of Facebook in India, more than one of our sources told us. As will be detailed later, Shivnath Thukral, who is currently an important executive in Facebook, worked with this team. . . .”
- An ostensibly remorseful BJP politician–Prodyut Bora–highlighted [31] the dramatic effect of Facebook and its WhatsApp subsidiary have had on India’s politics: ” . . . . In 2009, social media platforms like Facebook and WhatsApp had a marginal impact in India’s 20 big cities. By 2014, however, it had virtually replaced the traditional mass media. In 2019, it will be the most pervasive media in the country. . . .”
- A concise statement about the relationship between the BJP and Facebook was issued by BJP tech office Vinit Goenka [31]: ” . . . . At one stage in our interview with [Vinit] Goenka that lasted over two hours, we asked him a pointed question: ‘Who helped whom more, Facebook or the BJP?’ He smiled and said: ‘That’s a difficult question. I wonder whether the BJP helped Facebook more than Facebook helped the BJP. You could say, we helped each other.’ . . .”
5. In Ukraine, as well, Facebook and the OUN/B successor organizations function symbiotically:
CrowdStrike–at the epicenter [33] of the supposed Russian hacking controversy is noteworthy. Its co-founder and chief technology officer, Dmitry Alperovitch is a senior fellow at the Atlantic Council, financed by elements that are at the foundation of fanning the flames of the New Cold War: “In this respect, it is worth noting that one of the commercial cybersecurity companies the government has relied on is Crowdstrike, which was one of the companies initially brought in by the DNC to investigate the alleged hacks. . . . Dmitri Alperovitch [34] is also a senior fellow at the Atlantic Council. . . . The connection between [Crowdstrike co-founder and chief technology officer Dmitri] Alperovitch and the Atlantic Council has gone largely unremarked upon, but it is relevant given that the Atlantic Council—which is is funded in part [35] by the US State Department, NATO, the governments of Latvia and Lithuania, the Ukrainian World Congress, and the Ukrainian oligarch Victor Pinchuk—has been among the loudest voices calling for a new Cold War with Russia. As I pointed out in the pages of The Nation in November, the Atlantic Council has spent the past several years producing some of the most virulent specimens of the new Cold War propaganda. . . . ”
(Note that the Atlantic Council is dominant in the array of individuals and institutions constituting the Ukrainian fascist/Facebook cooperative effort. We have spoken about the Atlantic Council in numerous programs, including FTR #943 [36]. The organization has deep operational links to elements of U.S. intelligence, as well as the OUN/B milieu that dominates the Ukrainian diaspora.)
In May of 2018, Facebook decided to effectively outsource the work of identifying propaganda and misinformation during elections to the Atlantic Council. [19]
” . . . . Facebook [37] is partnering with the Atlantic Council in another effort to combat election-related propaganda and misinformation from proliferating on its service. The social networking giant said Thursday that a partnership with the Washington D.C.-based think tank would help it better spot disinformation during upcoming world elections. The partnership is one of a number of steps Facebook is taking to prevent the spread of propaganda and fake news after failing to stop it [38] from spreading on its service in the run up to the 2016 U.S. presidential election. . . .”
Since autumn 2018, Facebook has looked to hire a public policy manager for Ukraine. The job came after years of Ukrainians criticizing the platform for takedowns of its activists’ pages and the spread of [alleged] Russian disinfo targeting Kyiv. Now, it appears to have one: @Kateryna_Kruk [39].— Christopher Miller (@ChristopherJM) June 3, 2019 [40]
Kateryna Kruk:
- Is Facebook’s Public Policy Manager for Ukraine as of May of this year, according to her LinkedIn page [42].
- Worked as an analyst and TV host for the Ukrainian ‘anti-Russian propaganda’ outfit StopFake. StopFake is the creation of Irena Chalupa, who works for the Atlantic Council and the Ukrainian government [43] and appears to be the sister of Andrea and Alexandra Chalupa [36].
- Joined the “Kremlin Watch” team at the European Values think-tank [44], in October of 2017.
- Received the Atlantic Council’s Freedom award for her communications work during the Euromaidan protests [45] in June of 2014.
- Worked for OUN/B successor organization Svoboda during the Euromaidan [46] protests. “ . . . ‘There are people who don’t support Svoboda because of some of their slogans, but they know it’s the most active political party and go to them for help, said Svoboda volunteer Kateryna Kruk. . . . ” . . . .
- Also has a number of articles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advocates for the creation of an independent Ukrainian Orthodox Church to diminish the influence of the Russian Orthodox Church [47].
- According to her LinkedIn page [42] has also done extensive work for the Ukrainian government. From March 2016 to January 2017 she was the Strategic Communications Manager for the Ukrainian parliament where she was responsible for social media and international communications. From January-April 2017 she was the Head of Communications at the Ministry of Health.
- Was not only was a volunteer for Svoboda during the 2014 Euromaidan protests, but openly celebrated on twitter the May 2014 massacre in Odessa when the far right burned dozens of protestors alive. Kruk’s twitter feed is set to private now so there isn’t public access to her old tweet, but people have screen captures of it. Here’s a tweet [48] from Yasha Levine with a screenshot of Kruk’s May 2, 2014 tweet where she writes: “#Odessa cleaned itself from terrorists, proud for city fighting for its identity.glory to fallen heroes..” She even threw in a “glory to fallen heroes” at the end of her tweet celebrating this massacre. Keep in mind that it was month after this tweet that the Atlantic Council gave her that Freedom Award for her communications work during the protests.
- In 2014 [49], . . . tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [50].” And he then traveled to Luhansk to fight pro-Russian rebels.
- Lionized [49] a Nazi sniper killed in Ukraine’s civil war. In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich [51] was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi [52]. (The Nazi was also lionized on Euromaidan Press’ Facebook page [53].)
- Has [49] staunchly defended the use of the slogan “Slava Ukraini,” [54]which was first coined and popularized [55] by Nazi-collaborating fascists [56], and is now the official salute of Ukraine’s army [57].
- Has [49] also said that the Ukrainian fascist politician Andriy Parubiy [58], who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart [59],” writing, “Parubiy touche [60].” . . . .