Dave Emory’s entire lifetime of work is available on a flash drive that can be obtained HERE [1]. The new drive is a 32-gigabyte drive that is current as of the programs and articles posted by the fall of 2017. The new drive (available for a tax-deductible contribution of $65.00 or more.)
WFMU-FM is podcasting For The Record–You can subscribe to the podcast HERE [2].
You can subscribe to e‑mail alerts from Spitfirelist.com HERE [3].
You can subscribe to RSS feed from Spitfirelist.com HERE [3].
You can subscribe to the comments made on programs and posts–an excellent source of information in, and of, itself, HERE [4].
Please consider supporting THE WORK DAVE EMORY DOES [5].
This broadcast was recorded in one, 60-minute segment [6].
Introduction: We have spoken repeatedly about the Nazi tract Serpent’s Walk [7], in which the Third Reich goes underground, buys into the opinion-forming media and, eventually, takes over.
Hitler, the Third Reich and their actions are glorified and memorialized. The essence of the book is synopsized on the back cover:
“It assumes that Hitler’s warrior elite — the SS — didn’t give up their struggle for a White world when they lost the Second World War. Instead their survivors went underground and adopted some of their tactics of their enemies: they began building their economic muscle and buying into the opinion-forming media. A century after the war they are ready to challenge the democrats and Jews for the hearts and minds of White Americans, who have begun to have their fill of government-enforced multi-culturalism and ‘equality.’ ”
Something analogous is happening in Ukraine and India.
In FTR #889 [8], we noted that Pierre Omidyar, a darling of the so-called “progressive” sector for his founding of The Intercept, was deeply involved with the financing of the ascent of both Narendra Modi’s Hindutva fascist BJP and the OUN/B successor organizations in Ukraine.
Omidyar’s anointment [9] as an icon of investigative reporting could not be more ironic, in that journalists and critics of his fascist allies in Ukraine and India are being repressed and murdered, thereby furthering the suppression of truth in those societies. This suppression of truth feeds in to the Serpent’s Walk scenario.
This program supplements past coverage of Facebook in FTR #‘s 718 [10], 946 [11], 1021 [12], 1039 [13] noting how Facebook has networked with the very Hindutva fascist Indian elements and OUN/B successor organizations in Ukraine. This networking has been–ostensibly to combat fake news. The reality may well highlight that the Facebook/BJP-RSS/OUN/B links generates fake news, rather than interdicting it. The fake news so generated, however, will be to the liking of the fascists in power in both countries, manifesting as a “Serpent’s Walk” revisionist scenario.
Key elements of discussion and analysis include:
- Indian politics has been largely dominated [14] by fake news, spread by social media: ” . . . . In the continuing Indian elections, as 900 million people are voting to elect representatives to the lower house of the Parliament, disinformation and hate speech are drowning out truth on social media networks in the country and creating a public health crisis like the pandemics of the past century. This contagion of a staggering amount of morphed images, doctored videos and text messages is spreading largely through messaging services and influencing what India’s voters watch and read on their smartphones. A recent study by Microsoft [15] found that over 64 percent Indians encountered fake news online, the highest reported among the 22 countries surveyed. . . . These platforms are filled with fake news and disinformation aimed at influencing political choices during the Indian elections. . . . ”
- Narendra Modi’s Hindutva fascist BJP has been the primary beneficiary [14] of fake news, and his regime has partnered with Facebook: ” . . . . The hearing was an exercise in absurdist theater [16] because the governing B.J.P. has been the chief beneficiary of divisive content that reaches millions because of the way social media algorithms, especially Facebook, amplify ‘engaging’ articles. . . .”
- Rajesh Jain is among those BJP functionaries who serve Facebook [17], as well as the Hindutva fascists: ” . . . . By the time Rajesh Jain was scaling up his operations in 2013, the BJP’s information technology (IT) strategists had begun interacting with social media platforms like Facebook and its partner WhatsApp. If supporters of the BJP are to be believed, the party was better than others in utilising the micro-targeting potential of the platforms. However, it is also true that Facebook’s employees in India conducted training workshops to help the members of the BJP’s IT cell. . . .”
- Dr. Hiren Joshi is another [17] of the BJP operatives who is heavily involved with Facebook. ” . . . . Also assisting the social media and online teams to build a larger-than-life image for Modi before the 2014 elections was a team led by his right-hand man Dr Hiren Joshi, who (as already stated) is a very important adviser to Modi whose writ extends way beyond information technology and social media. . . . Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India. . . .”
- Shivnath Thukral, who was hired by Facebook in 2017 [18] to be its Public Policy Director for India & South Asia, worked with Joshi’s team in 2014. ” . . . . The third team, that was intensely focused on building Modi’s personal image, was headed by Hiren Joshi himself who worked out of the then Gujarat Chief Minister’s Office in Gandhinagar. The members of this team worked closely with staffers of Facebook in India, more than one of our sources told us. As will be detailed later, Shivnath Thukral, who is currently an important executive in Facebook, worked with this team. . . .”
- An ostensibly remorseful BJP politician–Prodyut Bora–highlighted [17] the dramatic effect of Facebook and its WhatsApp subsidiary have had on India’s politics: ” . . . . In 2009, social media platforms like Facebook and WhatsApp had a marginal impact in India’s 20 big cities. By 2014, however, it had virtually replaced the traditional mass media. In 2019, it will be the most pervasive media in the country. . . .”
- A concise statement about the relationship between the BJP and Facebook was issued by BJP tech office Vinit Goenka [17]: ” . . . . At one stage in our interview with [Vinit] Goenka that lasted over two hours, we asked him a pointed question: ‘Who helped whom more, Facebook or the BJP?’ He smiled and said: ‘That’s a difficult question. I wonder whether the BJP helped Facebook more than Facebook helped the BJP. You could say, we helped each other.’ . . .”
In Ukraine, as well, Facebook and the OUN/B successor organizations function symbiotically:
(Note that the Atlantic Council is dominant in the array of individuals and institutions constituting the Ukrainian fascist/Facebook cooperative effort. We have spoken about the Atlantic Council in numerous programs, including FTR #943 [19]. The organization has deep operational links to elements of U.S. intelligence, as well as the OUN/B milieu that dominates the Ukrainian diaspora.)
CrowdStrike–at the epicenter [20] of the supposed Russian hacking controversy is noteworthy. Its co-founder and chief technology officer, Dmitry Alperovitch is a senior fellow at the Atlantic Council, financed by elements that are at the foundation of fanning the flames of the New Cold War: “In this respect, it is worth noting that one of the commercial cybersecurity companies the government has relied on is Crowdstrike, which was one of the companies initially brought in by the DNC to investigate the alleged hacks. . . . Dmitri Alperovitch [21] is also a senior fellow at the Atlantic Council. . . . The connection between [Crowdstrike co-founder and chief technology officer Dmitri] Alperovitch and the Atlantic Council has gone largely unremarked upon, but it is relevant given that the Atlantic Council—which is is funded in part [22] by the US State Department, NATO, the governments of Latvia and Lithuania, the Ukrainian World Congress, and the Ukrainian oligarch Victor Pinchuk—has been among the loudest voices calling for a new Cold War with Russia. As I pointed out in the pages of The Nation in November, the Atlantic Council has spent the past several years producing some of the most virulent specimens of the new Cold War propaganda. . . . ”
In May of 2018, Facebook decided to effectively outsource the work of identifying propaganda and misinformation during elections to the Atlantic Council. [23]
” . . . . Facebook [24] is partnering with the Atlantic Council in another effort to combat election-related propaganda and misinformation from proliferating on its service. The social networking giant said Thursday that a partnership with the Washington D.C.-based think tank would help it better spot disinformation during upcoming world elections. The partnership is one of a number of steps Facebook is taking to prevent the spread of propaganda and fake news after failing to stop it [25] from spreading on its service in the run up to the 2016 U.S. presidential election. . . .”
Since autumn 2018, Facebook has looked to hire a public policy manager for Ukraine. The job came after years of Ukrainians criticizing the platform for takedowns of its activists’ pages and the spread of [alleged] Russian disinfo targeting Kyiv. Now, it appears to have one: @Kateryna_Kruk [26].— Christopher Miller (@ChristopherJM) June 3, 2019 [27]
Kateryna Kruk:
- Is Facebook’s Public Policy Manager for Ukraine as of May of this year, according to her LinkedIn page [28].
- Worked as an analyst and TV host for the Ukrainian ‘anti-Russian propaganda’ outfit StopFake. StopFake is the creation of Irena Chalupa, who works for the Atlantic Council and the Ukrainian government [29] and appears to be the sister of Andrea and Alexandra Chalupa [19].
- Joined the “Kremlin Watch” team at the European Values think-tank [30], in October of 2017.
- Received the Atlantic Council’s Freedom award for her communications work during the Euromaidan protests [31] in June of 2014.
- Worked for OUN/B successor organization Svoboda during the Euromaidan [32] protests. “ . . . ‘There are people who don’t support Svoboda because of some of their slogans, but they know it’s the most active political party and go to them for help, said Svoboda volunteer Kateryna Kruk. . . . ” . . . .
- Also has a number of articles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advocates for the creation of an independent Ukrainian Orthodox Church to diminish the influence of the Russian Orthodox Church [33].
- According to her LinkedIn page [28] has also done extensive work for the Ukrainian government. From March 2016 to January 2017 she was the Strategic Communications Manager for the Ukrainian parliament where she was responsible for social media and international communications. From January-April 2017 she was the Head of Communications at the Ministry of Health.
- Was not only was a volunteer for Svoboda during the 2014 Euromaidan protests, but openly celebrated on twitter the May 2014 massacre in Odessa when the far right burned dozens of protestors alive. Kruk’s twitter feed is set to private now so there isn’t public access to her old tweet, but people have screen captures of it. Here’s a tweet [34] from Yasha Levine with a screenshot of Kruk’s May 2, 2014 tweet where she writes: “#Odessa cleaned itself from terrorists, proud for city fighting for its identity.glory to fallen heroes..” She even threw in a “glory to fallen heroes” at the end of her tweet celebrating this massacre. Keep in mind that it was month after this tweet that the Atlantic Council gave her that Freedom Award for her communications work during the protests.
- In 2014 [35], . . . tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [36].” And he then traveled to Luhansk to fight pro-Russian rebels.
- Lionized [35] a Nazi sniper killed in Ukraine’s civil war. In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich [37] was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi [38]. (The Nazi was also lionized on Euromaidan Press’ Facebook page [39].)
- Has [35] staunchly defended the use of the slogan “Slava Ukraini,” [40]which was first coined and popularized [41] by Nazi-collaborating fascists [42], and is now the official salute of Ukraine’s army [43].
- Has [35] also said that the Ukrainian fascist politician Andriy Parubiy [44], who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart [45],” writing, “Parubiy touche [46].” . . . .
In the context of Facebook’s institutional level networking with fascists, it is worth noting that social media themselves have been cited [47] as a contributing factor to right-wing domestic terrorism. ” . . . The first is stochastic terrorism [48]: ‘The use of mass, public communication, usually against a particular individual or group, which incites or inspires acts of terrorism which are statistically probable but happen seemingly at random.’ I encountered the idea in a Friday thread from data scientist Emily Gorcenski [49], who used it to tie together four recent attacks. . . . .”
The program concludes with review [50] (from FTR #1039 [13]) of the psychological warfare strategy adapted by Cambridge Analytica to the political arena. Christopher Wylie–the former head of research at Cambridge Analytica who became one of the key insider whistle-blowers about how Cambridge Analytica operated and the extent of Facebook’s knowledge about it–gave an interview to Campaign Magazine. (We dealt with Cambridge Analytica in FTR #‘s 946 [11], 1021 [12].) Wylie recounts how, as director of research at Cambridge Analytica, his original role was to determine how the company could use the information warfare techniques used by SCL Group – Cambridge Analytica’s parent company and a defense contractor providing psy op services for the British military. Wylie’s job was to adapt the psychological warfare strategies that SCL had been using on the battlefield to the online space. As Wylie put it:
“ . . . . When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy…But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists. . . . .”
Wylie also draws parallels between the psychological operations used on democratic audiences and the battlefield techniques used to be build an insurgency.
1a. Following the sweeping victory of the BJP in India’s elections that exceeded the expectations [51], there’s no shortage of questions of how the BJP managed such a resounding victory despite what appeared to be growing popular frustrations with the party just six months ago [52]. And while the embrace of nationalism and sectarianism no doubt played a major role along with the tensions with Pakistan, it’s also important to give credit to the profound role social media played in this year’s elections. Specifically, organized social media disinformation campaigns run by the BJP [14]:
In the continuing Indian elections, as 900 million people are voting to elect representatives to the lower house of the Parliament, disinformation and hate speech are drowning out truth on social media networks in the country and creating a public health crisis like the pandemics of the past century.
This contagion of a staggering amount of morphed images, doctored videos and text messages is spreading largely through messaging services and influencing what India’s voters watch and read on their smartphones. A recent study by Microsoft [15] found that over 64 percent Indians encountered fake news online, the highest reported among the 22 countries surveyed.
India has the most social media users, with 300 million users on Facebook, 200 million on WhatsApp and 250 million using YouTube. TikTok, the video messaging service owned by a Chinese company, has more than 88 million users [53] in India. And there are Indian messaging applications such as ShareChat, which claims to have 40 million users and allows them to communicate in 14 Indian languages.
These platforms are filled with fake news and disinformation aimed at influencing political choices during the Indian elections. Some of the egregious instances are a made-up BBC survey [54] predicting victory for the governing Bharatiya Janata Party and a fake video of the opposition Congress Party president, Rahul Gandhi, saying a machine can convert potatoes into gold [55].
Fake stories are spread by legions of online trolls and unsuspecting users, with dangerous impact. A rumor spread through social media about child kidnappers arriving in various parts of India has led to 33 deaths in 69 incidents of mob violence [56] since 2017, according to IndiaSpend, a data journalism website.
Six months before the 2014 general elections in India, 62 people were killed in sectarian violence and 50,000 were displaced from their homes in the northern state of Uttar Pradesh. Investigations by the police found that a fake video was shared [57] on WhatsApp to whip up sectarian passions.
In the lead-up to the elections, the Indian government summoned the top executives of Facebook and Twitter to discuss the crisis of coordinated misinformation, fake news and political bias on their platforms. In March, Joel Kaplan, Facebook’s global vice president for public policy, was called to appear before a committee of 31 members of the Indian Parliament — who were mostly from the ruling Bharatiya Janata Party — to discuss “safeguarding [58] citizens’ rights on social/online news media platforms.”
The hearing was an exercise in absurdist theater [16] because the governing B.J.P. has been the chief beneficiary of divisive content that reaches millions because of the way social media algorithms, especially Facebook, amplify “engaging” articles.
As elsewhere in the world, Facebook, Twitter and YouTube are ambivalent about tackling the problem head-on for the fear of making decisions that invoke the wrath of national political forces [16]. The tightrope walk was evident when in April, Facebook announced a ban on about 1,000 fake news pages targeting India. They included pages directly associated with political parties.
Facebook announced that a majority of the pages were associated with the opposition Indian National Congress party, but it merely named the technology company associated with the governing B.J.P. pages. Many news reports later pointed [59] out that the pages related to the B.J.P. that were removed were far more consequential and reached millions.
Asking the social media platforms to fix the crisis is a deeply flawed approach because most of the disinformation is shared in a decentralized manner through messaging. Seeking to monitor those messages is a step toward accepting mass surveillance. The Indian government loves the idea and has proposed laws [60] that, among other things, would break end-to-end encryption and obtain user data without a court order.
The idea of more effective fact-checking has come up often in the debates around India’s disinformation contagion. But it comes with many conceptual difficulties: A large proportion of messages shared on social networks in India have little to do with verifiable facts and peddle prejudiced opinions. Facebook India has a small 11- to 22 [61]-member fact-checking team for content related to Indian elections.
Fake news is not a technological or scientific problem with a quick fix. It should be treated as a new kind of public health crisis in all its social and human complexity. The answer might lie in looking back at how we responded to the epidemics, the infectious diseases in the 19th and early 20th centuries, which have similar characteristics. . . .
1b. As the following article notes, the farcical nature of the BJP government asking Facebook to help with the disinformation crisis is even more farcical by the fact that Facebook has previously conducting training workshops to help the BJP use Facebook more effectively. The article describes the teams of IT cells that were set up by the BJP for the 2014 election to build a larger-than-life image for Modi. There were four cells.
One of those cells was run by Modi’s right hand man Dr Hiren Joshi. Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India according to the article. Hiren’s team worked closely with Facebook’s staff. Shivnath Thukral, who was hired by Facebook in 2017 to be its Public Policy Director for India & South Asia [18], worked with this team in 2014. And that’s just an overview of how tightly Facebook was working with the BJP in 2014 [17]:
By the time Rajesh Jain was scaling up his operations in 2013, the BJP’s information technology (IT) strategists had begun interacting with social media platforms like Facebook and its partner WhatsApp. If supporters of the BJP are to be believed, the party was better than others in utilising the micro-targeting potential of the platforms. However, it is also true that Facebook’s employees in India conducted training workshops to help the members of the BJP’s IT cell.
Helping party functionaries were advertising honchos like Sajan Raj Kurup, founder of Creativeland Asia and Prahlad Kakkar, the well-known advertising professional. Actor Anupam Kher became the public face of some of the advertising campaigns. Also assisting the social media and online teams to build a larger-than-life image for Modi before the 2014 elections was a team led by his right-hand man Dr Hiren Joshi, who (as already stated) is a very important adviser to Modi whose writ extends way beyond information technology and social media.
Currently, Officer On Special Duty in the Prime Minister’s Office, he is assisted by two young professional “techies,” Nirav Shah and Yash Rajiv Gandhi. Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India. In 2013, one of his important collaborators was Akhilesh Mishra who later went on to serve as a director of the Indian government’s website, MyGov India – which is at present led by Arvind Gupta who was earlier head of the BJP’s IT cell.
Mishra is CEO of Bluekraft Digital Foundation. The Foundation has been linked to a disinformation website titled “The True Picture,” has published books authored by Prime Minister Narendra Modi and produces campaign videos for NaMo Television, a 24 hour cable television channel dedicated to promoting Modi.
The 2014 Modi pre-election campaign was inspired by the 2012 campaign to elect Barack Obama as the “world’s first Facebook President.” Some of the managers of the Modi campaign like Jain were apparently inspired by Sasha Issenberg’s book on the topic, The Victory Lab: The Secret Science of Winning Campaigns. In the first data-led election in India in 2014, information was collected from every possible source to not just micro-target users but also fine-tune messages praising and “mythologising” Modi as the Great Leader who would usher in acche din for the country.
Four teams spearheaded the campaign. The first team was led by Mumbai-based Jain who funded part of the communication campaign and also oversaw voter data analysis. He was helped by Shashi Shekhar Vempati in running NITI and “Mission 272+.” As already mentioned, Shekhar had worked in Infosys and is at present the head of Prasar Bharati Corporation which runs Doordarshan and All India Radio.
The second team was led by political strategist Prashant Kishor and his I‑PAC or Indian Political Action Committee who supervised the three-dimensional projection programme for Modi besides programmes like Run for Unity, Chai Pe Charcha (or Discussions Over Tea), Manthan (or Churning) and Citizens for Accountable Governance (CAG) that roped in management graduates to garner support for Modi at large gatherings. Having worked across the political spectrum and opportunistically switched affiliation to those who backed (and paid) him, 41-year-old Kishor is currently the second-in-command in Janata Dal (United) headed by Bihar Chief Minister Nitish Kumar.
The third team, that was intensely focused on building Modi’s personal image, was headed by Hiren Joshi himself who worked out of the then Gujarat Chief Minister’s Office in Gandhinagar. The members of this team worked closely with staffers of Facebook in India, more than one of our sources told us. As will be detailed later, Shivnath Thukral, who is currently an important executive in Facebook, worked with this team. (We made a number of telephone calls to Joshi’s office in New Delhi’s South Block seeking a meeting with him and also sent him an e‑mail message requesting an interview but he did not respond.)
The fourth team was led by Arvind Gupta, the current CEO of MyGov.in, a social media platform run by the government of India. He ran the BJP’s campaign based out of New Delhi. When contacted, he too declined to speak on the record saying he is now with the government and not a representative of the BJP. He suggested we contact Amit Malviya who is the present head of the BJP’s IT cell. He came on the line but declined to speak specifically on the BJP’s relationship with Facebook and WhatsApp.
The four teams worked separately. “It was (like) a relay (race),” said Vinit Goenka who was then the national co-convener of the BJP’s IT cell, adding: “The only knowledge that was shared (among the teams) was on a ‘need to know’ basis. That’s how any sensible organisation works.”
From all accounts, Rajesh Jain worked independently from his Lower Parel office and invested his own funds to support Modi and towards executing what he described as “Project 275 for 2014” in a blog post that he wrote in June 2011, nearly three years before the elections actually took place. The BJP, of course, went on to win 282 seats in the 2014 Lok Sabha elections, ten above the half-way mark, with a little over 31 per cent of the vote.
As an aside, it may be mentioned in passing that – like certain former bhakts or followers of Modi – Jain today appears less than enthusiastic about the performance of the government over the last four and a half years. He is currently engaged in promoting a campaign called Dhan Vapasi (or “return our wealth”) which is aimed at monetising surplus land and other assets held by government bodies, including defence establishments, and public sector undertakings, for the benefit of the poor and the underprivileged. Dhan Vapasi, in his words, is all about making “every Indian rich and free.”
In one of his recent videos that are in the public domain, Jain remarked: “For the 2014 elections, I had spent three years and my own money to build a team of 100 people to help with Modi’s campaign. Why? Because I trusted that a Modi-led BJP government could end the Congress’ anti-prosperity programmes and put India on a path to prosperity, a nayi disha (or new direction). But four years have gone by without any significant change in policy. India needed that to eliminate the big and hamesha (perennial) problems of poverty, unemployment and corruption. The Modi-led BJP government followed the same old failed policy of increasing taxes and spending. The ruler changed, but the outcomes have not.”
As mentioned, when we contacted 51-year-old Jain, who heads the Mumbai-based Netcore group of companies, said to be India’s biggest digital media marketing corporate group, he declined to be interviewed. Incidentally, he had till October 2017 served on the boards of directors of two prominent public sector companies. One was National Thermal Power Corporation (NTPC) – Jain has no experience in the power sector, just as Sambit Patra, BJP spokesperson, who is an “independent” director on the board of the Oil and Natural Gas Corporation, has zero experience in the petroleum industry. Jain also served on the board of the Unique Identification Authority of India (UIDAI), which runs the Aadhar programme.
Unlike Jain who was not at all forthcoming, 44-year-old Prodyut Bora, founder of the BJP’s IT cell in 2007 (barely a year after Facebook and Twitter had been launched) was far from reticent while speaking to us. He had resigned from the party’s national executive in February 2015 after questioning Modi and Amit Shah’s “highly individualised and centralised style of decision-making” that had led to the “subversion of democratic traditions” in the government and in the party.
Bora recalled how he was one of the first graduates from the leading business school, the Indian Institute of Management, Ahmedabad, to join the BJP because of his great admiration for the then Prime Minister Atal Behari Vajpayee. It was at the behest of the then party president Rajnath Singh (who is now Union Home Minister) that he set up the party’s IT cell to enable its leaders to come closer to, and interact with, their supporters.
The cell, he told us, was created not with a mandate to abuse people on social media platforms. He lamented that “madness” has now gripped the BJP and the desire to win elections at any cost has “destroyed the very ethos” of the party he was once a part of. Today, the Gurgaon-based Bora runs a firm making air purification equipment and is involved with an independent political party in his home state, Assam.
He told us: “The process of being economical with the truth (in the BJP) began in 2014. The (election) campaign was sending out unverified facts, infomercials, memes, dodgy data and graphs. From there, fake news was one step up the curve. Leaders of political parties, including the BJP, like to outsource this work because they don’t want to leave behind digital footprints. In 2009, social media platforms like Facebook and WhatsApp had a marginal impact in India’s 20 big cities. By 2014, however, it had virtually replaced the traditional mass media. In 2019, it will be the most pervasive media in the country.” . . . .
. . . . At one stage in our interview with [Vinit] Goenka that lasted over two hours, we asked him a pointed question: “Who helped whom more, Facebook or the BJP?”
He smiled and said: “That’s a difficult question. I wonder whether the BJP helped Facebook more than Facebook helped the BJP. You could say, we helped each other.”
1c. According to Christopher Miller of RFERL, Facebook selected Kateryna Kruk for the position:
Since autumn 2018, Facebook has looked to hire a public policy manager for Ukraine. The job came after years of Ukrainians criticizing the platform for takedowns of its activists’ pages and the spread of Russian disinfo targeting Kyiv. Now, it appears to have one: @Kateryna_Kruk [26].— Christopher Miller (@ChristopherJM) June 3, 2019 [27]
Kruk’s LinkedIn page [28] also lists her as being Facebook’s Public Policy Manager for Ukraine as of May of this year.
Kruk worked as an analyst and TV host for the Ukrainian ‘anti-Russian propaganda’ outfit StopFake. StopFake is the creation of Irena Chalupa, who works for the Atlantic Council and the Ukrainian government [29] and appears to be the sister of Andrea and Alexandra Chalupa.
(As an example of how StopFake.org approaches Ukraine’s far right, here’s a tweet [62] from StopFake’s co-founder, Yevhen Fedchenko, from May of 2018 where he complains about an article in Hromadske International [63] that characterizes C14 as a neo-Nazi group:
“for Hromadske C14 is ‘neo- nazi’, in reality one of them – Oleksandr Voitko – is a war veteran and before going to the war – alum and faculty at @MohylaJSchool [64], journalist at Foreign news desk at Channel 5. Now also active participant of war veterans grass-root organization. https://t.co/QmaGnu6QGZ [65]— Yevhen Fedchenko (@yevhenfedchenko) May 5, 2018) [66]”
In October of 2017, Kruk joined the “Kremlin Watch” team at the European Values think-tank [30]. In June of 2014, The Atlantic Council gave Kruk its Freedom award for her communications work during the Euromaidan protests [31]. Kruk also has a number of articles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advocates for the creation of an independent Ukrainian Orthodox Church to diminish the influence of the Russian Orthodox Church [33]. Keep in mind that, in May of 2018, Facebook decided to effectively outsource the work of identifying propaganda and misinformation during elections to the Atlantic Council [23], so choosing someone like Kruk who already has the Atlantic Council’s stamp of approval is in keeping with that trend.
According to Kruk’s LinkedIn page [28] she’s also done extensive work for the Ukrainian government. From March 2016 to January 2017 she was the Strategic Communications Manager for the Ukrainian parliament where she was responsible for social media and international communications. From January-April 2017 she was the Head of Communications at the Ministry of Health.
Kruk not only was a volunteer for Svoboda during the 2014 Euromaidan protests, she also openly celebrated on twitter the May 2014 massacre in Odessa when the far right burned dozens of protestors alive. Kruk’s twitter feed is set to private now so there isn’t public access to her old tweet, but people have screen captures of it. Here’s a tweet [34] from Yasha Levine with a screenshot of Kruk’s May 2, 2014 tweet where she writes:
“#Odessa cleaned itself from terrorists, proud for city fighting for its identity.glory to fallen heroes..”
She even threw in a “glory to fallen heroes” at the end of her tweet celebrating this massacre. Keep in mind that it was month after this tweet that the Atlantic Council gave her that Freedom Award for her communications work during the protests.
An article from January of 2014 about the then-ongoing Maidan square protests, The article covers the growing presence of the far right in the protests and their attacks on left-wing protestors. Kruk is interviewed in the article and describes herself as a Svoboda volunteer. Kruk issued a tweet celebrating the Odessa massacre a few months later and also stands out from a public relations standpoint: Kruk was sending messages for why average Ukrainians who don’t necessarily support the far right should support the far right at that moment, which was one of the most useful messages she could have been sending for the far right at that time [32]:
“The Ukrainian Nationalism at the Heart of ‘Euromaidan’” by Alec Luhn; The Nation; 01/21/2014 [32].
. . . . For now, Svoboda and other far-right movements like Right Sector are focusing on the protest-wide demands for civic freedoms government accountability rather than overtly nationalist agendas. Svoboda enjoys a reputation as a party of action, responsive to citizens’ problems. Noyevy cut an interview with The Nation short to help local residents who came with a complaint that a developer was tearing down a fence without permission.
“There are people who don’t support Svoboda because of some of their slogans, but they know it’s the most active political party and go to them for help,” said Svoboda volunteer Kateryna Kruk. “Only Svoboda is helping against land seizures in Kiev.” . . . .
1d. Kruk has manifested other fascist sympathies and connections:
- In 2014, she tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [36].” And he then traveled to Luhansk to fight pro-Russian rebels.
-
In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich [37] was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi [38]. (The Nazi was also lionized on Euromaidan Press’ Facebook page [39].)
- Kruk has staunchly defended the use of the slogan “Slava Ukraini,” [40]which was first coined and popularized [41] by Nazi-collaborating fascists [42], and is now the official salute of Ukraine’s army [43].
- She has also said that the Ukrainian fascist politician Andriy Parubiy [44], who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart [45],” writing, “Parubiy touche [46].” . . . .
. . . . Svoboda is not the only Ukrainian fascist group Kateryna Kruk has expressed support for. In 2014, she tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him [36].” And he then traveled to Luhansk to fight pro-Russian rebels.
That’s not all. In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich [37] was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi [38]. (The Nazi was also lionized on Euromaidan Press’ Facebook page [39].)
Kruk has staunchly defended the use of the slogan “Slava Ukraini,” [40]which was first coined and popularized [41] by Nazi-collaborating fascists [42], and is now the official salute of Ukraine’s army [43].
She has also said that the Ukrainian fascist politician Andriy Parubiy [44], who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart [45],” writing, “Parubiy touche [46].” . . . .
2. The essence of the book Serpent’s Walk is presented on the back cover:
It assumes that Hitler’s warrior elite — the SS — didn’t give up their struggle for a White world when they lost the Second World War. Instead their survivors went underground and adopted some of the tactics of their enemies: they began building their economic muscle and buying into the opinion-forming media. A century after the war they are ready to challenge the democrats and Jews for the hearts and minds of White Americans, who have begun to have their fill of government-enforced multi-culturalism and ‘equality.’
3. This process is described in more detail in a passage of text, consisting of a discussion between Wrench (a member of this Underground Reich) and a mercenary named Lessing.
. . . . The SS . . . what was left of it . . . had business objectives before and during World War II. When the war was lost they just kept on, but from other places: Bogota, Asuncion, Buenos Aires, Rio de Janeiro, Mexico City, Colombo, Damascus, Dacca . . . you name it. They realized that the world is heading towards a ‘corporacracy;’ five or ten international super-companies that will run everything worth running by the year 2100. Those super-corporations exist now, and they’re already dividing up the production and marketing of food, transport, steel and heavy industry, oil, the media, and other commodities. They’re mostly conglomerates, with fingers in more than one pie . . . . We, the SS, have the say in four or five. We’ve been competing for the past sixty years or so, and we’re slowly gaining . . . . About ten years ago, we swung a merger, a takeover, and got voting control of a supercorp that runs a small but significant chunk of the American media. Not openly, not with bands and trumpets or swastikas flying, but quietly: one huge corporation cuddling up to another one and gently munching it up, like a great, gubbing amoeba. Since then we’ve been replacing executives, pushing somebody out here, bringing somebody else in there. We’ve swing program content around, too. Not much, but a little, so it won’t show. We’ve cut down on ‘nasty-Nazi’ movies . . . good guys in white hats and bad guys in black SS hats . . . lovable Jews versus fiendish Germans . . . and we have media psychologists, ad agencies, and behavior modification specialists working on image changes. . . .
4. The broadcast addresses the gradual remaking of the image of the Third Reich that is represented in Serpent’s Walk. In the discussion excerpted above, this process is further described.
. . . . Hell, if you can con granny into buying Sugar Turds instead of Bran Farts, then why can’t you swing public opinion over to a cause as vital and important as ours?’ . . . In any case, we’re slowly replacing those negative images with others: the ‘Good Bad Guy’ routine’ . . . ‘What do you think of Jesse James? John Dillinger? Julius Caesar? Genghis Khan?’ . . . The reality may have been rough, but there’s a sort of glitter about most of those dudes: mean honchos but respectable. It’s all how you package it. Opinion is a godamned commodity!’ . . . It works with anybody . . . Give it time. Aside from the media, we’ve been buying up private schools . . . and helping some public ones through philanthropic foundations . . . and working on the churches and the Born Agains. . . .
5. Through the years, we have highlighted the Nazi tract Serpent’s Wal [68]k, excerpted above, which deals, in part, with the rehabilitation of the Third Reich’s reputation and the transformation of Hitler into a hero.
In FTR #1015 [69], we noted that a Serpent’s Walk scenario is indeed unfolding in India.
[70]Key points of analysis and discussion include:
- Narendra Modi’s presence on the same book cove [71]r (along with Gandhi, Mandela, Obama and Hitler.)
- Modi himself has his own political history [72] with children’s books that promote Hitler as a great leader: ” . . . . In 2004, reports surfaced of high-school textbooks in the state of Gujarat, which was then led by Mr. Modi, that spoke glowingly of Nazism and fascism [72]. According to ‘The Times of India,’ in a section called ‘Ideology of Nazism,’ the textbook said Hitler had ‘lent dignity and prestige to the German government,’ ‘made untiring efforts to make Germany self-reliant’ and ‘instilled the spirit of adventure in the common people.’ . . . .”
- In India, many have a favorable view of Hitler [73]: ” . . . . as far back as 2002, the Times of India reported a survey [73] that found that 17 percent of students in elite Indian colleges ‘favored Adolf Hitler as the kind of leader India ought to have.’ . . . . Consider Mein Kampf [74], Hitler’s autobiography. Reviled it might be in the much of the world, but Indians buy thousands of copies of it every month. As a recent paper in the journal EPW tells us (PDF [75]), there are over a dozen Indian publishers who have editions of the book on the market. Jaico, for example, printed its 55th edition in 2010, claiming to have sold 100,000 copies in the previous seven years. (Contrast this to the 3,000 copies my own 2009 book, Roadrunner, has sold). In a country where 10,000 copies sold makes a book a bestseller, these are significant numbers. . . .”
- A classroom of school children filled with fans of Hitler had a very different sentiment about Gandhi. ” . . . . ‘He’s a coward!’ That’s the obvious flip side of this love of Hitler in India. It’s an implicit rejection of Gandhi. . . .”
- Apparently, Mein Kampf has achieved gravitas among business students in India [76]: ” . . . . What’s more, there’s a steady trickle of reports that say it has become a must-read for business-school students [77]; a management guide much like Spencer Johnson’s Who Moved My Cheese or Edward de Bono’s Lateral Thinking. If this undistinguished artist could take an entire country with him, I imagine the reasoning goes, surely his book has some lessons for future captains of industry? . . . .”
6. Christopher Wylie–the former head of research at Cambridge Analytica who became one of the key insider whistle-blowers about how Cambridge Analytica operated and the extent of Facebook’s knowledge about it–gave an interview last month to Campaign Magazine. (We dealt with Cambridge Analytica in FTR #‘s 946 [11], 1021 [12].)
Wylie recounts how, as director of research at Cambridge Analytica, his original role was to determine how the company could use the information warfare techniques used by SCL Group – Cambridge Analytica’s parent company and a defense contractor providing psy op services for the British military. Wylie’s job was to adapt the psychological warfare strategies that SCL had been using on the battlefield to the online space. As Wylie put it:
“ . . . . When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy…But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists. . . . .”
Wylie also draws parallels between the psychological operations used on democratic audiences and the battlefield techniques used to be build an insurgency. It starts with targeting people more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to “like” a group on social media. The information you’re feeding this target audience may or may not be real. The important thing is that it’s content that they already agree with so that “it feels good to see that information.” Keep in mind that one of the goals of the ‘psychographic profiling’ that Cambridge Analytica was to identify traits like neuroticism.
Wylie goes on to describe the next step in this insurgency-building technique: keep building up the interest in the social media group that you’re directing this target audience towards until it hits around 1,000–2,000 people. Then set up a real life event dedicated to the chosen disinformation topic in some local area and try to get as many of your target audience to show up. Even if only 5 percent of them show up, that’s still 50–100 people converging on some local coffee shop or whatever. The people meet each other in real life and start talking about about “all these things that you’ve been seeing online in the depths of your den and getting angry about”. This target audience starts believing that no one else is talking about this stuff because “they don’t want you to know what the truth is”. As Wylie puts it, “What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.”
In the early hours of 17 March 2018, the 28-year-old Christopher Wylie tweeted: “Here we go….”
Later that day, The Observer and The New York Times published the story of Cambridge Analytica’s misuse of Facebook data, which sent shockwaves around the world, caused millions to #DeleteFacebook, and led the UK Information Commissioner’s Office to fine the site the maximum penalty for failing to protect users’ information. Six weeks after the story broke, Cambridge Analytica closed. . . .
. . . . He believes that poor use of data is killing good ideas. And that, unless effective regulation is enacted, society’s worship of algorithms, unchecked data capture and use, and the likely spread of AI to all parts of our lives is causing us to sleepwalk into a bleak future.
Not only are such circumstances a threat to adland – why do you need an ad to tell you about a product if an algorithm is choosing it for you? – it is a threat to human free will. “Currently, the only morality of the algorithm is to optimise you as a consumer and, in many cases, you become the product. There are very few examples in human history of industries where people themselves become products and those are scary industries – slavery and the sex trade. And now, we have social media,” Wylie says.
“The problem with that, and what makes it inherently different to selling, say, toothpaste, is that you’re selling parts of people or access to people. People have an innate moral worth. If we don’t respect that, we can create industries that do terrible things to people. We are [heading] blindly and quickly into an environment where this mentality is going to be amplified through AI everywhere. We’re humans, we should be thinking about people first.”
His words carry weight, because he’s been on the dark side. He has seen what can happen when data is used to spread misinformation, create insurgencies and prey on the worst of people’s characters.
The political battlefield
A quick refresher on the scandal, in Wylie’s words: Cambridge Analytica was a company spun out of SCL Group, a British military contractor that worked in information operations for armed forces around the world. It was conducting research on how to scale and digitise information warfare – the use of information to confuse or degrade the efficacy of an enemy. . . .
. . . . As director of research, Wylie’s original role was to map out how the company would take traditional information operations tactics into the online space – in particular, by profiling people who would be susceptible to certain messaging.
This morphed into the political arena. After Wylie left, the company worked on Donald Trump’s US presidential campaign and – possibly – the UK’s European Union referendum. In February 2016, Cambridge Analytica’s former chief executive, Alexander Nix, wrote in Campaign that his company had “already helped supercharge Leave.EU’s social-media campaign”. Nix has strenuously denied this since, including to MPs.
It was this shift from the battlefield to politics that made Wylie uncomfortable. “When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy,” he says.
“But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists.”
One of the reasons these techniques are so insidious is that being a target of a disinformation campaign is “usually a pleasurable experience”, because you are being fed content with which you are likely to agree. “You are being guided through something that you want to be true,” Wylie says.
To build an insurgency, he explains, you first target people who are more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to “like” a group on social media. They start engaging with the content, which may or may not be true; either way “it feels good to see that information”.
When the group reaches 1,000 or 2,000 members, an event is set up in the local area. Even if only 5% show up, “that’s 50 to 100 people flooding a local coffee shop”, Wylie says. This, he adds, validates their opinion because other people there are also talking about “all these things that you’ve been seeing online in the depths of your den and getting angry about”.
People then start to believe the reason it’s not shown on mainstream news channels is because “they don’t want you to know what the truth is”. As Wylie sums it up: “What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.” . . . .
. . . . Psychographic potential
One such application was Cambridge Analytica’s use of psychographic profiling, a form of segmentation that will be familiar to marketers, although not in common use.
The company used the OCEAN model, which judges people on scales of the Big Five personality traits: openness to experiences, conscientiousness, extraversion, agreeableness and neuroticism.
Wylie believes the method could be useful in the commercial space. For example, a fashion brand that creates bold, colourful, patterned clothes might want to segment wealthy woman by extroversion because they will be more likely to buy bold items, he says.
Sceptics say Cambridge Analytica’s approach may not be the dark magic that Wylie claims. Indeed, when speaking to Campaign in June 2017, Nix uncharacteristically played down the method, claiming the company used “pretty bland data in a pretty enterprising way”.
But Wylie argues that people underestimate what algorithms allow you to do in profiling. “I can take pieces of information about you that seem innocuous, but what I’m able to do with an algorithm is find patterns that correlate to underlying psychological profiles,” he explains.
“I can ask whether you listen to Justin Bieber, and you won’t feel like I’m invading your privacy. You aren’t necessarily aware that when you tell me what music you listen to or what TV shows you watch, you are telling me some of your deepest and most personal attributes.”
This is where matters stray into the question of ethics. Wylie believes that as long as the communication you are sending out is clear, not coercive or manipulative, it’s fine, but it all depends on context. “If you are a beauty company and you use facets of neuroticism – which Cambridge Analytica did – and you find a segment of young women or men who are more prone to body dysmorphia, and one of the proactive actions they take is to buy more skin cream, you are exploiting something which is unhealthy for that person and doing damage,” he says. “The ethics of using psychometric data really depend on whether it is proportional to the benefit and utility that the customer is getting.” . . .
Clashes with Facebook
Wylie is opposed to self-regulation, because industries won’t become consumer champions – they are, he says, too conflicted.
“Facebook has known about what Cambridge Analytica was up to from the very beginning of those projects,” Wylie claims. “They were notified, they authorised the applications, they were given the terms and conditions of the app that said explicitly what it was doing. They hired people who worked on building the app. I had legal correspondence with their lawyers where they acknowledged it happened as far back as 2016.”
He wants to create a set of enduring principles that are handed over to a technically competent regulator to enforce. “Currently, the industry is not responding to some pretty fundamental things that have happened on their watch. So I think it is the right place for government to step in,” he adds.
Facebook in particular, he argues is “the most obstinate and belligerent in recognising the harm that has been done and actually doing something about it”. . . .
7. Social media have been underscored as a contributing factor to right-wing, domestic terrorism. ” . . . The first is stochastic terrorism [48]: ‘The use of mass, public communication, usually against a particular individual or group, which incites or inspires acts of terrorism which are statistically probable but happen seemingly at random.’ I encountered the idea in a Friday thread from data scientist Emily Gorcenski [49], who used it to tie together four recent attacks. . . . .”
The Links Between Social Media, Domestic Terrorism and the Retreat from Democracy
It was an awful weekend of hate-fueled violence, ugly rhetoric, and worrisome retreats from our democratic ideals. Today I’m focused on two ways of framing what we’re seeing, from the United States to Brazil. While neither offers any comfort, they do give helpful names to phenomena I expect will be with us for a long while.
The first is stochastic terrorism [48]: “The use of mass, public communication, usually against a particular individual or group, which incites or inspires acts of terrorism which are statistically probable but happen seemingly at random.” I encountered the idea in a Friday thread from data scientist Emily Gorcenski [49], who used it to tie together four recent attacks.
In her thread, Gorcenski argues that various right-wing conspiracy theories and frauds, amplified both through mainstream and social media, have resulted in a growing number of cases where men snap and commit violence. “Right-wing media is a gradient pushing rightwards, toward violence and oppression,” she wrote. “One of the symptoms of this is that you are basically guaranteed to generate random terrorists. Like popcorn kernels popping.”
On Saturday, another kernel popped. Robert A. Bowers, the suspect in a shooting at a synagogue that left 11 people dead, was steeped in online conspiracy culture. He posted frequently to Gab, a Twitter clone that emphasizes free speech and has become a favored social network among white nationalists. Julie Turkewitz and Kevin Roose described his hateful views in the New York Times [78]:
After opening an account on it in January, he had shared a stream of anti-Jewish slurs and conspiracy theories. It was on Gab where he found a like-minded community, reposting messages from Nazi supporters.
“Jews are the children of Satan,” read Mr. Bowers’s biography.
Bowers is in custody — his life was saved by Jewish doctors and nurses [79] — and presumably will never go free again. Gab’s life, however, may be imperiled. Two payment processors, PayPal and Stripe, de-platformed the site, as did its cloud host, Joyent. The site went down on Monday [80] after its hosting provider GoDaddy, told it to find another one. Its founder posted defiant messages on Twitter and elsewhere promising it would survive.
Gab hosts a lot of deeply upsetting content [81], and to its supporters, that’s the point. Free speech is a right, their reasoning goes, and it ought to be exercised. Certainly it seems wrong to suggest that Gab or any other single platform “caused” Bowers to act. Hatred, after all, is an ecosystem. But his action came amid a concerted effort to focus attention on a caravan of migrants coming to the United States in seek of refugee.
Right-wing media, most notably Fox News, has advanced the idea that the caravan is linked to Jewish billionaire (and Holocaust survivor) George Soros [82]. An actual Congressman, Florida Republican Matt Gaetz, suggested the caravan was funded by Soros [83]. Bowers enthusiastically pushed these conspiracy theories on social media [84].
In his final post on Gab, Bowers wrote [85]: “I can’t sit by and watch my people get slaughtered. Screw your optics. I’m going in.”
The individual act was random. But it had become statistically probable thanks to the rise of anti-immigrant rhetoric across all manner of media. And I fear we will see far more of it before the current fever breaks.
The second concept I’m thinking about today is democratic recession. The idea, which is roughly a decade old, is that democracy is in retreat around the globe. The Economist covered it in January [86]:
The tenth edition of the Economist Intelligence Unit’s Democracy Index [87] suggests that this unwelcome trend remains firmly in place. The index, which comprises 60 indicators across five broad categories—electoral process and pluralism, functioning of government, political participation, democratic political culture and civil liberties—concludes that less than 5% of the world’s population currently lives in a “full democracy”. Nearly a third live under authoritarian rule, with a large share of those in China. Overall, 89 of the 167 countries assessed in 2017 received lower scores than they had the year before.
In January, The Economist considered Brazil a “flawed democracy.” But after this weekend, the country may undergo a more precipitous decline in democratic freedoms. As expected, far-right candidate Jair Bolsonaro, who speaks approvingly of the country’s previous military dictatorship, handily won election over his leftist rival.
In the best piece I read today [88], BuzzFeed’s Ryan Broderick — who was in Brazil for the election — puts Bolsonaro’s election into the context of the internet and social platform. Broderick focuses on the symbiosis between internet media, which excels at promoting a sense of perpetual crisis and outrage, and far-right leaders who promise a return to normalcy.
Typically, large right-wing news channels or conservative tabloids will then take these stories going viral on Facebook and repackage them for older, mainstream audiences. Depending on your country’s media landscape, the far-right trolls and influencers may try to hijack this social-media-to-newspaper-to-television pipeline. Which then creates more content to screenshot, meme, and share. It’s a feedback loop.
Populist leaders and the legions of influencers riding their wave know they can create filter bubbles inside of platforms like Facebook or YouTube that promise a safer time, one that never existed in the first place, before the protests, the violence, the cascading crises, and endless news cycles. Donald Trump wants to Make American Great Again; Bolsonaro wants to bring back Brazil’s military dictatorship; Shinzo Abe wants to recapture Japan’s imperial past; Germany’s AFD performed the best with older East German voters longing for the days of authoritarianism [89]. All of these leaders promise to close borders, to make things safe. Which will, of course, usually exacerbate the problems they’re promising to disappear. Another feedback loop.
A third feedback loop, of course, is between a social media ecosystem promoting a sense of perpetual crisis and outrage, and the random-but-statistically-probable production of domestic terrorists.
Perhaps the global rise of authoritarians and big tech platforms are merely correlated, and no causation can be proved. But I increasingly wonder whether we would benefit if tech companies assumed that some level of causation was real — and, assuming that it is, what they might do about it.
DEMOCRACY
On Social Media, No Answers for Hate [90]
You don’t have to go to Gab to see hateful posts. Sheera Frenkel, Mike Isaac, and Kate Conger report on how the past week’s domestic terror attacks play out on once-happier places, most notably Instagram:
On Monday, a search on Instagram, the photo-sharing site owned by Facebook, produced a torrent of anti-Semitic images and videos uploaded in the wake of Saturday’s shooting at a Pittsburgh synagogue.
A search for the word “Jews” displayed 11,696 posts with the hashtag “#jewsdid911,” claiming that Jews had orchestrated the Sept. 11 terror attacks. Other hashtags on Instagram referenced Nazi ideology, including the number 88, an abbreviation used for the Nazi salute “Heil Hitler.”
Attacks on Jewish people rising on Instagram and Twitter, researchers say [91]
Just before the synagogue attack took place on Saturday, David Ingram posted this story about an alarming rise in attacks on Jews on social platforms:
Samuel Woolley, a social media researcher who worked on the study, analyzed more than 7 million tweets from August and September and found an array of attacks, also often linked to Soros. About a third of the attacks on Jews came from automated accounts known as “bots,” he said.
“It’s really spiking during this election,” Woolley, director of the Digital Intelligence Laboratory, which studies the intersection of technology and society, said in a telephone interview. “We’re seeing what we think is an attempt to silence conversations in the Jewish community.”
Russian disinformation on Facebook targeted Ukraine well before the 2016 U.S. election [92]
Dana Priest, James Jacoby and Anya Bourg report that Ukraine’s experience with information warfare offered an early — and unheeded — warning to Facebook:
To get Zuckerberg’s attention, the president posted a question for a town hall meeting at Facebook’s Silicon Valley headquarters. There, a moderator read it aloud.
“Mark, will you establish a Facebook office in Ukraine?” the moderator said, chuckling, according to a video of the assembly. The room of young employees rippled with laughter. But the government’s suggestion was serious: It believed that a Kiev office, staffed with people familiar with Ukraine’s political situation, could help solve Facebook’s high-level ignorance about Russian information warfare. . . . .