Spitfire List Web site and blog of anti-fascist researcher and radio personality Dave Emory.

For The Record  

FTR #1074 FakeBook: Walkin’ the Snake on the Earth Island with Facebook (FascisBook, Part 2; In Your Facebook, Part 4)

Dave Emory’s entire lifetime of work is available on a flash drive that can be obtained HERE. The new drive is a 32-gigabyte drive that is current as of the programs and articles posted by the fall of 2017. The new drive (available for a tax-deductible contribution of $65.00 or more.)

WFMU-FM is podcasting For The Record–You can subscribe to the podcast HERE.

You can subscribe to e-mail alerts from Spitfirelist.com HERE.

You can subscribe to RSS feed from Spitfirelist.com HERE.

You can subscribe to the comments made on programs and posts–an excellent source of information in, and of, itself, HERE.

Please consider supporting THE WORK DAVE EMORY DOES.

This broadcast was recorded in one, 60-minute segment.

An Indian book placing Adolf Hitler alongside Mahatma Gandhi, Barack Obama and (ahem) Narendra Modi.

Introduction: We have spoken repeatedly about the Nazi tract Serpent’s Walk, in which the Third Reich goes underground, buys into the opinion-forming media and, eventually, takes over.

Hitler, the Third Reich and their actions are glorified and memorialized. The essence of the book is synopsized on the back cover:

“It assumes that Hitler’s warrior elite – the SS – didn’t give up their struggle for a White world when they lost the Second World War. Instead their survivors went underground and adopted some of their tactics of their enemies: they began building their economic muscle and buying into the opinion-forming media. A century after the war they are ready to challenge the democrats and Jews for the hearts and minds of White Americans, who have begun to have their fill of government-enforced multi-culturalism and ‘equality.'”

Something analogous is happening in Ukraine and India.

In FTR #889, we noted that Pierre Omidyar, a darling of the so-called “progressive” sector for his founding of The Intercept, was deeply involved with the financing of the ascent of both Narendra Modi’s Hindutva fascist BJP and the OUN/B successor organizations in Ukraine.

Omidyar’s anointment as an icon of investigative reporting could not be more ironic, in that journalists and critics of his fascist allies in Ukraine and India are being repressed and murdered, thereby furthering the suppression of truth in those societies. This suppression of truth feeds in to the Serpent’s Walk scenario.

This program supplements past coverage of Facebook in FTR #’s 718, 946, 1021, 1039 noting how Facebook has networked with the very Hindutva fascist Indian elements and OUN/B successor organizations in Ukraine. This networking has been–ostensibly to combat fake news. The reality may well highlight that the Facebook/BJP-RSS/OUN/B links generates fake news, rather than interdicting it. The fake news so generated, however, will be to the liking of the fascists in power in both countries, manifesting as a “Serpent’s Walk” revisionist scenario.

Key elements of discussion and analysis include:

  1. Indian politics has been largely dominated by fake news, spread by social media: ” . . . . In the continuing Indian elections, as 900 million people are voting to elect representatives to the lower house of the Parliament, disinformation and hate speech are drowning out truth on social media networks in the country and creating a public health crisis like the pandemics of the past centuryThis contagion of a staggering amount of morphed images, doctored videos and text messages is spreading largely through messaging services and influencing what India’s voters watch and read on their smartphones. A recent study by Microsoft found that over 64 percent Indians encountered fake news online, the highest reported among the 22 countries surveyed. . . . These platforms are filled with fake news and disinformation aimed at influencing political choices during the Indian elections. . . .
  2. Narendra Modi’s Hindutva fascist BJP has been the primary beneficiary of fake news, and his regime has partnered with Facebook: ” . . . . The hearing was an exercise in absurdist theater because the governing B.J.P. has been the chief beneficiary of divisive content that reaches millions because of the way social media algorithms, especially Facebook, amplify ‘engaging’ articles. . . .”
  3. Rajesh Jain is among those BJP functionaries who serve Facebook, as well as the Hindutva fascists: ” . . . . By the time Rajesh Jain was scaling up his operations in 2013, the BJP’s information technology (IT) strategists had begun interacting with social media platforms like Facebook and its partner WhatsApp. If supporters of the BJP are to be believed, the party was better than others in utilising the micro-targeting potential of the platforms. However, it is also true that Facebook’s employees in India conducted training workshops to help the members of the BJP’s IT cell. . . .”
  4. Dr. Hiren Joshi is another of the BJP operatives who is heavily involved with Facebook. ” . . . . Also assisting the social media and online teams to build a larger-than-life image for Modi before the 2014 elections was a team led by his right-hand man Dr Hiren Joshi, who (as already stated) is a very important adviser to Modi whose writ extends way beyond information technology and social media. . . .  Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India. . . .”
  5. Shivnath Thukral, who was hired by Facebook in 2017 to be its Public Policy Director for India & South Asia, worked with Joshi’s team in 2014.  ” . . . . The third team, that was intensely focused on building Modi’s personal image, was headed by Hiren Joshi himself who worked out of the then Gujarat Chief Minister’s Office in Gandhinagar. The members of this team worked closely with staffers of Facebook in India, more than one of our sources told us. As will be detailed later, Shivnath Thukral, who is currently an important executive in Facebook, worked with this team. . . .”
  6. An ostensibly remorseful BJP politician–Prodyut Bora–highlighted the dramatic effect of Facebook and its WhatsApp subsidiary have had on India’s politics: ” . . . . In 2009, social media platforms like Facebook and WhatsApp had a marginal impact in India’s 20 big cities. By 2014, however, it had virtually replaced the traditional mass media. In 2019, it will be the most pervasive media in the country. . . .”
  7. A concise statement about the relationship between the BJP and Facebook was issued by BJP tech office Vinit Goenka” . . . . At one stage in our interview with [Vinit] Goenka that lasted over two hours, we asked him a pointed question: ‘Who helped whom more, Facebook or the BJP?’ He smiled and said: ‘That’s a difficult question. I wonder whether the BJP helped Facebook more than Facebook helped the BJP. You could say, we helped each other.’ . . .”

Celebration of the 75th Anniversary of the 14th Waffen SS Division in Lviv, Ukraine

In Ukraine, as well, Facebook and the OUN/B successor organizations function symbiotically:

(Note that the Atlantic Council is dominant in the array of individuals and institutions constituting the Ukrainian fascist/Facebook cooperative effort. We have spoken about the Atlantic Council in numerous programs, including FTR #943. The organization has deep operational links to elements of U.S. intelligence, as well as the OUN/B milieu that dominates the Ukrainian diaspora.)

CrowdStrike–at the epicenter of the supposed Russian hacking controversy is noteworthy. Its co-founder and chief technology officer, Dmitry Alperovitch is a senior fellow at the Atlantic Council, financed by elements that are at the foundation of fanning the flames of the New Cold War: “In this respect, it is worth noting that one of the commercial cybersecurity companies the government has relied on is Crowdstrike, which was one of the companies initially brought in by the DNC to investigate the alleged hacks. . . . Dmitri Alperovitch is also a senior fellow at the Atlantic Council. . . . The connection between [Crowdstrike co-founder and chief technology officer Dmitri] Alperovitch and the Atlantic Council has gone largely unremarked upon, but it is relevant given that the Atlantic Council—which is is funded in part by the US State Department, NATO, the governments of Latvia and Lithuania, the Ukrainian World Congress, and the Ukrainian oligarch Victor Pinchuk—has been among the loudest voices calling for a new Cold War with Russia. As I pointed out in the pages of The Nation in November, the Atlantic Council has spent the past several years producing some of the most virulent specimens of the new Cold War propaganda. . . .

In May of 2018, Facebook decided to effectively outsource the work of identifying propaganda and misinformation during elections to the Atlantic Council.

” . . . . Facebook is partnering with the Atlantic Council in another effort to combat election-related propaganda and misinformation from proliferating on its service. The social networking giant said Thursday that a partnership with the Washington D.C.-based think tank would help it better spot disinformation during upcoming world elections. The partnership is one of a number of steps Facebook is taking to prevent the spread of propaganda and fake news after failing to stop it from spreading on its service in the run up to the 2016 U.S. presidential election. . . .”

Since autumn 2018, Facebook has looked to hire a public policy manager for Ukraine. The job came after years of Ukrainians criticizing the platform for takedowns of its activists’ pages and the spread of [alleged] Russian disinfo targeting Kyiv. Now, it appears to have one: @Kateryna_Kruk.— Christopher Miller (@ChristopherJM) June 3, 2019

Oleh Tihanybok, leader of the OUN/B successor organization Svoboda, for which Kateryna Kruk worked.

Kateryna Kruk:

  1. Is Facebook’s Public Policy Manager for Ukraine as of May of this year, according to her LinkedIn page.
  2. Worked as an analyst and TV host for the Ukrainian ‘anti-Russian propaganda’ outfit StopFake. StopFake is the creation of Irena Chalupa, who works for the Atlantic Council and the Ukrainian government and appears to be the sister of Andrea and Alexandra Chalupa.
  3. Joined the “Kremlin Watch” team at the European Values think-tank, in October of 2017.
  4. Received the Atlantic Council’s Freedom award for her communications work during the Euromaidan protests in June of 2014.
  5. Worked for OUN/B successor organization Svoboda during the Euromaidan protests. “ . . . ‘There are people who don’t support Svoboda because of some of their slogans, but they know it’s the most active political party and go to them for help, said Svoboda volunteer Kateryna Kruk. . . . ” . . . .
  6. Also has a number of articles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advocates for the creation of an independent Ukrainian Orthodox Church to diminish the influence of the Russian Orthodox Church.
  7. According to her LinkedIn page has also done extensive work for the Ukrainian government. From March 2016 to January 2017 she was the Strategic Communications Manager for the Ukrainian parliament where she was responsible for social media and international communications. From January-April 2017 she was the Head of Communications at the Ministry of Health.
  8. Was not only was a volunteer for Svoboda during the 2014 Euromaidan protests, but openly celebrated on twitter the May 2014 massacre in Odessa when the far right burned dozens of protestors alive. Kruk’s twitter feed is set to private now so there isn’t public access to her old tweet, but people have screen captures of it. Here’s a tweet from Yasha Levine with a screenshot of Kruk’s May 2, 2014 tweet where she writes: “#Odessa cleaned itself from terrorists, proud for city fighting for its identity.glory to fallen heroes..” She even threw in a “glory to fallen heroes” at the end of her tweet celebrating this massacre. Keep in mind that it was month after this tweet that the Atlantic Council gave her that Freedom Award for her communications work during the protests.
  9. In 2014, . . .  tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him.” And he then traveled to Luhansk to fight pro-Russian rebels.
  10. Lionized a Nazi sniper killed in Ukraine’s civil war. In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi. (The Nazi was also lionized on Euromaidan Press’ Facebook page.)
  11. Has staunchly defended the use of the slogan “Slava Ukraini,”which was first coined and popularized by Nazi-collaborating fascists, and is now the official salute of Ukraine’s army.
  12. Has also said that the Ukrainian fascist politician Andriy Parubiy, who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart,” writing, “Parubiy touche.” . . . .

In the context of Facebook’s institutional level networking with fascists, it is worth noting that social media themselves have been cited as a contributing factor to right-wing domestic terrorism. . . . The first is stochastic terrorism: ‘The use of mass, public communication, usually against a particular individual or group, which incites or inspires acts of terrorism which are statistically probable but happen seemingly at random.’ I encountered the idea in a Friday thread from data scientist Emily Gorcenski, who used it to tie together four recent attacks. . . . .”

The program concludes with review (from FTR #1039) of the psychological warfare strategy adapted by Cambridge Analytica to the political arena. Christopher Wylie–the former head of research at Cambridge Analytica who became one of the key insider whistle-blowers about how Cambridge Analytica operated and the extent of Facebook’s knowledge about it–gave an interview to Campaign Magazine. (We dealt with Cambridge Analytica in FTR #’s 946, 1021.) Wylie recounts how, as director of research at Cambridge Analytica, his original role was to determine how the company could use the information warfare techniques used by SCL Group – Cambridge Analytica’s parent company and a defense contractor providing psy op services for the British military. Wylie’s job was to adapt the psychological warfare strategies that SCL had been using on the battlefield to the online space. As Wylie put it:

“ . . . . When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy…But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists. . . . .”

Wylie also draws parallels between the psychological operations used on democratic audiences and the battlefield techniques used to be build an insurgency.

1a. Following the sweeping victory of the BJP in India’s elections that exceeded the expectations, there’s no shortage of questions of how the BJP managed such a resounding victory despite what appeared to be growing popular frustrations with the party just six months ago. And while the embrace of nationalism and sectarianism no doubt played a major role along with the tensions with Pakistan, it’s also important to give credit to the profound role social media played in this year’s elections. Specifically, organized social media disinformation campaigns run by the BJP:

“India Has a Public Health Crisis. It’s Called Fake News.” by Samir Patil; The New York Times; 04/29/2019.

In the continuing Indian elections, as 900 million people are voting to elect representatives to the lower house of the Parliament, disinformation and hate speech are drowning out truth on social media networks in the country and creating a public health crisis like the pandemics of the past century.

This contagion of a staggering amount of morphed images, doctored videos and text messages is spreading largely through messaging services and influencing what India’s voters watch and read on their smartphones. A recent study by Microsoft found that over 64 percent Indians encountered fake news online, the highest reported among the 22 countries surveyed.

India has the most social media users, with 300 million users on Facebook, 200 million on WhatsApp and 250 million using YouTube. TikTok, the video messaging service owned by a Chinese company, has more than 88 million users in India. And there are Indian messaging applications such as ShareChat, which claims to have 40 million users and allows them to communicate in 14 Indian languages.

These platforms are filled with fake news and disinformation aimed at influencing political choices during the Indian elections. Some of the egregious instances are a made-up BBC survey predicting victory for the governing Bharatiya Janata Party and a fake video of the opposition Congress Party president, Rahul Gandhi, saying a machine can convert potatoes into gold.

Fake stories are spread by legions of online trolls and unsuspecting users, with dangerous impact. A rumor spread through social media about child kidnappers arriving in various parts of India has led to 33 deaths in 69 incidents of mob violence since 2017, according to IndiaSpend, a data journalism website.

Six months before the 2014 general elections in India, 62 people were killed in sectarian violence and 50,000 were displaced from their homes in the northern state of Uttar Pradesh. Investigations by the police found that a fake video was shared on WhatsApp to whip up sectarian passions.

In the lead-up to the elections, the Indian government summoned the top executives of Facebook and Twitter to discuss the crisis of coordinated misinformation, fake news and political bias on their platforms. In March, Joel Kaplan, Facebook’s global vice president for public policy, was called to appear before a committee of 31 members of the Indian Parliament — who were mostly from the ruling Bharatiya Janata Party — to discuss “safeguarding citizens’ rights on social/online news media platforms.”

The hearing was an exercise in absurdist theater because the governing B.J.P. has been the chief beneficiary of divisive content that reaches millions because of the way social media algorithms, especially Facebook, amplify “engaging” articles.

As elsewhere in the world, Facebook, Twitter and YouTube are ambivalent about tackling the problem head-on for the fear of making decisions that invoke the wrath of national political forces. The tightrope walk was evident when in April, Facebook announced a ban on about 1,000 fake news pages targeting India. They included pages directly associated with political parties.

Facebook announced that a majority of the pages were associated with the opposition Indian National Congress party, but it merely named the technology company associated with the governing B.J.P. pages. Many news reports later pointed out that the pages related to the B.J.P. that were removed were far more consequential and reached millions.

Asking the social media platforms to fix the crisis is a deeply flawed approach because most of the disinformation is shared in a decentralized manner through messaging. Seeking to monitor those messages is a step toward accepting mass surveillance. The Indian government loves the idea and has proposed laws that, among other things, would break end-to-end encryption and obtain user data without a court order. 

The idea of more effective fact-checking has come up often in the debates around India’s disinformation contagion. But it comes with many conceptual difficulties: A large proportion of messages shared on social networks in India have little to do with verifiable facts and peddle prejudiced opinions. Facebook India has a small 11- to 22-member fact-checking team for content related to Indian elections.

Fake news is not a technological or scientific problem with a quick fix. It should be treated as a new kind of public health crisis in all its social and human complexity. The answer might lie in looking back at how we responded to the epidemics, the infectious diseases in the 19th and early 20th centuries, which have similar characteristics. . . .

1b. As the following article notes, the farcical nature of the BJP government asking Facebook to help with the disinformation crisis is even more farcical by the fact that Facebook has previously conducting training workshops to help the BJP use Facebook more effectively. The article describes the teams of IT cells that were set up by the BJP for the 2014 election to build a larger-than-life image for Modi. There were four cells.

One of those cells was run by Modi’s right hand man Dr Hiren Joshi. Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India according to the article. Hiren’s team worked closely with Facebook’s staff. Shivnath Thukral, who was hired by Facebook in 2017 to be its Public Policy Director for India & South Asia, worked with this team in 2014. And that’s just an overview of how tightly Facebook was working with the BJP in 2014:

“Meet the advisors who helped make the BJP a social media powerhouse of data and propaganda” by Cyril Sam & Paranjoy Guha Thakurta; Scroll.in; 05/06/2019.

By the time Rajesh Jain was scaling up his operations in 2013, the BJP’s information technology (IT) strategists had begun interacting with social media platforms like Facebook and its partner WhatsApp. If supporters of the BJP are to be believed, the party was better than others in utilising the micro-targeting potential of the platforms. However, it is also true that Facebook’s employees in India conducted training workshops to help the members of the BJP’s IT cell.

Helping party functionaries were advertising honchos like Sajan Raj Kurup, founder of Creativeland Asia and Prahlad Kakkar, the well-known advertising professional. Actor Anupam Kher became the public face of some of the advertising campaigns. Also assisting the social media and online teams to build a larger-than-life image for Modi before the 2014 elections was a team led by his right-hand man Dr Hiren Joshi, who (as already stated) is a very important adviser to Modi whose writ extends way beyond information technology and social media.

Currently, Officer On Special Duty in the Prime Minister’s Office, he is assisted by two young professional “techies,” Nirav Shah and Yash Rajiv Gandhi. Joshi has had, and continues to have, a close and long-standing association with Facebook’s senior employees in India. In 2013, one of his important collaborators was Akhilesh Mishra who later went on to serve as a director of the Indian government’s website, MyGov India – which is at present led by Arvind Gupta who was earlier head of the BJP’s IT cell.

Mishra is CEO of Bluekraft Digital Foundation. The Foundation has been linked to a disinformation website titled “The True Picture,” has published books authored by Prime Minister Narendra Modi and produces campaign videos for NaMo Television, a 24 hour cable television channel dedicated to promoting Modi.

The 2014 Modi pre-election campaign was inspired by the 2012 campaign to elect Barack Obama as the “world’s first Facebook President.” Some of the managers of the Modi campaign like Jain were apparently inspired by Sasha Issenberg’s book on the topic, The Victory Lab: The Secret Science of Winning CampaignsIn the first data-led election in India in 2014, information was collected from every possible source to not just micro-target users but also fine-tune messages praising and “mythologising” Modi as the Great Leader who would usher in acche din for the country.

Four teams spearheaded the campaign. The first team was led by Mumbai-based Jain who funded part of the communication campaign and also oversaw voter data analysis. He was helped by Shashi Shekhar Vempati in running NITI and “Mission 272+.” As already mentioned, Shekhar had worked in Infosys and is at present the head of Prasar Bharati Corporation which runs Doordarshan and All India Radio.

The second team was led by political strategist Prashant Kishor and his I-PAC or Indian Political Action Committee who supervised the three-dimensional projection programme for Modi besides programmes like Run for Unity, Chai Pe Charcha (or Discussions Over Tea), Manthan (or Churning) and Citizens for Accountable Governance (CAG) that roped in management graduates to garner support for Modi at large gatherings. Having worked across the political spectrum and opportunistically switched affiliation to those who backed (and paid) him, 41-year-old Kishor is currently the second-in-command in Janata Dal (United) headed by Bihar Chief Minister Nitish Kumar.

The third team, that was intensely focused on building Modi’s personal image, was headed by Hiren Joshi himself who worked out of the then Gujarat Chief Minister’s Office in Gandhinagar. The members of this team worked closely with staffers of Facebook in India, more than one of our sources told us. As will be detailed later, Shivnath Thukral, who is currently an important executive in Facebook, worked with this team. (We made a number of telephone calls to Joshi’s office in New Delhi’s South Block seeking a meeting with him and also sent him an e-mail message requesting an interview but he did not respond.)

The fourth team was led by Arvind Gupta, the current CEO of MyGov.in, a social media platform run by the government of India. He ran the BJP’s campaign based out of New Delhi. When contacted, he too declined to speak on the record saying he is now with the government and not a representative of the BJP. He suggested we contact Amit Malviya who is the present head of the BJP’s IT cell. He came on the line but declined to speak specifically on the BJP’s relationship with Facebook and WhatsApp.

The four teams worked separately. “It was (like) a relay (race),” said Vinit Goenka who was then the national co-convener of the BJP’s IT cell, adding: “The only knowledge that was shared (among the teams) was on a ‘need to know’ basis. That’s how any sensible organisation works.”

From all accounts, Rajesh Jain worked independently from his Lower Parel office and invested his own funds to support Modi and towards executing what he described as “Project 275 for 2014” in a blog post that he wrote in June 2011, nearly three years before the elections actually took place. The BJP, of course, went on to win 282 seats in the 2014 Lok Sabha elections, ten above the half-way mark, with a little over 31 per cent of the vote.

As an aside, it may be mentioned in passing that – like certain former bhakts or followers of Modi – Jain today appears less than enthusiastic about the performance of the government over the last four and a half years. He is currently engaged in promoting a campaign called Dhan Vapasi (or “return our wealth”) which is aimed at monetising surplus land and other assets held by government bodies, including defence establishments, and public sector undertakings, for the benefit of the poor and the underprivileged. Dhan Vapasi, in his words, is all about making “every Indian rich and free.”

In one of his recent videos that are in the public domain, Jain remarked: “For the 2014 elections, I had spent three years and my own money to build a team of 100 people to help with Modi’s campaign. Why? Because I trusted that a Modi-led BJP government could end the Congress’ anti-prosperity programmes and put India on a path to prosperity, a nayi disha (or new direction). But four years have gone by without any significant change in policy. India needed that to eliminate the big and hamesha (perennial) problems of poverty, unemployment and corruption. The Modi-led BJP government followed the same old failed policy of increasing taxes and spending. The ruler changed, but the outcomes have not.”

As mentioned, when we contacted 51-year-old Jain, who heads the Mumbai-based Netcore group of companies, said to be India’s biggest digital media marketing corporate group, he declined to be interviewed. Incidentally, he had till October 2017 served on the boards of directors of two prominent public sector companies. One was National Thermal Power Corporation (NTPC) – Jain has no experience in the power sector, just as Sambit Patra, BJP spokesperson, who is an “independent” director on the board of the Oil and Natural Gas Corporation, has zero experience in the petroleum industry. Jain also served on the board of the Unique Identification Authority of India (UIDAI), which runs the Aadhar programme.

Unlike Jain who was not at all forthcoming, 44-year-old Prodyut Bora, founder of the BJP’s IT cell in 2007 (barely a year after Facebook and Twitter had been launched) was far from reticent while speaking to us. He had resigned from the party’s national executive in February 2015 after questioning Modi and Amit Shah’s “highly individualised and centralised style of decision-making” that had led to the “subversion of democratic traditions” in the government and in the party.

Bora recalled how he was one of the first graduates from the leading business school, the Indian Institute of Management, Ahmedabad, to join the BJP because of his great admiration for the then Prime Minister Atal Behari Vajpayee. It was at the behest of the then party president Rajnath Singh (who is now Union Home Minister) that he set up the party’s IT cell to enable its leaders to come closer to, and interact with, their supporters.

The cell, he told us, was created not with a mandate to abuse people on social media platforms. He lamented that “madness” has now gripped the BJP and the desire to win elections at any cost has “destroyed the very ethos” of the party he was once a part of. Today, the Gurgaon-based Bora runs a firm making air purification equipment and is involved with an independent political party in his home state, Assam.

He told us: “The process of being economical with the truth (in the BJP) began in 2014. The (election) campaign was sending out unverified facts, infomercials, memes, dodgy data and graphs. From there, fake news was one step up the curve. Leaders of political parties, including the BJP, like to outsource this work because they don’t want to leave behind digital footprints. In 2009, social media platforms like Facebook and WhatsApp had a marginal impact in India’s 20 big cities. By 2014, however, it had virtually replaced the traditional mass media. In 2019, it will be the most pervasive media in the country.” . . . .

. . . . At one stage in our interview with [Vinit] Goenka that lasted over two hours, we asked him a pointed question: “Who helped whom more, Facebook or the BJP?”

He smiled and said: “That’s a difficult question. I wonder whether the BJP helped Facebook more than Facebook helped the BJP. You could say, we helped each other.”

1c. According to Christopher Miller of RFERL, Facebook selected Kateryna Kruk for the position:

Since autumn 2018, Facebook has looked to hire a public policy manager for Ukraine. The job came after years of Ukrainians criticizing the platform for takedowns of its activists’ pages and the spread of Russian disinfo targeting Kyiv. Now, it appears to have one: @Kateryna_Kruk.— Christopher Miller (@ChristopherJM) June 3, 2019

Kruk’s LinkedIn page also lists her as being Facebook’s Public Policy Manager for Ukraine as of May of this year.

Kruk  worked as an analyst and TV host for the Ukrainian ‘anti-Russian propaganda’ outfit StopFake. StopFake is the creation of Irena Chalupa, who works for the Atlantic Council and the Ukrainian government and appears to be the sister of Andrea and Alexandra Chalupa.

(As an example of how StopFake.org approaches Ukraine’s far right, here’s a tweet from StopFake’s co-founder, Yevhen Fedchenko, from May of 2018 where he complains about an article in Hromadske International that characterizes C14 as a neo-Nazi group:

“for Hromadske C14 is ‘neo- nazi’, in reality one of them – Oleksandr Voitko – is a war veteran and before going to the war – alum and faculty at @MohylaJSchool, journalist at Foreign news desk at Channel 5. Now also active participant of war veterans grass-root organization. https://t.co/QmaGnu6QGZ— Yevhen Fedchenko (@yevhenfedchenko) May 5, 2018)

In October of 2017, Kruk joined the “Kremlin Watch” team at the European Values think-tankIn June of 2014, The Atlantic Council gave Kruk its Freedom award for her communications work during the Euromaidan protests. Kruk also has a number of articles on the Atlantic Council’s Blog. Here’s a blog post from August of 2018 where she advocates for the creation of an independent Ukrainian Orthodox Church to diminish the influence of the Russian Orthodox Church. Keep in mind that, in May of 2018, Facebook decided to effectively outsource the work of identifying propaganda and misinformation during elections to the Atlantic Council, so choosing someone like Kruk who already has the Atlantic Council’s stamp of approval is in keeping with that trend.

According to Kruk’s LinkedIn page she’s also done extensive work for the Ukrainian government. From March 2016 to January 2017 she was the Strategic Communications Manager for the Ukrainian parliament where she was responsible for social media and international communications. From January-April 2017 she was the Head of Communications at the Ministry of Health.

Kruk not only was a volunteer for Svoboda during the 2014 Euromaidan protests, she also openly celebrated on twitter the May 2014 massacre in Odessa when the far right burned dozens of protestors alive. Kruk’s twitter feed is set to private now so there isn’t public access to her old tweet, but people have screen captures of it. Here’s a tweet from Yasha Levine with a screenshot of Kruk’s May 2, 2014 tweet where she writes:
“#Odessa cleaned itself from terrorists, proud for city fighting for its identity.glory to fallen heroes..”

She even threw in a “glory to fallen heroes” at the end of her tweet celebrating this massacre. Keep in mind that it was month after this tweet that the Atlantic Council gave her that Freedom Award for her communications work during the protests.

An article from January of 2014 about the then-ongoing Maidan square protests, The article covers the growing presence of the far right in the protests and their attacks on left-wing protestors. Kruk is interviewed in the article and describes herself as a Svoboda volunteer. Kruk issued a tweet celebrating the Odessa massacre a few months later and also stands out from a public relations standpoint: Kruk was sending messages for why average Ukrainians who don’t necessarily support the far right should support the far right at that moment, which was one of the most useful messages she could have been sending for the far right at that time:

“The Ukrainian Nationalism at the Heart of ‘Euromaidan’” by Alec Luhn; The Nation; 01/21/2014.

. . . . For now, Svoboda and other far-right movements like Right Sector are focusing on the protest-wide demands for civic freedoms government accountability rather than overtly nationalist agendas. Svoboda enjoys a reputation as a party of action, responsive to citizens’ problems. Noyevy cut an interview with The Nation short to help local residents who came with a complaint that a developer was tearing down a fence without permission.

“There are people who don’t support Svoboda because of some of their slogans, but they know it’s the most active political party and go to them for help,” said Svoboda volunteer Kateryna Kruk. “Only Svoboda is helping against land seizures in Kiev.” . . . .

1d. Kruk has manifested other fascist sympathies and connections:

  1. In 2014, she tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him.” And he then traveled to Luhansk to fight pro-Russian rebels.
  2. Nazi sniper Dilly Krivich, posthumously lionized by Kateryna Kruk

    In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi. (The Nazi was also lionized on Euromaidan Press’ Facebook page.)

  3. Kruk has staunchly defended the use of the slogan “Slava Ukraini,”which was first coined and popularized by Nazi-collaborating fascists, and is now the official salute of Ukraine’s army.
  4. She has also said that the Ukrainian fascist politician Andriy Parubiy, who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart,” writing, “Parubiy touche.” . . . .

“Facebook’s New Public Policy Manager Is Nationalist Hawk Who Volunteered with Fascist Party During US-Backed Coup” by Ben Norton; The Gray Zone; 6/4/2019.

. . . . Svoboda is not the only Ukrainian fascist group Kateryna Kruk has expressed support for. In 2014, she tweeted that a man had asked her to convince his grandson not to join the Azov Battalion, a neo-Nazi militia. “I couldn’t do it,” she said. “I thanked that boy and blessed him.” And he then traveled to Luhansk to fight pro-Russian rebels.

That’s not all. In March 2018, a 19-year neo-Nazi named Andriy “Dilly” Krivich was shot and killed by a sniper. Krivich had been fighting with the fascist Ukrainian group Right Sector, and had posted photos on social media wearing Nazi German symbols. After he was killed, Kruk tweeted an homage to the teenage Nazi. (The Nazi was also lionized on Euromaidan Press’ Facebook page.)

Kruk has staunchly defended the use of the slogan “Slava Ukraini,”which was first coined and popularized by Nazi-collaborating fascists, and is now the official salute of Ukraine’s army.

She has also said that the Ukrainian fascist politician Andriy Parubiy, who co-founded a neo-Nazi party before later becoming the chairman of Ukraine’s parliament the Rada, is “acting smart,” writing, “Parubiy touche.” . . . .

2. The essence of the book Serpent’s Walk  is presented on the back cover:

Serpent’s Walk by “Randolph D. Calverhall;” Copyright 1991 [SC]; National Vanguard Books; 0-937944-05-X.

It assumes that Hitler’s warrior elite – the SS – didn’t give up their struggle for a White world when they lost the Second World War. Instead their survivors went underground and adopted some of the tactics of their enemies: they began building their economic muscle and buying into the opinion-forming media. A century after the war they are ready to challenge the democrats and Jews for the hearts and minds of White Americans, who have begun to have their fill of government-enforced multi-culturalism and ‘equality.’

3. This process is described in more detail in a passage of text, consisting of a discussion between Wrench (a member of this Underground Reich) and a mercenary named Lessing.

Serpent’s Walk by “Randolph D. Calverhall;” Copyright 1991 [SC]; National Vanguard Books; 0-937944-05-X; pp. 42-43.

. . . . The SS . . . what was left of it . . . had business objectives before and during World War II. When the war was lost they just kept on, but from other places: Bogota, Asuncion, Buenos Aires, Rio de Janeiro, Mexico City, Colombo, Damascus, Dacca . . . you name it. They realized that the world is heading towards a ‘corporacracy;’ five or ten international super-companies that will run everything worth running by the year 2100. Those super-corporations exist now, and they’re already dividing up the production and marketing of food, transport, steel and heavy industry, oil, the media, and other commodities. They’re mostly conglomerates, with fingers in more than one pie . . . . We, the SS, have the say in four or five. We’ve been competing for the past sixty years or so, and we’re slowly gaining . . . . About ten years ago, we swung a merger, a takeover, and got voting control of a supercorp that runs a small but significant chunk of the American media. Not openly, not with bands and trumpets or swastikas flying, but quietly: one huge corporation cuddling up to another one and gently munching it up, like a great, gubbing amoeba. Since then we’ve been replacing executives, pushing somebody out here, bringing somebody else in there. We’ve swing program content around, too. Not much, but a little, so it won’t show. We’ve cut down on ‘nasty-Nazi’ movies . . . good guys in white hats and bad guys in black SS hats . . . lovable Jews versus fiendish Germans . . . and we have media psychologists, ad agencies, and behavior modification specialists working on image changes. . . .

4. The broadcast addresses the gradual remaking of the image of the Third Reich that is represented in Serpent’s Walk. In the discussion excerpted above, this process is further described.

Serpent’s Walk by “Randolph D. Calverhall;” Copyright 1991 [SC]; National Vanguard Books; 0-937944-05-X; pp. 42-44.

. . . . Hell, if you can con granny into buying Sugar Turds instead of Bran Farts, then why can’t you swing public opinion over to a cause as vital and important as ours?’ . . . In any case, we’re slowly replacing those negative images with others: the ‘Good Bad Guy’ routine’ . . . ‘What do you think of Jesse James? John Dillinger? Julius Caesar? Genghis Khan?’ . . . The reality may have been rough, but there’s a sort of glitter about most of those dudes: mean honchos but respectable. It’s all how you package it. Opinion is a godamned commodity!’ . . . It works with anybody . . . Give it time. Aside from the media, we’ve been buying up private schools . . . and helping some public ones through philanthropic foundations . . . and working on the churches and the Born Agains. . . .

5. Through the years, we have highlighted the Nazi tract Serpent’s Walk, excerpted above, which deals, in part, with the rehabilitation of the Third Reich’s reputation and the transformation of Hitler into a hero.

In FTR #1015, we noted that a Serpent’s Walk scenario is indeed unfolding in India.

Key points of analysis and discussion include:

  1. Narendra Modi’s presence on the same book cove(along with Gandhi, Mandela, Obama and Hitler.)
  2. Modi himself has his own political history with children’s books that promote Hitler as a great leader: ” . . . . In 2004, reports surfaced of high-school textbooks in the state of Gujarat, which was then led by Mr. Modi, that spoke glowingly of Nazism and fascism. According to ‘The Times of India,’ in a section called ‘Ideology of Nazism,’ the textbook said Hitler had ‘lent dignity and prestige to the German government,’ ‘made untiring efforts to make Germany self-reliant’ and ‘instilled the spirit of adventure in the common people.’  . . . .”
  3. In India, many have a favorable view of Hitler: ” . . . . as far back as 2002, the Times of India reported a survey that found that 17 percent of students in elite Indian colleges ‘favored Adolf Hitler as the kind of leader India ought to have.’ . . . . Consider Mein Kampf, Hitler’s autobiography. Reviled it might be in the much of the world, but Indians buy thousands of copies of it every month. As a recent paper in the journal EPW tells us (PDF), there are over a dozen Indian publishers who have editions of the book on the market. Jaico, for example, printed its 55th edition in 2010, claiming to have sold 100,000 copies in the previous seven years. (Contrast this to the 3,000 copies my own 2009 book, Roadrunner, has sold). In a country where 10,000 copies sold makes a book a bestseller, these are significant numbers. . . .”
  4. A classroom of school children filled with fans of Hitler had a very different sentiment about Gandhi. ” . . . . ‘He’s a coward!’ That’s the obvious flip side of this love of Hitler in India. It’s an implicit rejection of Gandhi. . . .”
  5. Apparently, Mein Kampf has achieved gravitas among business students in India” . . . . What’s more, there’s a steady trickle of reports that say it has become a must-read for business-school students; a management guide much like Spencer Johnson’s Who Moved My Cheese or Edward de Bono’s Lateral Thinking. If this undistinguished artist could take an entire country with him, I imagine the reasoning goes, surely his book has some lessons for future captains of industry? . . . .”

6. Christopher Wylie–the former head of research at Cambridge Analytica who became one of the key insider whistle-blowers about how Cambridge Analytica operated and the extent of Facebook’s knowledge about it–gave an interview last month to Campaign Magazine. (We dealt with Cambridge Analytica in FTR #’s 946, 1021.)

Wylie recounts how, as director of research at Cambridge Analytica, his original role was to determine how the company could use the information warfare techniques used by SCL Group – Cambridge Analytica’s parent company and a defense contractor providing psy op services for the British military. Wylie’s job was to adapt the psychological warfare strategies that SCL had been using on the battlefield to the online space. As Wylie put it:

“ . . . . When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy…But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists. . . . .”

Wylie also draws parallels between the psychological operations used on democratic audiences and the battlefield techniques used to be build an insurgency. It starts with targeting people more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to “like” a group on social media. The information you’re feeding this target audience may or may not be real. The important thing is that it’s content that they already agree with so that “it feels good to see that information.” Keep in mind that one of the goals of the ‘psychographic profiling’ that Cambridge Analytica was to identify traits like neuroticism.

Wylie goes on to describe the next step in this insurgency-building technique: keep building up the interest in the social media group that you’re directing this target audience towards until it hits around 1,000-2,000 people. Then set up a real life event dedicated to the chosen disinformation topic in some local area and try to get as many of your target audience to show up. Even if only 5 percent of them show up, that’s still 50-100 people converging on some local coffee shop or whatever. The people meet each other in real life and start talking about about “all these things that you’ve been seeing online in the depths of your den and getting angry about”. This target audience starts believing that no one else is talking about this stuff because “they don’t want you to know what the truth is”. As Wylie puts it, “What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.”

“Cambridge Analytica whistleblower Christopher Wylie: It’s time to save creativity” by Kate Magee; Campaign; 11/05/2018.

In the early hours of 17 March 2018, the 28-year-old Christopher Wylie tweeted: “Here we go….”

Later that day, The Observer and The New York Times published the story of Cambridge Analytica’s misuse of Facebook data, which sent shockwaves around the world, caused millions to #DeleteFacebook, and led the UK Information Commissioner’s Office to fine the site the maximum penalty for failing to protect users’ information. Six weeks after the story broke, Cambridge Analytica closed. . . .

. . . . He believes that poor use of data is killing good ideas. And that, unless effective regulation is enacted, society’s worship of algorithms, unchecked data capture and use, and the likely spread of AI to all parts of our lives is causing us to sleepwalk into a bleak future.

Not only are such circumstances a threat to adland – why do you need an ad to tell you about a product if an algorithm is choosing it for you? – it is a threat to human free will. “Currently, the only morality of the algorithm is to optimise you as a consumer and, in many cases, you become the product. There are very few examples in human history of industries where people themselves become products and those are scary industries – slavery and the sex trade. And now, we have social media,” Wylie says.

“The problem with that, and what makes it inherently different to selling, say, toothpaste, is that you’re selling parts of people or access to people. People have an innate moral worth. If we don’t respect that, we can create industries that do terrible things to people. We are [heading] blindly and quickly into an environment where this mentality is going to be amplified through AI everywhere. We’re humans, we should be thinking about people first.”

His words carry weight, because he’s been on the dark side. He has seen what can happen when data is used to spread misinformation, create insurgencies and prey on the worst of people’s characters.

The political battlefield

A quick refresher on the scandal, in Wylie’s words: Cambridge Analytica was a company spun out of SCL Group, a British military contractor that worked in information operations for armed forces around the world. It was conducting research on how to scale and digitise information warfare – the use of information to confuse or degrade the efficacy of an enemy. . . .

. . . . As director of research, Wylie’s original role was to map out how the company would take traditional information operations tactics into the online space – in particular, by profiling people who would be susceptible to certain messaging.

This morphed into the political arena. After Wylie left, the company worked on Donald Trump’s US presidential campaign and – possibly – the UK’s European Union referendum. In February 2016, Cambridge Analytica’s former chief executive, Alexander Nix, wrote in Campaign that his company had “already helped supercharge Leave.EU’s social-media campaign”. Nix has strenuously denied this since, including to MPs.

It was this shift from the battlefield to politics that made Wylie uncomfortable. “When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy,” he says.

“But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists.”

One of the reasons these techniques are so insidious is that being a target of a disinformation campaign is “usually a pleasurable experience”, because you are being fed content with which you are likely to agree. “You are being guided through something that you want to be true,” Wylie says.

To build an insurgency, he explains, you first target people who are more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to “like” a group on social media. They start engaging with the content, which may or may not be true; either way “it feels good to see that information”.

When the group reaches 1,000 or 2,000 members, an event is set up in the local area. Even if only 5% show up, “that’s 50 to 100 people flooding a local coffee shop”, Wylie says. This, he adds, validates their opinion because other people there are also talking about “all these things that you’ve been seeing online in the depths of your den and getting angry about”.

People then start to believe the reason it’s not shown on mainstream news channels is because “they don’t want you to know what the truth is”. As Wylie sums it up: “What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you.” . . . . 

. . . . Psychographic potential

One such application was Cambridge Analytica’s use of psychographic profiling, a form of segmentation that will be familiar to marketers, although not in common use.

The company used the OCEAN model, which judges people on scales of the Big Five personality traits: openness to experiences, conscientiousness, extraversion, agreeableness and neuroticism.

Wylie believes the method could be useful in the commercial space. For example, a fashion brand that creates bold, colourful, patterned clothes might want to segment wealthy woman by extroversion because they will be more likely to buy bold items, he says.

Sceptics say Cambridge Analytica’s approach may not be the dark magic that Wylie claims. Indeed, when speaking to Campaign in June 2017, Nix uncharacteristically played down the method, claiming the company used “pretty bland data in a pretty enterprising way”.

But Wylie argues that people underestimate what algorithms allow you to do in profiling. “I can take pieces of information about you that seem innocuous, but what I’m able to do with an algorithm is find patterns that correlate to underlying psychological profiles,” he explains.

“I can ask whether you listen to Justin Bieber, and you won’t feel like I’m invading your privacy. You aren’t necessarily aware that when you tell me what music you listen to or what TV shows you watch, you are telling me some of your deepest and most personal attributes.”

This is where matters stray into the question of ethics. Wylie believes that as long as the communication you are sending out is clear, not coercive or manipulative, it’s fine, but it all depends on context. “If you are a beauty company and you use facets of neuroticism – which Cambridge Analytica did – and you find a segment of young women or men who are more prone to body dysmorphia, and one of the proactive actions they take is to buy more skin cream, you are exploiting something which is unhealthy for that person and doing damage,” he says. “The ethics of using psychometric data really depend on whether it is proportional to the benefit and utility that the customer is getting.” . . .

Clashes with Facebook

Wylie is opposed to self-regulation, because industries won’t become consumer champions – they are, he says, too conflicted.

“Facebook has known about what Cambridge Analytica was up to from the very beginning of those projects,” Wylie claims. “They were notified, they authorised the applications, they were given the terms and conditions of the app that said explicitly what it was doing. They hired people who worked on building the app. I had legal correspondence with their lawyers where they acknowledged it happened as far back as 2016.”

He wants to create a set of enduring principles that are handed over to a technically competent regulator to enforce. “Currently, the industry is not responding to some pretty fundamental things that have happened on their watch. So I think it is the right place for government to step in,” he adds.

Facebook in particular, he argues is “the most obstinate and belligerent in recognising the harm that has been done and actually doing something about it”. . . .

7. Social media have been underscored as a contributing factor to right-wing, domestic terrorism. . . . The first is stochastic terrorism: ‘The use of mass, public communication, usually against a particular individual or group, which incites or inspires acts of terrorism which are statistically probable but happen seemingly at random.’ I encountered the idea in a Friday thread from data scientist Emily Gorcenski, who used it to tie together four recent attacks. . . . .”

“Why Social Media is Friend to Far-Right Politicians Around the World” by Casey Newton; The Verge; 10/30/2018.

The Links Between Social Media, Domestic Terrorism and the Retreat from Democracy

It was an awful weekend of hate-fueled violence, ugly rhetoric, and worrisome retreats from our democratic ideals. Today I’m focused on two ways of framing what we’re seeing, from the United States to Brazil. While neither offers any comfort, they do give helpful names to phenomena I expect will be with us for a long while.

The first is stochastic terrorism: “The use of mass, public communication, usually against a particular individual or group, which incites or inspires acts of terrorism which are statistically probable but happen seemingly at random.” I encountered the idea in a Friday thread from data scientist Emily Gorcenski, who used it to tie together four recent attacks.

In her thread, Gorcenski argues that various right-wing conspiracy theories and frauds, amplified both through mainstream and social media, have resulted in a growing number of cases where men snap and commit violence. “Right-wing media is a gradient pushing rightwards, toward violence and oppression,” she wrote. “One of the symptoms of this is that you are basically guaranteed to generate random terrorists. Like popcorn kernels popping.”

On Saturday, another kernel popped. Robert A. Bowers, the suspect in a shooting at a synagogue that left 11 people dead, was steeped in online conspiracy culture. He posted frequently to Gab, a Twitter clone that emphasizes free speech and has become a favored social network among white nationalists. Julie Turkewitz and Kevin Roose described his hateful views in the New York Times:

After opening an account on it in January, he had shared a stream of anti-Jewish slurs and conspiracy theories. It was on Gab where he found a like-minded community, reposting messages from Nazi supporters.

“Jews are the children of Satan,” read Mr. Bowers’s biography.

Bowers is in custody — his life was saved by Jewish doctors and nurses — and presumably will never go free again. Gab’s life, however, may be imperiled. Two payment processors, PayPal and Stripe, de-platformed the site, as did its cloud host, Joyent. The site went down on Monday after its hosting provider GoDaddy, told it to find another one. Its founder posted defiant messages on Twitter and elsewhere promising it would survive.

Gab hosts a lot of deeply upsetting content, and to its supporters, that’s the point. Free speech is a right, their reasoning goes, and it ought to be exercised. Certainly it seems wrong to suggest that Gab or any other single platform “caused” Bowers to act. Hatred, after all, is an ecosystem. But his action came amid a concerted effort to focus attention on a caravan of migrants coming to the United States in seek of refugee.

Right-wing media, most notably Fox News, has advanced the idea that the caravan is linked to Jewish billionaire (and Holocaust survivor) George Soros. An actual Congressman, Florida Republican Matt Gaetz, suggested the caravan was funded by Soros. Bowers enthusiastically pushed these conspiracy theories on social media.

In his final post on Gab, Bowers wrote: “I can’t sit by and watch my people get slaughtered. Screw your optics. I’m going in.”

The individual act was random. But it had become statistically probable thanks to the rise of anti-immigrant rhetoric across all manner of media. And I fear we will see far more of it before the current fever breaks.

The second concept I’m thinking about today is democratic recession. The idea, which is roughly a decade old, is that democracy is in retreat around the globe. The Economist covered it in January:

The tenth edition of the Economist Intelligence Unit’s Democracy Index suggests that this unwelcome trend remains firmly in place. The index, which comprises 60 indicators across five broad categories—electoral process and pluralism, functioning of government, political participation, democratic political culture and civil liberties—concludes that less than 5% of the world’s population currently lives in a “full democracy”. Nearly a third live under authoritarian rule, with a large share of those in China. Overall, 89 of the 167 countries assessed in 2017 received lower scores than they had the year before.

In January, The Economist considered Brazil a “flawed democracy.” But after this weekend, the country may undergo a more precipitous decline in democratic freedoms. As expected, far-right candidate Jair Bolsonaro, who speaks approvingly of the country’s previous military dictatorship, handily won election over his leftist rival.

In the best piece I read todayBuzzFeed’s Ryan Broderick — who was in Brazil for the election — puts Bolsonaro’s election into the context of the internet and social platform. Broderick focuses on the symbiosis between internet media, which excels at promoting a sense of perpetual crisis and outrage, and far-right leaders who promise a return to normalcy.

Typically, large right-wing news channels or conservative tabloids will then take these stories going viral on Facebook and repackage them for older, mainstream audiences. Depending on your country’s media landscape, the far-right trolls and influencers may try to hijack this social-media-to-newspaper-to-television pipeline. Which then creates more content to screenshot, meme, and share. It’s a feedback loop.

Populist leaders and the legions of influencers riding their wave know they can create filter bubbles inside of platforms like Facebook or YouTube that promise a safer time, one that never existed in the first place, before the protests, the violence, the cascading crises, and endless news cycles. Donald Trump wants to Make American Great Again; Bolsonaro wants to bring back Brazil’s military dictatorship; Shinzo Abe wants to recapture Japan’s imperial past; Germany’s AFD performed the best with older East German voters longing for the days of authoritarianism. All of these leaders promise to close borders, to make things safe. Which will, of course, usually exacerbate the problems they’re promising to disappear. Another feedback loop.

A third feedback loop, of course, is between a social media ecosystem promoting a sense of perpetual crisis and outrage, and the random-but-statistically-probable production of domestic terrorists.

Perhaps the global rise of authoritarians and big tech platforms are merely correlated, and no causation can be proved. But I increasingly wonder whether we would benefit if tech companies assumed that some level of causation was real — and, assuming that it is, what they might do about it.

DEMOCRACY

On Social Media, No Answers for Hate

You don’t have to go to Gab to see hateful posts. Sheera Frenkel, Mike Isaac, and Kate Conger report on how the past week’s domestic terror attacks play out on once-happier places, most notably Instagram:

On Monday, a search on Instagram, the photo-sharing site owned by Facebook, produced a torrent of anti-Semitic images and videos uploaded in the wake of Saturday’s shooting at a Pittsburgh synagogue.

A search for the word “Jews” displayed 11,696 posts with the hashtag “#jewsdid911,” claiming that Jews had orchestrated the Sept. 11 terror attacks. Other hashtags on Instagram referenced Nazi ideology, including the number 88, an abbreviation used for the Nazi salute “Heil Hitler.”

Attacks on Jewish people rising on Instagram and Twitter, researchers say

Just before the synagogue attack took place on Saturday, David Ingram posted this story about an alarming rise in attacks on Jews on social platforms:

Samuel Woolley, a social media researcher who worked on the study, analyzed more than 7 million tweets from August and September and found an array of attacks, also often linked to Soros. About a third of the attacks on Jews came from automated accounts known as “bots,” he said.

“It’s really spiking during this election,” Woolley, director of the Digital Intelligence Laboratory, which studies the intersection of technology and society, said in a telephone interview. “We’re seeing what we think is an attempt to silence conversations in the Jewish community.”

Russian disinformation on Facebook targeted Ukraine well before the 2016 U.S. election

Dana Priest, James Jacoby and Anya Bourg report that Ukraine’s experience with information warfare offered an early — and unheeded — warning to Facebook:

To get Zuckerberg’s attention, the president posted a question for a town hall meeting at Facebook’s Silicon Valley headquarters. There, a moderator read it aloud.

“Mark, will you establish a Facebook office in Ukraine?” the moderator said, chuckling, according to a video of the assembly. The room of young employees rippled with laughter. But the government’s suggestion was serious: It believed that a Kiev office, staffed with people familiar with Ukraine’s political situation, could help solve Facebook’s high-level ignorance about Russian information warfare. . . . .

 

Discussion

2 comments for “FTR #1074 FakeBook: Walkin’ the Snake on the Earth Island with Facebook (FascisBook, Part 2; In Your Facebook, Part 4)”

  1. The Ukrainian publication 112.ua has a piece on the appointment of Kateryna Kruk as Facebook’s new head of Public Policy for Ukraine that provides some of the backstory for how this position got created in the first. And, yes, it’s rather disturbing. Surprise!

    So back in 2015, Facebook was engaged in widespread blocking of users from Ukraine. It got to the point where then-President Petro Poroshenko asked Mark Zuckerberg to open a Facebook office in Ukraine to handle the issue of when someone should be blocked. At that point, it was Facebook’s office in Ireland that made those decisions for Ukraine’s users. Zuckerberg responded that the blocking of the Ukrainian accounts was done right because “language of hostility” was used in them. Given the civil war at the time and the fact that neo-Nazi movements were playing a major role in fighting on the pro-Kiev side of that war we can get a pretty good idea of what that “language of hostility” would have sounded like.

    Flash forward to October of 2018, and Facebook announces a competition for the position of public policy manager for Ukraine. As Facebook’s post put it, “We are looking for a good communicator that can combine the passion for the Internet services Facebook provides and has deep knowledge of the political and regulatory dynamics in Ukraine and, preferably, in all the Eastern European region,” and that someone with experience working on political issues with the participation of the Ukrainian government would be preferred.

    Interestingly, one source in the article indicates that the new manager position won’t be handling the dealing with banning users. Of course, the article also references the Public Policy team. In other words, Kruk is going to have a bunch of people working under her so it it seems likely that people working under Kruk would be the ones actually handling the bannings. Plus, one of the responsibilities Kruk will have includes helping to “create rules in the Internet sector”, and it’s very possible tweaking those rules will be how Kruk prevents the need for future bannings. And the article explicitly says that it is expected that after this new appointment the blocking of posts of Ukrainian users would stop.

    So in 2015, Ukraine’s government complains about Facebook banning people for what sounds like hate speech and requests a special Ukrainian-specific office for handling who gets banned and four years later Facebook basically does exactly that:

    112 UA

    Facebook appoints manager for Ukraine: What does it mean?

    Kateryna Kruk became a Facebook public policy manager for Ukraine

    Author : News Agency 112.ua
    14:08, 7 June 2019

    In early June, Facebook for the first time in its history appointed a public policy manager for Ukraine – she is Ukrainian Kateryna Kruk. It is expected that after this appointment the blocking of posts of Ukrainian users would stop, as well as “gross and unprofessional attitude of Facebook towards Ukraine and Ukrainians.”

    What is the idea?

    In spring of 2015, due to the mass blocking of Ukrainian users, the Ukrainian Facebook group addressed the founder of Facebook Mark Zuckerberg with a request to create a Ukrainian administration. Former Ukrainian President Petro Poroshenko also asked Zuckerberg to open a Facebook office in Ukraine.

    Zuckerberg said that the controversial publications for which Ukrainian users were banned, were deleted rightly, because “language of hostility” was used in them. At the same time, Zuckerberg said that the Ukrainian social networking segment is moderated by an office in Ireland, and the issue of opening a representative office of a social network in Ukraine can be considered over time.

    And in October 2018, Facebook announced a competition for the position of public policy manager for Ukraine.

    “We are looking for a good communicator that can combine the passion for the Internet services Facebook provides and has deep knowledge of the political and regulatory dynamics in Ukraine and, preferably, in all the Eastern European region,” said the comment to the position.

    In addition, it was noted that a candidate who is acquainted with politicians and government officials, and has experience in working on political issues with the participation of the Ukrainian government, will be given preference to. It was reported that the new manager would work at the Facebook office in Warsaw.

    In early June, it became known that Kateryna Kruk became Facebook’s Public Policy Manager for Ukraine.

    Who is Kateryna Kruk?

    The Deputy Minister of Information Policy Dmytro Zolotukhin believes that Katerina Kruk is the best choice that could have been made by Facebook. The ministry promised to support Kruk in all matters that will be in the interests of Ukraine.

    What will the new manager do?

    It is noted, that the Public Policy team is engaged in communication between politicians and Facebook: responds to inquiries from politicians and regulatory bodies, helps to create rules in the Internet sector, shares information about products and activities of the company.

    In addition, the manager will monitor the legislation and regulatory issues related to Facebook in Ukraine, form coalitions with other organizations to promote the political goals of the social network, communicate with the media and represent the interests of Facebook before state agencies.

    At the same time, some sources report that the work with blocked groups, user bans and problems with the advertising cabinets are not included in the manager’s responsibilities.

    What’s next?

    Earlier Dmytro Zolotukhin noted that, first of all the new manager would act in the interests of the company, which pays him/her.

    “However, on the other hand, this will relieve us of suspicion of who really solves conflict situations with Ukrainian users,” Zolotukhin wrote in the fall last year.

    And after announcing the results of the competition for the vacancy, he expressed the hope that after this appointment “gross and unprofessional attitude of Facebook towards Ukraine and Ukrainians.”

    ———–

    “Facebook appoints manager for Ukraine: What does it mean?” by News Agency 112.ua; 112.ua, 06/07/2019

    “In early June, Facebook for the first time in its history appointed a public policy manager for Ukraine – she is Ukrainian Kateryna Kruk. It is expected that after this appointment the blocking of posts of Ukrainian users would stop, as well as “gross and unprofessional attitude of Facebook towards Ukraine and Ukrainians.””

    No more blockings of Ukrainian posts. That’s the expectation now that Kruk has this new position. It’s quite a change from 2015 when Mark Zuckerberg himself defended the blocking of such posts because they violated Facebook’s terms of use by including “language of hostility”, which is almost certainly a euphemism for Nazi hate speech. But Zuckerberg said the company would consider Petro Poroshenko’s request for a special Ukrainian office to handle these issues and in 2018 the company decided to go ahead with the idea:


    What is the idea?

    In spring of 2015, due to the mass blocking of Ukrainian users, the Ukrainian Facebook group addressed the founder of Facebook Mark Zuckerberg with a request to create a Ukrainian administration. Former Ukrainian President Petro Poroshenko also asked Zuckerberg to open a Facebook office in Ukraine.

    Zuckerberg said that the controversial publications for which Ukrainian users were banned, were deleted rightly, because “language of hostility” was used in them. At the same time, Zuckerberg said that the Ukrainian social networking segment is moderated by an office in Ireland, and the issue of opening a representative office of a social network in Ukraine can be considered over time.

    And in October 2018, Facebook announced a competition for the position of public policy manager for Ukraine.

    “We are looking for a good communicator that can combine the passion for the Internet services Facebook provides and has deep knowledge of the political and regulatory dynamics in Ukraine and, preferably, in all the Eastern European region,” said the comment to the position.

    In addition, it was noted that a candidate who is acquainted with politicians and government officials, and has experience in working on political issues with the participation of the Ukrainian government, will be given preference to. It was reported that the new manager would work at the Facebook office in Warsaw.

    And while it doesn’t sound like the manager (Kruk) will be directly responsible for handling bannings, it also sounds like she’s going to be managing a team of people so we would expect that team to be the one’s actually handling the bannings. Plus, Kruk’s responsibilities for things like helping to “create rules in the Internet sector” are a far more effective way to lift the rules that were resulting in these bans:


    What will the new manager do?

    It is noted, that the Public Policy team is engaged in communication between politicians and Facebook: responds to inquiries from politicians and regulatory bodies, helps to create rules in the Internet sector, shares information about products and activities of the company.

    In addition, the manager will monitor the legislation and regulatory issues related to Facebook in Ukraine, form coalitions with other organizations to promote the political goals of the social network, communicate with the media and represent the interests of Facebook before state agencies.

    At the same time, some sources report that the work with blocked groups, user bans and problems with the advertising cabinets are not included in the manager’s responsibilities.

    And note how Urkaine’s Deputy Minister of Information Policy has already pledged to support Kruk and has expressed his hope that this appoint will end the “gross and unprofessional attitude of Facebook towards Ukraine and Ukrainians”:


    The Deputy Minister of Information Policy Dmytro Zolotukhin believes that Katerina Kruk is the best choice that could have been made by Facebook. The ministry promised to support Kruk in all matters that will be in the interests of Ukraine.

    What’s next?

    Earlier Dmytro Zolotukhin noted that, first of all the new manager would act in the interests of the company, which pays him/her.

    “However, on the other hand, this will relieve us of suspicion of who really solves conflict situations with Ukrainian users,” Zolotukhin wrote in the fall last year.

    And after announcing the results of the competition for the vacancy, he expressed the hope that after this appointment “gross and unprofessional attitude of Facebook towards Ukraine and Ukrainians.”

    Now, it’s important to acknowledge that there has undoubtedly been some bans of Ukrainians that were the result of pro-Kremlin trolls (and vice versa). That was, in fact, one of the big complaints of Ukrainians in 2015: that pro-Kremlin trolls were effectively gaming Facebook’s systems to get Ukrainians banned. But there’s also no denying that Ukraine is awash in fascist propaganda backed by the government by undoubtedly violates Facebook’s various rules against hate speech. And now that we have a far right sympathizer, Kruk, in this new position.

    So it’s going to be really interesting to see what happens with the neo-Nazi groups with government backing like Azov. As the following article from April of this describes, Azov members first starting experiencing bannings in 2015 and this year the group was quietly banned entirely at some point this year. Except, despite that ban, Azov remains on Facebook, just under new pages. Olena Semenyaka, the international spokesperson for the movement, has had multiple pages banned but has multiple pages still up. And that’s going to be an important thing to keep in mind as this plays out: even if Facebook bans these far right groups, getting around those bans appears to be trivial:

    Radio Free Europe/Radio Liberty

    Facebook ‘Bans’ Ukrainian Far-Right Group Over ‘Hate Speech’ — But Getting Rid Of It Isn’t Easy

    By Christopher Miller
    April 16, 2019 18:50 GMT

    KYIV — Ukraine’s militaristic, far-right Azov movement and its various branches have used Facebook to promote its antidemocratic, ultranationalist messages and recruit new members since its inception at the start of the country’s war against Russia-backed separatists five years ago.

    The American social-networking giant has also been an important platform for Azov’s global expansion and attempts to legitimize itself among likeminded American and European white nationalists.

    Facebook has occasionally taken down pages and groups associated with Azov when they have been found to be in violation of its policies on hate speech and the depiction of violence.

    The first Facebook removals occurred in 2015, Azov members told RFE/RL.

    But after continuous, repeat violations Azov — which includes many war veterans and militant members with openly neo-Nazi views who have been involved in attacks on LGBT activists, Romany encampments, and women’s groups — is now officially banned from having any presence on Facebook, the social network has confirmed to RFE/RL.

    Despite the ban, however, which quietly came into force months ago, a defiant Azov and its members remain active on the social network under pseudonyms and name variations, underscoring the difficulty Facebook faces in combating extremism on a platform with some 2.32 billion monthly active users.

    ‘Organized Hate’ Not Allowed

    For years, Facebook has struggled with how to deal with extremist content and it has been criticized for moving too slowly on it or behaving reactively.

    The issue was put front-and-center in August 2017, when the platform was used to organize a white supremacist rally in Charlottesville, Virginia, that turned deadly.

    The issue was raised most recently in the aftermath of the Christchurch massacre that left 50 people dead. The shooter livestreamed the killing on his Facebook page. The company said it had “quickly removed both the shooter’s Facebook and Instagram accounts and the video,” and was taking down posts of praise or support for the shooting.

    Joe Mulhall, a senior researcher at the U.K.-based antifascist organization Hope Not Hate, told RFE/RL by phone that Charlottesville brought a “sea change” when it came to social media companies and Facebook, in particular paying attention to extremists.

    For instance, he praised the company for its “robust” action against the far-right founder of the English Defence League, Tommy Robinson, who had repeatedly violated Facebook’s policies on hate speech.

    But Mulhall said Facebook more often acts only after “they’re publicly shamed.”

    “When there is massive public pressure, they act; or when they think they can get away with things, they don’t,” he added.

    This may explain why it took Facebook years to ban the Azov movement, which received significant media attention following a series of violent attacks against minorities in 2018.

    Facebook did not specify what exactly tipped the scale. But responding to an RFE/RL e-mail request on April 15, a Facebook spokesperson wrote that the company has been taking down accounts associated with the Azov Regiment, National Corps, and National Militia – the group’s military, political, and vigilante wings, respectively — on Facebook for months, citing its policies against hate groups. The spokesperson did not say when exactly the ban came into force.

    In its policy on dangerous individuals and organizations, Facebook defines a hate organization as “any association of three or more people that is organized under a name, sign, or symbol and that has an ideology, statements, or physical actions that attack individuals based on characteristics, including race, religious affiliation, nationality, ethnicity, gender, sex, sexual orientation, serious disease or disability.”

    Defending ‘Ukrainian Order’

    Azov and its leadership consider themselves defenders of what they call “Ukrainian order,” or an illiberal and antidemocratic society. They are anti-Russian and also against Ukraine’s potential accession to the European Union and NATO.

    Their ideal Ukraine is a “Ukraine for Ukrainians,” as Olena Semenyaka, the international secretary for Azov’s political wing, the National Corps, told RFE/RL last year. Azov’s symbol is similar to the Nazi Wolfsangel but the group claims it is comprised of the letters N and I, meaning “national idea.”

    Earlier in March, the U.S. State Department referred to the National Corps as a “nationalist hate group” in its annual human rights report.

    Azov has inducted thousands of militant members in recent years in torchlight ceremonies with chants of “Glory to Ukraine! Death to enemies.” The movement claims to have roughly 10,000 members in its broader movement and the ability to mobilize some 2,000 to the streets within hours. A large part of its recruiting has been done using slickly produced videos and advertisements of it fight clubs, hardcore concerts, and fashion lines promoted on Facebook and other social networks.

    Still On Facebook, But Moving Elsewhere

    Many of those may no longer be found on Facebook after the ban. But some are likely to stick around, since many Azov factions and leaders remain on the platform or else have opened fresh accounts after original ones were removed, RFE/RL research shows.

    For instance, the Azov Regiment, whose official page under the Polk Azov name was removed months ago, has also opened a fresh page with a new name: Tviy Polk (Your Regiment).

    Its leaders have reacted similarly, as have the National Corps and National Militia, opening dozens of new accounts under slightly altered names to make it more difficult for Facebook to track them. A simple search on April 16 brought up more than a dozen active accounts.

    Semenyaka has had at least two personal accounts removed by Facebook. But two other accounts belonging to her and opened with different spellings of her name — Lena Semenyaka and Helena Semenyaka — are still open, as is a group page she manages.

    In a post to the Lena account on April 11, Semenyaka wrote after the takedown of her original account that Facebook “is getting increasingly anti-intellectual.”

    “If you wish to keep in touch, please subscribe to some other permanent and temporary platforms,” she continued, adding a link to her Facebook-owned Instagram account.

    Then, highlighting what’s become a popular new destination for far-right and other extremist groups, she also announced the opening of a new National Corps International account — on the messenger app Telegram.

    ———-

    “Facebook ‘Bans’ Ukrainian Far-Right Group Over ‘Hate Speech’ — But Getting Rid Of It Isn’t Easy” by Christopher Miller; Radio Free Europe/Radio Liberty; 04/16/2019

    Despite the ban, however, which quietly came into force months ago, a defiant Azov and its members remain active on the social network under pseudonyms and name variations, underscoring the difficulty Facebook faces in combating extremism on a platform with some 2.32 billion monthly active users.”

    The total ban on Azov took place months ago and yet Azov members still have an active presence, including the movement’s spokesperson, Olena Semenyaka, who has two personal pages and a group page still up as of April, along with an account on Facebook-owned Instagram:


    Defending ‘Ukrainian Order’

    Azov and its leadership consider themselves defenders of what they call “Ukrainian order,” or an illiberal and antidemocratic society. They are anti-Russian and also against Ukraine’s potential accession to the European Union and NATO.

    Their ideal Ukraine is a “Ukraine for Ukrainians,” as Olena Semenyaka, the international secretary for Azov’s political wing, the National Corps, told RFE/RL last year. Azov’s symbol is similar to the Nazi Wolfsangel but the group claims it is comprised of the letters N and I, meaning “national idea.”

    Still On Facebook, But Moving Elsewhere

    Many of those may no longer be found on Facebook after the ban. But some are likely to stick around, since many Azov factions and leaders remain on the platform or else have opened fresh accounts after original ones were removed, RFE/RL research shows.

    For instance, the Azov Regiment, whose official page under the Polk Azov name was removed months ago, has also opened a fresh page with a new name: Tviy Polk (Your Regiment).

    Its leaders have reacted similarly, as have the National Corps and National Militia, opening dozens of new accounts under slightly altered names to make it more difficult for Facebook to track them. A simple search on April 16 brought up more than a dozen active accounts.

    Semenyaka has had at least two personal accounts removed by Facebook. But two other accounts belonging to her and opened with different spellings of her name — Lena Semenyaka and Helena Semenyaka — are still open, as is a group page she manages.

    In a post to the Lena account on April 11, Semenyaka wrote after the takedown of her original account that Facebook “is getting increasingly anti-intellectual.”

    “If you wish to keep in touch, please subscribe to some other permanent and temporary platforms,” she continued, adding a link to her Facebook-owned Instagram account.

    Then, highlighting what’s become a popular new destination for far-right and other extremist groups, she also announced the opening of a new National Corps International account — on the messenger app Telegram.

    And note how Facebook wouldn’t actually say what exactly triggered the company to fully ban the group after years of individual bannings. That’s part of what’s going to be interesting to watch with the creation of a new Public Policy office for Ukraine: those rules are going to become a lot clearer after figures like Kruk learn what they are and can shape them:


    Joe Mulhall, a senior researcher at the U.K.-based antifascist organization Hope Not Hate, told RFE/RL by phone that Charlottesville brought a “sea change” when it came to social media companies and Facebook, in particular paying attention to extremists.

    For instance, he praised the company for its “robust” action against the far-right founder of the English Defence League, Tommy Robinson, who had repeatedly violated Facebook’s policies on hate speech.

    But Mulhall said Facebook more often acts only after “they’re publicly shamed.”

    “When there is massive public pressure, they act; or when they think they can get away with things, they don’t,” he added.

    This may explain why it took Facebook years to ban the Azov movement, which received significant media attention following a series of violent attacks against minorities in 2018.

    Facebook did not specify what exactly tipped the scale. But responding to an RFE/RL e-mail request on April 15, a Facebook spokesperson wrote that the company has been taking down accounts associated with the Azov Regiment, National Corps, and National Militia – the group’s military, political, and vigilante wings, respectively — on Facebook for months, citing its policies against hate groups. The spokesperson did not say when exactly the ban came into force.

    So getting around those rules is also presumably going to get a lot easier once figures like Kruk can inform her fellow far right activists what exactly those rules are…assuming the rules against organized hate aren’t dealt away with entirely for Ukraine.

    Posted by Pterrafractyl | June 7, 2019, 3:06 pm
  2. Here’s an article discussing a book that just came out, The Real Face of Facebook in India, about the relationship between Facebook and the BJP and the role this relationship played in the BJP’s stunning 2014 successes. Most of what’s in the article covers what we already knew about this relationship, where Shivnath Thukral, a former NDTV journalist with a close working relationship with close Modi aide, Hiren Joshi, worked together on the Modi digital team in the 2014 election before Thukral went on to become Facebook’s director of policy for India and South Asia.

    Some of the new fun facts include Facebook apparently refusing to run the Congress Party’s ads highlighting the Modi government’s Rafale fighter jet scandal. It also delayed for 11 days ad for an expose in Caravan Magazine about BJP official Amit Shah. Disturbingly, it also sounds like Indian propaganda companies are offering their services in other countries like South Africa, which makes the company’s cozy ties to the BJP propagandists even more troubling.

    One of the more ironic fun facts in the book is that Katie Harbath, Facebook’s Director for Global Politics and Government Outreach, was apparently “unhappy and uneasy about the proximity” of top officials of Facebook to the Narendra Modi government after Thukral got his position at Facebook. This is according to an anonymous source. So that would appear to indicate that even Facebook’s high-level employees recognize these are politicized positions and yet the company goes ahead with it anyway. Surprise! As the article also notes, it’s somewhat ironic for Harbath to be expressing an unease with the company hiring a politically connected individual close to the government for such a position since Harbath herself was once a digital strategist for the Republican Party and Rudy Giuliani:

    The Wire

    The Past and Future of Facebook and BJP’s Mutually Beneficial Relationship

    A new book by Paranjoy Guha Thakurta and Cyril Sam finds revolving doors and quid pro quos between India’s richest political party and the world’s largest social platform.

    Partha P. Chakrabartty
    03/Jun/2019

    Five years from now, we may well be reading a book about the BJP’s WhatsApp operations in the 2019 elections – featuring two lakh groups of 256 members each, or over 5o million readers of the party line. A recent book, however, tells the story of the 2014 elections, and the role of WhatsApp’s parent company Facebook in the rise of Narendra Modi.

    In 2019, if we forget Facebook’s billions of dollars in revenue, we might almost feel sorry for it. Facebook has had a rough year, where it has been attacked both by the Left (for permitting the rise of right-wing troll armies), and the Right (for censorship of conservatives: Donald Trump has launched a new tool to report instances.)

    But we can’t forget their billions of dollars of revenue, especially when, even in this tough year, Facebook’s income grew by 26% quarter-on-quarter. To add to the voices raised against it, a new book alleges that Facebook was both directly complicit in, and benefited from, the rise of Modi’s BJP in India.

    The Real Face of Facebook in India, co-authored by the journalists Paranjoy Guha Thakurta and Cyril Sam, is a short, terse book that reads like a whodunnit. In the introduction, it is teased that the book will reveal ‘a wealth of details about the kind of support that Facebook provided Narendra Modi and the apparatus of the BJP apparatus (sic) even (sic) before the 2014 elections’. The many copy errors reveal that the book is an attempt to get the news out as widely and as quickly as possible. This is reporting, not deep analysis.

    In line with the aim of reaching as wide an audience as possible, the book has been simultaneously published in Hindi under the title Facebook ka Asli Chehra. There is also a companion website, theaslifacebook.com, which also has a Hindi section.

    Teasers to the big reveal come in the first few chapters, which do a slightly haphazard job of narrating the history of Facebook in India. The smoking gun is finally disclosed in Chapter 8 in the form of a person, Shivnath Thukral, a former NDTV journalist and ex-managing director of Carnegie India. Going by the evidence in the book, Thukral had a close working relationship with intimate Modi aide, Hiren Joshi. Together, they created the Mera Bharosa website and other web pages for the BJP in late 2013, ahead of the national election. In 2017, after his stint at Carnegie, Thukral joined Facebook as its director of policy for India and South Asia.

    For a person so close to a ruling party to become a top official of a ‘neutral’ platform is worrying. Worrying enough, it seems, to trouble the company itself: Real Face claims that Katie Harbath, Facebook’s director for global politics and government outreach, said she was “unhappy and uneasy about the proximity” of top officials of Facebook to the Narendra Modi government. The quote is attributed to an anonymous source. Whether it is true or not, citizens should be concerned about this particular revolving door between the most powerful media organisation in the world and the Modi administration. (It’s a different matter that Harbath herself was once a digital strategist for the Republican Party and Rudy Giuliani).

    The overarching story is this: The BJP was the first in our country to see the potential of Facebook as a way to reach voters. Facebook, a private corporation with an eye on building relevance in India and earning profits through advertising, saw in politics a great way to drive engagement. Both the BJP and Facebook had much to gain from a partnership.

    As a result, in the run-up to the 2014 election, Facebook offered training to BJP personnel in running social media campaigns. (Facebook has stated that they conduct these workshops for various political parties, but the implication remains that the BJP, in being a first mover, benefited disproportionately).

    The strategy worked beautifully for Facebook. As reported by Ankhi Das, Facebook India’s lead on policy and government relations, the 2014 elections reaped the platform 227 million interactions. Read today, Das’ article – which speaks of how ‘likes’ won Narendra Modi votes – comes off as more sinister than it might have at the time.

    We also know that the strategy worked for Modi. So potent was BJP’s targeting that it won 90% of its votes in only 299 constituencies, 282 of which it won. Former and current members of the BJP’s digital media strategy team were happy to confirm the mutual benefit. The current member is Vinit Goenka, once the national co-convener of the BJP’s IT cell, and currently working with Nitin Gadkari. This is how the book tells it:

    At one stage in our interview with Goenka that lasted over two hours, we asked him a pointed question: ‘Who helped whom more, Facebook or the BJP?’

    He smiled and said: ‘That’s a difficult question. I wonder whether the BJP helped Facebook more than Facebook helped the BJP. You could say, we helped each other.’

    Equally alarming are reports, in the book, of Facebook denying Congress paid ads to publicise the Rafale controversy. Facebook also delayed a boost on a Caravan expose on Amit Shah by more than 11 days, an eternity in our ridiculously fast news cycle. Finally, there are reports of Indian propaganda companies replicating these lessons in elections in South Africa and other countries. Taken together, we see how private platforms are happy to be used to manipulate democratic processes, whether in service of the Right or Left or Centre.

    This book is concerned with critiquing Facebook’s links with the right-wing. It has a foreword by the popular journalist Ravish Kumar and a preface by professor Apoorvanand of the Department of Hindi at Delhi University (and a contributor at The Wire). Both of these belong to what the right-wing terms the ‘secular’ brigade.

    However, we know now that Facebook is also acting against some of the assets of the BJP itself. This may be an eyewash, or just the logical next step in Facebook’s project: having created its importance in elections with the help of the BJP, it is now selling its influence to other parties. It doesn’t matter which party comes out on top in the social media game: the house always wins. Even supporters of the BJP should be wary of the monster they have fed. There is no reason for Facebook to be loyal to the party.

    Cambridge Analytica, by using limited data from Facebook, was able to influence the Brexit and 2016 US presidential elections. The book asks: what kind of influence can the platform itself exert on our democratic processes?

    The question citizens have to ask is: how much power do we allow one corporation, and its 35-year-old CEO, to have? What can be done about its near-monopolistic grip on data, and its ability to unilaterally impede or encourage the flow of information? These are early days, and one can hope that checks and balances will kick in. Until that happens, our work – of simply keeping up with how platforms propel or impede political interests – will be cut out for us.

    ———-

    “The Past and Future of Facebook and BJP’s Mutually Beneficial Relationship” by Partha P. Chakrabartty; The Wire; 06/03/2019

    “Teasers to the big reveal come in the first few chapters, which do a slightly haphazard job of narrating the history of Facebook in India. The smoking gun is finally disclosed in Chapter 8 in the form of a person, Shivnath Thukral, a former NDTV journalist and ex-managing director of Carnegie India. Going by the evidence in the book, Thukral had a close working relationship with intimate Modi aide, Hiren Joshi. Together, they created the Mera Bharosa> website and other web pages for the BJP in late 2013, ahead of the national election. In 2017, after his stint at Carnegie, Thukral joined Facebook as its director of policy for India and South Asia.

    It’s the kind of smoking gun of Facebook’s relationship with the BJP that just keeps smoking more and more the longer Shivnath Thukral holds that position. But it’s not the only smoking gun. Reports of Facebook refusing to publicize ads for the rival Congress Party and delaying stories that would be damaging to the BJP produce quite a bit of smoke too. And note that, while the article raises the risks for the BJP that Facebook might work against the BJP’s interests in the future citing some of Facbook’s efforts that have acted against the BJP’s digital assets, keep in mind that the particular effort the piece is referring to was a crackdown on ‘fake news’ that Facebook did where more than 700 pages were removed and almost all of them (687) were Congress Party pages, although the handful of BJP pages removed did have far more viewers than the Congress pages. So, thus far, the only time Facebook appears to work against the BJP’s interests is when there’s a generic ‘fake news’ purge and even in that case it appeared to target the BJP’s rivals much more heavily:


    Equally alarming are reports, in the book, of Facebook denying Congress paid ads to publicise the Rafale controversy. Facebook also delayed a boost on a Caravan expose on Amit Shah by more than 11 days, an eternity in our ridiculously fast news cycle. Finally, there are reports of Indian propaganda companies replicating these lessons in elections in South Africa and other countries. Taken together, we see how private platforms are happy to be used to manipulate democratic processes, whether in service of the Right or Left or Centre.

    However, we know now that Facebook is also acting against some of the assets of the BJP itself. This may be an eyewash, or just the logical next step in Facebook’s project: having created its importance in elections with the help of the BJP, it is now selling its influence to other parties. It doesn’t matter which party comes out on top in the social media game: the house always wins. Even supporters of the BJP should be wary of the monster they have fed. There is no reason for Facebook to be loyal to the party.

    The fact that this arrangement with the BJP is problematic isn’t lost on Facebook’s executives, according to the book. Facebook’s own
    director for global politics and government outreach, Katie Harbath, reportedly said she was “unhappy and uneasy about the proximity” of top officials of Facebook to the Modi government after Thukral was hired. But those concerns were clearly ignored. The concerns were also clearly ironic since Harbath herself was once a a digital strategist for the Republican Party and Rudy Giuliani:


    For a person so close to a ruling party to become a top official of a ‘neutral’ platform is worrying. Worrying enough, it seems, to trouble the company itself: Real Face claims that Katie Harbath, Facebook’s director for global politics and government outreach, said she was “unhappy and uneasy about the proximity” of top officials of Facebook to the Narendra Modi government. The quote is attributed to an anonymous source. Whether it is true or not, citizens should be concerned about this particular revolving door between the most powerful media organisation in the world and the Modi administration. (It’s a different matter that Harbath herself was once a digital strategist for the Republican Party and Rudy Giuliani).

    Recall how, right when the Cambridge Analytica scandal was emerging in late March of 2018, Facebook replaced its head of policy in the United States last year with another right-wing hack, Kevin Martin. Martin would be the new person in charge of lobbying the US government. Martin was Facebook’s vice president of mobile and global access policy and a former Republican chairman of the Federal Communications Commission. When Martin took this new position he would be reporting to Facebook’s vice president of global public policy, Joel Kaplan. Both Martin and Kaplan worked together on George W. Bush’s 2000 presidential campaign. Yep, that’s how Facebook responded to the Cambridge Analytica scandal. By putting a Republican in charge of lobbying the US government.

    It’s that context that makes the concerns of Katie Harbath so ironic, along with the fact that Facebook was so integral to the success of the 2016 Trump campaign that the company embedded employees with the campaign. Yes, Harbath’s concerns over an overly close relationship with the BJP were indeed valid concerns, but ironic valid when coming from a Republican operative like Harbath.

    And when you look at Harbath’s LinkedIn page, we learn that she was hired by Facebook to become the Public Policy Director for Global Elections in February of 2011. Harbath was the National Republican Senatorial Committee’s chief digital strategist from August 2009-March 2011. So Harbath would have been in charge of the GOP Senate’s digital strategy for the 2010 mid-terms when the Republicans gained six Senate seats and retook control of the US House and a few months later Facebook hired her to become the Public Policy Director for Global Elections.

    Beyond that, Harbath’s LinkedIn page lists her work for DCI Group. She was a Senior Account Manager at DCI Group from 2006-2007. Then she left to work at the Deputy eCampaign Director for Rudy Giuliani’s presidential campaign from February 2007-January 2008. And in February of 2008 she returned to DCI Group as Director of Online Services, the position she held until going to work for the National Republican Senatorial Committee in 2009. Recall how DCI Group has close ties to Karl Rove and is known for being one of the sleaziest and most amoral of the ‘dark money’ lobbying/propaganda firms operating in DC. In addition to lobbying and public relations work for the Republican Party, DCI has a history of taking on clients like RJ Reynolds Tobacco and the Burmese Junta. It’s also known for peddling misinformation and engaging in dirty politics. In 2008, the CEO of DCI Group was select to manage the Republican National Convention. And DCI Group also worked with the Koch brothers’ front groups Americans for Prosperity and FreedomWorks in creating the Tea Party movement, which would have taken place during Harbath’s time as the National Republican Senatorial Committee’s chief digital strategist. DCI Group was also the publisher of Tech Central Station, a website funded by Exxon dedicate to climate change denial and has worked on major right-wing disinformation campaigns in the US ranging from health care to oil pipelines.

    So Facebook’s Public Policy Director for Global Elections, Katie Harbath, wasn’t just a Republican Party operative. She also worked for one of the most disreputable lobbying and propaganda firms in DC and a key entity in the American ‘dark money’ propaganda industry. That’s the person who was allegedly uncomfortable with Facebook hiring of a BJP-connected individual. And despite those alleged concerns Thukral’s hiring happened anyway, of course.

    In related news, the Trump White House set up a webpage where conservatives could go to report instances of Facebook and other social media companies being biased against them. Yep.

    Posted by Pterrafractyl | June 11, 2019, 1:59 pm

Post a comment