Spitfire List Web site and blog of anti-fascist researcher and radio personality Dave Emory.

For The Record  

FTR#1221 War Games, Part 3 (Rittenhouse Nation)

You can sub­scribe to e‑mail alerts from Spitfirelist.com HERE.

You can sub­scribe to RSS feed from Spitfirelist.com HERE.

You can sub­scribe to the com­ments made on pro­grams and posts–an excel­lent source of infor­ma­tion in, and of, itself, HERE.

WFMU-FM is pod­cast­ing For The Record–You can sub­scribe to the pod­cast HERE.

Mr. Emory’s entire life’s work is avail­able on a 32GB flash dri­ve, avail­able for a con­tri­bu­tion of $65.00 or more (to KFJC). Click Here to obtain Dav­e’s 40+ years’ work, com­plete through Late Fall of 2021 (through FTR #1215).

­­­FTR#1221 This pro­gram was record­ed in one, 60-minute seg­ment.

Intro­duc­tion: In these pro­grams, we con­tin­ue our dis­cus­sion of Nick Turse’s 2008 tome The Com­plex: How the Mil­i­tary Invades Our Every­day Lives.

Writ­ing in his nov­el Trop­ic of Can­cer, Hen­ry Miller wrote: ” . . . . Amer­i­ca is the very incar­na­tion of Doom. And she will lead the rest of the world into the Bot­tom­less Pit. . . .”  (The quote was includ­ed in his For­give My Grief  books by pio­neer­ing JFK assas­si­na­tion researcher Penn Jones.

Epit­o­miz­ing Miller’s obser­va­tion is what Mr. Emory terms the res­o­nant syn­the­sis of video games and mil­i­tary train­ing and train­ing tech­nol­o­gy:

“. . . . Cer­tain­ly, the day is not far off when most poten­tial U.S. troops will have grown up play­ing com­mer­cial video games that were cre­at­ed by the mil­i­tary as train­ing sim­u­la­tors; will be recruit­ed, at least in part, through video games; will be test­ed, post-enlist­ment, on advanced video game sys­tems; will be trained using sim­u­la­tors, which will lat­er be turned into video games, or on recon­fig­ured ver­sions of the very same games used to recruit them or that they played kids; will be taught to pilot vehi­cles using devices resem­bling com­mer­cial video game con­trollers; and then, after a long day of real-life war-gam­ing head back to their quar­ters to kick back and play the lat­est PlaySta­tion or Xbox games cre­at­ed with or spon­sored by their own, or anoth­er, branch of the armed forces.

More and more toys are now poised to become clan­des­tine com­bat teach­ing tools, and more and more sim­u­la­tors are des­tined to be tomorrow’s toys. And what of America’s chil­dren and young adults in all this? How will they be affect­ed by the daz­zling set of mil­i­tary train­ing devices now land­ing in their liv­ing rooms and on their PCs, pro­duced by video game giants under the  watch­ful eyes of the Pen­ta­gon? After all, what these games offer is less a mat­ter of sim­ple mil­i­tary indoc­tri­na­tion and more like a near immer­sion in a vir­tu­al world of war, where armed con­flict is not the last, but the first—and indeed the only—resort. . . .

A con­crete exam­ple of that “res­o­nant syn­the­sis” is the bat­tle of 73 East­ing:

“. . . . Just days into the ground com­bat por­tion of the Gulf War, the Bat­tle of 73 East­ing pit­ted Amer­i­can armored vehi­cles against a much larg­er Iraqi tank force. The U.S. troops, who had trained using the SIMNET sys­tem, rout­ed the Iraqis. With­in days, the mil­i­tary began turn­ing the actu­al bat­tle into a dig­i­tal sim­u­la­tion for use with SIMNET. Inten­sive debrief­ing ses­sions with 150 vet­er­ans of the bat­tle were under­tak­en. Then DARPA per­son­nel went out onto the bat­tle­field with the vet­er­ans, sur­vey­ing tank tracks and burned-out Iraqi vehi­cles, as the vet­er­ans walked them through each indi­vid­ual seg­ment of the clash. Addi­tion­al­ly, radio com­mu­ni­ca­tions, satel­lite pho­tos, and ‘black box­es’ from U.S. tanks were used to gath­er even more details. Nine months after the actu­al com­bat took place, a dig­i­tal recre­ation of the Bat­tle of 73 East­ing was pre­miered for high-rank­ing mil­i­tary per­son­nel. Here was the cul­mi­na­tion of Thorpe’s efforts to cre­ate a net­worked sys­tem that would allow troops to train for future wars using the new tech­nol­o­gy com­bined with accu­rate his­tor­i­cal data. . . .

Plac­ing Hen­ry Miller’s quote into an iron­i­cal­ly-rel­e­vant con­text, a pop­u­lar video game “Doom” quick­ly was adapt­ed to Mar­tine Corps train­ing pur­pos­es:

“. . . . In late 1993, with the green glow of Gulf War vic­to­ry already fad­ing, id Soft­ware intro­duced the video game Doom. Gamers soon began mod­i­fy­ing share­ware copies of this ultra­vi­o­lent, ultra­pop­u­lar first per­son shoot­er, prompt­ing id to release edit­ing soft­ware the next year. The abil­i­ty to cus­tomize Doom caught the atten­tion of mem­bers of the Marine Corps Mod­el­ing and Sim­u­la­tion Man­age­ment Office who had been tasked by the corps’ Com­man­dant Charles Kru­lak with uti­liz­ing “‘com­put­er (PC)-based war games”‘to help the marines ‘devel­op deci­sion mak­ing skills, par­tic­u­lar­ly when live train­ing time and oppor­tu­ni­ties are lim­it­ed.’

“Act­ing on Krulak’s direc­tive, the marines’ mod­el­ing crew nixed Doom’s fan­ta­sy weapons and labyrinthine locale and, in three months’ time, devel­oped Marine Doom, a game that includ­ed only actu­al Marine Corps weapon­ry and real­is­tic envi­ron­ments. Kru­lak liked what he saw and, in 1997, approved the game. . . .”

Next, Turse dis­cuss­es Pen­ta­gon plans to oper­ate in urban slums in the Third World. Mr. Emory notes that many com­bat vet­er­ans of this coun­try’s long counter-insur­gency wars in Afghanistan and Iraq are join­ing the increas­ing­ly mil­i­ta­rized police forces in this coun­try.

Pen­ta­gon strat­e­gy as dis­cussed here by Turse may, even­tu­al­ly be real­ized, to an extent, in the U.S., par­tic­u­lar­ly in the event of an eco­nom­ic col­lapse.

More about Pen­ta­gon plans for urban war­fare in slums, osten­si­bly in the devel­op­ing world:

” . . . . As both the high-tech pro­grams and the pro­lif­er­at­ing train­ing facil­i­ties sug­gest, the for­eign slum city is slat­ed to become the bloody bat­tle­space of the future. . . . For exam­ple, the U.S. Navy/Marine Corps launched a pro­gram seek­ing to devel­op algo­rithms to pre­dict the crim­i­nal­i­ty of a giv­en build­ing or neigh­bor­hood. The project titled Find­ing Repet­i­tive Crime Sup­port­ing Struc­tures, defines cities as noth­ing more than a col­lec­tion of ‘urban clut­ter [that] affords con­sid­er­able con­ceal­ment for the actors that we must cap­ture.’ The ‘hos­tile behav­ior bad actors,’ as the pro­gram terms them, are defined not just as ‘ter­ror­ists,’ today’s favorite catch-all bogey­men, but as a panoply of night­mare arche­types: ‘insur­gents, ser­i­al killers, drug deal­ers, etc.’. . .”

Pro­gram High­lights Include: Dis­cus­sion of Colonel Dave Gross­man­’s book On Killing against the back­ground of the res­o­nant syn­the­sis of video games and mil­i­tary train­ing; analy­sis of the use of gam­ing apps by Nazi ele­ments to cel­e­brate school shoot­ings and encour­age them; dis­cus­sion of school shoot­er Niko­las Cruz of Park­land high and his Nazi, white suprema­cist and Trumpian influ­ence; dis­cus­sion of alt-right use of web­sites cater­ing to peo­ple suf­fer­ing from depres­sion for recruit­ing pur­pos­es.

1. A pas­sage ana­lyzes the essence of the video game/military rela­tion­ship for our soci­ety, today and in the future.

The Com­plex: How The Mil­i­tary Invades Our Every­day Lives by Nick Turse; Pic­a­dor [SC] Met­ro­pol­i­tan Books [Hen­ry Holt & Com­pa­ny]; Copy­right 2008 by Nick Turse; ISBN 978–0‑8050–8919‑6; pp. 139–140. 

. . . . Today, the mil­i­tary, toy, and gam­ing worlds are com­plete­ly entan­gled, and the future promis­es only more inter­pen­e­tra­tions and com­plex col­lab­o­ra­tions that would have made Dwight Eisenhower’s head spin. . . .

. . . . Cer­tain­ly, the day is not far off when most poten­tial U.S. troops will have grown up play­ing com­mer­cial video games that were cre­at­ed by the mil­i­tary as train­ing sim­u­la­tors; will be recruit­ed, at least in part, through video games; will be test­ed, post-enlist­ment, on advanced video game sys­tems; will be trained using sim­u­la­tors, which will lat­er be turned into video games, or on recon­fig­ured ver­sions of the very same games used to recruit them or that they played kids; will be taught to pilot vehi­cles using devices resem­bling com­mer­cial video game con­trollers; and then, after a long day of real-life war-gam­ing head back to their quar­ters to kick back and play the lat­est PlaySta­tion or Xbox games cre­at­ed with or spon­sored by their own, or anoth­er, branch of the armed forces.

More and more toys are now poised to become clan­des­tine com­bat teach­ing tools, and more and more sim­u­la­tors are des­tined to be tomorrow’s toys. And what of America’s chil­dren and young adults in all this? How will they be affect­ed by the daz­zling set of mil­i­tary train­ing devices now land­ing in their liv­ing rooms and on their PCs, pro­duced by video game giants under the  watch­ful eyes of the Pen­ta­gon? After all, what these games offer is less a mat­ter of sim­ple mil­i­tary indoc­tri­na­tion and more like a near immer­sion in a vir­tu­al world of war, where armed con­flict is not the last, but the first—and indeed the only—resort. . . .

2. Turse notes the “res­o­nant syn­the­sis” between video games, actu­al com­bat and mil­i­tary train­ing tech­nol­o­gy.

The Com­plex: How The Mil­i­tary Invades Our Every­day Lives by Nick Turse; Pic­a­dor [SC] Met­ro­pol­i­tan Books [Hen­ry Holt & Com­pa­ny]; Copy­right 2008 by Nick Turse; ISBN 978–0‑8050–8919‑6; pp. 130–132. 

. . . . DARPA approved (Cap­tain Jack] Thor­pe’s long-term plan to cre­ate the SIM­u­la­tor NET­work­ing or SIMNET, project using video game and enter­tain­ment indus­try tech­nol­o­gy. . . .

. . . . Just days into the ground com­bat por­tion of the Gulf War, the Bat­tle of 73 East­ing pit­ted Amer­i­can armored vehi­cles against a much larg­er Iraqi tank force. The U.S. troops, who had trained using the SIMNET sys­tem, rout­ed the Iraqis. With­in days, the mil­i­tary began turn­ing the actu­al bat­tle into a dig­i­tal sim­u­la­tion for use with SIMNET. Inten­sive debrief­ing ses­sions with 150 vet­er­ans of the bat­tle were under­tak­en. Then DARPA per­son­nel went out onto the bat­tle­field with the vet­er­ans, sur­vey­ing tank tracks and burned-out Iraqi vehi­cles, as the vet­er­ans walked them through each indi­vid­ual seg­ment of the clash. Addi­tion­al­ly, radio com­mu­ni­ca­tions, satel­lite pho­tos, and “black box­es” from U.S. tanks were used to gath­er even more details. Nine months after the actu­al com­bat took place, a dig­i­tal recre­ation of the Bat­tle of 73 East­ing was pre­miered for high-rank­ing mil­i­tary per­son­nel. Here was the cul­mi­na­tion of Thorpe’s efforts to cre­ate a net­worked sys­tem that would allow troops to train for future wars using the new tech­nol­o­gy com­bined with accu­rate his­tor­i­cal data. . . .

3. Plac­ing Hen­ry Miller’s quote into an iron­i­cal­ly-rel­e­vant con­text, a pop­u­lar video game “Doom” quick­ly was adapt­ed to Mar­tine Corps train­ing pur­pos­es:

The Com­plex: How The Mil­i­tary Invades Our Every­day Lives by Nick Turse; Pic­a­dor [SC] Met­ro­pol­i­tan Books [Hen­ry Holt & Com­pa­ny]; Copy­right 2008 by Nick Turse; ISBN 978–0‑8050–8919‑6; pp. 132–133. 

. . . . In late 1993, with the green glow of Gulf War vic­to­ry already fad­ing, id Soft­ware intro­duced the video game Doom. Gamers soon began mod­i­fy­ing share­ware copies of this ultra­vi­o­lent, ultra­pop­u­lar first per­son shoot­er, prompt­ing id to release edit­ing soft­ware the next year. The abil­i­ty to cus­tomize Doom caught the atten­tion of mem­bers of the Marine Corps Mod­el­ing and Sim­u­la­tion Man­age­ment Office who had been tasked by the corps’ Com­man­dant Charles Kru­lak with uti­liz­ing “per­son­al com­put­er (PC)-based war games” to help the marines “devel­op deci­sion mak­ing skills, par­tic­u­lar­ly when live train­ing time and oppor­tu­ni­ties are lim­it­ed.”

Act­ing on Krulak’s direc­tive, the marines’ mod­el­ing crew nixed Doom’s fan­ta­sy weapons and labyrinthine locale and, in three months’ time, devel­oped Marine Doom, a game that includ­ed only actu­al Marine Corps weapon­ry and real­is­tic envi­ron­ments. Kru­lak liked what he saw and, in 1997, approved the game. . . .

4. Turse dis­cuss­es Pen­ta­gon plans to oper­ate in urban slums in the Third World. Mr. Emory notes that many com­bat vet­er­ans of this coun­try’s long counter-insur­gency wars in Afghanistan and Iraq are join­ing the increas­ing­ly mil­i­ta­rized police forces in this coun­try.

Pen­ta­gon strat­e­gy as dis­cussed here by Turse may, even­tu­al­ly be real­ized, to an extent, in the U.S., par­tic­u­lar­ly in the event of an eco­nom­ic col­lapse.

The Com­plex: How The Mil­i­tary Invades Our Every­day Lives by Nick Turse; Pic­a­dor [SC] Met­ro­pol­i­tan Books [Hen­ry Holt & Com­pa­ny]; Copy­right 2008 by Nick Turse; ISBN 978–0‑8050–8919‑6; p. 239. 

. . . . In Plan­et of Slums, Mike Davis observes: “The Pentagon’s best minds have dared to ven­ture where most Unit­ed Nations, Word Bank or State Depart­ment types fear to go . . . . They now assert that the ‘fer­al, failed cities’ of the Third World—especially their slum outskirts—will be the dis­tinc­tive bat­tle­space of the twen­ty-first cen­tu­ry.” Pen­ta­gon war-fight­ing doc­trine, he nots, “Is being reshaped accord­ing­ly to sup­port a low-inten­si­ty world war of unlim­it­ed dura­tion against crim­i­nal­ized seg­ments of the urban poor.”

In Octo­ber 2006, the army issued an updat­ed “urban oper­a­tions” man­u­al. “Giv­en the glob­al pop­u­la­tion trends and the like­ly strate­gies and tac­tics of future threats,” it declared, “Army forces will like­ly con­duct oper­a­tions in, around, and over urban areas—not as a mat­ter of fate, but as a delib­er­ate choice linked to nation­al secu­ri­ty objec­tives and strat­e­gy, and at a time, place, and method of the commander’s choos­ing.” Glob­al eco­nom­ic depri­va­tion and poor hous­ing, the hall­marks of the urban slum, are, the man­u­al assert­ed, what makes “urban areas poten­tial sources of unrest” and thus increas­es “the like­li­hood of the Army’s involve­ment in sta­bil­i­ty oper­a­tions.” The manual’s authors were par­tic­u­lar­ly con­cerned about “idle” urban youth loosed, in the future slum city, from the “tra­di­tion­al social con­trols” of “vil­lage elders and clan lead­ers,” and thus, prey to manip­u­la­tion by “non­state actors.”. . .

5. More about Pen­ta­gon plans for urban war­fare in slums, osten­si­bly in the devel­op­ing world.

The Com­plex: How The Mil­i­tary Invades Our Every­day Lives by Nick Turse; Pic­a­dor [SC] Met­ro­pol­i­tan Books [Hen­ry Holt & Com­pa­ny]; Copy­right 2008 by Nick Turse; ISBN 978–0‑8050–8919‑6; pp. 245–246. 

. . . . As both the high-tech pro­grams and the pro­lif­er­at­ing train­ing facil­i­ties sug­gest, the for­eign slum city is slat­ed to become the bloody bat­tle­space of the future. Curi­ous­ly, the Pentagon’s con­cep­tu­al­iza­tion of urban space mim­ics Hollywood’s Escape from New York-meets Bladerun­ner-meets Zulu-meets-Robo­cop-style vision of the third-world city to come. [Mr. Emory sug­gests that this real­i­ty will come home to the U.S.]

For exam­ple, the U.S. Navy/Marine Corps launched a pro­gram seek­ing to devel­op algo­rithms to pre­dict the crim­i­nal­i­ty of a giv­en build­ing or neigh­bor­hood. The project titled Find­ing Repet­i­tive Crime Sup­port­ing Struc­tures, defines cities as noth­ing more than a col­lec­tion of “urban clut­ter [that] affords con­sid­er­able con­ceal­ment for the actors that we must cap­ture.” The “hos­tile behav­ior bad actors,” as the pro­gram terms them, are defined not just as “ter­ror­ists,” today’s favorite catch-all bogey­men, but as a panoply of night­mare arche­types: “insur­gents, ser­i­al killers, drug deal­ers, etc.” For its part, the army’s recent­ly revised Urban Oper­a­tions man­u­al offers an even more exten­sive list of “per­sis­tent and evolv­ing urban threats,” includ­ing region­al con­ven­tion­al mil­i­tary forces, para­mil­i­tary forces, gueril­las, and insur­gents as well as ter­ror­ists, crim­i­nal groups, and angry crowds. Even the pos­si­ble threat posed by com­put­er “hack­ers” is men­tioned. . . .

6. Park­land High School shoot­er Niko­las Cruz didn’t just sud­den­ly adopt a neo-Nazi world­view. He’s been stew­ing in these juices for years, and clear­ly had addi­tion­al men­tal health issues–the“Alt-Right” Nazi groups specif­i­cal­ly tar­get depressed peo­ple to take advan­tage of their dis­or­ders.

“Niko­las Cruz Was a Racist. Does That Make His Attack Ter­ror­ism?” by Dean Obei­dal­lah; The Dai­ly Beast; 03/01/2018.

On Tues­day, we learned a new, bone-chill­ing fact about the Park­land, Flori­da high school gun­man Niko­las Cruz that should’ve made nation­al head­lines but didn’t. That new devel­op­ment was that Cruz had etched swastikas on the ammu­ni­tion mag­a­zines he car­ried on the day he com­mit­ted his bru­tal mas­sacre that took 17 lives.

When I first heard of this devel­op­ment, my jaw dropped for two rea­sons. First, does any­one actu­al­ly believe if Cruz had etched the words “Allah Akbar” on his gun mag­a­zines we wouldn’t have heard about that for near­ly two weeks after the attack? No way. I can assure you that infor­ma­tion would’ve been made pub­lic, inten­tion­al­ly or by way of a leak. And then Don­ald Trump would almost cer­tain­ly have pounced–without wait­ing for addi­tion­al evidence–to label this an Islam­ic ter­ror attack and try to use it to fur­ther his own polit­i­cal agen­da.

But what also was shock­ing is that despite this new piece of evi­dence, togeth­er with Cruz’s known his­to­ry of hate direct­ed at peo­ple of col­or and Jews, we aren’t see­ing a fuller dis­cus­sion in the media about whether this shoot­ing was inspired by Cruz’s appar­ent white suprema­cist ide­ol­o­gy.

As CNN had report­ed with­in days of the Feb­ru­ary 14 attack, Cruz had in the past spewed vile com­ments in a pri­vate Insta­gram cha­t­room where he shared his hatred of “jews, ni**ers, immi­grants.” Cruz also wrote about killing Mex­i­cans and hat­ing black peo­ple sim­ply because of their skin col­or and he slammed Jews because in his twist­ed view they want­ed to destroy the world.

And Cruz’s white suprema­cist views also made their way from the online world to the real world. One of Cruz’s class­mates report­ed­ly told a social work­er that Cruz had drawn a swasti­ka on his book back next to the words “I hate ni***rs.” He also shared with oth­er stu­dents his “hat­ing on” Islam and slam­ming all Mus­lims as “ter­ror­ists and bombers.” And Cruz was also seen wear­ing a Trump MAGA hat when he was enrolled in school well before the attack.

While ini­tial reports that Cruz was actu­al­ly a mem­ber of a white suprema­cist group proved to be unfound­ed, there’s no dis­put­ing Cruz’s doc­u­ment­ed his­to­ry of spew­ing despi­ca­ble views that line up with the white nation­al­ist ide­ol­o­gy. But still, giv­en all that we’ve now learned, the ques­tion I have is: How much more evi­dence do we need before we dis­cuss in earnest whether Cruz’s white suprema­cist views played a role in this attack?!

True, there’s no evi­dence that Cruz tar­get­ed any spe­cif­ic group of peo­ple dur­ing his ram­page. But then again, ISIS-inspired ter­ror­ists who have com­mit­ted acts of ter­ror on U.S. soil, such as the man who inten­tion­al­ly drove a truck on a New York City pedes­tri­an walk­way in 2017 that killed eight, didn’t tar­get any spe­cif­ic race or reli­gion. He and oth­ers like him com­mit­ted acts of ter­ror in fur­ther­ance of their sick, per­vert­ed ideology—to spread ter­ror.

And the swastikas on Cruz’s gun mag­a­zines take on a greater sig­nif­i­cance when you exam­ine the shoot­ing itself. Of the 17 peo­ple Cruz killed, at least five were Jew­ish. (Some reports note it could be six.) Even more dis­turb­ing is that Cruz had report­ed­ly shot bul­lets into a Holo­caust his­to­ry class that killed two of those stu­dents. Did Cruz inten­tion­al­ly tar­get that class since he had for­mer­ly been a stu­dent at the school? We don’t know but giv­en Cruz’s his­to­ry this is cer­tain­ly a fair ques­tion. And since he’s that rare mass-shoot­er who’s still alive, I pre­sume he’ll be asked.

In fact, the ques­tion of whether Cruz’s gun mas­sacre was an anti-Semit­ic attack inspired by a white suprema­cist ide­ol­o­gy was raised in an op-edin the lib­er­al Israeli news­pa­per Ha’aretz even before we learned about the swastikas on Cruz’s gun mag­a­zines. There, the writer not­ed that Cruz had expressed views “that Jews were part of a con­spir­a­cy to unseat white peo­ple from pow­er and destroy the world.”In response to that arti­cle, the writer was sub­ject­ed to an avalanche of vile anti-Semit­ic barbs.

Giv­en these new­ly revealed swastikas, it’s long over­due that we have that con­ver­sa­tion about whether Cruz was more than a trou­bled youth. And to be clear, Cruz was trou­bled. He had been repeat­ed­ly dis­ci­plined at school for dis­turb­ing behav­ior and for a peri­od of time was placed in a spe­cial school for kids with emo­tion­al and behav­ior issues. On social media, he even wrote about his dream of becom­ing a “pro­fes­sion­al school shoot­er.” But when he was eval­u­at­ed in 2016 by a men­tal health pro­fes­sion­al, he was deter­mined to be sta­ble and not in need of being invol­un­tar­i­ly com­mit­ted to a men­tal health insti­tu­tion. . . .

7.  The Steam gam­ing app, a major dis­trib­u­tor for very pop­u­lar video games, has a neo-Nazi problem–neo-Nazis are using its chat room and voice-over-IP options to pro­mote their ide­ol­o­gy. Both the Dai­ly Stormer and Andrew Auern­heimer have Steam chat rooms, as does Atom­Waf­fen.

There’s also an over­lap­ping prob­lem with Steam chat forums that glo­ri­fy school shoot­ers. 173 such groups glo­ri­fy­ing school shoot­ings accord­ing to one count.

Steam isn’t the only pop­u­lar gam­ing app that this neo-Nazi prob­lem. Dis­cord, anoth­er very pop­u­lar app for gamers, also appears to have a num­ber of chat rooms run by neo-Nazis. The Ger­man­ic Recon­quista group of Ger­man neo-Nazis who were train­ing peo­ple how to game Youtube’s algo­rithms did that train­ing using Dis­cord. And, again, Steam and Dis­cord are both quite pop­u­lar.

The 173+ pop­u­lar video game chat forums on Steam that glo­ri­fy school shoot­ers are def­i­nite­ly part of the school shoot­ing prob­lem.

“Neo-Nazis, ‘Future School Shoot­ers’ Using Lead­ing Gam­ing App to Post Hate­ful Con­tent in Hun­dreds of Groups: Report” by Michael Edi­son Hay­den; Newsweek; 03/17/2018

A lead­ing gam­ing app that is pop­u­lar with adher­ents of the neo-Nazi wing of the alt-right move­ment has at least 173 groups ded­i­cat­ed to the glo­ri­fi­ca­tion of school shoot­ings, accord­ing to a report pub­lished last week by Reveal News. Sep­a­rate­ly, dozens of neo-Nazi groups have cul­ti­vat­ed active com­mu­ni­ties on the app.

The report notes that these Steam groups—which typ­i­cal­ly have between 30 and 200 active mem­bers—glo­ri­fy men like 22-year-old Elliot Rodger, who killed six peo­ple and injured over a dozen oth­ers in the vicin­i­ty of the cam­pus of Uni­ver­si­ty of Cal­i­for­nia, San­ta Bar­bara, before com­mit­ting sui­cide in 2014.

Rodger was a vir­u­lent misog­y­nist and want­ed to pun­ish women for reject­ing him. Oth­er shoot­ers, like Seung-Hui Cho, the Vir­ginia Tech senior who killed 32 peo­ple in 2007, are also hailed in these Steam groups. The groups have names like “School Shoot­ers Are Heroes” and “Shoot Up a School.” Some of them allude to “future” school shoot­ings yet to take place and are filled with racist lan­guage.

The link between vio­lence and the scat­tered cul­ture of inter­net Nazism has received greater scruti­ny in recent weeks, fol­low­ing a CBS News report that sus­pect­ed Park­land, Flori­da, mass shoot­er Niko­las Cruz alleged­ly pos­sessed gun mag­a­zines engraved with swastikas. Gam­ing apps like Steam have become increas­ing­ly pop­u­lar with­in that com­mu­ni­ty.

One exam­ple of neo-Nazis using Steam is Andrew “Weev” Auern­heimer, who han­dles the tech­ni­cal side of the white suprema­cist troll web­site Dai­ly Stormer, and sev­er­al months ago appeared to threat­en to “slaugh­ter” Jew­ish chil­dren in retal­i­a­tion for his web­site being tak­en offline. Auern­heimer appears to have a group on the app, which dis­cuss­es games in the con­text of whether they por­tray Adolf Hitler in a favor­able light. The broad­er com­mu­ni­ty of Dai­ly Stormer also appears to have an active com­mu­ni­ty on Steam called “Storm Sect” with rough­ly 200 mem­bers.

Oth­er neo-Nazi groups on Steam have more overt­ly hate­ful and vio­lent names like “Fag Lynch Squad,” which depicts shad­owy fig­ures hang­ing limply from noos­es in its pro­file pic­ture. Atom­Waf­fen Divi­sion, a neo-Nazi group linked to a num­ber of mur­ders, had its com­mu­ni­ty on Steam removed ear­li­er this month, Reveal News report­ed.

Angela Nagle, a left­ist writer, demon­strat­ed links between the ori­gins of the alt-right and gam­ing cul­ture in her book Kill All Normies: Online Cul­ture Wars From 4Chan And Tum­blr To Trump And The Alt-Right. The ven­er­a­tion of school shoot­ers and oth­er killers is sim­i­lar­ly linked. . . .

8. Over­lap­ping the use of gam­ing chat forums to recruit depressed peo­ple.

“The Alt-right is recruit­ing depressed peo­ple” by Paris Mar­tineau; The Out­line; 02/26/2018

A video on YouTube enti­tled “Advice For Peo­ple With Depres­sion” has over half a mil­lion views. The title is gener­ic enough, and to the unsus­pect­ing view­er, lec­tur­er Jor­dan Peter­son could even look legit­i­mate or knowl­edgable — a quick Google search will reveal that he even spoke at Har­vard once. But as the video wears on, Peter­son argues that men are depressed and frus­trat­ed because they don’t have a high­er call­ing like women (who, accord­ing to Peter­son, are bio­log­i­cal­ly required to have and take care of infants). This leaves weak men seek­ing “impul­sive, low-class plea­sure,” he argues. Upon first glance he cer­tain­ly doesn’t seem like a dar­ling of the alt-right, but he is.

Type “depres­sion” or “depressed” into YouTube and it won’t be long until you stum­ble upon a suit-clad white suprema­cist giv­ing a lec­ture on self-empow­er­ment. They’re every­where. For years, mem­bers of the alt-right have tak­en advan­tage of the internet’s most vul­ner­a­ble, turn­ing their fear and self-loathing into vit­ri­olic extrem­ism, and thanks to the movement’s recent gal­va­niza­tion, they’re only grow­ing stronger.

“I still won­der, how could I have been so stu­pid?” writes Red­dit user u/pdesperaux, in a post detail­ing how he was acci­den­tal­ly seduced by the alt-right. “I was part of a cult. I know cults and I know brain­wash­ing, I have researched them exten­sive­ly, you’d think I would have noticed, right? Wrong. These are the same tac­tics that Sci­en­tol­ogy and ISIS use and I fell for them like a chump.”

“NOBODY is talk­ing about how the online depres­sion com­mu­ni­ty has been infil­trat­ed by alt-right recruiters delib­er­ate­ly prey­ing on the vul­ner­a­ble,” writes Twit­ter user @MrHappyDieHappy in a thread on the issue. “There NEED to be pub­lic warn­ings about this. ‘Online pals’ have attempt­ed to groom me mul­ti­ple times when at my absolute low­est.”

“You know your life is use­less and mean­ing­less,” Peter­son says in his “Advice” video, turn­ing towards the view­er, “you’re full of self-con­tempt and nihilism.” He doesn’t fol­low all of this rous­ing self-hatred with an answer, but rather mere­ly teas­es at one. “[You] have had enough of that,” he says to a class­room full of men. “Rights, rights, rights, rights…”

Peterson’s alt-light mes­sag­ing quick­ly takes a dark­er turn. Fin­ish that video and YouTube will queue up “Jor­dan Peter­son – Don’t Be The Nice Guy” (1.3 mil­lion views), and “Jor­dan Peter­son – The Trag­ic Sto­ry of the Man-Child” (over 853,000 views), both of which are prac­ti­cal­ly right out of the redpill/incel hand­book.

The com­mon rail­road stages of ‘help­ful’ link­ing to ‘moti­va­tion­al speak­ers’ goes ‘Jor­dan Peter­son —> Ste­fan Molyneux —> Mil­len­ni­al Woes,” writes @MrHappyDieHappy. “The first is charis­mat­ic and not as harm­ful, but his per­sua­sive­ness leaves peo­ple open for the next two, who are frankly evil and dumb.” Molyneux, an anar­cho-cap­i­tal­ist who pro­motes sci­en­tif­ic racism and eugen­ics, has grown wild­ly pop­u­lar amongst the alt-right as of late. His videos — which argue, among oth­er things, that rape is a “moral right” — are often used to help tran­si­tion vul­ner­a­ble young men into the vit­ri­olic and racist core of the alt-right.

Though it may seem like a huge ide­o­log­i­cal leap, it makes sense, in a way. For some dis­il­lu­sioned and hope­less­ly con­fused young men, the alt-right offers two things they feel a seri­ous lack of in the throes of depres­sion: accep­tance and com­mu­ni­ty. These primer videos and their asso­ci­at­ed “sup­port” groups do a shock­ing­ly good job of acknowl­edg­ing the valid­i­ty of the depressed man’s exis­tence — some­thing men don’t often feel they expe­ri­ence — and cap­i­tal­ize on that good will by gal­va­niz­ing their mem­bers into a plan of action (which gen­er­al­ly involves fight­ing against some group or class of peo­ple des­ig­nat­ed as “the ene­my”). These sort of move­ments allot the depressed per­son a form of agency which they may nev­er have expe­ri­enced before. And whether it’s ground­ed in real­i­ty or not, that’s an addict­ing feel­ing.

Accord­ing to Chris­t­ian Pic­ci­oli­ni, a for­mer neo-nazi who co-found­ed the peace advo­ca­cy orga­ni­za­tion, Life After Hate, these sort of recruit­ing tac­tics aren’t just com­mon, but sys­tem­at­i­cal­ly enforced. “[The recruiters] are active­ly look­ing for these kind of bro­ken indi­vid­u­als who they can promise accep­tance, who they can promise iden­ti­ty to,” Pic­ci­oli­ni said in an inter­view with Sam Seder. “Because in real life, per­haps these peo­ple are social­ly awk­ward — they’re not fit­ting in; they may be bul­lied — and they’re des­per­ate­ly look­ing for some­thing. And the ide­ol­o­gy and the dog­ma are not what dri­ve peo­ple to this extrem­ism, it’s in fact, I think, a bro­ken search for that accep­tance and that pur­pose and com­mu­ni­ty.” . . . .

9. We con­clude with some obser­va­tions, post­ed in the writ­ten descrip­tion for FTR#1003.

” . . . . The role of the media in con­di­tion­ing young peo­ple to kill is a major focal point of the book On Killing by Lieu­tenant Colonel Dave Gross­man, who taught psy­chol­o­gy at West Point. From Ama­zon’s pro­mo­tion­al text for Gross­man­’s book: “The good news is that most sol­diers are loath to kill. But armies have devel­oped sophis­ti­cat­ed ways of over­com­ing this instinc­tive aver­sion. And con­tem­po­rary civil­ian soci­ety, par­tic­u­lar­ly the media, repli­cates the army’s con­di­tion­ing tech­niques, and, accord­ing to Lt. Col. Dave Gross­man­’s the­sis, is respon­si­ble for our ris­ing rate of mur­der among the young. Upon its ini­tial pub­li­ca­tion, ON KILLING was hailed as a land­mark study of the tech­niques the mil­i­tary uses to over­come the pow­er­ful reluc­tance to kill, of how killing affects sol­diers, and of the soci­etal impli­ca­tions of esca­lat­ing vio­lence. Now, Gross­man has updat­ed this clas­sic work to include infor­ma­tion on 21st-cen­tu­ry mil­i­tary con­flicts, recent trends in crime, sui­cide bomb­ings, school shoot­ings, and more. The result is a work cer­tain to be rel­e­vant and impor­tant for decades to come.”

Our high body-count movies and TV pro­grams, as well as point-and-shoot video games, accord­ing to Gross­man, repli­cate to a con­sid­er­able degree the audio-visu­al desen­si­ti­za­tion tech­niques used by con­tem­po­rary armies to help recruits over­came their inhi­bi­tions about killing. We sug­gest Gross­man­’s the­sis as a fac­tor in the school mas­sacres. . . .”

 

 

Discussion

One comment for “FTR#1221 War Games, Part 3 (Rittenhouse Nation)”

  1. There’s a new piece in Wired on the com­mer­cial­iza­tion of the data col­lect­ed by the video game indus­try. On one lev­el, it’s the same old sto­ry we’ve been hear­ing over and over: the gen­er­a­tion of large vol­umes of new data has led to the devel­op­ment of a whole new indus­try focused on har­vest­ing and new Big Data exploita­tion. And yet, as the arti­cle points out, this real­ly is an excep­tion­al new area for Big Data exploita­tion. Because as Sil­i­con Val­ley has learned long ago, video games pro­vide an excep­tion­al­ly rich envi­ron­ment for record­ing human inter­ac­tions, with all of the poten­tial infer­ences that can be derived from those inter­ac­tions. Every­thing from the play­er’s sex­u­al ori­en­ta­tion to their per­son­al­i­ty char­ac­ter­is­tics. In many ways it sounds like the kind of psy­cho­graph­ic pro­fil­ing Cam­bridge Ana­lyt­i­ca was engaged in using Face­book data. But in this case it’s just data gath­ered while play­ing a game, mak­ing it poten­tial­ly far more inva­sive. After all, you expect to be pro­filed by Face­book. Not so much by your game devel­op­er. And yet it’s hap­pen­ing any­way:

    Wired

    The Unnerv­ing Rise of Video Games that Spy on You
    Play­ers gen­er­ate a wealth of reveal­ing psy­cho­log­i­cal data—and some com­pa­nies are soak­ing it up.

    Ben Eglis­ton
    Feb 1, 2022 8:00 AM

    Tech con­glom­er­ate Ten­cent caused a stir last year with the announce­ment that it would com­ply with China’s direc­tive to incor­po­rate facial recog­ni­tion tech­nol­o­gy into its games in the coun­try. The move was in line with China’s strict gam­ing reg­u­la­tion poli­cies, which impose lim­its on how much time minors can spend play­ing video games—an effort to curb addic­tive behav­ior, since gam­ing is labeled by the state as “spir­i­tu­al opi­um.”

    The state’s use of bio­met­ric data to police its pop­u­la­tion is, of course, inva­sive, and espe­cial­ly under­mines the pri­va­cy of under­age users—but Ten­cent is not the only video game com­pa­ny to track its play­ers, nor is this recent case an alto­geth­er new phe­nom­e­non. All over the world, video games, one of the most wide­ly adopt­ed dig­i­tal media forms, are installing net­works of sur­veil­lance and con­trol.

    In basic terms, video games are sys­tems that trans­late phys­i­cal inputs—such as hand move­ment or gesture—into var­i­ous elec­tric or elec­tron­ic machine-read­able out­puts. The user, by act­ing in ways that com­ply with the rules of the game and the spec­i­fi­ca­tions of the hard­ware, is parsed as data by the video game. Writ­ing almost a decade ago, the soci­ol­o­gists Jen­nifer R. Whit­son and Bart Simon argued that games are increas­ing­ly under­stood as sys­tems that eas­i­ly allow the reduc­tion of human action into know­able and pre­dictable for­mats.

    Video games, then, are a nat­ur­al medi­um for track­ing, and researchers have long argued that large data sets about play­ers’ in-game activ­i­ties are a rich resource in under­stand­ing play­er psy­chol­o­gy and cog­ni­tion. In one study from 2012, Nick Yee, Nico­las Duch­e­neaut, and Les Nel­son scraped play­er activ­i­ty data logged on the World of War­craft Armory website—essentially a data­base that records all the things a player’s char­ac­ter has done in the game (how many of a cer­tain mon­ster I’ve killed, how many times I’ve died, how many fish I’ve caught, and so on).

    The researchers used this data to infer per­son­al­i­ty char­ac­ter­is­tics (in com­bi­na­tion with data yield­ed through a sur­vey). The paper sug­gests, for exam­ple, that there is a cor­re­la­tion between the sur­vey respon­dents clas­si­fied as more con­sci­en­tious in their game-play­ing approach and the ten­den­cy to spend more time doing repet­i­tive and dull in-game tasks, such as fish­ing. Con­verse­ly, those whose char­ac­ters more often fell to death from high places were less con­sci­en­tious, accord­ing to their sur­vey respons­es.

    Cor­re­la­tion between per­son­al­i­ty and quan­ti­ta­tive game­play data is cer­tain­ly not unprob­lem­at­ic. The rela­tion­ship between per­son­al­i­ty and iden­ti­ty and video game activ­i­ty is com­plex and idio­syn­crat­ic; for instance, research sug­gests that gamer iden­ti­ty inter­sects with gen­der, racial, and sex­u­al iden­ti­ty. Addi­tion­al­ly, there has been gen­er­al push­back against claims of Big Data’s pro­duc­tion of new knowl­edge root­ed in cor­re­la­tion. Despite this, games com­pa­nies increas­ing­ly real­ize the val­ue of big data sets to gain insight into what a play­er likes, how they play, what they play, what they’ll like­ly spend mon­ey on (in freemi­um games), how and when to offer the right con­tent, and how to solic­it the right kinds of play­er feel­ings.

    While there are no num­bers on how many video game com­pa­nies are sur­veilling their play­ers in-game (although, as a recent arti­cle sug­gests, large pub­lish­ers and devel­op­ers like Epic, EA, and Activi­sion explic­it­ly state they cap­ture user data in their license agree­ments), a new indus­try of firms sell­ing mid­dle­ware “data ana­lyt­ics” tools, often used by game devel­op­ers, has sprung up. These data ana­lyt­ics tools promise to make users more amenable to con­tin­ued con­sump­tion through the use of data analy­sis at scale. Such ana­lyt­ics, once avail­able only to the largest video game studios—which could hire data sci­en­tists to cap­ture, clean, and ana­lyze the data, and soft­ware engi­neers to devel­op in-house ana­lyt­ics tools—are now com­mon­place across the entire indus­try, pitched as “acces­si­ble” tools that pro­vide a com­pet­i­tive edge in a crowd­ed mar­ket­place by com­pa­nies like Uni­ty, Game­An­a­lyt­ics, or Ama­zon Web Ser­vices. (Although, as a recent study shows, the extent to which these tools are tru­ly “acces­si­ble” is ques­tion­able, requir­ing tech­ni­cal exper­tise and time to imple­ment.) As demand for data-dri­ven insight has grown, so have the range of dif­fer­ent services—dozens of tools in the past sev­er­al years alone, pro­vid­ing game devel­op­ers with dif­fer­ent forms of insight. One tool—essen­tial­ly Uber for playtesting—allows com­pa­nies to out­source qual­i­ty assur­ance test­ing, and pro­vides data-dri­ven insight into the results. Anoth­er sup­pos­ed­ly uses AI to under­stand play­er val­ue and max­i­mize reten­tion (and spend­ing, with a focus on high-spenders).

    Devel­op­ers might use data from these mid­dle­ware com­pa­nies to fur­ther refine their game (play­ers might be get­ting over­ly frus­trat­ed and dying at a par­tic­u­lar point, indi­cat­ing the game might be too dif­fi­cult) or their mon­e­ti­za­tion strate­gies (prompt­ing in-app purchases—such as extra lives—at such a point of dif­fi­cul­ty). But our data is not just valu­able to video game com­pa­nies in fine-tun­ing design. Increas­ing­ly, video game com­pa­nies exploit this data to cap­i­tal­ize user atten­tion through tar­get­ed adver­tise­ments. As a 2019 eMar­keter report sug­gests, the val­ue of video games as a medi­um for adver­tis­ing is not just in access to large-scale audi­ence data (such as the Uni­ty ad network’s claim to bil­lions of users), but through ad for­mats such as playable and reward­ed advertisements—that is, access to audi­ences more like­ly to pay atten­tion to an ad.

    These adver­tise­ments serve numer­ous ends, such as facil­i­tat­ing user acqui­si­tion (ads for oth­er games or apps), and increas­ing­ly, brand adver­tis­ing. Sim­i­lar to the approach of dig­i­tal adver­tis­ing giants Google and Face­book, where the data gen­er­at­ed by plat­form users (clicks, swipes, likes, dis­likes, pur­chas­es, move­ments, behav­iors, inter­ests, and so on) sup­pos­ed­ly facil­i­tates the place­ment of adver­tise­ments in front of the “right” audi­ences (as Unity’s exec­u­tives note in a tran­script of a recent quar­ter­ly earn­ings call), video game com­pa­nies are attempt­ing to har­ness the bil­lions of inter­ac­tions that take place with­in their games to cre­ate new rev­enue streams. These com­pa­nies sell the eye­balls (and per­haps fin­gers, with playable ads) of their users to adver­tis­ers and mobi­lize data to best match users with adver­tis­ers based on the spec­i­fi­ca­tions of the adver­tis­er or the soft­ware work­ing on the advertiser’s behalf.

    The data-rich­ness of video games has also had an impact beyond the video game industry’s attempts to shape play­er atten­tion. The log­ic of games is used to gam­i­fy func­tions and derive infor­ma­tion that might not have been oth­er­wise vol­un­teered. Indeed, Yee and col­leagues’ study of World of War­craft play­er moti­va­tion frames the val­ue of cor­re­lat­ing sen­ti­ment or per­son­al­i­ty with user activ­i­ty around the growth of gam­i­fi­ca­tion in soci­ety. To bet­ter under­stand how and why peo­ple play games in cer­tain ways is, as the authors sug­gest, to bet­ter under­stand how to make game­like inter­faces beyond the con­text of gam­ing more com­pelling.

    For instance, the Go365 health insur­ance app solicit­ed infor­ma­tion from users—such as blood glu­cose lev­els, sleep cycle, diet, whether they drink or smoke, or wider fam­i­ly med­ical his­to­ries— using gam­i­fi­ca­tion log­ics of points and rewards to devel­op (more prof­itable) per­son­al­ized insur­ance pro­files. These iden­ti­fied cat­e­gories of risk that pre­clude some from cer­tain kinds of insur­ance or dri­ve up their pre­mi­ums.

    A 2017 arti­cle in The New York Times revealed that Uber’s dri­ver inter­face used gam­i­fi­ca­tion tech­niques like rewards and points to cre­ate a “per­fect­ly effi­cient sys­tem” where dri­ver sup­ply can meet rid­er demand. Cru­cial­ly, the arti­cle revealed that Uber—through employ­ing both social and data scientists—optimized these sys­tems for com­pelling con­tin­ued labor sus­tain­ing the plat­form.

    Beyond gam­i­fi­ca­tion tech­niques opti­mized using data, we are begin­ning to see the use of gam­i­fi­ca­tion tech­niques to gen­er­ate data about work­er per­for­mance. Amazon’s ware­hous­es are report­ed­ly begin­ning to gam­i­fy labor to fur­ther make work­ers keep to “Ama­zon pace” (some­where between walk­ing and jogging)—a move that quite lit­er­al­ly resem­bles the plot of an episode of the dystopi­an tele­vi­sion show Black Mir­ror. As The Wash­ing­ton Post has report­ed, high per­for­mance in these (cur­rent­ly option­al) games—with titles like Mis­sion­Rac­er, PicksIn­Space, Drag­on Duel, and Castle­Crafter——can be exchanged for “Swag Bucks, a pro­pri­etary cur­ren­cy that can be used to buy Ama­zon logo stick­ers, appar­el or oth­er goods.” Under the guise of gam­i­fi­ca­tion, it is not a stretch to imag­ine how work­ers may be fur­ther dis­ci­plined through more inva­sive data-veil­lance in order to inten­si­fy their pro­duc­tiv­i­ty at the expense of their wel­fare.

    Because video games are sys­tems that trans­late human inputs into machine-read­able data, they have been afford­ed impor­tant sta­tus in dri­ving so-called Sil­i­con Val­ley inno­va­tion. One area has been the appli­ca­tion of games in the devel­op­ment of AI. In a kind of one-upman­ship of chess-play­ing algo­rithms, Alphabet’s AlphaS­tar AI and OpenAI’s Ope­nAI Five were trained to play the strat­e­gy games Star­craft 2 and Dota 2, respectively—famously, best­ing some of the world’s top play­ers. To do so, these AI were trained using tech­niques like rein­force­ment learn­ing, where essen­tial­ly the AI played match­es against itself—churning through thou­sands of years’ worth of game­play (and learn­ing from this data) with­in months.

    For these com­pa­nies, learn­ing to play video games at a high lev­el isn’t the end goal. For a com­pa­ny like Ope­nAI, train­ing on Dota 2 has appli­ca­tions to phys­i­cal robot­ics. Darpa—the Depart­ment of Defense’s research and devel­op­ment arm—has spon­sored efforts to use games to devel­op AI for mil­i­tary appli­ca­tion. Gamebreaker—a project that engages both acad­e­mia and indus­try (includ­ing defense and arms con­trac­tors like Lock­heed Mar­tin and Northrop Grum­man)— aims to use video-game-play­ing “AI to exploit engage­ment mod­els … to enable intel­li­gent sys­tems that could in turn enhance mil­i­tary strat­e­gy.”

    Train­ing AI on com­plex games—games that take humans thou­sands of hours to master—also serves to drum up sup­port for AI, sell­ing it to investors, pol­i­cy­mak­ers, and publics as some­thing cred­i­ble amid grow­ing crit­i­cism about exag­ger­at­ed (or out­right fraud­u­lent) claims of its effi­ca­cy and verac­i­ty. If we are told that AI can mas­ter Star­craft, then it might make us feel a bit bet­ter about the prospect of AI dri­ving a car, assess­ing debt, and so on.

    More spec­u­la­tive­ly, video games are align­ing with the devel­op­ment of new forms of embod­ied com­put­ing inter­faces, a train­ing ground for tech­nolo­gies such as brain-com­put­er inter­faces (BCI)—augmenting brain capa­bil­i­ties with com­pu­ta­tion. Valve Cor­po­ra­tion founder Gabe Newell, whose com­pa­ny an ear­ly adopter of BCI in the video game indus­try, sug­gests that BCI enabled games, built into things like future VR head­sets, could well track data points telling us whether peo­ple are hap­py, sad, sur­prised, or bored. Recent­ly, The Finan­cial Times report­ed on a series of patents grant­ed to Meta that sug­gest that future aug­ment­ed and vir­tu­al real­i­ty head­sets (where the com­pa­ny sees gam­ing as one major appli­ca­tion) may use bio­met­ric data (such as gaze, in one patent) lever­aged for pur­pos­es such as adver­tis­ing. In this sense, not only can games be used to make infer­ences about us from our choic­es, the val­ue propo­si­tion of forms of embod­ied com­put­ing inter­faces from VR to AR to BCIs is to pro­vide access to the phys­i­o­log­i­cal process­es under­pin­ning those choic­es.

    ...

    —————-

    “The Unnerv­ing Rise of Video Games that Spy on You” by Ben Eglis­ton; Wired; 02/01/2022

    While there are no num­bers on how many video game com­pa­nies are sur­veilling their play­ers in-game (although, as a recent arti­cle sug­gests, large pub­lish­ers and devel­op­ers like Epic, EA, and Activi­sion explic­it­ly state they cap­ture user data in their license agree­ments), a new indus­try of firms sell­ing mid­dle­ware “data ana­lyt­ics” tools, often used by game devel­op­ers, has sprung up. These data ana­lyt­ics tools promise to make users more amenable to con­tin­ued con­sump­tion through the use of data analy­sis at scale. Such ana­lyt­ics, once avail­able only to the largest video game studios—which could hire data sci­en­tists to cap­ture, clean, and ana­lyze the data, and soft­ware engi­neers to devel­op in-house ana­lyt­ics tools—are now com­mon­place across the entire indus­try, pitched as “acces­si­ble” tools that pro­vide a com­pet­i­tive edge in a crowd­ed mar­ket­place by com­pa­nies like Uni­ty, Game­An­a­lyt­ics, or Ama­zon Web Ser­vices. (Although, as a recent study shows, the extent to which these tools are tru­ly “acces­si­ble” is ques­tion­able, requir­ing tech­ni­cal exper­tise and time to imple­ment.) As demand for data-dri­ven insight has grown, so have the range of dif­fer­ent services—dozens of tools in the past sev­er­al years alone, pro­vid­ing game devel­op­ers with dif­fer­ent forms of insight. One tool—essen­tial­ly Uber for playtesting—allows com­pa­nies to out­source qual­i­ty assur­ance test­ing, and pro­vides data-dri­ven insight into the results. Anoth­er sup­pos­ed­ly uses AI to under­stand play­er val­ue and max­i­mize reten­tion (and spend­ing, with a focus on high-spenders).

    A whole new indus­try of data extrac­tion has blos­somed inside the video game indus­try, with the data ana­lyt­ics drawn from the game­play itself being used for every­thing from tar­get­ed ads inside the games to refin­ing the games them­selves (and poten­tial­ly mak­ing them even more engag­ing and/or addict­ed). But it’s the poten­tial abil­i­ty to draw infer­ences about the per­son­al­i­ty char­ac­ter­is­tics of the game play­ers them­selves that makes Big Data har­vest­ing of video games an incred­i­bly pow­er­ful new source of data:

    ...
    In basic terms, video games are sys­tems that trans­late phys­i­cal inputs—such as hand move­ment or gesture—into var­i­ous elec­tric or elec­tron­ic machine-read­able out­puts. The user, by act­ing in ways that com­ply with the rules of the game and the spec­i­fi­ca­tions of the hard­ware, is parsed as data by the video game. Writ­ing almost a decade ago, the soci­ol­o­gists Jen­nifer R. Whit­son and Bart Simon argued that games are increas­ing­ly under­stood as sys­tems that eas­i­ly allow the reduc­tion of human action into know­able and pre­dictable for­mats.

    Video games, then, are a nat­ur­al medi­um for track­ing, and researchers have long argued that large data sets about play­ers’ in-game activ­i­ties are a rich resource in under­stand­ing play­er psy­chol­o­gy and cog­ni­tion. In one study from 2012, Nick Yee, Nico­las Duch­e­neaut, and Les Nel­son scraped play­er activ­i­ty data logged on the World of War­craft Armory website—essentially a data­base that records all the things a player’s char­ac­ter has done in the game (how many of a cer­tain mon­ster I’ve killed, how many times I’ve died, how many fish I’ve caught, and so on).

    The researchers used this data to infer per­son­al­i­ty char­ac­ter­is­tics (in com­bi­na­tion with data yield­ed through a sur­vey). The paper sug­gests, for exam­ple, that there is a cor­re­la­tion between the sur­vey respon­dents clas­si­fied as more con­sci­en­tious in their game-play­ing approach and the ten­den­cy to spend more time doing repet­i­tive and dull in-game tasks, such as fish­ing. Con­verse­ly, those whose char­ac­ters more often fell to death from high places were less con­sci­en­tious, accord­ing to their sur­vey respons­es.

    Cor­re­la­tion between per­son­al­i­ty and quan­ti­ta­tive game­play data is cer­tain­ly not unprob­lem­at­ic. The rela­tion­ship between per­son­al­i­ty and iden­ti­ty and video game activ­i­ty is com­plex and idio­syn­crat­ic; for instance, research sug­gests that gamer iden­ti­ty inter­sects with gen­der, racial, and sex­u­al iden­ti­ty. Addi­tion­al­ly, there has been gen­er­al push­back against claims of Big Data’s pro­duc­tion of new knowl­edge root­ed in cor­re­la­tion. Despite this, games com­pa­nies increas­ing­ly real­ize the val­ue of big data sets to gain insight into what a play­er likes, how they play, what they play, what they’ll like­ly spend mon­ey on (in freemi­um games), how and when to offer the right con­tent, and how to solic­it the right kinds of play­er feel­ings.
    ...

    And this is all, of course, still just the dawn­ing of this era of Big Data har­vest­ing from video game play­ers. Between the DARPA fund­ing work in this space to enhance mil­i­tary train­ing and the com­mer­cial inter­est in tech­nolo­gies like chips embed­ded in the brain, it’s just a mat­ter of time before the tech­no­log­i­cal infra­struc­ture need­ed to har­vest this data is not just avail­able but effec­tive­ly ubiq­ui­tous:

    ...
    For these com­pa­nies, learn­ing to play video games at a high lev­el isn’t the end goal. For a com­pa­ny like Ope­nAI, train­ing on Dota 2 has appli­ca­tions to phys­i­cal robot­ics. Darpa—the Depart­ment of Defense’s research and devel­op­ment arm—has spon­sored efforts to use games to devel­op AI for mil­i­tary appli­ca­tion. Gamebreaker—a project that engages both acad­e­mia and indus­try (includ­ing defense and arms con­trac­tors like Lock­heed Mar­tin and Northrop Grum­man)— aims to use video-game-play­ing “AI to exploit engage­ment mod­els … to enable intel­li­gent sys­tems that could in turn enhance mil­i­tary strat­e­gy.”

    Train­ing AI on com­plex games—games that take humans thou­sands of hours to master—also serves to drum up sup­port for AI, sell­ing it to investors, pol­i­cy­mak­ers, and publics as some­thing cred­i­ble amid grow­ing crit­i­cism about exag­ger­at­ed (or out­right fraud­u­lent) claims of its effi­ca­cy and verac­i­ty. If we are told that AI can mas­ter Star­craft, then it might make us feel a bit bet­ter about the prospect of AI dri­ving a car, assess­ing debt, and so on.

    More spec­u­la­tive­ly, video games are align­ing with the devel­op­ment of new forms of embod­ied com­put­ing inter­faces, a train­ing ground for tech­nolo­gies such as brain-com­put­er inter­faces (BCI)—augmenting brain capa­bil­i­ties with com­pu­ta­tion. Valve Cor­po­ra­tion founder Gabe Newell, whose com­pa­ny an ear­ly adopter of BCI in the video game indus­try, sug­gests that BCI enabled games, built into things like future VR head­sets, could well track data points telling us whether peo­ple are hap­py, sad, sur­prised, or bored. Recent­ly, The Finan­cial Times report­ed on a series of patents grant­ed to Meta that sug­gest that future aug­ment­ed and vir­tu­al real­i­ty head­sets (where the com­pa­ny sees gam­ing as one major appli­ca­tion) may use bio­met­ric data (such as gaze, in one patent) lever­aged for pur­pos­es such as adver­tis­ing. In this sense, not only can games be used to make infer­ences about us from our choic­es, the val­ue propo­si­tion of forms of embod­ied com­put­ing inter­faces from VR to AR to BCIs is to pro­vide access to the phys­i­o­log­i­cal process­es under­pin­ning those choic­es.
    ...

    It’s almost inevitable: the rich­er and more advanced gam­ing becomes, the more data these games will end up gath­er­ing on the users. The more data they gath­er, the bet­ter these games will get at things like tar­get­ed in-game ads and oth­er rev­enue streams that fur­ther dri­ve the devel­op­ment of these gam­ing Big Data tech­nolo­gies. The self-rein­forc­ing cycle of prof­it-dri­ven tech­no­log­i­cal advance­ments is already spin­ning and all signs point towards the emer­gence of new ful­ly-immer­sive gam­ing tech­nol­o­gy that that could drown this indus­try in an ocean of per­son­al­ized data. The kind of ocean of per­son­al­ized data that’s only going to fuel the hunger for even more data and more immer­sive gam­ing expe­ri­ences. It points towards what is per­haps the most dis­turb­ing part of the sto­ry of Big Data gam­ing: it’s going to be a high­ly seduc­tive and very enter­tain­ing Panop­ti­con. And not just for the play­ers.

    Posted by Pterrafractyl | February 9, 2022, 2:06 pm

Post a comment