Wednesday, 3 February 2021

Top Gunned: Behind the Persistent Pushing of the Envelope in 90s and 2000s Pop Culture

 Where Did That Proverbial "Envelope" Go Anyway? And What Was its Ultimate Purpose?


by James Albert Barr

"The world is a business, Mr. Beale!" - Network (1976)

Where did the phrase, "pushing the envelope" originate? Apparently, it was in 1944 during the late stages of World War II, and was associated with the American Air Force pertaining to aircraft, particularly test-flights, where a plane's designated altitude and speed limits are potentially pushed to the max where it becomes dangerous and possibly fatal for the pilot. The term "envelope" was actually first used in mathematics in a technical and engineering context, before being applied to aeronautics directly. 

It wasn't until famed author, Tom Wolfe, incorporated "pushing the envelope" in his 1979 novel, The Right Stuff, that the phrase started to gain traction in popular culture. One film, featuring the air-force and fighter planes as its main setting, the 1986 blockbuster Top Gun, which literally launched Tom Cruise's career as a bona fide movie star into the proverbial stratosphere, references "the envelope" in an early scene where Tom Skerritt's character, flight instructor, Mike "Viper" Metcalf, addresses "the top 1% of naval aviators - the elite", and says: "We're gonna teach you to fly the F-14 right to the edge of the envelope - faster than you've ever flown before,...but more dangerous." - "Highway toooo the danger zone!/ I'll take you riiiight intooo the danger zone!"

I find it very interesting that during this scene from Top Gun, when "Viper" Metcalf is talking about "the envelope" and the danger involved in riding its edge or beyond, the camera is off of Tom Skerritt and is instead focused on Tom "Maverick" Cruise, who is distracted by Val "Iceman" Kilmer - whom himself is not paying attention to his instructor - attempting to psych Maverick out by clumsily flipping a pen between his fingers. We, as the audience, are too having our attention directed towards Maverick and Iceman, thus likely not hearing what Viper is saying about "the envelope". It was clearly a deliberate editing choice to distract the audience from the dialogue being spoken and instead direct our collective attention onto Maverick and Iceman goofing off. Why was that?


It's no secret that Top Gun was essentially a glorified, nearly two-hour ad for air-force recruitment, and was likely funded, at least partially, by the American military-industrial complex. Well, the blockbuster "commercial" worked perfectly, because air-force recruitment "skyrocketed" by some 400% or so, shortly after the film's screen-run. This could very well be a text-book example of "predictive programming" onto the unsuspecting, and relatively "predictable" masses/sheep/specific demographic. It'll be very interesting to read the new semiotics and subtext infused into the upcoming Top Gun sequel, the post-9/11, Trump-era Top Gun: Maverick (or at least filmed during the Trump-era), which is supposedly getting released in July. 

I remember, while living through the 1990s, and much of the 2000s, an often asserted phrase was uttered regarding what was generally opined as "risque and provocative pop culture". Of course that phrase was this: "It's pushing the envelope" or "They're pushing the envelope". The phrase had transcended its original definition. It wasn't yet as bandied about in 1980s popular vernacular as it would the following decade, if memory serves, being a teenager myself in the 80s. A certain movie (The Silence of the Lambs, Pulp Fiction, Se7en, etc) or TV show (Seinfeld, Twin Peaks, The Larry Sanders Show, etc) or song ("Smells Like Teen Spirit", "Cop Killer", "Smack My Bitch Up", etc) or album (Ritual de lo habitual, Ready to Die, Mechanical Animals, etc) or book (American Psycho, Daddy's Roommate, Infinite Jest, etc) or piece of art, what-have-you, predominantly aimed at the 18-35 demographic, would sometimes court enough controversy and negative reaction that it would enter the public discourse and even get national exposure in newspaper columns, editorials and mainstream news programs, as well as sensationalist, afternoon talk shows (Ricky Lake, Jerry Springer, Montel Williams, etc), which, by the 90s, were all the rage.

In fact, the notion of "pushing the envelope" got so pervasive by the mid-2000s, as entertainment got more and more boundary-pushing risque, edgy and controversial, I was actually compelled to write down these three questions in one of my several miscellaneous notebooks, where I collected quotes, citations, lyrics, poems and general thoughts and ideas that suddenly came to me, and jotted them down:

1. What does it really mean to push the figurative "envelope"?
2. Is it not the case that, right now, a permissive, illusory pushing of "the envelope" is actually in effect?
3. Can "the envelope", conversely, be pushed too far unto irreparable excess and damage?

What did I mean exactly by "a permissive, illusory pushing of 'the envelope'"? I really wasn't quite sure when I originally wrote down that question over a dozen years ago. And what was I driving at asking if the so-called "envelope" can be pushed too far, potentially creating "excess" and causing subsequent "damage". Well, given what has transpired over the last decade, in particular, in our culture, our society, our very world, I'm now left with another loaded question:

Could it possibly be the case that the act(s) of "pushing the envelope" helped, if not unwittingly (or maybe even wittingly? - from an outside/inside source engineering it, surreptitiously, that is) provoked and gave rise to what we now know as Woke and Cancel Culture? Not necessarily as a singular agent that gradually engendered (or rather ungendered, to be punnily more precise) political division among the masses, particularly emanating from coming-of-age millennials (especially those with wide-spreading gender dysphoria and hyper-sensitve dispositions/social anxiety issues, "conveniently enough") and aging, leftist Gen-Xers with a longstanding "axe to grind", but as a no-less significant symptom/aspect of the over-arching "slow march through the institutions" to ultimately gain cultural power, political power, state power, municipal power, technological power, economic power, familial power, ethical power and philosophical power, ultimately for its own sake, and not for the reasons that most SJWs and identity intersectionalists are being led to believe.

Of course, one of, if not the X-factor behind the utterly implacable, easily "triggered" and rigid self-righteousness of the "cult of woke" clique, who are mostly wielding their collective power in the "digital world" on-line on social media platforms like Twitter and Instagram and Facebook, is none other than civilization's age-old nemesis of cultural and systemic dysfunction: narcissism!

[from Google Trends]


We've seen article after article, book after book, TV segment after TV segment, YouTube video after YouTube video lamenting on the astronomical rise of narcissism in contemporary culture. This is not exactly a societal secret. Whatever side of the cultural, political, philosophical, racial or spiritual divide one falls into, you can rest assured that that societal faction has to deal with narcissism, be they extroverted or even introverted (I personally know of several introverts that can tenably be diagnosed as full-blown narcissists). It's a universal constant, spanning centuries of human history, especially in situations that involved a "power dynamic", either publically or not. Although the "public examples" of narcissistic power-mongering can be far more damaging for everyone within its megalomaniacal vicinity, physically or technologically, or, dare I say, even "magically", if from what I've been gathering, relatively of late, from certain "peripheral sources" harbors any merit at all, that is. Who knows for sure, right? Has Jean Gebser's notion of the magical consciousness structure somehow resurfaced in the cultural (un)consciousness of the early 21st century human psyche? Perhaps it's no coincidence that magic, sorcery and spell-casting has become all the rage (casually or seriously via fandoms and/or proliferating (oc)cult activity) in pop culture since shortly after the millennium a la post-Harry Potter/Lord of the Rings phenomenon. It gives the discerning (and open-minded) pause, does it not?

This all sounds crazy and tin-foil-y doesn't it? Of course it does. And there-in lies its ironic, counter-tactic power. Just remember, the touchy nomenclature, "conspiracy theory" was purportedly manufactured and disseminated by the C.I.A., initially in the 60s - shortly after Jim Garrison's JFK assassination investigation -  for the very purpose of pre-emptively grooming conspiracy theories in the collective conscious/imagination via entertainment and the culture industry a la the 70s fascination with them in film, for instance (All the President's Men, The Conversation, The Parallax View, The Odessa File, Marathon Man, The Boys from Brazil, etc.).

Propaganda, and predictive programming (individually and on a mass scale), despite popular opinion, is not just the so-called diabolical game of the typical enemies of North America and its allies, particularly over the course of the past hundred years or so. In actual fact (and I know "facts" are remarkably unpopular for a certain cross-section of the population nowadays, insanely enough), the concept, and utilization, of propaganda was greatly improved upon by an allied European named Edward Bernays in the 1920s. He actually wrote a book pointedly titled, Propaganda, that was published in 1928! The word and very concept, funnily enough, given its, um, tarnished reputation since the Second World War, did not have a negative connotation associated with it when Bernays wrote his book. 

However, after World War II, and thanks predominantly to the Nazis and Josef Stalin's propaganda machine in his Soviet Union dictatorship, the word and concept from that point on took on a very negative connotation. So, Bernays had to come up with another word to describe his means of business, which would go on to ever greater heights of success and effectiveness: "Public Relations". That's right, he was "the father of public relations", and its practice would be disseminated all through North American and European culture to the point that it is now the very air we all breathe - like fish in water, unconscious - within the all-pervasive ether that is, what the late Mark Fisher referred to as, Capitalist Realism.

I'll leave you to mentally and emotionally digest the opening two paragraphs of Edward Bernays' monumentally influential 1928 book, Propaganda, which has literally been an instruction manual for the globalists, the corporatists, the market handlers, the tech moguls, the real slave-drivers that own and run Earth Incorporated:

"The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.

We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of. This is the logical result of the way in which our democratic society is organized. Vast numbers of human beings must cooperate in this manner if they are to live together as a smoothly functioning society."

How's that for "pushing the envelope", huh?





   



   

  

    

 




  

Saturday, 14 November 2020

The Hauntology of the 16th Century "Bifurcation of Human Consciousness" and Rise of a new Vanishing Mediator

The Ghosts of the Past Continue to Haunt our Futures while Transitioning into the Digital Age 

 




by James Albert Barr


"It is the task of the philosopher to restore, by representation, the primacy of the symbolic character of the word, in which the idea is given self-consciousness, and that is the opposite of all outwardly-directed communication. Since philosophy may not presume to speak in the tones of revelation, this can only be achieved by recalling in the memory the primordial form of perception." - Walter Benjamin: The Origin of German Tragic Drama 

"What is to be insisted upon is that the poet must develop or procure the consciousness of the past and that he should continue to develop this consciousness throughout his career." - T.S. Eliot: Tradition and the Individual Talent

"Just when I think I'm winning
 When I've broken every door
 The ghosts of my life blow wilder than before
 Just when I thought I could not be stopped
 When my chance came to be king
 The ghosts of my life blew wilder than the wind." - Ghosts by David Sylvian of new wave band, Japan

"Cannot the time be rejointed?... O cursed, aeon,
 That ever we were born to its anon and on." - variation on Shakespeare's Hamlet upon confronting his father's ghost



According to Fredric Jameson (following Max Weber's initial conclusions), in his 1988 essay, The Vanishing Mediator; or, Max Weber as Storyteller, Protestantism enabled the Western historical conditions that unwittingly created capitalism, as opposed to the Marxist theory of "class struggle" through historical materialism. Protestantism was, allegedly, the "vanishing mediator" between medieval feudalism and modern capitalism. Consequently, via Hegel's concept of "negation of the negation", Protestantism, which negated feudalism, was, itself, negated by capitalism, thus becoming a privatised activity of personal worship after having taken on, through the initial Lutheran advent and then the teachings and example of Calvinism, an ascetic-acquisitive stance prior to capitalism's emergence. Jameson had come to this conclusion via his study of Max Weber's monumental, sociological study, The Protestant Ethic and the Spirit of Capitalism, which was published shortly after the turn of the 20th century, that is, in 1905.

Slavoj Zizek borrowed Jameson's "vanishing mediator" concept, and has used it in his own work, such as For They Know Not What They Do. And it was actually Zizek that first noticed a "negation of the negation" in the vanishing mediator. So, what actually provoked the "bifurcation of human consciousness" and the "dissociation of sensibility" that Frank Kermode and T.S. Eliot, respectively, issued forth? Keith Alldritt, in his 1968 study of George Orwell, The Making of George Orwell: an Essay in Literary History, summed up Kermode's claims thusly:

"As Frank Kermode has reminded us in his The Romantic Image (a study of William Butler Yeats), a fundamental tenet of symbolism is that at some point in the history of western civilisation there occurred a 'bifurcation of human consciousness' which resulted in a degeneration in the quality of art, culture and life itself. This is the assumption that lies behind Yeats' question: 'had not Europe shared one mind and heart until both mind and heart began to break in to fragments a little before Shakespeare's birth?' And it is the same assumption which is present in T.S. Eliot's concept of 'the dissociation of sensibility'. Yeats and Eliot [both identify]... a moment in history at which both the unity of the individual mind and the unity of society become broken, and which heralds a subsequent impoverishment of life, that very impoverishment which stands as the prime donnee of symbolist thought and art."

And in Eliot's 1921 essay, The Metaphysical Poets, he states, by implication, after the initial "bifurcation of human consciousness" in the 16th century, shortly after the Reformation began, and Henry VIII's converting the Church of England from Catholicism to Protestantism:

"In the seventeenth century a dissociation of sensibility set in, from which we have never recovered; and this dissociation, as is natural, was aggravated by the influence of the two most powerful poets of the century, Milton and Dryden. Each of these men performed certain poetic functions so magnificently well that the magnitude of the effect concealed the absence of others. The language went on and in some respects improved; the best verse of Collins, Gray, Johnson, and even Goldsmith satisfies some of our fastidious demands better than that of Donne or Marvell or King. But while the language became more refined, the feeling became more crude. The feeling, the sensibility, expressed in the Country Churchyard (to say nothing of Tennyson and Browning) is cruder than that in the Coy Mistress

The second effect of the influence of Milton and Dryden followed from the first, and was therefore slow in manifestation. The sentimental age began early in the eighteenth century, and continued. The poets revolted against the ratiocinative, the descriptive; they thought and felt in fits, unbalanced; they reflected. In one or two passages of Shelley's Triumph of Life, in the second Hyperion there are traces of a struggle toward unification of sensibility. But Keats and Shelley died, and Tennyson and Browning ruminated."

Eliot never mentioned anything regarding capitalism, or the "bifurcation of human consciousness", at least directly, in his brief description of his "dissociation of sensibility" concept in his essay, and, to my knowledge, Kermode didn't mention capitalism either in his Romantic Image. If I conjoin the ramifications of Eliot's and Kermode's respective theories with Walter Benjamin's Origin of German Tragic Drama, and all it entails pertaining to late-16th and early-17th century conditions that engendered its particular idiom, tone and attitude - not just in Germany, of course, but all of Europe during that critical time-span - then certainly a connection with the advent of capitalism is tenable, because that's precisely what Benjamin is concluding in his indispensible 1928 book. And not only that capitalism emerged, gradually (like a "rough beast, its hour come round at last/ Slouching towards [modern man] to be born!"), as Weber concluded, from the ascetic-acquisitivity of Protestantism, but that it also, by consequence, inaugurated modernity! through the crucial alteration of time itself, hence Hamlet's famous, and implication-filled declaration: "The time is out of joint!"

Interestingly, Benjamin felt that Weber had not been radical enough in his critical study of capitalism's emergence. Benjamin saw Protestantism as not just the condition that stirred the early iterations of capitalism, but that it, unwittingly, acted as host to capitalism's parasitic nature, which itself, again according to Benjamin, became a "religion" of sorts: "Capitalism is an unprecedented religion which offers not the reform of existence but its complete destruction."

Benjamin differentiated between Greek tragedy and the German trauerspiel, or "mourning play" in general (of which Benjamin declares Hamlet the greatest of all mourning plays, despite its English derivation), by describing the former as one which used "symbol", and a linear, resolvable form of time to depict the action, where as the latter used the Baroque sense of "allegory", via much ostentation and dramatic excess, and revealed a non-linear, fractured, unresolvable sense of time in the action, which was chock-full of ambiguity, disunity, delay and catastrophe creating an ontologically new experience which was, ultimately, modern. In Benjamin's Origin of German Tragic Drama, he says:

"Origin [Ursprung], although an entirely historical category, has, nevertheless, nothing to do with genesis [Entstehung]. The term origin is not intended to describe the process by which the existent came into being, but rather to describe that which emerges from the process of becoming and disappearance. Origin is an eddy in the stream of becoming, and in its current it swallows the material involved in the process of genesis. That which is original is never revealed in the naked and manifest existence of the factual; its rhythm is apparent only to a dual insight. On the one hand it needs to be recognized as a process of restoration and reestablishment, but, on the other hand, and precisely because of this, as something imperfect and incomplete."...Hence, Jameson's notion of the "vanishing mediator" and Derrida's, and later Mark Fisher's, idea of hauntology, half a century or more after Benjamin's seminal book was published.

And, so, what exactly was it that was being "mourned", and ultimately "haunting", in these trauerspiel/mourning plays during the first half of the 17th century? Simply put, man's sense of agency, which was "denied any spiritual effect to human action", because, as Martin Luther proclaimed, man's only path to salvation depended on grace through faith alone, thus closing the book, so to speak, on any competing theologies, thus bringing about Divine Right monarchs, and absolutist states, and of course ever-expanding commercial markets. But what was really being mourned off-stage, from a collective unknowingness on the surface, was the coming, inexorable transition into "capitalist modernity". The plays of German playwrights such as Daniel Caspers von Lohenstein (1635-83) and Andreas Gryphius (1616-64), as well as even the Spaniard, Pedro Calderon de la Barca (particularly in his 1635 play, Life is a Dream), exemplified the characteristics of these mournful times, arguably initiated in Shakespeare's tragedies, namely Hamlet. The world had become drained of all substantial meaning ("What a piece of work is man!") with only Luther's stark notion of faith in scripture alone to fortify oneself from the pervasive meaninglessness and absurdity of the zeitgeist, but also the Calvinist propensity to collect wealth through the Protestant Ethic to at least better your chances to get into Heaven when your hard-working existence was finally "spent". 

The Wikipedia entry on Hauntology says, "The term refers to a situation of temporal, historical, and ontological disjunction in which the apparent presence of being is replaced by a deferred non-origin, represented by 'the figure of the ghost as that which is neither present nor absent, neither dead nor alive.' The concept is derived from Derrida's deconstructive method, in which any attempt to locate the origin of identity or history must inevitably find itself dependent on an always-already existing set of linguistic conditions - thus making 'haunting the state proper to being as such'. In the 2000s, the term was taken up by critics in reference to paradoxes found in postmodernity, particularly contemporary culture's persistent recycling of retro aesthetics and incapacity to escape old social forms. Critics like Mark Fisher and Simon Reynolds have used the term to describe art preoccupied with this temporal disjunction and defined by a 'nostalgia for lost futures.'"

Jacques Derrida's initial use of hauntology as a concept was incorporated into his 1993 book, Spectres of Marx: The State of the Debt, the Work of Mourning and the New International. Some musical examples of hauntology can be heard on albums by such acts as: Broadcast, Pram, Boards of Canada, Burial and The Caretaker. In the case of The Caretaker, and his 2011 album, An Empty Bliss Beyond This World, for example, he very aptly references Stanley Kubrick's classic horror film, The Shining, which is rife with hauntological aspects. 

As it likely was around four hundred years ago, we, in the early part of the 21st century, are also experiencing and wallowing through a very palpable "mourning period" (hence all the confusion, depression, anxiety and mental illness abounding in society, presently) of a collective loss of a future, and sense of meaning, we were foolishly promised in the 20th century, but have now been utterly denied, despite its continuing to haunt us, and instead have been mercilessly imposed upon with a seemingly soulless, post-culture, transgendered, transhumanist, coldly antiseptic (thus compromising immune systems, and strategically so), accelerationist, digitised, robotic, automated and possibly post-human altogether, "future". The vanishing mediator that will signal the transition of our world and our human existence and experience, collectively, into the so-called "Great Reset", the Fourth Industrial Revolution (according to globalist and technocrat, Klaus Schwab, whose been planning this since the 70s!) unto quite possibly the Singularity itself, seems likely to be our very nature, our democratic freedom, our human identity, our mortal and immortal mind/s (as a collective spirit and as an individual), and our very heart and soul.   


 




Friday, 2 October 2020

Available Now - Such Late Fugitives (A Collection of 21st Century Poetry) by James Albert Barr




My poetry collection, Such Late Fugitives (A Collection of 21st Century Poetry), was recently made available for purchase on Amazon.com in both paperback (for $8.99 US) and Kindle Ebook (for $3.70 US) versions. Here's the book's description as posted on Amazon:

 

From poet, cultural analyst, and blogger, James Albert Barr, comes this provocative collection of contemporary poetry which examines and expresses many of the profound and relevant themes and issues concerning a post-millennium world, mired in the semiotic chaos and excess of a digitalised culture, transitioning, with discernible difficulty, from postmodernism to Hypermodernity. This collection also probes, via intertextuality, literary, as well as philosophical, allusions and citations, the connection(s) with our past, both its social, cultural and literary history, and places them in a contemporary context, as we find ourselves collectively hurling, head-long, into an uncertain future.


Please order your copy today, and feel free to rate the book and leave a comment or even review at Amazon.com and/or right here at my blog, The Culture Fix. Thanks for your acknowledgement and support! 


Saturday, 2 May 2020

A Crucial Report from MintPress News Journalist, Whitney Webb: Is a "Soft Singularity" Imminent?


Last year a government commission called for the US to adopt an AI-driven mass surveillance system far beyond that used in any other country. Now, many of the “obstacles” cited as preventing its implementation are being removed under the guise of combating coronavirus.


April 21st, 2020

By Whitney Webb, MPN.news.

WASHINGTON DC (The Last American Vagabond) — Last year, a U.S. government body dedicated to examining how artificial intelligence can “address the national security and defense needs of the United States” discussed in detail the “structural” changes that the American economy and society must undergo in order to ensure a technological advantage over China, according to a recent document acquired through an FOIA request. This document suggests that the U.S. follow China’s lead and even surpass them in many aspects related to AI-driven technologies, particularly their use of mass surveillance. This perspective clearly clashes with the public rhetoric of prominent U.S. government officials and politicians on China, who have labeled the Chinese government’s technology investments and export of its surveillance systems and other technologies as a major “threat” to Americans’ “way of life.”

In addition, many of the steps for the implementation of such a program in the U.S., as laid out in this newly available document, are currently being promoted and implemented as part of the government’s response to the current coronavirus (COVID-19) crisis. This likely due to the fact that many members of this same body have considerable overlap with the taskforces and advisors currently guiding the government’s plans to “re-open the economy” and efforts to use technology to respond to the current crisis.

The FOIA document, obtained by the Electronic Privacy Information Center (EPIC), was produced by a little-known U.S. government organization called the National Security Commission on Artificial Intelligence (NSCAI). It was created by the 2018 National Defense Authorization Act (NDAA) and its official purpose is “to consider the methods and means necessary to advance the development of artificial intelligence (AI), machine learning, and associated technologies to comprehensively address the national security and defense needs of the United States.”

The NSCAI is a key part of the government’s response to what is often referred to as the coming “fourth industrial revolution,” which has been described as “a revolution characterized by discontinuous technological development in areas like artificial intelligence (AI), big data, fifth-generation telecommunications networking (5G), nanotechnology and biotechnology, robotics, the Internet of Things (IoT), and quantum computing.”

However, their main focus is ensuring that “the United States … maintain a technological advantage in artificial intelligence, machine learning, and other associated technologies related to national security and defense.” The vice-chair of NSCAI, Robert Work – former Deputy Secretary of Defense and senior fellow at the hawkish Center for a New American Security (CNAS), described the commission’s purpose as determining “how the U.S. national security apparatus should approach artificial intelligence, including a focus on how the government can work with industry to compete with China’s ‘civil-military fusion’ concept.”

The recently released NSCAI document is a May 2019 presentation entitled “Chinese Tech Landscape Overview.” Throughout the presentation, the NSCAI promotes the overhaul of the U.S. economy and way of life as necessary for allowing the U.S. to ensure it holds a considerable technological advantage over China, as losing this advantage is currently deemed a major “national security” issue by the U.S. national security apparatus. This concern about maintaining a technological advantage can be seen in several other U.S. military documents and think tank reports, several of which have warned that the U.S.’ technological advantage is quickly eroding.

The U.S. government and establishment media outlets often blame alleged Chinese espionage or the Chinese government’s more explicit partnerships with private technology companies in support of their claim that the U.S. is losing this advantage over China. For instance, Chris Darby, the current CEO of the CIA’s In-Q-Tel, who is also on the NSCAI, told CBS News last year that China is the U.S.’ main competitor in terms of technology and that U.S. privacy laws were hampering the U.S.’ capacity to counter China in this regard, stating that:

"[D]ata is the new oil. And China is just awash with data. And they don’t have the same restraints that we do around collecting it and using it, because of the privacy difference between our countries. This notion that they have the largest labeled data set in the world is going to be a huge strength for them."

In another example, Michael Dempsey – former acting Director of National Intelligence and currently a government-funded fellow at the Council on Foreign Relations – argued in The Hill that:

"It’s quite clear, though, that China is determined to erase our technological advantage, and is committing hundreds of billions of dollars to this effort. In particular, China is determined to be a world leader in such areas as artificial intelligence, high performance computing, and synthetic biology. These are the industries that will shape life on the planet and the military balance of power for the next several decades."

In fact, the national security apparatus of the United States is so concerned about losing a technological edge over China that the Pentagon recently decided to join forces directly with the U.S. intelligence community in order “to get in front of Chinese advances in artificial intelligence.” This union resulted in the creation of the Joint Artificial Intelligence Center (JAIC), which ties together “the military’s efforts with those of the Intelligence Community, allowing them to combine efforts in a breakneck push to move government’s AI initiatives forward.” It also coordinates with other government agencies, industry, academics, and U.S. allies. Robert Work, who subsequently became the NSCAI vice-chair, said at the time that JAIC’s creation was a “welcome first step in response to Chinese, and to a lesser extent, Russian, plans to dominate these technologies.”

Similar concerns about “losing” technological advantage to China have also been voiced by the NSCAI chairman, Eric Schmidt, the former head of Alphabet – Google’s parent company, who argued in February in the New York Times that Silicon Valley could soon lose “the technology wars” to China if the U.S. government doesn’t take action. Thus, the three main groups represented within the NSCAI – the intelligence community, the Pentagon and Silicon Valley – all view China’s advancements in AI as a major national security threat (and in Silicon Valley’s case, threat to their bottom lines and market shares) that must be tackled quickly.




Targeting China’s “adoption advantage”

In the May 2019 “Chinese Tech Landscape Overview” presentation, the NSCAI discusses that, while the U.S. still leads in the “creation” stage of AI and related technologies, it lags behind China in the “adoption” stage due to “structural factors.” It says that “creation”, followed by “adoption” and “iteration” are the three phases of the “life cycle of new tech” and asserts that failing to dominate in the “adoption” stage will allow China to “leapfrog” the U.S. and dominate AI for the foreseeable future.

The presentation also argues that, in order to “leapfrog” competitors in emerging markets, what is needed is not “individual brilliance” but instead specific “structural conditions that exist within certain markets.” It cites several case studies where China is considered to be “leapfrogging” the U.S. due to major differences in these “structural factors.” Thus, the insinuation of the document (though not directly stated) is that the U.S. must alter the “structural factors” that are currently responsible for its lagging behind China in the “adoption” phase of AI-driven technologies.


Chief among the troublesome “structural factors” highlighted in this presentation are so-called “legacy systems” that are common in the U.S. but much less so in China. The NSCAI document states that examples of “legacy systems” include a financial system that still utilizes cash and card payments, individual car ownership and even receiving medical attention from a human doctor. It states that, while these “legacy systems” in the US are “good enough,” too many “good enough” systems “hinder the adoption of new things,” specifically AI-driven systems.

Another structural factor deemed by the NSCAI to be an obstacle to the U.S.’ ability to maintain a technological advantage over China is the “scale of the consumer market,” arguing that “extreme urban density = on-demand service adoption.” In other words, extreme urbanization results in more people using online or mobile-based “on-demand” services, ranging from ride-sharing to online shopping. It also cites the use of mass surveillance on China’s “huge population base” is an example of how China’s “scale of consumer market” advantage allowing “China to leap ahead” in the fields of related technologies, like facial recognition.

In addition to the alleged shortcomings of the U.S.’ “legacy systems” and lack of “extreme urban density,” the NSCAI also calls for more “explicit government support and involvement” as a means to speed up the adoption of these systems in the U.S. This includes the government lending its stores of data on civilians to train AI, specifically citing facial recognition databases, and mandating that cities be “re-architected around AVs [autonomous vehicles],” among others. Other examples given include the government investing large amounts of money in AI start-ups and adding tech behemoths to a national, public-private AI taskforce focused on smart city-implementation (among other things).

With regards to the latter, the document says “this level of public-private cooperation” in China is “outwardly embraced” by the parties involved, with this “serving as a stark contrast to the controversy around Silicon Valley selling to the U.S. government.” Examples of such controversy, from the NSCAI’s perspective, likely include Google employees petitioning to end the Google-Pentagon “Project Maven,” which uses Google’s AI software to analyze footage captured by drones. Google eventually chose not to renew its Maven contract as a result of the controversy, even though top Google executives viewed the project as a “golden opportunity” to collaborate more closely with the military and intelligence communities.

The document also defines another aspect of government support as the “clearing of regulatory barriers.” This term is used in the document specifically with respect to U.S. privacy laws, despite the fact that the U.S. national security state has long violated these laws with near complete impunity. However, the document seems to suggest that privacy laws in the U.S. should be altered so that what the U.S. government has done “in secret” with private citizen data can be done more openly and more extensively. The NSCAI document also discusses the removal of “regulatory barriers” in order to speed up the adoption of self-driving cars, even though autonomous driving technology has resulted in several deadly and horrific car accidents and presents other safety concerns.

Also discussed is how China’s “adoption advantage” will “allow it to leapfrog the U.S.” in several new fields, including “AI medical diagnosis” and “smart cities.” It then asserts that “the future will be decided at the intersection of private enterprise and policy leaders between China and the U.S.” If this coordination over the global AI market does not occur, the document warns that “we [the U.S.] risk being left out of the discussions where norms around AI are set for the rest of our lifetimes.”

The presentation also dwells considerably on how “the main battleground [in technology] are not the domestic Chinese and US markets,” but what it refers to as the NBU (next billion users) markets, where it states that “Chinese players will aggressively challenge Silicon Valley.” In order to challenge them more successfully, the presentation argues that, “just like we [view] the market of teenagers as a harbinger for new trends, we should look at China.”

The document also expresses concerns about China exporting AI more extensively and intensively than the U.S., saying that China is “already crossing borders” by helping to build facial databases in Zimbabwe and selling image recognition and smart city systems to Malaysia. If allowed to become “the unambiguous leader in AI,” it says that “China could end up writing much of the rulebook of international norms around the deployment of AI” and that it would “broaden China’s sphere of influence amongst an international community that increasingly looks to the pragmatic authoritarianism of China and Singapore as an alternative to Western liberal democracy.”



What will replace America’s “legacy systems”?
Given that the document makes it quite clear that “legacy systems” in the U.S. are impeding its ability to prevent China from “leapfrogging” ahead in AI and then dominating it for the foreseeable future, it is also important to examine what the document suggests should replace these “legacy systems” in the U.S.

As previously mentioned, one “legacy system” cited early on in the presentation is the main means of payment for most Americans, cash and credit/debit cards. The presentation asserts, in contrast to these “legacy systems” that the best and most advanced system is moving entirely to smartphone-based digital wallets.

It notes specifically the main mobile wallet provider in India, PayTM, is majority owned by Chinese companies. It quotes an article, which states that “a big break came [in 2016] when India canceled 86% of currency in circulation in an effort to cut corruption and bring more people into the tax net by forcing them to use less cash.” At the time, claims that India’s 2016 “currency reform” would be used as a stepping stone towards a cashless society were dismissed by some as “conspiracy theory.” However, last year, a committee convened by India’s central bank (and led by an Indian tech oligarch who also created India’s massive civilian biometric database) resulted in the Indian government’s “Cashless India” program.

Regarding India’s 2016 “currency reform,” the NSCAI document then asserts that “this would be unfathomable in the West. And unsurprisingly, when 86% of the cash got cancelled and nobody had a credit card, mobile wallets in India exploded, laying the groundwork for a far more advanced payments ecosystem in India than the US.” However, it has become increasingly less unfathomable in light of the current coronavirus crisis, which has seen efforts to reduce the amount of cash used because paper bills may carry the virus as well as efforts to introduce a Federal Reserve-backed “digital dollar.”

In addition, the NSCAI document from last May calls for the end of in-person shopping and promotes moving towards all shopping being performed online. It argues that “American companies have a lot to gain by adopting ideas from Chinese companies” by shifting towards exclusive e-commerce purchasing options. It states that only shopping online provides a “great experience” and also adds that “when buying online is literally the only way to get what you want, consumers go online.”

Another “legacy system” that the NSCAI seeks to overhaul is car ownership, as it promotes autonomous, or self-driving vehicles and further asserts that “fleet ownership > individual ownership.” It specifically points to a need for “a centralized ride-sharing network,” which it says “is needed to coordinate cars to achieve near 100% utilization rates.” However, it warns against ride-sharing networks that “need a human operator paired with each vehicle” and also asserts that “fleet ownership makes more sense” than individual car ownership. It also specifically calls for these fleets to not only be composed of self-driving cars, but electric cars and cites reports that China “has the world’s most aggressive electric vehicle goals….and seek[s] the lead in an emerging industry.”

The document states that China leads in ride-sharing today even though ride-sharing was pioneered first in the U.S. It asserts once again that the U.S. “legacy system” of individual car ownership and lack of “extreme urban density” are responsible for China’s dominance in this area. It also predicts that China will “achieve mass autonomous [vehicle] adoption before the U.S.,” largely because “the lack of mass car ownership [in China] leads to far more consumer receptiveness to AVs [autonomous vehicles].” It then notes that “earlier mass adoption leads to a virtuous cycle that allows Chinese core self-driving tech to accelerate beyond [its] Western counterparts.”

In addition to their vision for a future financial system and future self-driving transport system, the NSCAI has a similarly dystopian vision for surveillance. The document calls mass surveillance “one of the ‘first-and-best customers’ for AI” and “a killer application for deep learning.” It also states that “having streets carpeted with cameras is good infrastructure.”

It then discusses how “an entire generation of AI unicorn” companies are “collecting the bulk of their early revenue from government security contracts” and praises the use of AI in facilitating policing activities. For instance, it lauds reports that “police are making convictions based on phone calls monitored with iFlyTek’s voice-recognition technology” and that “police departments are using [AI] facial recognition tech to assist in everything from catching traffic law violators to resolving murder cases.”

On the point of facial recognition technology specifically, the NSCAI document asserts that China has “leapt ahead” of the US on facial recognition, even though “breakthroughs in using machine learning for image recognition initially occurred in the US.” It claims that China’s advantage in this instance is because they have government-implemented mass surveillance (“clearing of regulatory barriers”), enormous government-provided stores of data (“explicit government support”) combined with private sector databases on a huge population base (“scale of consumer market”). As a consequence of this, the NSCAI argues, China is also set to leap ahead of the U.S. in both image/facial recognition and biometrics.

The document also points to another glaring difference between the U.S. and its rival, stating that: “In the press and politics of America and Europe, Al is painted as something to be feared that is eroding privacy and stealing jobs. Conversely, China views it as both a tool for solving major macroeconomic challenges in order to sustain their economic miracle, and an opportunity to take technological leadership on the global stage.”

The NSCAI document also touches on the area of healthcare, calling for the implementation of a system that seems to be becoming reality thanks to the current coronavirus crisis. In discussing the use of AI in healthcare (almost a year before the current crisis began), it states that “China could lead the world in this sector” and “this could lead to them exporting their tech and setting international norms.” One reason for this is also that China has “far too few doctors for the population” and calls having enough doctors for in-person visits a “legacy system.” It also cited U.S. regulatory measures such as “HIPPA compliance and FDA approval” as obstacles that don’t constrain Chinese authorities.

More troubling, it argues that “the potential impact of government supplied data is even more significant in biology and healthcare,” and says it is likely that “the Chinese government [will] require every single citizen to have their DNA sequenced and stored in government databases, something nearly impossible to imagine in places as privacy conscious as the U.S. and Europe.” It continues by saying that “the Chinese apparatus is well-equipped to take advantage” and calls these civilian DNA databases a “logical next step.”




Who are the NSCAI?

Given the sweeping changes to the U.S. that the NSCAI promoted in this presentation last May, it becomes important to examine who makes up the commission and to consider their influence over U.S. policy on these matters, particularly during the current crisis. As previously mentioned, the chairman of the NSCAI is Eric Schmidt, the former head of Alphabet (Google’s parent company) who has also invested heavily in Israeli intelligence-linked tech companies including the controversial start-up “incubator” Team8. In addition, the committee’s vice-chair is Robert Work, is not only a former top Pentagon official, but is currently working with the think tank CNAS, which is run by John McCain’s long-time foreign policy adviser and Joe Biden’s former national security adviser.

Other members of the NSCAI are as follows:

Safra Catz, CEO of Oracle, with close ties to Trump’s top donor Sheldon Adelson
Steve Chien, supervisor of the Artificial Intelligence Group at Caltech’s Jet Propulsion Lab
Mignon Clyburn, Open Society Foundation fellow and former FCC commissioner
Chris Darby, CEO of In-Q-Tel (CIA’s venture capital arm)
Ken Ford, CEO of the Florida Institute for Human and Machine Cognition
Jose-Marie Griffiths, president of Dakota State University and former National Science Board member
Eric Horvitz, director of Microsoft Research Labs
Andy Jassy, CEO of Amazon Web Services (CIA contractor)
Gilman Louie, partner at Alsop Louie Partners and former CEO of In-Q-Tel
William Mark, director of SRI International and former Lockheed Martin director
Jason Matheny, director of the Center for Security and Emerging Technology, former Assistant director of National Intelligence and former director of IARPA (Intelligence Advanced Research Project Agency)
Katharina McFarland, consultant at Cypress International and former Assistant Secretary of Defense for Acquisition
Andrew Moore, head of Google Cloud AI

As can be seen in the list above, there is a considerable amount of overlap between the NSCAI and the companies currently advising the White House on “re-opening” the economy (Microsoft, Amazon, Google, Lockheed Martin, Oracle) and one NSCAI member, Oracle’s Safra Katz, is on the White House’s “economic revival” taskforce. Also, there is also overlap between the NSCAI and the companies that are intimately involved in the implementation of the “contact tracing” “coronavirus surveillance system,” a mass surveillance system promoted by the Jared Kushner-led, private-sector coronavirus task force. That surveillance system is set to be constructed by companies with deep ties to Google and the U.S. national security state, and both Google and Apple, who create the operating systems for the vast majority of smartphones used in the U.S., have said they will now build that surveillance system directly into their smartphone operating systems.

Also notable is the fact that In-Q-Tel and the U.S. intelligence community has considerable representation on the NSCAI and that they also boast close ties with Google, Palantir and other Silicon Valley giants, having been early investors in those companies. Both Google and Palantir, as well as Amazon (also on the NSCAI) are also major contractors for U.S. intelligence agencies. In-Q-Tel’s involvement on the NSCAI is also significant because they have been heavily promoting mass surveillance of consumer electronic devices for use in pandemics for the past several years. Much of that push has come from In-Q-Tel’s current Executive Vice President Tara O’Toole, who was previously the director of the Johns Hopkins Center for Health Security and also co-authored several controversial biowarfare/pandemic simulations, such as Dark Winter.

In addition, since at least January, the U.S. intelligence community and the Pentagon have been at the forefront of developing the U.S. government’s still-classified “9/11-style” response plans for the coronavirus crisis, alongside the National Security Council. Few news organizations have noted that these classified response plans, which are set to be triggered if and when the U.S. reaches a certain number of coronavirus cases, has been created largely by elements of the national security state (i.e. the NSC, Pentagon, and intelligence), as opposed to civilian agencies or those focused on public health issues.

Furthermore, it has been reported that the U.S. intelligence community as well as U.S. military intelligence knew by at least January (though recent reports have said as early as last November) that the coronavirus crisis would reach “pandemic proportions” by March. The American public were not warned, but elite members of the business and political classes were apparently informed, given the record numbers of CEO resignations in January and several high-profile insider trading allegations that preceded the current crisis by a matter of weeks.

Perhaps even more disconcerting is the added fact that the U.S. government not only participated in the eerily prescient pandemic simulation last October known as Event 201, it also led a series of pandemic response simulations last year. Crimson Contagion was a series of four simulations that involved 19 U.S. federal agencies, including intelligence and the military, as well as 12 different states and a host of private sector companies that simulated a devastating pandemic influenza outbreak that had originated in China. It was led by the current HHS Assistant Secretary for Preparedness and Response, Robert Kadlec, who is a former lobbyist for military and intelligence contractors and a Bush-era homeland security “bioterrorism” advisor.

In addition, both Kadlec and the Johns Hopkins Center for Health Security, which was intimately involved in Event 201, have direct ties to the controversial June 2001 biowarfare exercise “Dark Winter,” which predicted the 2001 anthrax attacks that transpired just months later in disturbing ways. Though efforts by media and government were made to blame the anthrax attacks on a foreign source, the anthrax was later found to have originated at a U.S. bioweapons lab and the FBI investigation into the case has been widely regarded as a cover-up, including by the FBI’s once-lead investigator on that case.

Given the above, it is worth asking if those who share the NSCAI’s vision saw the coronavirus pandemic early on as an opportunity to make the “structural changes” it had deemed essential to countering China’s lead in the mass adoption of AI-driven technologies, especially considering that many of the changes in the May 2019 document are now quickly taking place under the guise of combatting the coronavirus crisis.



The NSCAI vision takes shape

Though the May 2019 NSCAI document was authored nearly a year ago, the coronavirus crisis has resulted in the implementation of many of the changes and the removal of many of the “structural” obstacles that the commission argued needed to be drastically altered in order to ensure a technological advantage over China in the field of AI. The aforementioned move away from cash, which is taking place not just in the U.S. but internationally, is just one example of many.

For instance, earlier this week CNN reported that grocery stores are now considering banning in-person shopping and that the U.S. Department of Labor has recommended that retailers nationwide start “‘using a drive-through window or offering curbside pick-up’ to protect workers for exposure to coronavirus.” In addition, last week, the state of Florida approved an online-purchase plan for low income families using the Supplemental Nutrition Assistance Program (SNAP). Other reports have argued that social distancing inside grocery stores is ineffective and endangering people’s lives. As previously mentioned, the May 2019 NSCAI document argues that moving away from in-person shopping is necessary to mitigate China’s “adoption advantage” and also argued that “when buying online is literally the only way to get what you want, consumers go online.”

Reports have also argued that these changes in shopping will last far beyond coronavirus, such as an article by Business Insider entitled “The coronavirus pandemic is pushing more people online and will forever change how Americans shop for groceries, experts say.” Those cited in the piece argue that this shift away from in-person shopping will be “permanent” and also states that “More people are trying these services than otherwise would have without this catalyst and gives online players a greater chance to acquire and keep a new customer base.” A similar article in Yahoo! News argues that, thanks to the current crisis, “our dependence on online shopping will only rise because no one wants to catch a virus at a shop.”

In addition, the push towards the mass use of self-driving cars has also gotten a boost thanks to coronavirus, with driverless cars now making on-demand deliveries in California. Two companies, one Chinese-owned and the other backed by Japan’s SoftBank, have since been approved to have their self-driving cars used on California roads and that approval was expedited due to the coronavirus crisis. The CPO of Nuro Inc., the SoftBank-backed company, was quoted in Bloomberg as saying that “The Covid-19 pandemic has expedited the public need for contactless delivery services. Our R2 fleet is custom-designed to change the very nature of driving and the movement of goods by allowing people to remain safely at home while their groceries, medicines, and packages are brought to them.” Notably, the May 2019 NSCAI document references the inter-connected web of SoftBank-backed companies, particularly those backed by its largely Saudi-funded “Vision Fund,” as forming “the connective tissue for a global federation of tech companies” set to dominate AI.

California isn’t the only state to start using self-driving cars, as the Mayo Clinic of Florida is now also using them. “Using artificial intelligence enables us to protect staff from exposure to this contagious virus by using cutting-edge autonomous vehicle technology and frees up staff time that can be dedicated to direct treatment and care for patients,” Kent Thielen, M.D., CEO of Mayo Clinic in Florida stated in a recent press release cited by Mic.

Like the changes to in-person shopping in the age of coronavirus, other reports assert that self-driving vehicles are here to stay. One report published by Mashable is entitled “It took a coronavirus outbreak for self-driving cars to become more appealing,” and opens by stating “Suddenly, a future full of self-driving cars isn’t just a sci-fi pipe dream. What used to be considered a scary, uncertain technology for many Americans looks more like an effective tool to protect ourselves from a fast-spreading, infectious disease.” It further argues that this is hardly a “fleeting shift” in driving habits and one tech CEO cited in the piece, Anuja Sonalker of Steer Tech, claims that “There has been a distinct warming up to human-less, contactless technology. Humans are biohazards, machines are not.”

Another focus of the NSCAI presentation, AI medicine, has also seen its star rise in recent weeks. For instance, several reports have touted how AI-driven drug discovery platforms have been able to identify potential treatments for coronavirus. Microsoft, whose research lab director is on the NSCAI, recently put $20 million into its “AI for health” program to speed up the use of AI in analyzing coronavirus data. In addition, “telemedicine”– a form of remote medical care – has also become widely adopted due to the coronavirus crisis.

Several other AI-driven technologies have similarly become more widely adopted thanks to coronavirus, including the use of mass surveillance for “contact tracing” as well as facial recognition technology and biometrics. A recent Wall Street Journal report stated that the government is seriously considering both contact tracing via phone geolocation data and facial recognition technology in order to track those who might have coronavirus. In addition, private businesses – like grocery stores and restaurants – are using sensors and facial recognition to see how many people and which people are entering their stores.

As far as biometrics go, university researchers are now working to determine if “smartphones and biometric wearables already contain the data we need to know if we have become infected with the novel coronavirus.” Those efforts seek to detect coronavirus infections early by analyzing “sleep schedules, oxygen levels, activity levels and heart rate” based on smartphone apps like FitBit and smartwatches. In countries outside the U.S., biometric IDs are being touted as a way to track those who have and lack immunity to coronavirus.

In addition, one report in The Edge argued that the current crisis is changing what types of biometrics should be used, asserting that a shift towards thermal scanning and facial recognition is necessary:

"At this critical juncture of the crisis, any integrated facial recognition and thermal scanning solution must be implemented easily, rapidly and in a cost-effective manner. Workers returning to offices or factories must not have to scramble to learn a new process or fumble with declaration forms. They must feel safe and healthy for them to work productively. They just have to look at the camera and smile. Cameras and thermal scanners, supported by a cloud-based solution and the appropriate software protocols, will do the rest."

Also benefiting from the coronavirus crisis is the concept of “smart cities,” with Forbes recently writing that “Smart cities can help us combat the coronavirus pandemic.” That article states that “Governments and local authorities are using smart city technology, sensors and data to trace the contacts of people infected with the coronavirus. At the same time, smart cities are also helping in efforts to determine whether social distancing rules are being followed.”

That article in Forbes also contains the following passage:

"…[T]he use of masses of connected sensors makes it clear that the coronavirus pandemic is–intentionally or not–being used as a testbed for new surveillance technologies that may threaten privacy and civil liberties. So aside from being a global health crisis, the coronavirus has effectively become an experiment in how to monitor and control people at scale."

Another report in The Guardian states that “If one of the government takeaways from coronavirus is that ‘smart cities’ including Songdo or Shenzhen are safer cities from a public health perspective, then we can expect greater efforts to digitally capture and record our behaviour in urban areas – and fiercer debates over the power such surveillance hands to corporations and states.” There have also been reports that assert that typical cities are “woefully unprepared” to face pandemics compared to “smart cities.”

Yet, beyond many of the NSCAI’s specific concerns regarding mass AI adoption being conveniently resolved by the current crisis, there has also been a concerted effort to change the public’s perception of AI in general. As previously mentioned, the NSCAI had pointed out last year that:

"In the press and politics of America and Europe, Al is painted as something to be feared that is eroding privacy and stealing jobs. Conversely, China views it as both a tool for solving major macroeconomic challenges in order to sustain their economic miracle, and an opportunity to take technological leadership on the global stage."

Now, less than a year later, the coronavirus crisis has helped spawn a slew of headlines in just the last few weeks that paint AI very differently, including “How Artificial Intelligence Can Help Fight Coronavirus,” “How AI May Prevent the Next Coronavirus Outbreak,” “AI Becomes an Ally in the Fight Against COVID-19,” “Coronavirus: AI steps up in battle against COVID-19,” and “Here’s How AI Can Help Africa Fight the Coronavirus,” among numerous others.

It is indeed striking how the coronavirus crisis has seemingly fulfilled the NSCAI’s entire wishlist and removed many of the obstacles to the mass adoption of AI technologies in the United States. Like major crises of the past, the national security state appears to be using the chaos and fear to promote and implement initiatives that would be normally rejected by Americans and, if history is any indicator, these new changes will remain long after the coronavirus crisis fades from the news cycle. It is essential that these so-called “solutions” be recognized for what they are and that we consider what type of world they will end up creating – an authoritarian technocracy. We ignore the rapid advance of these NSCAI-promoted initiatives and the phasing out of so-called “legacy systems” (and with them, many long-cherished freedoms) at our own peril.





Whitney Webb is a MintPress News contributing journalist based in Chile. She has contributed to several independent media outlets including Global Research, EcoWatch, the Ron Paul Institute and 21st Century Wire, among others. She has made several radio and television appearances and is the 2019 winner of the Serena Shim Award for Uncompromised Integrity in Journalism.




Sunday, 1 September 2019

The Hypermodernity of Individuals and Identity Reduced to Dividuals and Data

How PC Outrage and Cancel Culture actually work for Accelerationist Capital and Technology

by James Albert Barr






"Capital follows you when you dream. Time ceases to be linear, becomes chaotic, broken down to punctiform divisions. As production and distribution are restructured, so are nervous systems. To function effectively as a component of just-in-time production you must develop a capacity to respond to unforeseen events, you must learn to live in conditions of total instability, or 'precarity', as the ugly neologism has it. Periods of work alternate with periods of unemployment. Typically, you find yourself employed in a series of short-term jobs, unable to plan for the future." - Mark Fisher: "Capitalist Realism: Is There No Alternative?" (2009)

"With the internet as its nervous system, the world's connected cell-phones and sensors as its sense organs, and data centers as its brain, the 'whatever' will hear everything, see everything, and be everywhere at all times. the only rational word to describe that 'whatever', is 'god' - and the only way to influence a deity is through prayer and worship." - Anthony Levandoski on Wired magazine

"The mobile phone industry is the back-bone of the global brain that is being put together." - Rick Wiles

"You know what they say the modern version of Pascal's Wager is? Sucking up to as many Transhumanists as possible, just in case one of them turns into God." - Greg Egan: "Crystal Nights" (2009)




In a key early scene from 1999's zeitgeist defining cyberpunk film, The Matrix, we see Neo encountering Morpheus for the first time. Morpheus begins by asking Neo a very direct question:"Do you believe in fate?" Neo answers in the negative, stating that he "didn't like the idea that he wasn't in control of his life." Agreeing with Neo that he knew exactly what he meant, Morpheus goes on to tell Neo this: "Let me tell you why you are here. You're here because you know something. What you know you can't explain, but you feel it. You've felt it your entire life, that there's something wrong with the world. You don't know what it is, but it's there, like a splinter in your mind, driving you mad. It is this feeling that has brought you to me. Do you know what I'm talking about?" And of course Neo answers immediately with, "The Matrix".

Morpheus then proceeds to tell Neo what The Matrix is: "The Matrix is everywhere. It is all around us. Even now, in this very room. You can see it when you look out your window or when you turn on your television. You can feel it when you go to work. When you go to church. When you pay your taxes. It is the world that has been pulled over your eyes to blind you from the truth." Neo then asks the obvious, "What truth?", to which Morpheus ominously answers, "That you are a slave, Neo. Like everyone else you were born into bondage. Born into a prison that you cannot smell or taste or touch. A prison for your mind!"

The Matrix became a huge box-office hit in 1999, and shortly after, a pop-cultural sensation, literally blowing the minds of many of its fast-growing ardent fandom, and provoking much philosophical discussion and analysis - both astute and hare-brained - regarding the film and its profound implications. Others, however, simply saw it as a cool, slick piece of Hollywood entertainment and that's all; consumable and disposable and ultimately innocuous commercial fodder. The notion of being a "red pill person" or a "blue pill person" entered the social conscious and vernacular not long after 9/11 happened, interestingly enough.

Moreover, not long after social media exploded onto the cultural scene, as the Digital Age began to become more and more pervasive in the lives of all those who were spending more and more time on the internet after 2004, particularly "connecting" with people on a new social networking service called Facebook, downloading their music onto MP3's and iPods, and partaking in proliferating on-line gaming sites, the idea of what was once generally accepted as "consensus reality" began to breakdown and fragment, rendering it far more "subjective" than "objective". This was unwittingly, or not, facilitated by new stringent measures placed on many of the previous "freedoms" that Americans, especially, but also Canadians and Europeans inalienably enjoyed since after the Second World War. This, thanks solely to the fallout of September 11, 2001, with the introduction of the Patriot Act in Oct 2001 and the Homeland Security Act of November 2002, for instance, and an ever-widening division in the populace, politically, socially and culturally, especially between active Democrats/liberals and Republicans/conservatives. Commercial flying became an inculcated and standardized nightmare, because, at this point, anyone could be a terrorist threat to national security. A Kafkian presumption of guilt absorbed itself, inexorably, into the collective conscious and became "normalized".

This so-called sense of "guilt" was psychologically sublimated through the coordinated mantra of George W. Bush's, "You're either with us or against us", onto, and into, the minds of the American people, regarding the "necessary measures" implemented by the American government to crackdown on terrorism by flushing out all "evil doers" and the "axis of evil" who represented the very antithesis of America's "values and identity". For a little while, this fear-mongering rhetoric unified most Americans, regardless of political party affiliation. In other words, most everyone behaved and did their patriotically-charged due diligence, i.e. continued to "shop" and be consumers of freedom, like George W. encouraged them to be. 

However, when it looked as though Saddam Hussein did not, in fact, have "weapons of mass destruction", like W. and his right-hand men, Dick Cheney (who was really in charge) and Donald Rumsfeld, adamantly claimed he did, things began to sour in the House of Bush Jr. during his second term, thus clearing the way for an inevitable Democratic victory for Barack Obama in the 2008 presidential election. And what parasitically latched onto Obama's promise for "change" was that same pesky sense of "collective guilt" that W. exploited, from the right, during his presidency, only this time it morphed into "politically-correct guilt", from the left, with the sudden spread of identity politics, intersectionality, gender dysphoria, otherkin, systemic racism, easily triggered emotionalism, toxic masculinity and 4th wave feminism. Initially, these highly politically-charged concepts were predominately relegated to a university and college campus curriculum and syllabus, which were gaining more and more momentum since the early 90s, when postmodernism was entering its zenith, its final phase. And, all the while, Gen Xers were beginning to have kids, and those kids were being subjected to all kinds of "helicopter parenting", self-esteem counselling, Ritalin prescriptions, accruing the benefits of government-funded programs like No Child Left Behind, resulting in many a "participation trophy" being won, and bicycle helmet sales going through the roof.

Unbeknownst to said parents, and society at large, the vast majority of those kids, whom we know as Millennials, were developing a skewed sense of self, in their own world and the world in general; many of whom were growing up with an inflated, unrealistic, humorless and narcissistic sense of personal entitlement. A marked cross-section of them coming from economically privileged backgrounds and liberal environments (ironically, despite having "progressive parents" that smothered, or at least greatly limited, their early development and subjective experiences) began to go to university or college by the end of the 2000s, with a highly susceptible sense for the power of political and philosophical suggestibility, unlike generations that came before it, who didn't have the convenient benefits of in-real-time social media dissemination of opinion and perception. And it was here that the Millennial generation, like a perfect, sociologically-determined storm, "intersected" with the aforementioned identity politics et al, which had become a pervasive fixture on university campuses over the last few decades, coincidentally enough. Like a kind of "politicorticulture", this Millennial generation seems as if it was, in supremely calculated fashion, grown from a crop of seeds into the current politically-charged harvest, and easily offended, electorate class that has apparently taken most of contemporary culture hostage, with many MSM outlets, liberal Hollywood and corporate institutions like Disney, Google, Twitter and Facebook happily towing the "woke" line as "gatekeepers", under the seeming guise of "social justice", cultural diversity and sexual/gender equality. But to what end, ultimately? It's not for a social justice utopia of safe-space equality for all, I can tell you that! Capital, I suspect, has another "Utopian vision" in its panopticonal, surveilling cross-hairs, or rather, algorithms.





Whatever the consequential ramifications of the secret Jekyll Island meeting in 1910, featuring the participation of several of the richest men in the world, at that time, resulting in the Federal Reserve Act of 1913, which initiated what would become the Haves and the Have-Nots being bridged by the newly developing "middle class"; John Maynard Keynes economics; a highly profitable, but devastating, Great War; and the notion of "the bewildered herd", "the phantom public", "cold war" and the cultural "stereotype", all of which were coined by "the Father of Modern Journalism", Walter Lippman, in his exceedingly influential books, "Public Opinion" (1922) and "The Phantom Public" (1925), and Edward Bernays' propaganda-cum-public relations innovations, it's crucial that what once was the banking system, "the machine", eventually became the system itself, the cybernetic program, "the matrix", when economic computation and digital data, networks of information coincided with 70s and 80s computer software development. Crony capitalism and Reaganomics were all the rage in the economic boom of the 80s, thanks in great part to the events of October 6, 1979, as we entered the post-Fordism era and forever changed working environments and conditions. As Mark Fisher elucidated in his important 2009 book, "Capitalist Realism: Is There No Alternative?":

"According to Marxist economist Christian Marazzi, the switch from Fordism to post-Fordism can be given a very specific date: October 6, 1979. It was on that date that the Federal Reserve increased interest rates by 20 points, preparing the way for 'supply-side economics' that would constitute the 'economic reality' in which we are now enmeshed. The rise in interest rates not only contained inflation, it made possible a new organization of the means of production and distribution.. The 'rigidity' of the Fordist production line gave way to a new 'flexibility', a word that will send chills of recognition down the spine of every worker today. This flexibility was defined by a deregulation of Capital and labor, with the workforce being casualized (with an increasing number of workers employed on a temporary basis), and outsourced." 

Perception is everything in our industrial-cum-digital world, especially for those in positions of power, economically and culturally speaking. And it has been necessary to manufacture and manoeuver perception(s) through an ever-increasingly complex society, culture and civilization. This immense process takes a lot of planning, calculating and execution, with many, many variables, contingencies and nuances to consider. It takes a great mind, or set of minds, to successfully maintain a specific perception of the world that is not only accepted, but welcomed, and even defended, by those who, under less manipulated conditions, would perhaps be aghast at the thought that they were being deliberately duped on a daily basis. But then again, it appears that some dispositions are seemingly predisposed to welcome an illusory life of servitude to an economic system, national identity and symbolic order, in general. Therein perhaps lies the rub, in terms of ever achieving a wholly unified populace collectively rejecting an eventually transparent world of manipulation and exploitation perpetrated by a few elite factions, who have traditionally inherited their powerful positions and riches.

The time-honored, human frailty known as fear remains the greatest weapon wielded against us. We are socially tribal by nature (despite SJW's attempting to redefine what "human nature" actually is now) and we all have within us the fear of being rejected by others and even being ostracized and excommunicated. This very fear is now being exploited by social justice agendas, victimhood mentality and identity politics in the corrosive form of "outrage and cancel culture", where "deplorables" (to use Hillary Clinton's cringe-inducing, sanctimonious nomenclature) are now being deplatformed, censored and banned outright on social media, doxxed, demonetized on You Tube and essentially publicly shamed wherever they attempt to initiate an indiscriminately open discussion and/or debate. Far-left identitarians appear to be willfully immune to any logic, reason and fact-based arguments, evidently taking a strategic page right out of radical community organizer, Saul Alinsky's, playbook for effective radical activism: "Conservatives have a tendency to try to win every debate with logic and recitations of facts which all too often fail to get the job done because emotions and mockery are often just as effective as logic." This remarkable sentence could very well be ground-zero for the birth of the contemporary "social justice warrior".

But the mere necessity for such immature and childish tactics belies any notion of tenable justification under the predication of what we understand, generally, to be a civil society of coexistence and cooperation. But this is why a generation of unqualified entitlement, socially stunted, knee-jerk emotionalism, and utter lack of self-awareness, was so crucial to develop and implement on the political/world stage. Indeed, we have verily entered the "clown world" and "upside-down" of confusion and lunacy where they can pass a law that allows a transgender male to have "the right" to have an abortion, regardless of not actually being biologically equipped with a uterus. This should be called out for what it is: mental illness, and "enabling mental illness" at that. But what was previously adjudicated as a mental illness, such as gender dysphoria, by the APA, has now been overturned and furthermore pushed onto the general public to be accepted and seamlessly incorporated into the culture and society at large. In Canada, Bill C-16 was passed by Parliament in 2016 to amend the Canadian Human Rights Act and Criminal Code which added gender expression and gender identity as "protected grounds" within the Act. The traditional gender roles of men and women are being categorically reversed, where young men are now appearing effeminate and easily emasculated (some are even wearing unironic t-shirts that say "Beta Cuck 4 Life"!), and conversely women are becoming more masculine and socially aggressive. Surely this is sheer madness on an increasingly mass scale!

The Digital Age, neoliberal late capitalism and the rampant acceleration of technology has evidently played a critical role in our present identity politics and culture war crisis. Yet another cultural phenomenon happening on a lesser scrutinized scale is what is called "otherkin". More and more people, typically under the age of forty, are declaring themselves as "other" than their birth species. For instance, some are now identifying as "wolfkin" or "bearkin" or "deerkin" or "serpentkin", or even alternative identities emanating from fantasy or myth, like "elfkin" or "dragonkin" or "wizardkin". There's even a subculture called "Furries", where they dress-up, akin to cosplayers, as their favorite, personally identified furry creature. And much like gender dysphoria people, otherkin people are clamoring for social acceptance and legal acknowledgement within society, safe from ridicule and discrimination. There is real evidence here to suggest that humanity, certainly in the West, is regressing dramatically from generally well-adjusted individuals and mature adults to increasingly depressive, mentally ill, infantalized, data-processed dividuals, as Gilles Deleuze proclaimed in his 1990 essay, "Postscript on the Societies of Control":

"The numerical language of control is made of codes that mark access to information, or reject it. We no longer find ourselves dealing with the mass/individual pair. Individuals have become 'dividuals' and masses, samples, data, markets, or 'banks'. Perhaps it is money that expresses the distinction between the two societies best, since discipline (societies) always referred back to minted money that locks gold as numerical standard, while control (societies) relates to floating rates of exchange, modulated according to a rate established by a set of standard currencies. The old monetary mole is the animal of the space of enclosure, but the serpent is that of the societies of control. We have passed from one animal to the other, from the mole to the serpent, in the system under which we live, but also in our manner of living and in our relations with others. The disciplinary man was a continuous producer of energy, but the man of control is undulatory, in orbit, in a continuous network. Everywhere surfing has already replaced the older sports."

We are systematically being reduced and divided into algorithmic bits and bytes, mere information devoid of any real knowledge and "authentic being" in the Heideggerian sense. As the father of cybernetics, Norbert Wiener, said, "Information is information, not matter or energy." Our ontological experience is now an on-line avatar drained of nearly all and any identifiable visceral humanity, wherever you happen to be located, seeing as you are now ubiquitously connected to the matrix/hypermodern electronic exosphere enclosed around the entire globe, having a smart phone on your so-called "person" at all times. This is why so many "people" feel so disconnected with others, despite how many social media sites they frequent daily, and the moment to moment texts sent and received. As Peter Sloterdijk concluded, we are "foam", separate little bubbled worlds rubbing up against other bubbled worlds, only connected by the electronic membrane of the matrix that constitutes our "world interior". Wherever we are, it is. Mark Fisher called it Capitalist Realism. It is all around us, the very air we breathe. This is now our Hypermodern "reality".




According to John David Ebert and Brian Francis Culkin, in their collaborative new book, "Hypermodernity and the End of the World", postmodernism officially ended on September 11, 2001 and hypermodernity began, at least politically, where as culturally/technologically, it began in 1995 with the commercial advent of the internet when Windows 95 was released. I'm inclined to agree with the 9/11 commencement, as postmodernism was still very much a thing right up to the millennium, and at least residually for the first few years of the 21st century, ultimately dissipating completely by the time of the 2008 economic crash. We've been wholly in a hypermodern state ever since. One of the main differences between postmodernity and hypermodernity, again, according to Ebert and Culkin, is that the media of postmodernity had all been analogue, such as records, cassette tapes, photographs, magazines, celluloid. And the media of hypermodernity is exclusively digital:

"With the satellization of the Exosphere, the analogue telephone became transformed into the cell phone and later the smart phone, which jacked the individual into the World Interior from wherever on the planet he or she happened to be located. One didn't have to go anywhere to be included in the new Hypermodern World Interior. All analogue photographs could be dissolved from their nitrate surfaces and melted into cyberspace directly from the brain as the camera became an appendage of the World Interior. Vinyl records were dissolved and liquefied, and celluloid films transformed into bytes that obsolesced the movie projector. All analogue media were liquefied, dissolved and fed into the new matrix."

The Hypermodern Digital Age has utterly liquefied most everything within its all-encompassing mainframe, rendering it all as free-flowing information/data and pure Capital. It's no wonder so many "dividuals" are confusingly identifying with just about anything their unhinged narcissism becomes attracted to. In a recent study conducted by Idaho State University and College of the Canyons and Center for Positive Sexuality in Los Angeles, a paper entitled, "Do We Practice What We Preach? Real Vampires' Fear of Coming Out of the Coffin to Social Workers and Helping Professionals", a study that primarily focused on the growing "otherkin" and alternative identity community of "real vampires" - I kid you not - the researchers opined, "...it seems that rapid advances in technology provide a social environment conducive to the development of unique and unconventional identities. We should not be surprised to see a proliferation of nontraditional identities in the future."

They weren't kidding, in a manner of speaking, ironically. This definite proliferation of innumerable identities, regardless of the traditional parameters of reality, healthy maturity and political conviction, has been symptomatic of what has happened to language itself, that which ultimately constitutes the world, the Symbolic Order, in the Lacanian sense. Language, and its users, have unwittingly deteriorated unto semiotic chaos and excess, where in Hypermodernity, "too much is never enough". And this "language in chaos" appears to be affecting everyone, regardless of race, gender, politics, cultural identity, beliefs, ethics and values. Neoliberal late capitalism may very well be dragging us all towards the Singularity, like Elon Musk has been incessantly warning us about, where all meaningful differences will be rendered obsolete, as well as any discernible humanity, thus ushering in what Vernor Vinge predicted back in 1993: the post-human age. Is this what the far-left identitarians and SJW's are unrelentingly, and ultimately, fighting for? Because this is where we're headed, folks.