Since the industry’s pivot to peacetime in the late 1940s, computers have come to constitute our modern material environment literally, metaphorically, and aesthetically. Like a store-front window marketing display by Frank L. Baum, the ground-floor, street-facing show-room of IBM in the 1950s offered New Yorkers (regardless of outdoor conditions) a brightly-lit, unchanging view into the timeless, abstract world of computing outside of our own (Harwood, 46). Engineers in white lab coats moved spindles of tape and decks of cards, literally working inside the computer as they bused data between shiny large cabinets for curious onlookers twenty-four hours a day.

And there went everybody. The nature of the data being processed—demographic, financial, the results of opinion polling and sales data and audience testing and tracking—placed newly-minted “consumers” even deeper inside of computers. Not as components within the computer’s functioning, but as the content and subject of its very processing within model worlds. Through careful measurement and calculation, consumers’ free and self-determined choices are ever-more predicted, guided, and otherwise watched over by these machines of love and grace. For the conformists of these subjects, the Hegelian Absolute was long-ago captured and nudged into convergence upon the statisticians single, normative wavy line (McLuhan, Book I, 11). This abstract, mean consensus defines a collectively chosen, synthetic, cradle-to-grave universe and circumscribes all the water-cooler talk and sensation-on-tap occurring in between.

And then the computers moved from the labs to the living rooms, and now the front pocket. More immediately-reactive interfaces, working in “real-time,” placed the human subject even deeper into their very own “personal” computers. Gradually, the bulk of a “user’s” creative and logistical information moved from paper to circuitry and ferrous film by their own labour of data-entry. The offline, locally-processing machines of the 80s and 90s offered only a short historical respite from the control of massive corporate, government, and research institute systems. Through games, avatars, and online accounts, once again humanity moved deeper into the machine. Now, more and more facets of (dis-)embodied being itself were moved into virtuality. The chief benefit of being “always online” became always being tracked, data-mined and analyzed for possession by our ever-more full and complex “digital twins” (de Kerckhove).

At this point, the central question in Rodney Ascher’s 2021 film A Glitch In The Matrix—at least in its most abstract sense—becomes all-too-trivial: do we live in a computer simulation? In the above manner of speaking, the answer is, of course, unequivocally yes.

However you’ll find that Glitch—which premiered at the Sundance Film Festival last month—is the furthest thing possible from a dry etiology of today’s hottest variant on the Truman Show syndrome. It is instead a visceral, personal, empathetic rendition of what happens when this question of living in computer simulation inevitably escapes the scope of its technical premise.

Glitch answers the question, “When secular, material man (at least, they’re all men in this film) seeks his maker, to what cosmological and metaphysical reaches does the urgency of his quest—and his dearth of cultural knowledge there-pertaining—push his speculations beyond the mark?” My own explorations of the subject suggest the corollary, “At what psychological and spiritual cost have black-boxed, opaque, ‘easy-to-use’, artificially obsolesced, disposable consumer technologies, sold to us under the false premises of corporate propaganda, extracted their rents?”

If only Maury Povich would pull those truly responsible—let’s say some embarrassed marketer in a gaudy tweed jacket, a structuralist anthropologist with always-observant eyes incessantly darting across the audience, and a bespectacled and befuddled cyberneticist dragging behind him a trail of fan-folded, tractor-fed printer paper—out from behind the veil of the temple and declare them the real fathers!

No. The film’s answer lies in origins of the fantasies which have filled religion-sized holes in self-awareness: science fiction movies, new-age techno-shamans, half-remembered high-school philosophy class, and curiously—and if only by allusion—the 18th century visionary artist William Blake. Ascher is a master of his craft, and with this film accepts a challenge no slighter than that of a modern creation story. There are homeopathic-levels of references which might place it within any larger streams of culture, philosophy, and religion predating the lifetime of its young interview subjects. It is this calculated ommittance which provides the negative space for the film’s subjectivity to unfurl, beginning with its self-contained, contemporary zeitgeist. Aside from paintings by Blake and a twice-seen image of Vishnu aside, the film’s success as it proceeds in arguing its premise comes more through what is left out than what is brought in as evidence.

At the very least, the audacity of the film’s ambition—to give a fair hearing to the speculation that our reality is not “base reality”, and that we live inside a giant computer—will surely entertain while failing to persuade most viewers. This would have been true of a far lesser film given the very novelty and fascinating nature of the question being probed.

Far more interesting, however, is Ascher’s ambition in going beyond the already out-there premise, and the risks taken in going there. No small part of this film’s sensational effects are the ethical concerns it dramatically raises—and persuasively argues—but to which it couldn’t possibly provide closure.

These darker lessons of the film are unfinished once the credits roll, and so must be addressed here and elsewhere as the effects of ubiquitous computing on the canaries in PLATO’s coal-mine inevitably spread. This film will certainly convince you of this much: many people out there will take simulation theory seriously enough to join the true believers Ascher interviews. And—as the film takes great dramatic effort to demonstrate—the belief most of your flesh-and-blood human fellows are in fact NPCs (the non-player-characters of table-top and video role playing games) may entail a wanton disregard for the value of their lives.

Ironically, the take-away of this lesson may not be the one the film goes through the motions to performatively address: that simulation-believers ought to still logically respect and care for the value of life even if they suspect that life doesn’t really exist. The real lesson, far more pernicious, could be one not for the believers, but the audience who doesn’t believe in the film’s metaphysical premise.

This film could, inadvertently, just as well be teaching the skeptical, “base-reality”-dwelling audience member something worrisome regarding the motivations and psychology of those who commit atrocities: that they’re all just hopeless simulation believers; that each and every mass shooter must be psychologically broken, trapped in their own unreachable universe, pathologically “othering” everyone except themselves, turned into monsters wholly-unrelatable except on those terms.

Let me use an example from the film. One expert discussing the Christchurch shooter (who live-streamed his murders with a body-camera) asserts that since his actions are reminiscent of a violent video game, his devaluing of his victim’s lives was likely related to his playing of violent video games. Like a synecdoche for the total film’s strategy, this assertion is only plausible as a final analysis to someone who had no other points of reference for contemplating the potential motivations of this particular shooter as a real human being, along with all the complexity that entails.

Let’s be clear: what I mean is that the argument that solipsism—the state of skepticism regarding the reality of anyone else—may cause people to devalue human life and hurt other people should not be run in reverse. People who hurt and kill other people can not be routinely dismissed, as a matter of course, as dehumanizing solipsists. Murderers—even mass murderers—can and no-doubt often do believe in the humanity of those they kill while killing them. They need only have even higher priorities than respecting the sanctity of life—we don’t, after-all, assume that everyone who sacrifices their own lives for a higher-cause hated themselves!

Our mature capability to understand and relate to the full breadth of human nature requires our rejection of the ironically-dehumanizing psychological tactic of assuming all mass murderers see their victims as non-human. This tactic merely serves to preempt discomforting self-appraisal of our own darkest capabilities. As unpleasant as it may be, it is much healthier—when we are strong enough to contemplate the all-too-human evils which we see on the news or may even witness in our own lives—to appreciate every possible interpretation of the 16th century martyr John Bradford’s utterance “There, but for the Grace of God, go I.” (If the word “martyr” didn’t tip you off, God’s grace did eventually run out for Bradford.)

Perhaps the above few paragraphs strike you as absurdly serious for a silly film review about eccentric geeks who play too many video games, or insufficiently rigorous for the gravity of their subject matter. I’m sorry to have gone all heavy on you, and more-so for falling short. Let my inadequacy illustrate, though, precisely the same inadequacy of the film. This is an inevitable limitation in a movie as audacious in its ambitions as Glitch. Your reviewer didn’t want to go there, but the movie went there first! And ultimately I’m grateful that it did as, is slowly becoming clear in the age of cybernetics and virtuality, there go we all—in spite of our inadequacies.

Let’s be real. Rodney Ascher has attempted to retroactively synthesize an earnest, rigorous, relatable metaphysical treatise and moral framework to gird a quasi-religious state of being arising from our technological, mythically-enchanted world in film. In under two hours. He’s trying to capture the full human condition while spanning from the light of infinite potential to the darkness of nihilism and insanity. Of course the film will fall short of the bar set by—how shall we say it—the older belief systems of which it conspicuously mentions as little as possible. But in witnessing and scrutinizing the attempt, there is ample reward, and Ascher’s many successes in the film are laudable. Hell, he made a documentary about The Matrix that, at least for younger viewers, could compete with The Matrix!

The clean IBM showroom of the 1950s became the imaginary, empty white spaces of THX 1138, and the weightless, sterile space inside HAL 9000 (Harwood, 147). Industrial design and architecture informed the aesthetics of our interior landscape when we first contemplated the verisimilar “inside” of our computers: the “other side” of our screens. This is cyberspace: a virtually real, sensorily-tangible place, evolving its aesthetics through the neon wire-frames of TRON and the static skies of cyberpunk novels in our collective imagination, to today’s touch-screen UX design. Perhaps our proclivity to use our latest technology as heuristic models for our mind has finally failed completely. Since so few people really know how these darn contraptions even work any more, how can we even make good metaphors out of them. Isn’t it only natural that simulation theory, with its appeal to fantasy fiction and senses-on-tap, fill the vacuum in our need for self-awareness?

You should see this film. You will find A Glitch in the Matrix to be utterly enthralling, shocking, and viscerally upsetting. Had I attended a live screening I’m certain that social pressures would have forced me to make a show of angrily storming out of the theater during the film’s final third—and that’s a feature, not a bug. Most unnerving, your several simultaneous states of disbelief will be in exquisitely taut tension with a new unshakable conviction: someone I know would probably believe every single minute of it.

References

de Kerckhove, Derrick. “Three Looming Figures of the Digital Transformation.” New Explorations Journal. Volume 1, Number 1. 2020. https://jps.library.utoronto.ca/index.php/nexj/article/view/34218

Harwood, John. The Interface: IBM and the Transformation of Corporate Design, 1945-1976. Minnesota: University of Minnesota Press. 2011

McLuhan, Marshall. Typhon in America or Guide to Chaos. Unpublished typescript. Library and Archives Canada collection MG31-D156, box 64, files 5, 6, 7. 1947.