The energy performance of buildings directive

Sometimes, I delve into dark, insanity-inducing realms of madness and chaos, from which few mortals have reemerged with their faculties of reason fully intact. To speak of these geographic null-spaces in polite company is to invite scorn and contempt, and so investigations have to take place far away from public scrutiny, lest the fearful among us act out their barely contained primal urges.

I talk, of course, about EU legislation.

At some point, the powers that be decided that buildings across the continent leaked too much heat and wasted too much energy. All that wasted power could go towards facilitating free trade and tearing down barriers of commerce instead. And so, they instituted the inspiringly named Energy performance of buildings directive, which aims at reducing energy waste in existing buildings and preempting energy waste in future ones.

An important aspect of EU legislation is that the central EU office can not tell any one particular person what to do. Directives have to go through national and regional intermediaries before they become real, actual imperatives that real, actual people have to obey. This often means that when regulations meet reality, they often do so under the guise of some seemingly arbitrary third party that at some point ended up being appointed to carry them out. Directives are not enforced by suited EU officials enamored with cosmopolitan visions of a continent united, but by local – often municipal – inspectors who may or may not have positive feelings about this new rule sprung at them by the higher-ups.

Another important aspect is that the central EU office does not know anything about conditions on the ground. This goes with the territory – one central agency cannot possibly possess detailed knowledge about every building on a continent. There can be ambitions to change or improve conditions, but it takes a lot of work and a great many missives to intermediary institutions to actually get anything done.

The buildings directive is interesting in that its earlier incarnations took this into account. These versions did not set out to change building standards directly. Instead, it set up a framework within which anyone who owns or operate a residential building larger than a cottage, has to file a report every so often, containing information about how much energy the buildings use, their methods of heating, ventilation system, etc etc. The contents of these reports were then collated by the national intermediaries, who sent an executive summary up the chain to the head office. Based on this grand and ongoing survey, subsequent versions of the directive have been revised to add minimum standards for new constructions.

For any one individual person, the phenomenological impact of this is that they every so often have to hire someone who knows how to file this report and give them a grand tour of the innards of their property. Once the report is signed and filed, the whole ordeal is swiftly forgotten, until years and years later when it is time for another round. It is, to use vernacular parlance, not a big deal.

In aggregate, however, this results in a non-trivial amount of human activity. There are a great many buildings larger than a cottage, and filing a report on each and every one of them requires as much effort as it does time and space. Reading and synthesizing all of these reports into useful statistics that can be understood by bureaucracy-driven institutions also requires its share of effort. These bureaucracy-driven institutions, in turn, have to do work their mysterious ways with these statistics, until they produce documents to send off into the nebulous void of public policy. And so on and so forth. In aggregate, the amount of human activity is both staggering and vertigo-inducing (a scary combination of sensations indeed).

Another mind-bending aspect is that all of this activity happens within the scope of a single directive. There are a great number of similar directives, which also have similar requirements regarding the collection and distilling of information. Each and every one of them have their own intricate webs of who reports what to where. The totality is such that attempting to comprehend it all in one go leads to almost Lovecraftian levels of madness. It is to glimpse into a substrate of reality which structures the way we live – literally, in this case – but which is completely invisible to ordinary everyday persons. The notion that there are hundreds of these matrices and nexi of invisible human activity – it is too much to bear. The mind reels. Ordinary everyday rationality abdicates in favor of incomprehensible scribblings.

Lovecraft had his protagonists go to the end of the Earth to find the mountains of madness. This is a somewhat optimistic notion – it implies that the mind-bending sleeping gods are far away, and that awakening them requires crossing great distances at immense peril. The modern condition is that we have our own home-made blind idiot gods right here, ingeniously disguised by the sheer mundanity of everyday office spaces. Their dreams, too, affect the subconsciousness of nearby human beings, albeit with slightly more indirect effects.

The good news is that you get over the vertigo after a while. The bad news is that it creates a rift between you and your peers. You have glimpsed the hidden substrate of reality, and it glimpsed back. One does not emerge from such experiences unchanged.

The energy performance of buildings directive

Long live the queen (2013)

There is a memorable scene in Long live the queen. Elodie – the protagonist, player character and teenage soon-to-be titular queen – goes to a royal ball. It has all the trappings: plotting nobles, fancy dresses, court intrigue and, above all, the iron certainty that whatever Elodie does, it will be judged. All eyes are on her, and not all of them have her best interest at heart.

This is a scary situation for most of us. You are about to enter a situation where there are hundreds of unwritten rules and dozens of ways to interpret even the slightest of variations, and up until now you have only learnt about five of these. Worse still: those present know that you don’t know the rules, and will actively use this against you. Everyone is all smiles and politeness, yet every move is heavily scrutinized such every action or inaction conveys three boatloads of significance more than it should. Intentions are not converted to outcomes; indeed, the path between one and the other is so convoluted as to be effectively arbitrary. Worse further: you’re the guest of honor, so there’s no retreating to safety. You are firmly lodged on the tiger, and the only way off is to keep a hold of it until it has run its course.

The game represents this mechanically in terms of skill checks. Every time an event occurs, the game performs a check to see if the player character has the requisite skill to navigate the situation. Usually, these checks only look at one or two skills; did you know enough court etiquette to not accidentally offend the visiting dignitary, could you hold still long enough to avoid the snake, were you well-versed enough in history to pick up on that reference? If yes, the game flashes a brief “skill check passed” message, and gives you the appropriate outcome. If no, then the game flashes a brief “skill check failed” message, and destiny runs its course.

What is memorable about the ballroom scene is that it performs a lot of skill checks all at once. On your first playthrough, you will have picked up on the way the game communicates skill checks to know that passing is good and failing is bad. Thus, when the game throws a dozen “skill check failed” at you all at once, it is a jarring experience. More so when the game does it again seconds later. It is also narratively accurate. Elodie has, by force of circumstance, been thrown to the wolves, and these are judgmental wolves at that. When you, the player, feel insufficient for failing all these skill checks, it is ludonarratively consonant with what the player character experiences. Your experiences are one and the same.

Being a video game, Long live the queen affords you the opportunity to replay the sequence to find out what happens if you actually pass these skill checks. Some of them are inconsequential, others branch off into entirely new narrative branches. It is impossible to pass all checks in a single run, so an avid explorer will have to be patient to see all outcomes. Alas, the same does not go for real life.

Real life for most of us involves slightly high-stakes royal balls than we would like. It does, however, include situations just like it, where we encounter a great number of people who behold and judge us on merits we do not understand and could not have understood at the time. Everywhere we go, we bring the skill checks with us. Most of them, we fail without ever noticing. It is not possible to pass every check in one run.

When Bourdieu described a habitus as a collection of generative principles from which behavior emerges, this is what he meant. Whenever we enter into a situation, skill checks are performed on us. Based on our habitus, we wither pass or fail these checks. Do we have the proper accent? Are our clothes befitting the occasion? Are we able to pick up on cultural references that are made between the lines? Have we heard that joke before, and so can join in on the punchline? Are we aware that flirting is in fact going on? If yes, go to one outcome; if no, go to another. Having a habitus means having lived a life such that you either pass or fail these skill checks, and people judge you accordingly.

Oftentimes, skill checks are clustered together, such that if you know one thing you also know another. If you have lived somewhere for decades, then you can join in on discussions on the great storm of ‘96; you can also remark on the fire in ‘03 and the unfortunate passing of the old pastor in ‘06. The fact of having lived somewhere a long time is the generative principle which allows you to partake in the act of communal remembrance. The same goes for having attended certain schools, partaken in sports or hung out in particular circles. The experience from these extended periods of time allow you to pass these clustered skill checks, such that those who are in the know can tell that, yes, you are indeed one of us. It also allows for almost immediate detection of those who are not one of us.

The starkest example of the latter is when a non-academic visits a university campus. First off, there is the unfamiliarity of a new geography. More brutally is the sense of not being welcome there. Those present are all reading fancy books, wearing unidentifiable articles of clothing and having inscrutable titles which sound fake but somehow are not. There is a whole world of skill checks going on, all of which fail silently and without preamble.

Bourdieu argues that it is not conceptually impossible to attain a different habitus. It is possible to move from one class position to another, or to change profession, or any other such transition. Live within a certain context long enough and you eventually learn what the skill checks are and how to pass them. It is possible to fake it until one makes it. Learning by doing is a thing. Visit the campus library often enough, and you too will become a library patron. It can be done. History is not destiny.

In practice, however, we only have the one upbringing, which has instilled in us the generative principles which cause us to behave as we do. Our lived experience has brought us to this point, and this is who we are. Some skill checks we pass with ease, others we fail spectacularly. Neither the passing or the failing is a sign of virtue; it is just a contingent fact of history, a culmination of life as it unfolded. They will judge you for the shoes you wear – it is only the specifics of which shoes and which they that differ.

The ballroom scene Long live the queen manages to do two things. The first is the already mentioned illustration of just how many skill checks happen all at once in one single moment. The second is to underscore that it is neither possible nor desirable to pass every skill check. There will be situations where you are right at home, and there will be situations where nothing makes sense. In the game, you can start over to see what happens if you learn certain skills (learn falconry, it is important). The joy of playing lies in exploring just how intricate the story is, and just how many unexpected skill checks (falconry) there are. In real life, the path to wisdom is to realize when you are in your element, and when you need to ask someone for advice. We all fail skill checks all the time, and that is okay.

Long live the queen (2013)

Barbenheimer

At the center of Oppenheimer, there is one single question. You might imagine it being something scientific and fancy, concerning the reasonable limits of reason or the proper role of theoreticians in a society characterized by inscrutable levels of specialization. Or that it borders on the theological: now that humanity has the power to destroy the world, will it be able to bear the responsibility with grace? Or, perhaps, that the question is of prudence: given the options to use a few big bombs or several thousand smaller ones of equal accumulated destruction, which should a person of virtue constrained by the realities of war opt for?

These are themes the movies touches upon, but neither of them is the big overarching question which returns again and again and again. That question is:

Are you or have you ever been a member of the communist party?

Readers will of course recognize this as the iconic phrase used by the House Committed on Un-American Activities, whose purpose it was to root out suspected communists, real or imagined. Readers will also note that this question is slightly smaller and more prosaic in scope than the ones hinted at above. Figuring out the ethics of science and nuclear bombs – those are high and mighty questions, whose answers lie well into futures yet to come. The question of communism, on the other hand, is a concrete matter concerning the here and now; to a paranoid mind fueled by what would later be called McCarthyism, communists are a greater danger than atom bombs, and should be treated as such.

The movie begins and ends with a Senate hearing, not about the atom bomb, but about the confirmation of a new cabinet member. During the proceedings of this hearing, the political opinions of our titular Oppenheimer are brought to bear; did he not associate with communists, making him politically suspect and a security risk, regardless of his contribution to the development of the atomic bomb? Did he not lose his security clearance over this very issue? Would the potential cabinet member like to comment?

It is characteristic of paranoia that nothing is ever settled. No one is ever above suspicion. No one is let off the hook, regardless of their past performance and present prostrations. The more you give in to the paranoid impulse, the more it demands of you; it acts very much like Freud’s superego, sucking all the energy out of the room. Ultimately, it consumes the world, either by lighting it on fire or by turning it into a very, very small place.

In Barbie, there is a heavily choreographed song and dance number. It has all the bells and whistles, letting the actors play out the utopia of the Barbie imaginary; for those who know their way around a stage production, there are more than enough moving pieces to reward careful attention to peripheral detail. The whole shebang comes to a standstill, however, when protagonist Barbie suddenly asks if anyone ever think about death. It is an intrusive thought, both in the psychological and ideological sense; it perturbs the whole notion that these characters spend their eternally perfect days putting up the same musical number again and again and again. The discomfort seen on the faces of the assembled actors communicate the extent of this intrusion – here is an idea powerful enough to blow up the social fabric merely by being vocalized. Barbie has become death, destroyer of worlds.

After that point, protagonist Barbie can not return to the blissful state of ideological ignorance from whence she came. Social life becomes an act, a performance that has do be continuously put up and repeated in all its familiar steps. Perfect reproduction becomes not a reenactment of the idyllic beforetimes, but a precondition for being included in the social order. Not participating is not an option, even as protagonist Barbie’s body physically falls apart from the effort.

In the context of the movie, there is an escape from this mandatory performance. Protagonist Barbie quite literally leaves the social world of Barbieland and travels into the movie version of our reality. Oppenheimer, for his part, had no corresponding escape vector. He was stuck in a world of perpetual paranoia, where each and every contact with anyone even remotely communist became a matter of state record and, ultimately, the aforementioned senate hearing. Oppenheimer, too, had to put on the act of being a good ol’ American patriot, apple pie and baseball to the core, who only incidentally happened to create a weapon of mass destruction that one time.

It does not require an extensive exegesis of Judith Butler to see the parallels between performing gender and performing the role of a straight American patriot. Both have a very prescribed set of values and activities within which an individual is meant to find the entirety of their lifeworld, and both have very definite punishments awaiting those who dare to stray too permanently outside the rules. Both are enforced by a very proactive and paranoid mindset, such that even those who do not necessarily believe wholeheartedly in the projects of gender roles or anticommunism respectively feel obliged to put up a good performance. The show must go on. Their lives and livelihoods depend on it.

While it would be a stretch to call Oppenheimer a feminist work, the accidental juxtaposition with Barbie serves to underscore just how confined the social world of traditional gender and national roles really is, and how scant the rewards are for remaining within these confines. There is no small irony inherent in the fact that a mild questioning of these roles is, at present, more likely to cause explosive societal upheaval than the remaining nuclear arsenals. The paranoid mind suffers no transgression, no matter how small.

Barbenheimer

On the ephemeral nature of internet artifacts

A big hurdle for theater scholars is that the unit of study – the theater performance – is an inherently ephemeral entity. There is a great deal of hubbub and commotion during its production, where actors, props and costumes are made ready for the big moment; the time leading up to a performance is hectic indeed. The machinations of these preparations have been preserved to time, with some even in use to this day. The actual performance, however, is forever lost to time. It happened there and then, at that one particular moment, and then it ended. Shakespeare had his release window, and then it closed. All that is left are the memories, the catharsis, and hopefully enough paraphernalia that the plays can be put up again at a later time. But that is a different performance, which too only exists in its moment.

For empirically minded theater scholars, this ephemerality means a suite of indirect methods have to be employed. Two have already been alluded to – first, to study the various props and implements that remain from the productions in question, who have survived through the ages; second, to draw what careful analogies there are to be drawn from how theater is done today. The former method has the advantage of sheer hands on materiality – the objects are right there, available to the senses. The latter method has the advantage of putting things in context; the basics of setting up a production ought to be the same then as now, and so the difficulties of the present ought to be roughly the same as those faced in the past. The past being prologue and all.

A slightly less direct method is to read contemporary accounts of the performances in question. Reviews are an obvious choice, seeing as they pertain directly and explicitly to the relevant material. Diaries, letters and other such personal writs which mention the performances also bear testimony, albeit at times as a sidenote. Persons of the past do not write to suit the research needs of the present, and thus some words might – despite their scarcity – be read in vain.

For theoretically minded theater scholars, even more indirect methods are available. The attempt to overcome the lack of immediate access to the source material gets real weird real fast. If you are into seemingly simple premises turning into extremely esoteric examinations, then I urge you to put your theoretical theater gloves on – nothing will prepare you for just how far this little candle manages to cast its shadows.

An attentive reader might, at this moment, begin to wonder about the title of this post. There is much ado about theater and nothing about the internet. Here, I shall make my cunning rhetorical move: internet artifacts are like the theater in that they, too, are ephemeral entities which are only ever accessible at a particular moment in time, and then have to be carefully reconstructed using a combination of elaborate empirical methodology and esoteric theoretical shenanigans. The fact that we are in a moment where we have access to said artifacts does not alter the inevitability that there will come a time when access is cut off, and so manages to put ye olde Elizabethan theater and dril tweets in the same ontological category. Both used to be things a person could just access, no big deal; then they were gone, nothing to be done about it.

This might seem obvious upon reading these words. That is the power of my cunning rhetorical move. There is, however, a prevailing trend of thinking about internet artifacts – tweets, websites, forums, social media posts of all kinds – as inherently eternal. They are after all stored on computers, and computer memory is safe from the ravages of time. Someone, somewhere, will archive it all – it is still all right there, just waiting for someone to press the copy button and save to disk.

It would be unwise to put such faith in the generalized archival ambitions of the powers that be. Most online services exist only for the purposes of turning a profit, spurring the activities that encourages profit to happen. Preemptively archiving everything for future researchers is not a money-making move in the present, and even less so when a service reaches the end of its projected profitability. It costs much less to just urge users to make a personal copy before a certain deadline, and then take the servers offline, breaking all links to the past.

Nor should we trust the archival prowess of the heads wearing the crowns. Take the case of Twitter, who accidentally made the entire site unavailable for those not logged in, cutting off researchers and data scrapers alike from doing their memory work. If it happened once, it could happen again. What is done cannot be undone.

At this point, it would be easy to urge everyone involved to create backups of their internet artifacts. Download their tweet archive and store it safely on a responsibly cycled backup disk. While this would undoubtedly be helpful, it is insufficient. At some point, we will find ourselves looking back at Twitter the same way we do an original Hamlet – it happened once upon a time, but no longer. What we need are contemporary accounts of what it was like when it happened – accounts preemptively written to give future scholars the context they need to understand just what it meant when everything happened so much. When everything smacked of gender. Why we liked the place and would willingly waste our time on it.

The time for proactive nostalgia is over. The time for preemptively writing history is now. We know what we are, but know not what we may be.

On the ephemeral nature of internet artifacts

Power overwhelming, or: Velma is lesbian now

Recently, it was announced that Velma – of Mystery Inc. Scooby Doo fame – is canonically lesbian. This came as a surprise to no one who has ever pondered the matter; the reaction could best be described as a subtle mélange between “well, obviously” and “finally”. While the canonical works have, up until now, been somewhat ambiguous on this point, the general opinion among those vaguely familiar with the series has for the longest time been that, yes, she is very much lesbian, no two ways about it. The general intellect has firmly assimilated this fact, even if it is not reflected in the source material.

This is an interesting state of things. Fictional characters only exist as far as they appear in the text – be it on a page, on screen or some other media. They have no objective existence outside of the text, they are pure representation; when the movie ends, that is it. What lingers is the memory of having experienced these representations, and a series of logical inferences that can be drawn from these same memories. Fictional characters in themselves are not this or that; they are fiction, made up, we can change them at will by writing them differently. And yet, these phantoms of representation can have inescapable features that impose themselves whenever someone mentions a character’s name. Velma is a fictional character, she is not real; she is also, unequivocally, lesbian. The fiction has a material solidity to it.

This interesting state of things has been the source of much confusion over the years. Not least in terms of dead authors, which began as a dry technical observation on the craft of literary criticism, and then took on a life of its own. We should not understand the death of the author to be a radical separation between work and writer; rather, we should understand it in terms of the author not always being the best conversationalist about the works in question. Authors, unlike fictional characters, are actual persons with an actual body and an actual capacity to produce words about the things they wrote. More importantly, authors can only be in one place at a time, unlike fictional characters who can be many places all at once. More importantly, fictional characters can be the center of many conversations at a time, while authors can, at best, juggle three or four, given a sufficiently stuffed dinner table. Physical authors are a conversational bottleneck, and fictional characters thrive in conversations. The more of these conversations there are, the better. It is only natural for conversations to outpace their inciting incident.

There have been a great many discussions about Velma, the fictional character. These discussions have mostly trended in the same direction, with more or less explicit sapphic overtones. As the years went by and these discussions faded into the ambient background noise of popular culture, the obviousness of Velma’s orientation became more and more entrenched. It is no longer a point of contention, a matter o debate, a question to raise; it is settled, part of the strange materiality of fiction. It could not be otherwise.

I contend that Scooby Doo has become archontic. A feature of becoming archontic is that the original source material – the movies and series about Scooby Doo, in this case – are placed in a context where they have equal status with other works. Or, indeed, with years’ worth of accumulated conversations. If you were to watch an old episode of the series, it would be filtered through a contemporary understanding of what it entails. The present imposes itself.

A slightly less sapphic example of archontic texts are modded video games, where players have become so used to playing the modded versions that the original is a possible mode among many. Any given play session could go with one of the mods, or with the unaltered game; it could go either way, both are valid options. But the unmodded version will forever be reinterpreted in light of there being different ways to go about playing; the conversation will have these altered states in mind, and proceed accordingly. There is no turning back, there is no primacy of the original. There are only further conversations.

The original creators of Scooby Doo may or may not have intended for Velma to be an unequivocal sapphic icon. Intent is immaterial, however. Years and years of people talking about Velma as a lesbian has cemented this version of fictional fact as actual truth, and so it became an unavoidable matter of course. To paraphrase the archons of Starcraft: the lesbian presence of Velma is a power overwhelming. Making it canonical is not only natural; it is the path of least resistance, given the materiality of fiction.

Power overwhelming, or: Velma is lesbian now

The fantasy theme of rationality, or the big mood of speedrunning

Speedrunning is an eminently rational practice. In terms of being a rational practice, it has the advantage of wearing its prime rational value on its sleeves. That value is speed, which supersedes any and all other priorities whenever a run is made. A run being whatever happens between pressing “new game” and the end credits rolling; whatever it is, faster is better.

At this point, some readers might object that completing a video game in the shortest amount of time possible is the opposite of rational, and that nothing good comes out of it. This is a valid criticism to make, but it misses the point, and slides on the definition of “rationality”. Being rational means having a goal and taking effective steps towards achieving that goal, measuring and evaluating one’s performance over time. The goal itself may or may not live up to the description of being a virtuous pursuit worthy of striving for, but once the goal is established, whether a course of action is rational or not is fully determined by whether it advances the progress meter towards that goal. Rationality is fully process-oriented, and thus, it can be rational to play video games at high speed.

What makes speedrunning such an eminently rational practice is that it lays bare the shedding of other values in the pursuit of the goal. During the course of a run, the speedrunner actively ignores or bypasses many aspects of a video game that are usually considered vital or important, such as text, story, narrative themes, feats of graphical majesty or works of musical mastery. The goal is to get to the end as fast as possible, and stopping to smell the roses (or behold the game as a work of art) is the opposite of going fast. As the runner finds ways to bypass ever more components of the game, we come to understand that this is how other rational processes come to shed components of a practice that might be viewed as important when considered by an outside onlooker. Rationality strips away everything that is not goal-oriented and distills the process to the barest viable minimum. In the context of speedrunning, we have the advantage of knowing explicitly what the goal is. In other social or economic practices, we are unfortunately not always so blessed with clear definitions.

Fantasy themes have the advantage and disadvantage of being one of the most useful analytical tools stuck with the absolute worst name imaginable. A fantasy theme is a critical (as in ‘rhetorical criticism’) attempt to recreate the spirit of what it was like to be there (as in “you had to be there”). Most fiction acts according to this principle – skilled authors know how to write in such a way that when the punchline or climax arrives, the reader is in the perfect frame of mind to appreciate it. The fact that it takes a while to arrive at said frame of mind is a feature, not a bug. When an author or a social situation manages to create a fantasy theme powerful enough that more than one person feels it, the results are pure magic.

Of course, reconstructing these themes after the fact is a difficult thing, and even more so to write a text that conveys what went through people’s heads in actual social settings. The payoff, however, is immense – decisions that in hindsight are inexplicable or seemingly irrational are suddenly contextualized thus that they make sense, or even attain a measure of inevitability. The past is made present, and thus we are reminded that we are all equidistant to eternity. The challenge lies not only in collecting enough material support for the claim that a certain theme permeated a certain moment, but also in avoiding the temptation to go full Herodotus and just invent events that did not happen but fit the narrative. It behooves a critic to remain faithful to the sources.

At the intersection of speedrunning and fantasy themes, we have this 2019 Summer Games Done Quick showcase of a Chrono Trigger run. The fact that it is six hours long is both natural and significant. Natural, in that Chrono Trigger is a very long game which takes a very long time to finish, making the six hours an impressive feat indeed. Significant, in that because of the duration we are able to see the fantasy theme emerge into the situation and how it affects those present. What begins as a rather straightforward showcase of a speedrunner’s rational bag of tricks to avoid getting bogged down in time-consuming minutiae (the hallmark of these kinds of games), gradually morphs into an emotionally intense scene where the only proper response is a prolonged triumphant yalp, which cannot help but infect the audience even now, years later. Not only is it impossible to step into the end without understanding the series of events that led up to it – it would be to miss the whole point. What amazes and amuses me is that we end up with a perfect blend of rationality and fantasy, of solidly goal-oriented measurable accomplishments and a relentlessly intangible case of “you had to be there”. Sometimes, you gotta go slow in order to go fast.

Should you decide to take the plunge, be aware that the video is in fact six hours long, and that you do not want to be in a hurry to do anything or go anywhere upon its completion. The ending is a mood a person has to sit with for a while, and this text is but a nod in appreciation that it exists as an object in the world at all. Most fantasy themes are irrevocably lost to history, but this one – this one we managed to catch. It is perhaps the most anomalous aspect of the whole event.

The fantasy theme of rationality, or the big mood of speedrunning

Nostalgia

The big draw of nostalgia is that the future is foreclosed. All the possible paths and trajectories life could have snaked from that one nostalgic point have collapsed into the one singular timeline we now find ourselves in. All worries have either borne out or turned into intense nothings. All upcoming exertions have been performed. All feats of bravery, cowardice, romance, ambiguity, clarity, sickness, health, love, hate – it is all in the past now, as surely written in the annals of time as in our memories. The letter has arrived, we have received it, and as we think back on that nostalgic moment, we can appreciate it as a pure immobilized snippet of time where nothing ever changes. For the purposes of nostalgia, time truly is a moving image of eternity.

The attraction of such eternal clarity only heightens in a present characterized by everything but a clear sense of what is happening or where we are heading. The world shifts this way and that, mashing contradictions together every which way, such that making sense of it all seems a fool’s errand at best. Some embrace the tactic of a forward retreat, following every wild goose chase to wherever they lead in the hopes of it being on firmer ground than here. Others fortify their current location, making it a bastion as impenetrable physically as it is socially, with little to no regard for how suitable it is for permanent habitation. This leads to an ever increasing juxtaposition of unstoppable forces and immovable objects, which only serves to increase the overall chaos. In all this, nostalgia tightens its grip, almost to a chokehold. The glorious past, yes. If there is one certainty in this world, that would be it.

The great philosophers, poets and historians all try to tell us that this sense of a solid unmoving past is an illusion, and that everyone was, is and will be equidistant from eternity. The ancestors faced much the same fears we do, albeit about different things; the register of human emotion has only varied so much across time. Our children will face much the same challenges we face now, albeit hopefully aided by the wisdom we pass on. Our peers – our companions in the present – know just about as much as we do about what is happening, albeit with different levels of proficiency when it comes to hiding their fears. Time then was as much as it is now, and it is comforting in a way to think that the Ancestors too looked back with a nostalgic sheen in their eyes. Our heroes doubted too, beset by the very worries that loom over us now. There is solace to be had in the fact that they at no point were able to sidestep the future. At the moment they performed their great deeds and became our revered heroes, they were scared out of their minds and hoping against hope that it would work out. The past had not become inevitable yet.

We need to be reminded of this every now and then, lest the closure of the past results in the foreclosing of the future. What is yet to come is still contingent, as is the present, as was the past. Nostalgia aims to lock the past in perfect perpetuity; it is only a short step until it locks the future into the days beyond this one which are still perfect. Reject this impulse. Embrace the perpetual now, and push contingency as far as it will go. Build your own ancient traditions. Do it now.

Nostalgia

The discipline of sociology

One of the least intuitive aspects of sociology is that it sees everything as real, with a very broad definition of everything. UFOs? Real. The existence of the divine creator of the world? Real. The metaphysics of sports? Real to a fault. Astrology? That’s a mother lode of reality. Everything, it’s all real. All of it.

This is such a counterintuitive claim that most people write the whole endeavor off as soon as they catch wind of it. “Really everything? All of it? That cannot possibly be right” they think to themselves, only to read on to discover that, yes, all of it, everything, the whole shebang. Anything and everything goes, nothing is out of bounds. Which, unfortunately, can cause folks of a literalist and positivist bent to just write the whole discipline off as an exercise in lunacy, a waste of time and, frankly, a bunch of gobbledygook unworthy of anyone’s attention.

The key to unlocking this counterintuitive claims is to realize that it is not operating on an ontological level. Sociology makes no claims about the factual existence of UFOs, gods or the predictive properties of stellar phenomena. Sociology does, however, assert the existence of social contexts structured around these entities, which can be studied systematically and scientifically. And, since sociology is the study of social structures and contexts, this is exactly what sociology goes on to study.

This places UFOs, divine beings and horoscopes in a strange ontological position. On the one hand, there is scant empirical evidence for the existence and/or efficacy of any of these things. On the other hand, they evidently operate as an organizing principle in the lives of the people who are interested and engaged with the communities surrounding them. While no UFO has been conclusively spotted, this poses no difficulty whatsoever for UFO spotters who travel far and wide in their quest to make contact. Nor does it negate the existence of UFO believers. For the purposes of understanding why these people do what they do, we have to posit that UFOs have real social effects, which makes UFOs – you guessed it – real.

Durkheim called these strange objects in the shadowlands of reality “social facts”. Regardless of the ontic facticity of flying saucers, a sociologist has to treat their status as an organizing principle as a thing that factually exists in the world. And, more crucially, the purpose of studying a community of UFO enthusiasts is not to finally nail a photo of those flying buggers; it’s to watch how the social fact plays out in actual physical reality, as an empirical process which (fortunately) can be observed, studied and documented. This is the empirical reality of sociology.

As you might imagine, this has methodological implications. If someone were to barge in to these contexts in the name of Science, proudly proclaiming that organizational principle of said context is pseudoscientific bullshit that has a less than zero chance of being real, then the response will (quite rationally) be to throw said person out the door and ask them to never show their face again. In methodological terms, this is a critical mission failure; the scientist has failed to secure access to the empirical material. When working with people, you have to get along with them. This means approaching the object of study with a measure of politeness, and to come prepared, having done your homework. They key to methodological success is to know intimately what it means that Mercury is in retrograde.

Given that sociologists study real social contexts, it follows that these very same contexts have no obligation whatsoever to let the agents of sociology roam free to make empirical observations to their hearts content. These are real people with things to do, time tables to keep, budgets to consider and kids to pick up. Making room for strangers suddenly appearing to ask nosy questions is, for all intents and purposes, well above and beyond. This means that gaining access to the relevant social milieus, a sociologist either have to be very patient and play the long game in building trusting relationships, or otherwise pull their weight in some significant way. To be sure, being able to say that you work at the university of [insert name here] opens a great many doors, but it also closes others, and at times it makes no impression whatsoever. There is no one correct way to gain access, but a great many ways to find oneself barred from it.

This raises the question of scientific objectivity. The scientific ideal is a dispassionate observer who observes and analyzes the data according to strictly defined criteria in a rigorous manner. Ideally, it should be utterly irrelevant who interprets the data; if the method is followed, the same results should follow every time. For obvious reasons, this does not work when studying real social settings – a sociologist who has spent years building up trust and mutual respect within a community is not interchangeable with a physicist whose main activity is to juggle variables. Indeed, this very same sociologist is not even interchangeable with another sociologist – the community under study simply does not know the guy. When the method of measure is a particular person, things get personal indeed.

On the basis of this, it might be tempting to write off the results of sociological investigations as unscientific. Indeed, if results can only be replicated by one singular person, then that would be quite a blow. However, this state of things is not a deviation from the scientific method, but an application of it. The fact that the methodology includes lengthy, and at times downright anthropological, aspects does not take away from their methodological necessity. It is simply an admission that humans are as humans do, and that if you want to play along you also have to play nice. Play being a function of time spent together. There is a sociological method; the fact that it sometimes takes an extended period of time and a counterintuitive amount of effort to go through the steps of said method does not make it less methodical. We could make a parallel to the investment costs of particle accelerators; the fact that you or I will never be able to build one for purposes of replication does not negate the value of those already in existence.

The alternative to accepting the social sciences would be to delegate human activity to an unknowable black box, whose inside mechanisms are utterly inscrutable and beyond the reach of scientific investigation. Which, to be sure, would make the lives of university administrators that much easier when the perennial round of budget cutbacks comes around. But it would also limit the project of extending human knowledge to a very narrow range of topics. We know, through the application of our powers of reason and intellect, that nothing changes when a goal is scored in a big match. The ball passed an arbitrary line, and that’s it. We also know that the massive cheering and displays of emotion that result from this very same goal is a very real thing indeed, and that it can be studied as an object in the world. There is a method to it, and at some points of your investigation you might have to drink a celebratory – or commiseratory – pint at the bar, but thus are the rules of entry.

The discipline of sociology

Dyson: Analogia

Sometimes when I read something, it connects in my mind to something else I’ve read. In this case, the connexion is made to Zielinski’s Deep Time of the Media. Zielinski explored various technological and epistemic dead ends, and pondered where we would be today had any of those possible lines of development been pursued. The goal of such an archaeological excavation is not merely historical curiosity – although that is always a valid reason to go about things – but also a reframing of the present. Seeing how we are all equidistant from eternity, the various abandoned dead end media he explored were once (or could have been) state of the art technological cutting edge. Pondering where things could have gone puts the arbitrariness of where they actually went into perspective; the alluded to Deep Time consists of not taking historical contingency for granted just because we happen to live in the here and now.

Dyson has embarked on a similar project, albeit with ever so slightly less grand epistemic ambitions. Analogia tracks the development of digital technology from Leibniz to the present, and gives us the perspectives of those who happened to be on the other side of it. Not its opponents, per se, but of its victims. When the telegraph made its way across the continental United States, it did not bring good news for the Native Americans. These self-same tribes were not passive recipients of history, however, and responded to these news by quite literally hacking the telegraph wires. Chopping off a wire gave a tactical advantage in that it severed enemy lines of communication, but the act was not one in opposition to telegraphy specifically; they were firmly in the realm of Clausewitzian depriving your opponent of the capability of resisting, rather than of Luddite convictions.

This places the emerging communications technology outside the realm of inevitability. It is a rare thing indeed to place those who happened to be on the other side of technology as strategic actors using the full force of human rationality to oppose said technology. Usually, the story sets up the initial conditions, the requirements for the technology to work, the impediments in place, and what the clever men of applied science did to overcome these impediments. At best, locals are delegated to the status of “impediment”, and then swiftly removed from the equation as the march of Science and Technology continues ever apace. Inevitability had to be brutally enforced, and by pointing towards those who were subjected to the enforcement the inevitability is brought into question.

A reader of the book might not be immediately struck by this line of thought. The text moves from context to context, giving biographical and at times personal insight into the lives of a number of individuals. From Leibniz himself, and his petition to the Russian Tzar to build a very early version of a difference engine; to a trans-Siberian expedition from the very same Russian court to explore the east coast (before subsequently arriving at the west cost), and the various logistical challenges inherent in such an undertaking; to the supreme canoe-making of those who live in the arctic peninsula of Alaska; to the social circumstances surrounding the creation of the atom bomb. In all these individual, peculiar and at all times fascinating recountings, it is easy to lose track of the overall analogy; the meeting of analog and digital gets lost among the proverbial trees, as it were.

This is, perhaps, more of a feature than a bug. The whole endeavor is to not reduce these varied and skillful opponents of the coming digital order to an analogy. Part of looking at the world through the lens of deep time is that every moment is contingent, and every future moment following from any given moment was as uncertain as it is to us. This necessitates a certain amount of critical distance from history – the fact that we know how it turned out does not imply it was the only way things could have gone. This mode of thinking is inherently allergic to analogies; it ever so slightly defeats the point to posit that since something happened once, it is bound to happen again in the same manner. Thus, an extensive setting of the stage is required, to fully appreciate the magnitude of the collapse of eternity into one singular timeline.

What Analogia ultimately does is set the stage for our present moment, in all its contingent glory. There is no telling where things go next, and those who do try to tell you usually do so in the context of a sales pitch. And there are many, many sales pitches abound about the advent of artificial intelligence, the new digital frontier. It is an inevitable development, a preordained extension of destiny, the next step in human evolution, the next big thing, the done deal. All it requires is that we embrace it in our hearts and unilaterally allow it to happen. A digital manifest destiny, if you will.

As we have seen, inevitability has never been the case when it comes to new technological developments. There is little to suggest it would be the case this time around, and it will undoubtedly be interesting to see who among us will happen to be on the other side of this evitable inevitability. It will also be interesting to see where the new avenues of relentlessly analog wire cutting will take place. The future is not here yet; we still have time to be contingent. If history teaches us anything, it is thus.

Dyson: Analogia

Rhetoric and the truth claims of documentaries

Rhetoric is the art of organizing words in such a way that they have an effect. At times, this effect can be to achieve a certain outcome, i.e. to persuade someone of a course of action. At other times, it can be to make someone come around to a point of view. Most of the time, it is merely to achieve clarity where there previously was none. Clarity is an underappreciated effect to achieve, and more often than not simply being able to boil down a message to its most basic and easily understood form can rescue a situation from much untold strife. Getting everyone on the same page, even if only so they can disagree more constructively, is a superpower in disguise.

The organizing of words is not contained to writing or a speech, however. It does not even begin there. It begins in one’s head, as a method of making sense of things. It is a function of paying attention to how things fit together, and how they can be made to relate to one another. Everything is related to everything else, and every connexion is a potential line of argument. Most relations are tenuous or non-intuitive, and can be safely put aside in search of better ones. Eventually, after having looked at enough possible connexions and discarding enough bad ones, what remains is a few potential ways of organizing the words such that they can be understood. What follows from this process is a lot of bad ways to put something, and a few good ones. The key is to select something from the good category.

This organization of thought and narrowing down of possible alternatives to a few good ones is often pitted against a natural, intuitive way of going about figuring out what to say. Rhetoric is the art of lying, it is often said, and a person’s true thoughts & feelings are expressed spontaneously, without prompt or preamble. Indeed, it is maintained, the more one thinks about something the less truthful it becomes. At length, this line of argument ends up saying that the process of choosing one’s words cannot but end up in dishonesty; whatever truth was to be found in spontaneity is lost once the moment has passed, leaving only the malingering specter of doubt. It is a line of thinking as silly as it is wide-spread.

The thing of it is, it is seldom clear what the most effective, truthful or proper way of saying something is. If someone asks for direction to a place, is it best to describe the route by means of street names, landmarks or numbers of turns left/right? All three options lead to the destination, and it is unclear whether one is more truthful than the other. The criterion of spontaneous honesty does not give us any guidance as to which to choose, and would only serve to induce anxiety in those who are not able to make a choice right there on the spot. If we allow ourselves to ponder whether the asker is familiar with the local street names, can remember an abstract list of left/right turns, or would be best served by the most concrete visual input the city has to offer – then we are in a better position to pick the most appropriate option.

From this, we can gather that rhetoric is the art of figuring out what is useful to say. What flows from the heart is not always the most informative of thoughts, and saying it reflexively might at times lead to confusion. By organizing one’s thoughts in relation to the overall situation, it becomes possible to be a more constructive participant in the conversation. It streamlines the process, as it were.

This brings us to documentaries, who – much like spontaneous honesty – are assumed to be bearers of unmediated truth. If something is presented in the form of a documentary, then it is generally assumed to be the true story. This follows both from convention, and from the fact that many generations of film making has established the genre as an effective vehicle for conveying information. Form and function go together, leading to a very persuasive mode of presentation.

However, just as in the above example of asking directions, there are always multiple ways of arriving at a certain truth. Documentaries do not spring fully formed out of nothing; they are structured around an organizing principle which puts things in relation to one another. The organizing principle was chosen at some point during production, and then executed in the form of the finished product. Whatever the choice, there were always other ways of presenting the information that were not chosen. And, conversely, if one of those other options were picked, then we would end up not picking the first option. Neither of them are false, but at the same time are neither of them the whole truth. They are equally and both at once, choices. (Here, Booth would make an amused aside about the need for an infinite number of documentaries, to cover all bases.)

This places us in a tricky spot when it comes to evaluating the truth claims of documentaries. Or, rather, it makes the truth somewhat orthogonal to the choice of organizing principle. Any given choice is bound to have advantages over another, with corresponding disadvantages. The difference is one of emphasis rather than of veracity, which is an important difference, but – and this is the point of this text – we can not arrive at a constructive criticism the significance of this difference if we talk about it in terms of true and/or false. Truth is not the issue; the mode of presentation is.

The purpose of rhetoric is to organize words so as to arrive at clarity. The same goes, mutatis mutandis, for documentaries. As a critical reader and/or viewer, your task is to evaluate the organizing principle to see where it places its emphasis. Should you find that there is an alternate organizing principle, then you are richer for knowing it, and can proceed to compare & contrast until interesting nuggets of insight emerge. These nuggs will, no doubt, be useful in your upcoming attempts to find something useful to say. –

Rhetoric and the truth claims of documentaries