Continued from Part Two.
In Grade 11 I took my first programming class. Although I had played around with Logo in elementary school, and had created many complicated DOS Batch files, this was my introduction to all the formal elements of modern computer programming. We learned Turing, an educational object-oriented language developed at the University of Toronto. I wrote a multiplayer Tron/Snake-style game which would send each players key-presses back and forth across the network, but it often got out of sync resulting in the screens of both players looking different and perhaps both thinking they had won or lost. My final class project was a graphical implementation of Battleship. In Grade 12 we learned Java, an language with plenty of actual real-world usage.
What does it mean for a programming language to be object-oriented? Well, “low-level” language involve telling the parts of the computer hardware what to do with each other. More specifically, a low-level language is designed so that a programmer using it thinks about how the CPU is going to be moving, step-by-step, through 1s and 0s in the RAM and deciding where to look (in RAM or hardware) and what to do next. In contrast, a high-level language is designed so that the programmer doesn’t need to worry about what the computer hardware is actually doing. Object-Oriented languages are high level: the programmer writes code by imagining a magical sandbox full of Lego-like objects which can be magically declared into existence and have their properties, functions, and interrelations precisely defined. To me, it felt like imagining a giant Rube Goldberg machine in my head, and then typing as quickly as possible to get it all out; then clicking run and watching it break in 17 different places at once. Eventually, after getting all the bugs ironed out, this imaginary simulated concept-space is animated into life when the code is executed.
The development of Object Oriented programming at the Xerox Palo Alto Research Center was a creative milestone in the history of computing, toward the shaping of computers toward fitting human needs, away from demanding humans learn to shape their thinking toward the needs of computer hardware.
It wasn’t until University, in Systems Programming, that I learned the fundamentals of how a computer processor addresses all the other parts of the system and works through the logic of the contents of the RAM. It’s rather like how a reader works through a Choose Your Own Adventure novel. If this, go to page 43, if that go to page 19. With only a few logical operations at the low-level, computer software over the decades has been built up into higher and higher levels of abstraction. It’s possible to make loops, make simple structures, lists, iterators, and from those even higher and higher-level constructs, eventually culminating into the simulation of virtual spatial environments and digital analogies of analogue human technologies.
The notion of bending computers to fit humans more easily is extremely powerful for explaining what we describe as “progress” in computing. Starting from CPUs and RAM and hardware, the development from the low-level rudiments, up toward the high-level simulations and fictions of cyberspace is the story of the development of the “software stack”, the abstract Tower of Babel of arrestable, human-comprehensible logic which gets all flattened together into an impenetrable binary blob which the computer can actually running. Early microcomputers often came with cassette decks to allow the RAM to be saved, beginning to end, onto an audio tape so as to be replayed back into the machine later. This was the simplest way to save your data and programs before turning the machine off, but it was all about how the machine worked; not about how people work.
The DOS environment, which I grew up on as a kid, represented great progress on building up the nature of magnetic media into something more articulate and useful for users. Magnetic disks (and, today, CDs, DVDs, Blu-Rays, USB keys, and SSDs) purport to exist as virtual spaces consisting of “files” analogous to pieces of paper within folders in a file cabinet. A computer disk is like magnetic audio tape in the form of the grooves of a vinyl record (not spiralled but, rather, in concentric rings) which are broken into addressable sectors containing bytes of data. Upon this lower-level hardware layout is constructed an abstract file system, which is a way to map out and lump those sectors into larger entities presented to the user as files. By using disks with file systems, computers progressed into allowing users to be far more selective in choosing which bits and bytes on the disk could be loaded into and saved from the RAM of the computer onto magnetic storage, instead of the all-or-nothing approach of saving the RAM onto linear cassette tape. The basic solution of storing volatile RAM contents for future use onto magnetic storage hadn’t changed, what changed was the granularity of that process, necessitating the creation of a metaphoric mental model, implemented in software, that had nothing to do with magnets or RAM.
This is the essence of cyberspace. While video games might give us dream worlds that clearly take place in non-existent, imaginary lands, the virtual actually begins with the very first steps of computer interfaces into metaphor which obscure or de-necessitate any awareness of the actual physical functioning of the device. Forget Myst, Azeroth or Pallet Town — even computer disks are imaginary worlds full of fictional things! To know that one is working with a fictional metaphor is to ground one’s self within the reality of the illusion. Absent that grounding, however, the computer interface becomes a figure without ground — a non-reality; a virtual space suspended within the mind of the user. To what degree is that virtual world and the real, physical world prone to merging in the mind at an inappropriate level? Consider, again, how naturally the mind considers the file called Resume.docx on a computer and the piece of printed paper it produces to be one-in-the-same thing.
When I was on dial-up internet at home, the only way to move files around the house was to save them to a floppy disk or burn them onto a CD. Both internet-enabled computers, the family computer and my own, had a modem and only one could be online at a time. Moving to high-speed DSL meant getting a router and setting up a home network. This is how I learned all the ins and outs of modern computer networking which is today premised upon TCP/IP, or Transport Control Protocol and Internet Protocol. These are the fundamental methods behind the world-changing networking method once-popularly known as internetwork packet-switching.
Once upon a time, from the sixties through to the early nineties, everyone had their own proprietary way of wiring all their computers together. Sending data or messages from one computer to another computer involved myriad methods of manually setting up their interrelation in fragile configurations susceptible to interruptions. The computers of the planet were linked together in a map of physically-overlapping, yet unconnected private networks which couldn’t connect to one another. This meant exchanging data would still requires either someone to copy things to a disk and walk over to another computer to data between them, or to set up a computer system connected simultaneously to two different, incompatible networks and function as a gateway between them (often for a hefty fee). The idea of internetworking all of the various private computer networks into one big, self-shaping network was a massive undertaking, involving much funding by the American military who saw national security as greatly-improved by a communications system which would be nearly impossible to take down through targeted strikes. The project of internetworking all the disparate, unconnected computer networks entailed the creation of a method for all the computers to automatically be assigned their own address on the network (Internet Protocol), and for messages between them (packets) to magically find their way around the planet, moving from system to system (switching) until they reach their destination (via the Transport Control Protocol). As the internetworked packet-switching behemoth flourished in the 80s and 90s — catalysed by telecommunications infrastructure spending in the form of fibre-optic cables replacing copper telephone lines — more and more computers moved into being directly connected to “The Internet” instead of remaining sequestered within their own local area network. Nowadays, nearly every LAN is just a sub-set of the larger single internet.
Home computer users in the 80s were not likely to be on the internet, at least not directly — that would have been something you’d only get access to at a university, or through some expensive gateway service. Instead, they would “address” other computers through the phone-numbers of the modem-equipped computer they were dialing up. With the movement onto the Internet, other computers would instead be addressed by domain names, which are the human-memorable references which indexed with the IP address of the remote destinations. For instance, the domain name google.com refers to the IP (version six) address 2607:f8b0:400b:808::200e. If internet users were forced to use IP addresses directly to refer to remote destinations, it would impossible to change the physical location of a resource without making everyone learn the new address. The domain-name system offers a layer of flexibility, such that a single domain name can be redirected to different IP addresses without breaking anyone’s email address books or bookmarks.
Domain names became well known in the 80s as the second-half of the email address: everything after the @ symbol. It was rather clear in that usage that a domain name was a way to refer to particular computer network which had been internetworked into the global packet-switching system; clearly example@harvard.edu referred to the university computer network account of your amply-endowed ex. But a new development soon hijacked entirely the meaning of internet domain names in the popular mind.
As soon as Sir Timothy invented the World Wide Web in the early 90s it exploded as a way for just about anyone to become a global media publisher. In order to get your content to the masses, you needed a place to store your files and a way for people to know where you find you. In this way, the internet domain name became synonymous with “website”, intractable from the notion of web browsing in the popular imagination. The functional purpose of domain names as a way to specify a single local domain, such as a univeristy LAN which had been internetworked into the packswitching system was rendered obscure and arcane. Furthermore, regulations against commercial activity on the internet were lifted and the .com domain suffix was introduced, indelibly marrying the concept of a domain name with a particular brand instead of a particular computer network.
Today many people use the terms “net”, “internet”, “web” and “social media” interchangeably as though they were all one-in-the-same thing. Of course, if you’ve actually read this far into this essay you sense my feelings about such sloppy speaking of something so well-defined at a technical level. The flattening of what was once a clearly distinct model of the functioning state of affairs (a series of physical computers connected to one another) into a single window of groundless cyberspace (web addresses) is the total obscuring of the real functioning and existence of networked computers in our lives for the sake of facilitating as much human ignorance as possible in it’s day-to-day usage. How can you know what you’re doing, what you’re looking at, what you are responsible for on a computer when it all takes place in a fictional, simulated world with all the real parts hidden away? But wait, it gets much worse!
To establish yourself on the web, all you had to do was spend some money on a domain name and a file hosting service and begin uploading images, music, and any other media onto the remote server, alongside the HyperText documents which linked to them. Just like as it had always been, we’re talking about files on a magnetic (now solid-state) drive being sent from one computer to another. But the HyperText files being copied contained the interface to accessing the files, and the links to other domain names which might send the web browser along to transparently connect to other IP addresses. If what you wanted to publish was writing then, of course, the HyperText web pages themselves would be the content on offer.
The world wide web was an unexpected, explosive disruption of the carefully planned structure of cyberspace created by Silicon Valley entrepreneurs. America Online had set up their version of cyberspace to be addressable by AOL Keywords; many of the television commercial I grew up with would tell audiences to use an AOL Keyword instead of providing a domain name (or a “dotcom”). Once the early online service providers surrendered to the web it took more than a decade for control of online space to be wrestled back from the chaotic, sprawling madness. Everyone and their mother was running out and creating their own web page independent of any sort of centralized authority, outside of any single legal jurisdiction or set of laws. One might pay for a web host in any country whose laws allowed for computers to serve the content you wanted to offer. Many such countries lacked legal mechanisms for enforcement of such terrestrial concerns such as copyright law, or liable, or decency or civility.
I decided to get a website as soon as I got my first real job and credit card. Originally it was clintonignatov.com, but given all the internet safety advice about obscuring one’s identity, I compromised with the purchase of clintonthegeek.com. I don’t remember using it for much except as a domain name for my own email address — something which bit me in the ass when I forgot to renew the domain name and thus stopped receiving mail and lost control over many of my online internet accounts. The moment the domain expired it was bought up by a domain harvester: a company whose entire business model is to extort money out of poor shmucks like me by selling me back my domain name for hundreds of dollars. I chose the cheaper, more difficult route of changing my account registration at countless online services and emailing everyone I knew with an address change rather than reward such scummy behaviour (not that I could exactly afford it, either).
Web surfers may have all of the mechanics of their cyberspace hidden beneath the hypertexual interface of links and embedded media. But at least people who created websites, like myself, knew far more about the mechanics of what was really going on inside the simulation. Uploading all the media onto the remote web-hosts, placing it within the properly-designed HTML documents, setting up the database connections for the dynamically-generated hypertext documents of Content Management Systems all revealed to website administrators all the familiar desktop-computer and networking fundamentals which made everything work. The internet is just a whole bunch of computers connected wired together with an easy, uninterruptible, military-grade way to always reliably find one-another and securely exchange data. Atop this framework, the world wide web blossomed as a totally decentralized “platform” for unencumbered publication by anyone; a free global press.
But now, an entire human generation later, the internet is largely back in the hands of a few corporations. This was achieved not by technical means, but by social ones — namely the ignorance of users as to the nature of their cyberspace. Nowadays the easiest pathway for creating digital information is certainly not to create and maintain one’s own website. The re-branding, writ-large, of cyberspace as “social media” has been an overwhelmingly successful effort by Silicon Valley to recapture what the Web took from them: control as the centralized administrators and publishers of the online presence of internet users. Enticed by easy-to-use, free services for hosting video, pictures, weblogs, email, and all the other commonly created user-content formerly self-managed by website owners, users of social media relinquish ownership and control over their personal “sites” on the web to corporations who derive value from everything online been kept neat and tidy, easily index-able and analysable and governable. Why bother setting up your own website when a half-dozen social media “platforms”/”apps” will give you a page to publish your material on? Personal web pages might still exist, but they are pushed down and out of the popular, heavily-marketed cyberspace of “Social Media” into the retroactively-coined “deep web”.
This long-con has relied largely on the semantic shift from the world wide web as the singular whole “platform” into the siloing of individual websites as “platforms” which can each be conflated with a discrete computer interface which runs outside of the web browser. On an iPhone, you needn’t open Safari to check your Facebook “page” — so is it still a web page at all?
Of course, before social media inserted itself as the default entry-point into cyberspace, replete with all the consumer-protections expected of modern service industries, the “deep web” — a name ripe with mysterious and nefarious insinuations — was simply the web. And I, as an internet users in the early years of this millennium was surfing it without the protections and promises of purity and security and safety of a corporation like Twitter or Facebook promising to shield me from anything harmful or subversive. Instead I had a sprawling catalogue of bookmarks and many diffuse communities where word-of-mouth would offer new rabbit holes for discovery.
Much of the web consisted of outlets for ideas which would be unpublishable in realm of traditional, analogue media. This might mean ideas which are reprehensible and pushed underground onto web servers beyond regulation; or more often topics which are so niche or fringe that they’ve a target audiences of merely dozens or hundreds of people. But it also includes rambling, incoherent political or spiritual manifestos of wannabe cult leaders who prognosticate their all-encompassing theories on the true nature of reality and salvation. Pity the fool embarking on that Quixotic crusade.
Perhaps the most well known example was Gene Ray’s Time Cube at timecube.com. The site was a popular in Fark.com comments; a common punch line. It remains the archetypal display of madness online; genius gone awry in spectacularly public fashion. Screaming into the wind, Ray’s endless page of center-justified, variously coloured and accented blocks of text is the epitome of crazed self-deification. The site was ever changing, laying out his theories and decrying the blindness of deceptive academics who deprive children of his perfect model of four simultaneous 24 hour days of cubic time. To read the site leaves one marvelling at the energy and creativity of someone who is either completely serious and totally insane or a hoaxer of boundless dedication to his craft. The indeterminacy of this enigma from the site alone lead Ray to internet fame, resulting in an MIT lecture and interview on TechTV where it was revealed that he was, in fact, complete sincere and writing in earnest. He had interpreted the gregarious applause and hoots of laughter from the incredulous MIT students as unconditional support, telling the interviewer they had “treated him like Einstein.” While he remained laughingstock of the internet, Gene Ray’s tragic life was extensively catalogued in documentary footage shot by his one true believer disciple Richard Janczarski and their story is respectfully told in an episode of Fredrick Knudson’s Down the Rabbit Hole.
The information age has come upon us quickly, and all that we see is the high-level interface, flattened, of links and buttons to be clicked and poked, maze-like, leading from one realm to another. The logic, the structure, the reality of what undergirds Cyberspace remains elusive and nearly unimaginable for most. In this drifting, shifting mental space of dynamic symbolic logic lie the words and thoughts and even lives of countless people bouncing off of one another, fighting and loving, and passing like ships in the night. While our eyes and ears exist to make sense of the real world, they struggle to make sense of what lies beneath all appearances of the digital. Know that it is the collaborative effort of millions of minds, accrued over decades of engineering — made of knowledge not lost to either time or discovery. And when it’s hard to tell what is real or fake, remember to take the time to step back and make sure you know where you are, and who you are with in the world your senses were designed to perceive.
Leave a Reply