IF Capital, part 2: NOT attending Virtual Futures

Click here to read Part 1.

Part 2: NOT attendance

My generation will be the last to remember life before The Event.

We review a nostalgic montage, played on a VCR with misaligned heads, colour distorted and flickering: we see a parent and child run through the rain to catch the bus home, hoping to get back in time for a favourite TV programme or the football results on the radio; a boy agreeing to meet, weeks before, at the landmark in town at 3pm on Saturday, and then weeks later, turns up, waiting in the cold, stamping feet and gazing about, wondering if the other will show, and then — accepting but despondent — finally drifts away, wondering if the date or time was wrong, looking back forlorn for one final check, but no; a young man, over different days, rushing eagerly to check the post that flops onto the doormat only to halt disappointed at the absent letter, and then the hoped-for day arrives and he reaches to the floor elated, picks up the envelope and excitedly rips it, and sits to devour the written letter inside that once consumed spurs immediate action: he pens a response, taking care to craft just the right words to evoke just the right intellectual and emotional response, desperately striving to compress time and space with the force of feeling; a young woman, sitting on a stair, plastic receiver in hand, smiling and talking, the odd laugh, straining to hear the crackling long-distance voice fraught with static and expense; and then small groups, of variable composition, conversing across tables in public and private houses, alternately sober, drunk or high, upon all subjects under the sun, drawing upon their fallible and finite memory to contest matters of fact, and then the wrinkling of brows, pointing of fingers, and suddenly, breaking through this stream of ignorant repetition, a well-prepared and embarrassingly earnest interlocutor stands and turns to a shelf to pick and then leaf through a copy of the EncyclopΓ¦dia Britannica mid conversation, whereupon the group becomes silent in anticipation, if not awe at the impending dissemination of knowledge; and then scenes upon scenes of newspaper reading in the street, at work, in the home, and then scenes upon scenes of people simultaneously sitting in lounges watching broadcast TV, home in time from the rain and cold, with whirling headlines flickering across technicolour faces that turn to reflect and comment to others sitting adjacent mere feet away, ignorant of the same domestic scene playing out with fractal repetitiveness; the ground now disappears below us as the camera zooms out up into the sky, the effect is bad perhaps even hand drawn, and the VCR playback wobbles vigorously and the sound distorts as the nations of the world shrink, to reveal an image of the globe connected but only sparsely and shabbily so, and there, like hotspots on a medical scan, invisible to our normal senses but present nonetheless, are tiny pockets of computational intelligence beginning to multiply amongst us, and between us, growing slowly but pervasively, crawling silently and inexorably into our social relations.

Everything before was more intimate, more parochial, more disconnected and less automated. Looking back, after The Event, it’s hard to imagine how we got anything done at all.

No internet. And then the beginning of the internet. 

If you’re reading this in the future its name has probably changed. What do you call this cybernetic merging of human and machine now?

But on the cusp, in the mid 90s, when the Event had both happened and not happened, even Computer Science departments, the proximate source of this new emanation of Turing’s revolution, were almost as clueless as the general population about the impending transition. 

The Computer Science department at Birmingham University, where I was pursuing a PhD in Artificial Intelligence, was probably typical. The β€œinformation superhighway” was just one amongst many things of academic interest to ponder. And as the intellectual labour indeed pondered, staring ahead into monitors, it hardly noticed the manual labour taking place beneath its desk. A brief mumbled thanks, hardly any eye contact, and suddenly ethernet cables snaked and tangled, sensing with their blind heads, towards the contact points between hitherto isolated computing boxes. LEDs lit up signalling liveness as each isolated computational device was jacked in. 

Upon this new topology new creatures thrived. A novel arrangement of bits, called the browser, began to spread, yes just like a virus, from one computer to the next, with an infection rate that guaranteed exponential growth. (Everyone, back then, was immensely pleased with the new virus analogy.) The (β€œrhizomic” in order to foreshadow) growth began to fundamentally change everybody’s daily habits, the very way we lived our lives. 

I remember watching puzzled as other post grads used Lynx to explore hypertext linked documents. Admittedly, it was of passing interest that the documents were hosted on other computers, but so what? Bulletin board systems accessed via telephone modems were nothing new. They had been around since the 70s. But then I found myself acquiring a daily habit of using the friendlier Mosaic Browser running on a Unix workstation. Soon I had hacked up my first homepage with HTML that could, in principle, be seen by anyone in the world. This mere possibility was sufficient narcissistic reward. I landed on the earliest version of Amazon’s homepage, which boasted 1 million books for sale. I didn’t purchase, either due to lacking a credit card, money, or trust in the new.

The internet was so small, and innocent, one could explore its new geography with the spiritual balance and healthy detachment of the archetypical Californian who, with foamy delight, plays amongst the natural splendours freely gifted by the sun, sea and surf. In the mid 90s we therefore β€œsurfed” the internet. By and large, everyone, in the beginning, was just happy to be there.

We wanted to explore. We wanted to see. We wanted to connect. We loved it. And so, step by inexorable step, it grew. The new forces of production escaped from university departments and commercial R&D centres. At which point the ideological superstructure woke up.

The BBC, not known for championing the avant-garde either in art or technology, had been broadcasting the TV series The Net, which earnestly explored the first cultural judders of the shifting technological plates. Obviously something important was happening but no-one knew what it really meant or where it was all heading, and so it made perfect sense, by the messy power of associative reasoning, to throw onto the screen anything connected with computing. 

The cultural confusion, excitement and bewilderment was sufficiently high that my supervisor, a Professor in Artificial Intelligence, was invited, in early 1997, to talk about the future of AI and robotics. Sadly, I cannot find an extant online version. On the cusp many things had to be left behind, undigitised. Future historians will consider the 90s cusp as something like an event horizon. Behind it information is trapped, and rarely escapes.

The Professor, perfectly filling the role of the intrepid thinker who has escaped the territories of received opinion, bravely and dispassionately asserted the inevitability of future AIs being more intelligent than us, and possessing moral superiority, and therefore would naturally take over and put us all in cages, no doubt for our own good. At the same time, and as an illustration of the coming AI apocalypse, an experimental AI demo, which I had coded, was prominently displayed. Many seconds of national terrestrial TV time, and therefore an astonishingly high quantity of human computational power, was suddenly and simultaneously devoted to visually processing the meaning of some of my code. 

Perhaps I should have been pleased. A career narcissist could plaster, β€œas seen on TV”, over their PhD work, and build their academic thought-leader brand. But that would be desperately uncool. But, anyhow, there was a problem, which caused me discomfort: even by the graphical standards of 1997 computer science departments, my demo was fucking ugly and, from the point of view of communicating any substantive scientific content, a complete failure — and therefore best forgotten.

But for reasons that will subsequently become clearer, this demo cannot yet be forgotten. What it means will have to wait. But to give you some idea I must re-experience a modicum of shame again. Because a small digital footprint did make it through the event horizon, and I have resurrected it. This is what the demo looked like:

Early evidence that AI will replace us.

For now, just watch and note your reaction. Give full reign to your judgement. You have been told that advances in AI imply the end of humanity as we know it. And then you are shown this. Presumably the millions of UK viewers, watching the TV in February 1997, came away with a higher opinion of the unique powers of humanity compared to AI. But, it must be admitted, with a lower opinion of the powers of PhD students.

You may understand, therefore, why I decided to repress, rather than trumpet, the whole literal episode.

On the cusp, cultural aberrations were rife.

A more significant aberration, within academia, drifted into my sensory cone. I was wandering around stupidly, much like the little AI-letter-agents above, in the murky corridors of the Computer Science department when I spotted a flyer for a β€œVirtual Futures” cyberphilosophy conference. This was either 1994 or 1995. The conference flyer promised discussion on cybernetics, techno music, feminism, cyberpunk, the internet, hacking, bio-computation, cognition, cryptography and capitalism. This was not the typical highly specialised academic conference. This was a bag of jewels mixed with rocks and (let’s be honest) dried shit. I didn’t know it at the time, but the CCRU, a central protagonist in our story, was heavily involved in Virtual Futures.

The conference poster screamed loudly that it was about The Event, about what was happening right now. As a young Marxist, and a post grad studying how machines would and could have emotions, this seemed right up my street. Very cyber. Very philosophy. So I had to be there.

However, I was also skint, socially diffident and indifferent, and in most practical senses really quite inept, and therefore the logistics of traversing the thirty miles from Birmingham to Warwick, and organising somewhere to stay seemed, to me at least, both expensive and insurmountable. Obviously I had no car and no savings. Unnoticed by me at the time, for I was typically semi-conscious due to ignorance and recreational drugs, but incredibly fortunate in retrospect, Birmingham University was paying all my tuition fees, and all my living expenses. Perhaps today it would be called a scholarship. Back then, as someone who had also received a state grant as an undergraduate, it seemed obvious and natural that the nation would pay for people to study science. On the cusp some remnants of political reformism persisted. Anyhow, due to personal character flaws, and the relative generosity but also absolute stinginess of my PhD grant, I didn’t bother going. 

So I didn’t attend. I wasn’t there.

And this is the first of multiple examples of how CCRU’s story, and my own story, hardly intersect at all. The next time nothing happened was when I did not meet Sadie Plant.


Click here to read Part 3.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s