With their promise of highly efficient data management, algorithms are increasingly being used to observe and analyse our behaviour. However, these same algorithms threaten to undermine democratic decision-making processes and individual autonomy. In the political world of tomorrow, algorithms would predict our fates and render them inevitable. To resist this trend, the group Laokoon proposes theatrical self-observation and has developed a digital platform to observe us being observed by the observers.
Shortly after midnight on 30 December 2019, the Canadian start-up company BlueDot issued a travel warning. The reason: an increase in “unusual pneumonia-like cases” detected near a marketplace in Wuhan, China. Ten days later, the World Health Organisation finally released an official statement confirming the outbreak. How was it possible that a start-up company could predict the threat so much earlier? The answer is Big Data. Self-learning algorithms can analyse data from countless sources, including statements by official health organisations, digital media, global flight data, reports on livestock health and demographic population data. Based on patterns detected by these algorithms, it is possible to make predictions. Using so-called “predictive analytics”, technology companies hope to conquer the future. Couldn’t we avoid future crises if we placed our fate in the hands of artificial intelligence instead of putting our faith in error-prone and imprecise human guesswork?
There are many areas in which algorithmic predictions are used today: market research, healthcare, finance services, the security sector and, not least of all, climate research. If one believes the stories coming from Silicon Valley or Shenzhen, there are practically no limits to the clairvoyant power of algorithms. Google’s partner company DeepMind aims to predict individual health risks, the progression of disease and times of death, while another subsidiary in Mountain View claims to predict who is likely to commit suicide in the future. Around the world, police departments are using software to predict where the next murder in their city will take place. Using GPS data, algorithms can predict where people will congregate before they actually do and can even calculate the outcome of wars.
Algorithmic predictions normally draw on present and past patterns to generate likely future scenarios – as if past events could stretch seamlessly into the future. But the future is not a stringent extension of the past; its progress is volatile and unexpected. The current and ongoing pandemic has clearly demonstrated this fact. Even if it were possible to view the future with a telescope, and the viral threat prognosticated by BlueDot was no mere “flash in the pan”, we have seen that prior knowledge about the disastrous consequences of our behaviour (e.g. climate catastrophe) by no means guarantees a change in behaviour.
Laocoön, the Trojan prophet and namesake of our artists’ group, could tell us a thing or two about predictions. His warnings about the Greeks’ treacherous gift fell on deaf ears, and the Trojan horse became a symbol of hidden danger forevermore. According to the myth, the Trojan priest was a tragic hero whose warnings were wrongly ignored. The moral of the story could also be the blueprint of the narratives propagated by the marketing departments of the tech companies: The complexity of the globalised, capitalistic world is not to blame for the unexpected impact of financial crises, famines or pandemics, but rather our lack of faith in the predictive capabilities of algorithms.
Yet the myth of Laocoön can be interpreted very differently, as in the telling by the Greek poet Vergil. In his version, the prophetic visions of Laocoön are merely described as a symptom of an underlying struggle for power among the gods. Laocoön was granted the gift of spectacular foresight in return for his service to them. Only after learning of this backstory, do we discover that the magical revelation is a part of an obscured, power-political conflict. As futile as Laocoön’s vision regarding the danger within the horse’s wooden hull was in preventing the fall of Troy, so too would it be to poke holes into the algorithmic black box today. Those who demand transparency when it comes to algorithms are mistaken if they think that revealing the code would solve the problem. Open data sets alone do not change the fact that scouring and further processing them require highly refined algorithms which remain in the possession of billion-dollar corporations. To reveal the supposedly immaterial digital character, it is necessary – like with the wooden Trojan horse – to tell the story in which its manifestation is embedded, driven by outside interests. Nowadays, it is a story of private corporate influence on the political process, disguised by the (digital) spectacle.
Myths have the ability to empower or disempower – in any case, they create realities. In this respect, it is critical that we denizens of the digital age do not relinquish possession of prophecies and their mythical power to tech companies. As artists, we can do our part by revealing the fabrication of reality – we, too, can look a gift horse in the mouth to recognise that this generous gift stinks to high heaven. The same applies to the free services offered by Google, Facebook and Co. Naturally, each of our artistic narratives can only convey one of many possible realities. We assume the position of the observer without denying the contingency of this position. In the context of systems theory, contemporary theatre is about observing one’s observation: when people watching a performance experience their own excitability, they become reacquainted with themselves. But self-observation is and has always been an observation which creates itself tautologically. In theatre, this might be relatively harmless, but even the smartphones, laptops and wearable tech which monitors our behaviour also permits us to observe ourselves which in turn sustainably reinforces our self-perception, identity and understanding of reality. This aspect is often ignored in the presentation of supposedly incorruptible, objective data. Although our personal data is derived from the observation of our behaviour (i.e. from one possible observation), they also encourage future action (attributed to us as being uniquely authentic). The observation is initially a description which then becomes an impetus. In this way, algorithmic predictions can become self-fulfilling prophecies when we follow the recommendations which have been supposedly customised to us, thereby creating the impression of having acted in accord with our own personality.
As artists in the digital age, it is no longer enough to enable viewers to observe their own observation, but also to point out their observation by digital entities – the data companies and the products and services which lead us to produce data, cookies and tracking methods that allow companies to observe practically every aspect of our behaviour. From the theatrical perspective, it would be interesting to permit our audience to observe the digital entities observing us.
In our current project, we are collaborating with programmers to develop a digital narrative format which allows visitors on our storytelling website to witness the instruments and processes of surveillance in action. The purpose of our digital reportage is not to inform users of the dangers of surveillance technology by way of anecdotal storytelling. Rather, the website aims to present the disparate and unremarkable moments of digital observation in meaningful relation to one another so that they become observable by the observed. Ideally, the feedback loop will twist even further, allowing the audience to watch themselves observing how they are being observed. This elicits the question: how does it feel to be degraded to a granular data producer and product? What happens to me when my personal weaknesses, anxieties and desires become the focus of commercial microtargeting or political wrangling – for example, when tailor-made messaging encourages destructive behaviour? How can I tell whether my own data is fuelling an industry which reaps profit from those with psychological problems, addictions and illnesses? By watching ourselves observe others who are observing us, we have the chance to become aware of our blind spots and patterns of perception which data companies specifically exploit. Indeed, on our digital stage we are addressing one of the most crucial aspects of human existence – our autonomy.
We stand at the threshold of the “clairvoyant society” in which personal behaviour can supposedly be individually predicted and monitored. The phantasm of a foreseeable future has found its way into our belief in the formability of the future. Sociologists have repeatedly noted that between the presence of the future and the future present, there is scope for action. Despite predictions to the contrary, we have it in our power to shape the future. The mantra-like evocation of the predictive capacities of artificial intelligence can create the impression that technological progress is predestined, and that we humans must resign ourselves to it. But like every mythos, its impact dissipates when we no longer believe in it. How advanced is a world in which we no longer are, but rather will have always been? We can only defend the idea of human autonomy by calling into question the myth of algorithmic prediction itself. And theatre is the best place to do this. Instead of observing ourselves through the eyes of the digital industry, thereby allowing them to define us and obediently playing our role as data carriers, we can choose – again and again – to define ourselves with every surprising self-observation. We can watch others observe us and share our observations, compare them with one another and debate about them. To create sensual occasions to this end – that is our task as artists. The digital stage is open to us.