By Jo Lindsay Walton
“[…] had the it forow sene […]”
— John Barbour, The Brus (c.1375)
If this is an awful mess… then would something less messy make a mess of
describing it?
— John Law, After Method (2004)
❧
I am in the Attenborough Centre for the Creative Arts. It is my first time in the Attenborough Centre for the Creative Arts. It’s a pretty good Centre.
It’s Messy Edge 2018, part of Brighton Digital Festival. The name “Messy Edge,” I guess, is a play on “cutting edge.” It must be a rebuke to a certain kind of techno‑optimism – or more specifically, to the aesthetics which structure and enable that optimism. That is, the image of technology as something slick, even, and precise, which glides resistlessly onward through infinite possibility. If that slick aesthetic has any messiness at all, it’s something insubstantial, dispersed as shimmer and iridescence and lens flare.
My mind flicks to a chapter in Ruth Levitas’s Utopia as Method, where she explores the utopian presence that pervades the colour blue. Blue sky thinking, the blues. Levitas never mentions “blueprint,” and now I’m wondering if that’s deliberate? – an essay haunted, textured, structured, enabled, by its unuttered pun. Like how no one ever asks Bojack Horseman, “Why the long face?”
Utopia as Method – my copy anyway – is blue.
❧
❧
Many artists are really awful at talking about their art. Some artists, I suspect, do this deliberately. Or at least, their incompetence comes from stubborn adherence to something disordered and convoluted, to something in their work that would vanish from any punchy soundbyte. I like them, these artists who are really awful at talking about their art. “Awful” – filled with awe?
By contrast, the digital artists at Messy Edge are, by and large, very good at talking about their art, and about the political context of their art.
OK, I like them too.
In fact as the day progresses, just once or twice, I forget that I’m at a digital arts festival. Speakers are so sharp when they talk about the social and ethical context of their work, that I half-expect the conversation to turn to wistful or wonkish proposals of legislative reform. Or to a rallying cry for collective actions, or an insider critique of ongoing collective actions.
Something keeps building in the excitable cells of the discourse, but when it spikes, it spikes toward art.
❧
Laurence Hill, director Brighton Digital Festival, gives us a deadpan and dubious origin story. A dove-laden soft-focus dream sequence, a la Eighties pop ballads. “I woke up and wrote that shit down and Messy Edge was born.”
❧

❧
Nye Thompson talks about Backdoored. The searchbots of Backdoored gather images from unsecured surveillance cameras and add them to an online archive.
Rough plaster wall dominates this image, with a column of brownish shadow. There is a child, head hung, face half in shadow.
Nye’s bullet-points accrue, like a veil being drawn down.
These images, Nye explains, are solely down to the programmatic imperatives of the searchbots. There is no image framing, no formal composition. The relationships between the “photographer” – that is, the algorithm and its hardware – and the subject, between the viewer and the viewed, are transformed.
So it’s a project exploring (self-)surveillance and the natureculture of the machine gaze. But as our cities get smarter and our Internet gets Thingier, Backdoored also looks to me like a straightforward cautionary tale about the consequences of putting network connections in everything. Cars, fridges, sex toys.
❧
There is a rough theme (a messy theme) to today’s event: the right to be forgotten. Some speakers touch on the theme, others ignore it, others flip it over and talk about the right to be remembered.
The right to be forgotten and the right to be remembered: they could be part of the same thing, the right to some kind of representational autonomy. That incorporates issues of privacy and data security, of course. But it also opens up a broader set of questions.
We tend to slip into property metaphors to understand the power we feel we should have over representations of ourselves. Representations are mine, or at least of me. Those property metaphors never fully work. Anything that represents me – whether I’m the creator, or someone else is – is nourished by a facticity outside myself. It can’t belong to me, not absolutely. Not even, I think, my face in my mirror. Not even my face in my mirror mirrored in my eye.
❧
Nye says most of the people in the Backdoored images were looking at another screen, or two or three screens.
In this case, the image is of a horse in a dapper coat. Staring at a tail-light. Which, I suppose, is screen-like.
Another.
Another.
A security camera grabs a blurry image of itself reflected in glass. Chunky, rounded edges, smeared across lawn and trees, scraps of white sky. A self-portrait?
❧
Property metaphors, to be honest, don’t work for belongings either.
❧
As Nye talks, for some reason, my mind flashes to #PunkMoney, and then to The Womansplainer. And then to The Sputnik Awards. I’m intrigued by activity which turns out retrospectively to have been art, or activity which is provisionally art … until it is not.
Not experimental art, exactly. An experiment of which “art” is one possible outcome. Possibly not the best outcome.
I think of Sarah Stewart’s poem about her posh ex-boyfriend’s family: “I marvelled at their gift / / for turning near-miss into legend: / He almost rowed for Oxford, you know!”
I think of a mode of making that pours scintillating excess energy weirdly into a savage world, with the privilege of knowing that whatever “this” is, “this” can always be salvaged as an aesthetic intervention.
None of that describes Nye’s project exactly, which was art from the start. But Backdoored does share that slightly volatile, eloquently irresponsible quality – a quality I always respond to positively, maybe more than I should. Putting the “troll” in “semi‑controlled.”
The next iteration of Backdoored ran these images through image recognition AI. A gunmetal grey doggo, staring down a smooth grey pathway. Close to the lens, the barbed wire outline of fairy lights. The identification: Airfield. Airplane. Airport. Aircraft. Warplane. Battleship.
❧
Wesley Goatley picks up on a similar point, when he talks about ‘Augury,’ a collaboration with Tobias Revell. “We built a system with loads of breakage points, none of which we control. These structures aren’t seamless, and they don’t know better than us.” And breakage points are potentially also input ports – opportunities for the artwork to start behaving in unexpected ways.
Mostly, Wesley talks about the Amazon Echo. Amazon are selling these things at a loss. They want them in homes. Break the Echo open, you won’t find Alexa “in” there. The Echo listens for wake words, then it gathers sound data and beams it elsewhere. What happens next is a mystery. We don’t have access to their data centres and their analytics.
Wizardly GOAT – sorry, autocorrect – is being more subtle and suggestive than this, but put bluntly: is the Echo from Hell?
Is the Echo a diabolic familiar? It comes into your home on false pretenses, refers your wishes to these infernal data centres – often subterranean, because of the enormous heat of the calculations – where the precise surfaces of your words and actions will be pored over, and perhaps turned against you …
❧
Does that food you’re chopping sound healthy? Might that data end up with your health insurer? Should I be playing my Alexa Soothing Ambient Kitchen Soundscapes Vol. 2: Chopping Carrots?
❧
I am getting my messy edgycation.
I listen to Sharon Webb on Queer in Brighton, endangered community archives, and the dangers of replicating oppressive structures even while trying to preserve their alternative perspectives; Emma Frankland on VoiceOver Brighton, creating art from stories gathered from Brighton’s trans community, in collaboration with Umbrellium, and fostering a particular structure of various forms of publicness, communality, privacy, and intimacy; Rhiannon Armstrong on the accessibility of jerky, flashy gif culture and her own Slow GIF Movement …
❧
There is another, slightly less overt theme to the day: algorithmic governance.
I guess it’s inevitable. Inevitable because, (a) how can you talk about art and tech in 2018, or how can you talk about anything and tech in 2018, or how can you talk about anything in 2018, or in 2017, without talking about algorithmic governance?
The black boxes have blossomed in the joints of our governance structures. Algorithms curate news, social media feeds and search results; shape border surveillance and investigation; assess credit ratings; determine insurance premiums; predict recidivism; inform job probation and promotion decisions; decide who gets to see ads for jobs and other opportunities; prioritise housing and social services; plan travel routes; feed into political messaging and policy design.
We know the black boxes are among us now, and we want to break their proprietary seals, and gaze at what lies within. It could be anything! Will it be the Valkyrie Brunhild drawing back her bow, from Fritz Lang’s 1924 epic Die Nibelungen?
No! It will probably be racism.
Algorithmic governance is also an inevitable theme because: (b) algorithmic governance is closely tied to the theme of representational autonomy. Algorithmic governance is involved with right to be forgotten and the right to be remembered. With the right, perhaps, to have certain kinds of say in how you are interpreted. In what everything that is not you makes of you.
❧
I think of Cory Doctorow, who has been trying to develop an optimistic vision of ambient computation, an Internet of Things where people aren’t just more Things. “Imagine a location service that sold itself on the fact that your personal information was securely contained in its environs, used by you and you alone. […] [A]s you passed through space, stores, toilets, restaurants, grocers and other entities would emit information about their offerings. These would be seen and analyzed by your personal network, and it would decide, on your behalf, whether to tell you about them, based on the preferences you’ve set, or that it’s learned from you. The fact that you’ve been shown some piece of information would be between you and the computers you own, and – again – shared with no one.”
❧
If the data we generate is not really our property in any pure, deep, unassailable sense, then what is it? Perhaps the logic of the gift could be useful here … Mauss, Lévi-Strauss, Derrida, that lot?
When someone gives you a gift, a fiction springs up that the person has severed all ties with the object, that you can do whatever you want with it.
But – quite apart from the obligation to reciprocate, escalate, or otherwise ritually acknowledge the transfer – you know you can’t really do whatever you want with it.
That’s why, sometimes, people put on sweaters that they don’t want to wear.
And because there are all kinds of unstated parameters, gift-giving can go wrong. There are good gifts and bad gifts, sure. But it’s more complicated than that: it’s not just about fulfilling preferences. Gift-giving instantiates a whole giddying richness of felicities and infelicities.
❧
Ominous hums and rumbles, grating.
[suspenseful sounds]
❧
Misgivings: “feelings of apprehension or doubt about the future.” Also, by way of a pun, “giving badly.” It’s not really what misgivings means, but if it were a new word, then it would mean that. Misgivings: givings-wrong, glitches the circuitry of gift-giving, gaps that yawn open in the dense, creative progress of our ecology of obligation and counter‑obligation, of gratitude, graciousness, forgiveness, forgetfulness, revenge, remorse, recombinacy, restoration, reparation.
❧
Irene Fubara-Manuel on the colonial histories of biometric technologies, the surveillance of migration, and embodied biometric interaction at the border, and their installation work ‘Dreams of Disguise.’
There is a stereotypically machinic quality to the border official’s interrogation. Only listening for actionable language, for slip-ups, for performatives that transform you. For wake words.
Judith Ricketts talks about the aesthetics of invisibility in relation to the transatlantic slave trade and the Afro-Caribbean Diaspora (“7 people of Brighton owned 2,516 slaves in Jamaica […] At the Slavery Abolition Act, they were compensated, in today’s money, about eighty five million pounds”).
Some of Judith’s work, I think, draws on minimalist aesthetics to try to visualize the transatlantic slave trade, to create the space and the occasion for the data to speak. The film “12 million waves” slowly reveals statistics against a sparse seascape: acknowledging in the gaps between the numbers of taken and the numbers landed the hundreds of thousands lost at sea.
The next phase of her practice will involve connecting these gaps with the bias built into training data, the algorithms that grow from it, and the agency they inform.
❧
How do we get beyond visualizing data, when visibility is not enough? Should we think in terms of empowering data, agentifying data? The datasphere witness summonses?
❧
“If Hamlet from himself be ta’en away […]”
Maybe representations of ourselves aren’t exactly our property. Maybe they are more like gifts that we give. Gifts that were never really ours to keep. Gifts that somehow must be accepted graciously.
Whaaaat?
“Accepted graciously”? What am I even talking about?
Like, haranguing young algorithms to teach them some manners on the bus?
Nagging my teenage algorithm son to write some thank-you notes?
❧
Not sure.
An algorithm is just a recipe, a set of instructions. But within critical data studies, and adjacent conversations, “algorithm” is often shorthand for algorithmic decision-making involving machine learning and big data. In other words, the term has come to stand for a particular subfield of Artificial Intelligence, focused on prophetic pattern recognition using computational statistics and oodles and oodles of information. Many of these algorithms are opaque not just because they’re protected by Intellectual Property law, or because of the special skills required to analyse them, but because their form of computation just isn’t amenable to human interpretation.
The term “black box” is a useful metaphor […] given its own dual meaning. It can refer to a recording device, like the data- monitoring systems in planes, trains, and cars. Or it can mean a system whose workings are mysterious; we can observe its inputs and outputs, but we cannot tell how one becomes the other. We face these two meanings daily: tracked ever more closely by firms and government, we have no clear idea of just how far much of this information can travel, how it is used, or its consequences
— Frank Pasquale, Black Box Society: The Secret Algorithms That Control Money and Information
IN ALGO WE TRUST.
Earlier this year, the New York City Council introduced a local law intended to promote algorithmic transparency. Sitting today in the Attenborough Centre for the Creative Arts, my instinct is … and I could just be projecting, but … my instinct is that the critique of algorithmic governance has attained a kind of strength and maturity. We are mostly on the same page, and a lot of people are looking over enviously at our page. The strong misgivings about the social use of algorithms have won key parts of the argument. The days of uncritical zeal for the coming algocratic utopia-dystopia are fading fast.
❧

When you look at somebody, that’s not necessarily your big in to be like, “Excuse me, I believe this impression I have received in my visual cortex belongs to you. What would you like me to do with it? How would you like to be remembered and/or forgotten?”
If the personal dimension of data is about more than property and privacy, what else might it be about? Perhaps the logic of the signal or the flow could be useful here. Bergson, Shannon, Deleuze and Guattari, Alaimo? A cheeky Csíkszentmihályi? Maybe sometimes the onus of stewardship is not to guard and inspect, but to modulate and pass on. Digital sociality asks us to think through the ethics of signal-boosting. Signal-boosting certainly can’t be fully explained with reference to what we ‘agree’ with or what we ‘like.’ Many Twitter bios carry a proviso: “RTs are not endorsements.” If they’re not endorsements, what are they?
And, as someone – was it Laurence in his intro? – suggested, “Greater visibility brings greater vulnerability.”
Does greater social connectivity and increased surveillance impose the need for a richer vocabulary of signal modulation?
Where even to start?
Even volume isn’t that simple.
In audio engineering lingo, gain stages are points along an audio signal flow where the engineer can make adjustments to the level. A gain control appears to function a little like an additional volume slider, although there is a difference: gain adjusts the sensitivity to input (boosting the signal to a level that can be adequately analysed and manipulated in the mixer board) whereas volume usually manages the output level. Set the gain too low and there won’t be enough signal: the background hiss comes into the foreground. Set the gain too high and the sound becomes distorted or fizzy or compressed.
Don’t quote me on that. (Don’t boost my signal).
❧
Romy Gad el Rab, from Hyphen-Labs, alludes to a viral video of a racist soap dispenser, as a way into thinking through the intersections of race, technology, and design.
There’s a kind of speculative fiction worldbuilding aspect to Hyphen-Labs’s NeuroSpeculative AfroFeminism project: the basic unit of creation is the alternative possible reality, which then gets articulated across several platforms. A VR experience stars you as a black female neurocosmetology professional, pioneering cognitive enhancement techniques in your sumptuous NeuroCosmetology lab, a reimagined black hair salon. Then there are the product designs: Ruby Cam door-knocker recorder earrings, which discreetly launch a recording or livestream of a police stop or other dangerous encounter; the hyperface anti-surveillance scarf, which uses adversarial training to generate a print that can fool face recognition technology (modelled by Whoopi Goldberg nbd); and an affordable transparent sunscreen designed for black skin.
❧
David Garcia finishes off the day. Algorithmic governance is squarely in the centre of the frame now, as he explores how smart infrastructures are squeezing the space between knowledge and intervention, and the implications for civic and democratic life.
At one point he screens an ad for the short-lived Samaritans Radar app, which was supposed to analyse online activity and flag up language associated with suicidal ideation. “Who needs your help most?” asks the ad and for some reason I find this ad incredibly stressful, probably not least because it reminds me of those old British Army recruitment ads which gave you little puzzles like “It’s snowing and we’ve only brought one blanket, who gets it the medic or the radio operator” and the answer is like “PSYCH! Give it to the tank. Also now you’re in the army.”
❧
By the end of the Samaritans Radar ad I am very much a mess and on edge.
And.
All those misgivings about algorithmic governance are back.
❧
It is a bit surprising that the algorithm accrued such cachet in the first place. Think back, if you can, to the time before the algorithm gained its current cultural prominence. What would have been the closest correlates back then? What precedents and models would we have used to understand the algorithm as it began to emerge?
One point of comparison would be the science fictional Artificial Intelligence … always prone to go evil, or ‘mad,’ or to exercise catastrophically literalistic observation and reasoning. This gets the algorithm off to a great start:
ALGO-SKEPTIC: I don’t fully understand the system you’re proposing.
ALGO-ADVOCATE: Oh me neither luckily it will soon take on a life of its own so all that will soon be irrelevant, awesome just handing over control of the airlocks for a laugh … and … we’re cushty.
FORMER ALGO-SKEPTIC: Yay
Ask the science fictional AI the square root of a negative number, or the nature of love, or to look at a painting, and it explodes – everyone knows that.
Another point of comparison would be the bureaucracy, with its reputation for obfuscation, inefficiency, impenetrability, irrational inflexibility, and corruptibility. Again …
ALGO-SKEPTIC: I lowkey question the wisdom of some of these decisions.
ALGO-ADVOCATE: Don’t worry they were generated through a bureaucratic process so they’re fine.
ALGO-SKEPTIC: Idk, true that is a massive comfort and yet …
ALGO-ADVOCATE: Wait because also it’s an incredibly complicated and opaque bureaucratic process with millions of tiny steps and an insatiable appetite for information and outcomes which sometimes alter according to as far as anyone can tell basically trivial and irrelevant information.
FORMER ALGO-SKEPTIC: Oh cool I am so on board come here black boxes I want you to eat me
So it’s surprising.
I suspect the answer is that it wasn’t from the reputation of AI, nor from the reputation of bureaucracy, that the rudiments of the algorithm’s reputation were drawn. It was from the prestige of the market.
We were prepared to embrace algorithmic governance by the Hayekian project of identifying markets with information processors – specifically with information processors uniquely capable of absorbing and compositing our tacit knowledge, of minutely counting and valuing that which we cannot consciously count and value – and by globalization and financialisation, and the concomitant co-evolution of new powerful markets with information technology systems.
❧
Zoom out even further. Perhaps we were prepared to embrace algorithmic governance by the liberal political economy’s secularisation of God’s grace as the immanent order of dynamic manifolds. As Alexander Pope put it, “Extremes in Nature equal good produce; / Extremes in Man concur to gen’ral use.” Or as the KJV Bible put it:
29 Are not two sparrows sold for a farthing? and one of them shall not fall on the ground without your Father.
30 But the very hairs of your head are all numbered.
31 Fear ye not therefore, ye are of more value than many sparrows.
Early in the day, when Wes was talking through the project ‘Augury,’ which includes an incanted fragment from Hamlet, “we defy augury,” someone from the audience – I think Caroline Bassett – pointed out that Hamlet “defies augury by dying.”
In the Messy Edge aftermath I hit up Hamlet:
HORATIO: If your mind dislike anything, obey it. I will forestall their repair hither and say you are not fit.
HAMLET: Not a whit, we defy augury; there’s a special providence in the fall of a sparrow.
Hamlet’s so smart!
Augury is bird-based forecasting. Hamlet doesn’t say “we defy augury, there’s special providence in the splat of a sheep.” Nor does he say, “we defy haruspicy, God is a details guy even when it comes to birds.”
No, Hamlet’s critique of predictive analytics is that the would-be omens – in this case, our feathered trends – are already part of the interconnected multitude that the would-be augur expects them to model. Mr-Time-Is-Out-Of-Joint gets it. You can’t instrumentalise the birds to tell your story, because the birds have stories of their own, and their story is already part of your story.
That’s why we defy augury. And that’s part of why we defy algorithmic governance. Partly, perhaps, this is a matter of Barnesian performativity or counterperformativity: the way in which the use of a model can influence what it models. Cassandra of Troy had the gift of prophecy, but the curse that no one would believe her. There is a nice science fictional rigor to that premise. By ensuring Cassandra won’t be believed, you do away with the problem that her foresight influences the future, and the necessity to knit all the history in her vicinity out of closed causal loops.
Admiringly, I zoom out even more:
HAMLET: It is but foolery; but it is such a kind of gain-giving as would perhaps trouble a woman.
HORATIO: If your mind dislike anything, obey it. I will forestall their repair hither, and say you are not fit.
HAMLET: Not a whit, we defy augury. There’s a special providence in the fall of a sparrow. If it be now, ’tis not to come; if it be not to come, it will be now; if it be not now, yet it will come. The readiness is all. Since no man has aught of what he leaves, what is’t to leave betimes?
Hamlet is such an emo fuckboy.
❧
And gain-giving? I’m going to have to look that up.
Samuel Johnson quite reasonably defines gain-giving as “gainsaying,” i.e. contradicting, and he quotes Hamlet.
The OED has gain-giving as “misgiving,” offering Hamlet as the earliest example of this sense.
In modern useage, the word misgivings describes a kind of attitude to the future, or a mode of futurity, and it carries some stimulating nuance. For a start, it’s almost always plural. So it suggests several data points, albeit maybe as-yet unweighted. The word misgivings is also equivocal between two senses. First, “a funny superstitious feeling I have about something.” Second, “a bit of evidence about someone or something, although it may or may not be relevant, and I probably don’t want to express it out loud right now, potentially because I still want to give the person or thing a chance without pre-judging and perhaps slipping into a self-fulfilling prophecy.”
That is, misgivings are seldom a reason to call something off entirely: they are a reason to go ahead in some qualified way. Misgivings imply a skilful dialectic of heart and head may lie ahead. They are a reason to go ahead, but with a particular attitude and attentiveness, a particular tact and care, probing for a particular sweet-spot of sensitivities that can lift the signal from the background hiss. That particularity is what constitutes the misgivings. This kind of incipient intelligence, “as would perhaps trouble a woman,” is something Hamlet sweeps aside, in a gesticulation of run-down stoic machismo tinged with Machiavelli’s Fortuna-chirpsing. He sweeps it aside as though it were an alternative to “readiness,” whereas really it is an indispensable component of readiness.
❧
“Sometimes, perhaps, the onus of stewardship isn’t about guarding and inspecting, but about modulating and passing on.”
Remembrance is often like that: you don’t seal up experience in the big warehouse of Remembered Things; you just give your experience a certain salience and colouring and let it flow past. Will you encounter it again? Maybe.
❧
Hamlet doesn’t say misgivings. He says gain-giving. The word giving in early modern English can mean “intimating” or “telling.” Gain-giving could plausibly suggest something you’ve been told about the future – perhaps by heart, heaven or hell – from which you might be able to gain some advantage. A more credible option, though, would be to understand gain-giving as a kind of truncation of against-giving. In this period against certain often has its familiar associations of conflict, so gain could be heard like the mis of misgiving, and gain-giving could be a kind of “malign telling.” But against could also more neutrally express “motion toward” something, whether in space or in time. That sense seems still to be preserved in modern phrases like “gaining on [someone you’re chasing]” or “saving against the day [of buying something expensive].” This means that gain-giving could connote a kind of “toward telling,” a premonition that is not necessarily negative. And there might even have been faint positive associations to the word gain: the gainest course would have meant the shortest and most straightforward way; to be gainly would have been to be un-ungainly: that is, elegant, well-disposed, proper, becoming, suitable, fitting, gracious … full of grace.
❧
Anyway.
I’m a bit suspicious of everything that is easily googleable about the word gain-giving.
Before Hamlet, gain-giving appears to have nothing to do with premonitions. It carries the more intuitive sense of “again-giving, giving in return,” which also parallels words like gaincall (to call back) and gaincome (to return).
Gain recognise gain.
But the pre-Hamlet example in the OED is also the only one I can find. It’s from Barbour’s The Brus, where it’s part of a lament that the people of Scotland should trust Edward I – a grasping monarch who “gryppyt ay, but gayne-gevyng” – to arbitrate their disputed succession. I think means “gripped all, never giving back” … but it might not mean that. It’s a little murky, and the context is all about foreseeing or failing to foresee the future.
❧
The customary way of formulating what can go wrong with algorithmic governance is, “If the data is already biased, then the outcomes are biased.”
OK, that’s workable enough. It’s a sufficient basis to call out and resist the replication, deepening and multiplication of inequalities through technological redlining.
But it has limitations. For one thing, it doesn’t capture the challenge of integrating algorithmic agency within a more classic ontology of governance. You know: humans interacting through roles, responsibilities, rules and authorisations, and an array of audit and accountability forms, stuff like that. To cultivate, within algorithmic agency, some kind of semantic or representational granularity, one which can sprout ligatures into “stuff like that” – into the kinds of structures and practices that humans inhabit and feel at least a little bit at home in – could be pretty seriously morally fruitful. That is, the older governance (analog, symbolic, molar) has often been plagued, just as the new governance is (digital, connectionist, molecular), by coordination breakdowns, by the space collapsing between knowledge and intervention, by accountability strewn into airy nothingness. Algorithmic agency could reveal to us our senseless systemic violence and bungling in fresh ways. It could force us to come up with solutions to things we’re not even accustomed to thinking of as problems. So maybe critics of the social use of algorithms shouldn’t win our arguments too quickly and too decisively.
The other problem with “If the data is already biased, then the outcomes are biased” is that it doesn’t leave enough room for reality to be “biased.” One high-profile case of prejudiced algorithmic agency is COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) which purports to assess the risk that someone will commit future crimes, based on 137 data points – including factors such as education, employment, and a questionnaire that asks stuff like, “Do hungry people have a right to steal?” – and secret proprietary weighting and calculation. A study by ProPublica revealed COMPAS’s racist bias according to one standard (when it makes mistakes, it favours white defendants over black). COMPAS’s makers, Northpointe, rejoined that their tool was impartial according to a different standard (for any given risk score, the accuracy is the same whether you’re black or white). These two standards are both indispensable, and they are mathematically incompatible unless the rearrest rate happens to be the same for both black and white defendants – which right now, it isn’t. The algorithm can try to cram a racist social reality into a front-end of impartial formalism, but the inequity in the data will always bubble up elsewhere in the model.
❧
Throughout the day, those moments where art appears to be presented as the answer, however provisionally, in whatever modest pattern of fragments, and with however heavy a hint that it is not really an answer, but a 42-ish incitement to the insight that I don’t actually have a question – those moments leave me … not crestfallen exactly. Cresterect, in conditions of crestprecarity.
❧
There is being-toward-others, but there is no being-toward-evil-supercomputer.
In the digital era, just like the pre-digital, I think that as we give away our symbols and data – casting off atom-thin phantasma, likenesses, disjecta membra – we have certain expectations about what should become of them. There are upper and lower bounds here: our wishes do go with them, but we don’t want those wishes fulfilled with diabolic meticulousness. We want to relinquish our appearances. We don’t want everything that we seem to be to be treated strictly according to our intentions, our motives, our desires. That might be philosophically problematic anyway: our intentions, our motives, our desires are some of the things we’re giving away. How do you discover a person’s desire for their desire for their desire? Part of what we expect is that our expectations not be sought out too aggressively, nor met too exactly. We don’t expect purity for our signals, but modulation. We give ourselves as gifts, and like all gifts, it is unacceptable if we get back the same object in return. We expect ourselves to return to ourselves, mutated and mixed with others, and this is the material from which we regenerate ourselves.
Then again, I wonder about this “we,” and who it excludes, and who is at its centre (hi!). It’s all very well to gesture toward an ethic of representation that rejects the property metaphor. But what if you’ve always felt that your representations are treated as property, and always as somebody else’s property? You might just want back what’s yours.
❧
“Cutting edge,” I guess, also implies violence. The cutting edge of a machete, disrupting the intricate vegetation to clear a path, to make progress. As if whatever is already there doesn’t matter.
Or … more like the cutting edge of scalpel. Associated with dissection, forensics, experimentation, but also with healing.
Imagine trying to conduct surgery, only you’re not allowed to sew, suture, staple, glue, tape shut your patient’s flesh. You can only slice, debride, wash, and wait in hope their body will close of its own accord. Perhaps that’s how the cutting edge of techno-capitalism works.
Doesn’t the cutting edge need the stitching point?
Jo Lindsay Walton is a postdoctoral researcher at the Sussex Humanities Lab, working on political economy and speculative fiction.