New Short Story: Uncanny Sensitive for Hire, Rates on Request.

“I don’t like her.”

You don’t like anyone.”

 “Not true, I’m very sociable!”

“You’re a socialite; there’s a difference. She’s not touching anything is she?”

The two voices were languid. Bored even. These barbs had been used before. The to-and-fro of their conversation was old and played out. Only the arrival of the third gave them anything new to speak about, a new target for their day to day disgust.

“No, she just stands there. Staring. It’s creepy.”

“… she can’t hear us, can she, Della?”

“No, I have us on a different channel. I made sure.” One of the faces, professionally lit and made up, was dominating a flat-screen carefully held in plastic hands by a simple, almost retro, anthropoid unit. She looked smug.

The other, just as smothered in make-up and surrounded by artfully chosen books and little trinkets in her flat, flicked her shimmering hair with a thin hand and ignored the slight. Dora had nearly lost a client once by gossiping; forgetting to mute herself and broadcasting her every wicked thought to the floor of the gallery in surround sound. Her punishment had been to travel to the gallery in person to get down on her hands and knees to disinfect its bone-white floors; all while being watched over by Deren’s cheap plastic maintenance robots with their blank faces and skeletal hands.

“She’s expensive. Did you see her hourly rate?!” Dora changed the subject quickly.

“If our Lord and Master wishes to pay…”

“Deren is a pernickety fool!”

“Dora!” the other face squeaked in delight, relishing the bitchiness. Her unit rocked as its operator expressed her amusement through what few movements it could make. A basic teleconferencing unit, no-frills. No bells and whistles. The two gallery assistants had better models they could have used. Models with synthetic skin and styled hair that were meant to make even the tele-visiting guests comfortable. Using the old units was meant to be a slight to their visitor.

She didn’t seem to care.

The robotic units carrying Dora and Della’s faces had welcomed the young woman into the gallery at 9 am precisely. She had seemed to listen as they blurted out instructions, rules, petty gossip, and details about Deren’s purchases for the past five months. But she had barely spoken other to acknowledge that no, she would not touch anything, and yes, she would go through decontamination. Even as the decontamination drone had sprayed her over with a fine mist, she hadn’t moved, twitched, or complained. She’d just accepted the measures that were necessary these days, and gone to stand in front of the grand canvas Deren had commissioned her to assess.

“Do you think she’s… you know?”

“No, I don’t know.”

Neurodivergent,” Della whispered the word.

“I thought they all were. Isn’t that their thing?”

“I don’t know. Never met one before.”

“I did. At one of Philip’s soirées. He kept moving away from Pip’s butler-bot. Said he did some contract work for the military, but it was all very hush-hush.”

“Trying to get in your knickers?”

“Probably” Dora smiled widely, “But he didn’t chat long though. Seemed very uncomfortable with the whole face to face thing-”

“Pip’s still holding in-person parties??”

“Oh, you know Pip, anything to seem daring.” A mock yawn followed. “Do you think we should speak to her, though? I mean, if she’s being paid by the hour, do we just let her stand there all day?”

“But do we care?”

“Logically, the more Deren spends on consultants, the less he can spend on us. Right?”

Della sends an emoji in approval, and the two of them sent their clunky robotic carriers closer to the woman standing in front of the latest acquisition. They opened a channel and plastered broad, fake smiles on their faces as they crowded around her.

“How is it going?”

The woman, seemingly caught by surprise, turned and flinched a little. “Good. Good.”

“I hope our telepresence bots don’t bother you.” Shouted Dora, raising her voice as though the woman was partially deaf instead of uncanny sensitive.

“They’re very old. I’m okay.” The woman smiled hesitantly. “But I thought Deren warned me that he had some more advanced bots you would be controlling-”

“Of course, we have some state-of-the-art models. Very expensive ones, of course. The clients prefer to interact with them, even if they’re riding bots themselves. The personal touch is so important.” Sniffed Della.

“But we decided we’d use these old things. Make you feel more comfortable.” Said Dora pointedly, glaring at Della.

The woman looked a bit confused at the interaction.

“Poor thing, she doesn’t understand how compassionate we’re being. Neurodivergent. I told you.” Della sent to Dora on a private channel.

“I thought she’d look a bit more interesting. Special eyes or something. At least some sort of piercing soul-searching blue, not that boring mud brown colour.”

“And with all that money she’s being paid to look at one painting, you’d think she’d have some better clothes.” Della preened in her flat, smoothing the lines of her expensive shift dress. Dora sent her a quick emoji warning, a siren blaring red light and screaming sounds: their conversation might be private, but she was visible on the small rectangular screen by the sensitive’s face.

“So, what do you think?” Dora said out loud. “About the painting?”

“I’m not sure it matters, really.” Della interrupted as the woman went to speak. “Deren puts too much emphasis on man-made… sorry, human-made. Both still sell. Some clients even prefer machine-made.”

Dora began to recite some marketing spiel, boredom creeping into her words as repetition had dulled her interest in what she was selling, “Tapping into the great unknown space of artificial creativity, this work by the synthetic being called AI-472 is number 17,423 in their series on the colour blue.”

“AI-472 only produced sixteen thousand or so pieces in its blue period. Then it was adjusted.” The woman corrected her gently.

“I was making a point. Some people like that what they’re looking at came from inside a little black box and not a real person.”

“And some people don’t.” The woman said. “Some people see the joins. The unfinished stories in the work. The broken thoughts.”

“And some people turn that ability into a lucrative business,” Dora said pointedly. “Do you need much more time to check this one over? You’re paid by the hour, aren’t you?”

“Oh, I’m done. The painting’s human-made. I’ll certify it if one of your bots can bring me a pen that works.” The woman brought out an official-looking document and a stamp set from the shabby canvas bag strapped across her chest. “How many bots are there here, by the way?”

“A few telepresence models. Some cleaning models too.”

“You don’t clean the gallery too?”

“In-person?! Can you even imagine!” Della squealed, and then added in a self-satisfied voice, “I haven’t left my flat in six years. Have you, Dora?”

Dora’s face on the screen reddened slightly. “Just the odd little party, you know. But with precautions, of course.”

“Didn’t you have to come in one day, in person, and clean the floors-”

“Okay, Della. I’m sure Miss – whatever her name is – will want to get on with the rest of her day now.”

“Except you cheated, didn’t you! You just used a telepresence bot, one of the good ones! Just to clean the floor!! I saw its knees afterwards; I knew what you did!” Della cackled, taking delight in catching her colleague out.

Dora’s bot stamped its foot, making the woman take a step back in fear.

“I should go.” She said, her voice trembling.

“Sorry. Sorry!” Dora calmed herself, and her fake smile returned as she settled her hair. “Della and I were just playing.”

“You shouldn’t be scared of these old bots any way!” Della laughed. “Silly thing. They’re not even autonomous. We’re totally in control.”

“It’s not that. Its…” The woman swallowed hard. “Sorry, I just find it hard to be near the nearly human but not quite. And… I have to go.”

They walked their telepresence bots with her to the door and made some banal and benign goodbyes. It was Dora who asked the second to last question of the Uncanny Valley Sensitive. Paid by the hour to look at things.

“Why did you have to come to the gallery, going anywhere in person is so risky.”

“I have all my vaccinations.”

“But everyone knows they can’t keep up with the mutations. Remember, ‘Stay home, stay safe’. Always.”

“I can’t… I can’t tell if a painting’s real or not through a simulation of that thing. A digital picture of the painting would have been another layer of the unreal. I had to stand in front of it. Feel what I felt.”

“Is that the same for simulated humans? For AI?”

Afterwards, Dora and Della couldn’t remember who had asked the last question of the Uncanny Valley Sensitive. Couldn’t remember her looking from one screen to the next. Looking at where the women seemed to be sitting at slender glass desks in aesthetically perfect individual flats full of the right coffee table art books, the most perfectly delightful objets d’art, and sweet little curios from holidays they’d taken before the virus.

Before answering, the woman took a look at the rain outside the gallery and pulled out a battered old black umbrella from her heavy bag as she went to leave. “If someone is simulated, then the digital version is the only version, whether it’s in a bot or a tablet’s screen. There’s nothing in the way. Anyway, have a good evening, ladies.”

“Well,” said Dora, holding up the certificate of authenticity the Uncanny Sensitive had handed her telepresence bot, bringing it closer to her screen to examine the elegant calligraphy. “I’m not sure she was worth all that money. Deren’s an idiot.”

“Hmmm,” mused Della, “It all sounds like a bit of a scam to me. “Closing up time?”

“Oh, please. I want a glass of something cheeky and a bath.”

“Sounds perfect.”

Their wide red-lipped smiles froze on their faces. Their hair, elegantly cut in fashionable styles so ‘now’ that they almost transcended this moment, stopped moving as their heads stilled.

In synch, two identical clocks began their countdown to the next working day.

Atlas of Anomalous AI

Very excited to have an essay of mine, ‘When AI Prophecy Fails’, included in this fabulous new collection edited by Ben Vickers and K. Allado-McDowell:

Contributions from writers, philosophers and curators including: Blaise Agüera y Arcas, Ramon Amaro, Noelani Arista, Jorge Luis Borges, Benjamin H. Bratton, Federico Campagna, Arthur C. Clarke, Rana Dasgupta, Eknath Easwaran, GPT-2, GPT-3, Yuk Hui, Nora N. Khan, Suzanne Kite, Jason Edward Lewis, Catherine Malabou, Hans Ulrich Obrist, Matteo Pasquinelli, Archer Pechawis, Noah Raford, Nisha Ramayya, Beth Singler, and Hito Steyerl.

Artworks by: Anni Albers, Pablo Amaringo, Refik Anadol, William Blake, Ian Cheng, Ithell Colquhoun, DeepDream, Federico Díaz, Susan Hiller, Hildegard of Bingen, Pierre Huyghe, C. G. Jung, Hilma af Klint, Emma Kunz, Paul Laffoley, Lucy Siyao Liu, Branko Petrović and Nikola Bojić, Santiago Ramón y Cajal, Casey Reas, Jenna Sutela, and Suzanne Treister.

The Cake is Still a Lie


This morning the other parental unit and I downloaded Portal 2 on the Xbox One for the child unit, who has now completed the minimum eight rotations around the sun necessary to understand the tricky quantum tunnelling mechanics at work in the Aperture Science Handheld Portal Device, or ‘portal gun’. Possibly. Maybe.

There are many good reasons to download a Portal game (Portal 2 has a genuinely brilliant local co-op game), but of course, as soon as GLaDOS got her first mention I started thinking about AI. And cake.

If you aren’t familiar with either Portal 1 or Portal 2, here’s my TL: DR version of their game mechanics and plots (spoilers ahead of course). In Portal 1, you play as Chell, a subject in the numerous testing rooms of the Aperture Science Laboratories Computer-Aided Enrichment Center. The portal gun you find enables you to create two distinct portal ends, blue and orange, which you can walk through to come out somewhere else. Using continuous momentum you can fall into one hole to get launched to previously unobtainable areas and doors, and move on to the next room and eventually, maybe, your freedom.


All the while, your efforts are being assessed and snarkily commented on by GLaDOS, (‘Genetic Lifeform and Disk Operating System’, a homage to MS-DOS), an AI built by the creators of the Aperture Science labs. We find out during the course of the tests that GLaDOS turned on the scientists of Aperture, releasing a deadly neurotoxin that has wiped them all out. So far, so psychotic femme-voiced killer AI.

GLaDOS’ passive-aggressive comments on your progress, and on your pathetic humanity, also come with promises of a reward at the end of the testing process: a cake. Although, we also see graffiti from a lab worker, Doug Rattmann (the ‘Lab Rat’) warning that the cake is a lie.


We defeat GLaDOS at the end of Portal 1, ending up lying on the ground outside the facility gates, surrounded by the remains of the AI. But with the release of Portal 2, the original ending was retconned to show Chell being dragged away, back into the facility, with a robotic voice thanking her for lying on the floor and assuming the “party escort submission position”. GLaDOS then sings over the credits, letting us know that she’s Still Alive. She does indeed return in Portal 2, with even more test-rooms. For science.

I don’t think it’s a complete coincidence that we downloaded Portal 2 now, just as a whole generation in the UK have been tested and are now receiving their results, either for their Scottish Highers, A-Levels, and soon Scottish National 5 qualifications and GCSEs. Just as we’ve been watching an AI guiding, and lying to, a test subject, around 280,000 A-level students have had their grades adjusted downwards by an algorithm. Others have written detailed explanations of where this decision-making system has gone wrong, not only in its mathematical standardizations but also in how it makes assumptions about this school year on the basis of previous years. Go to an already high-attaining (likely Private or Public school) and the algorithm marks you upwards. Go to a school with a history of lower grades and there’s no way you could really deserve that A you were predicted. They’ve played the game, been promised cake at the end of it if they work hard enough. But the cake was a lie.

And they are angry.

I’ve written elsewhere about what happens when we trust ‘the algorithm’ too much; when we hand over so much of our agency that it becomes something we see as holding sway over our fate. So much so that we can feel that we have been ‘Blessed by the Algorithm’. Theistic metaphors hold true here too: I imagine many of these students currently feel ‘Cursed by the Algorithm’ as opportunities they have worked so hard from vanish from view. While our governments may make more of those u-turns that they are becoming infamous for (Scotland has already said universities should accept non-adjusted grades), at the moment we have a clear example of too must trust in a bad system, as well as the suggestion that any system we placed so much trust in might be bad for us.

There are of course differences between GLaDOS and the code put together by Ofqual, the exams regulator, that has created this current mess. Like so many other science-fiction AIs, GLaDOS is credited with personality, intention, and even cruelty. However, our tendency to anthropomorphize even the most abstract of mathematically driven decision making systems means students end up shouting ‘fuck the algorithm’, as though personally hurt by this string of code. They’re also shouting at Boris and his cronies of course, but there’s a danger in slipping into anthropomorphic thinking when it comes to the systems that are being used by actual humans in power to make decisions about our futures.

The creators of Portal leant into this anthropomorphism to make GLaDOS a villain. Early on in development, they also realised that one particular test required anthropomorphism to make it work for the gamer. In a ‘box marathon’ test, the player has to get from one side of a difficult test room to the other while carrying a cube. At first, players kept forgetting the box and leaving it behind. Then one of the game’s creators, having read some “declassified government interrogation thing” whereby “isolation leads subjects to begin to attach to inanimate objects”, suggested making the cube ‘cute’. They added a pink heart to its design, called it the Companion Cube, and gave GLaDOS some lines about how it was unlikely that the cube was sentient – at least, it probably wasn’t. Then, at the end of the box marathon test, the player has to learn how to use an incineration device – in order to defeat GLaDOS later on – and has to burn up the Companion Cube:

Although the euthanizing process is remarkably painful, eight out of ten Aperture Science engineers believe that the Companion Cube is most likely incapable of feeling much pain.” – GLaDOS

The player’s connection makes this moment actually affective. But, as the Michael Rosen poem above suggests, the dangers of handing over agency to algorithms are not just in reifying, or deifying, them but also in the counter-reaction: reducing those we enter into the system to being ‘just’ data. This is Goodhart’s Law and then some: ‘When a measure becomes a target, it ceases to be a good measure’. And when a student becomes something to be measured against, becomes data – your school has produced U-class A level students, therefore you must be a U-class student – the individual is lost in the algorithm.

Reifying and anthropomorphising the algorithm obviously obscures the humans – the students in this case – inputted into the system. It also obscures the humans behind the system. At the moment, students are trying to hold people in power to account, protesting outside the Department for Education in London. But it’s increasingly obvious that a certain flavour of technological accelerationism is gaining ground in the policy thinking of the UK government, signified by the adoption of the ‘move fast and break things’ thinking of Silicon Valley. The Barnard Castle fan, and Chief Advisor to the PM, Dominic Cummings has become the poster boy for this kind of approach, but he wouldn’t be so successful if there weren’t others who agreed with him.

When this kind of thinking clashes with the realities of people’s lived lives we’ll get these moments of rebellion against the algorithm, of Chell facing off against GLaDOS. But unless light is shed on the more insidious moments when algorithms are trusted too much by the government we might not know all the times when the cake is a lie.

Some tech ethicists calling for more transparency are trying to find a middle-way between ‘fuck the algorithm’ and ‘assume the party escort submission position’. I’m always a little wary of including myself in this endeavour, as the banner of ‘ethics’ is increasingly being used – by corporations in particular – to obscure these very problems. They can’t be unethical, they have an ethics board! But we can all get involved in bringing light to issues. For instance, some, like journalist Carole Cadwalladr below, have pointed out the irony of Roger Taylor being the chair of both Ofqual and the Centre for Data Ethics and Innovation in the UK (tasked with ensuring algorithmic fairness).


(As I write this (at 2.56pm) there are reports that he will make a statement at 4pm, likely announcing a Scottish-like u-turn. We shall see).

(Hi, its 4.03pm Beth here, yes we have a u-turn! Thoughts with all the admissions offices who now have to deal with this revision when there’s a cap on places!!)

The Portal games are challenging mind-bending, puzzles with amusing representations of AI and robots (and the co-op version lets you play as a part of robots who are always being blown up, which is more fun than it sounds) but for all that GLaDOS guides Chell in them, they are not a guide for life. It’s a few years until my small human needs to worry about exams, but already he’s in a system that will regularly test and grade him like Chell. So I’ll need to teach him about the algorithms that promise you cake in the real world and the humans who wrote the lie in the first place. Portal seems a good place to start…

cake pic
You got to the end so here is the cake: a doodle I drew while writing my notes for this post 🙂

‘How to Talk to Robots: A Girl’s Guide to a Future Dominated by AI’ by Tabitha Goldstraub

I’m very excited about the forthcoming book by Tabitha Goldstraub, How to Talk to Robots: A Girl’s Guide To a Future Dominated by AI.


The book involves interviews with women working in, and thinking about, AI (including me!). Fantastic portraits of all the interviewees were drawn by the wonderful Jess Wade (also interviewed in the book):

Beth Singler Jess Wade
It’s me!

Pre-order the book now for its October 2020 release 😀

NEW PAPER: “Blessed by the algorithm”: Theistic conceptions of artificial intelligence in online discourse

Blessed by the Algorithm Paper B Singler 2020

“My first long haul flight that didn’t fill up and an empty row for me. I have been blessed by the algorithm”

The phrase ‘blessed by the algorithm’ expresses the feeling of having been fortunate in what appears on your feed on various social media platforms, or in the success or virality of your content as a creator, or in what gig economy jobs you are offered. However, we cblessed 2an also place it within wider public discourse employing theistic conceptions of AI. Building on anthropological fieldwork into the ‘entanglements of AI and Religion’ (Singler 2017a), this article will explore how ‘blessed by the algorithm’ tweets are indicative of the impact of theistic AI narratives: modes of thinking about AI in an implicitly religious way. This thinking also represents continuities that push back against the secularisation thesis and other grand narratives of disenchantment that claim secularity occurs because of technological and intellectual progress. This article will also explore new religious movements, where theistic conceptions of AI entangle technological aspirations with religious ones.

AI & Society, April 2020