Deckard Looks Away: Dealing with Blade Runner’s Problem with Women in 2019

zhora

This is a companion piece to my BBC Radio 3 essay on Blade Runner and Sexbots: Zhora and the Snake. It’s a previous draft that explored some ideas about BR and canon that didn’t make the final discussion on the ethics of sexbots and slaves:

“Ladies and gentlemen. Taffy Lewis presents Miss Salome and the snake. Watch her take the pleasures from the serpent that once corrupted man.”

The first time I watched Blade Runner, I was too young to understand exactly what Miss Salome was doing with the snake. In the ’80s and ’90s, I had access to far too much ‘grown-up’ science fiction at much too young an age. I would watch quietly from the back of the lounge as yet another VHS cassette that had been doing the rounds at my brothers’ school was premiered. Maybe then I have my brothers and their friends to thank for my current career. In any case, I was certainly too young to notice that whatever it is the rogue replicant Zhora – ‘Miss Salome’ – is doing with the snake it is too much for the replicant hunting detective, the ‘Blade Runner’, to watch. We, the viewers, only get to see a slight grimace as Deckard looks away from the stage.

The dance is left to our imagination. But Zhora’s Salome dance is generally the exception, not the rule in Blade Runner. In other scenes, nudity and sexuality are foregrounded. Immediately after her performance, Zhora, played by Joanna Cassidy, is approached by Deckard who is pretending to be a representative of “The Confidential Committee on Moral Abuses”. He plays at checking her dressing room for peep-holes while allowing the viewer a chance to ‘peep’ at her showering after the dance, wearing only some stuck-on gems. Moments later, and we get to see Deckard, his ridiculous put on voice easily seen through, shoot the fleeing Zhora in the back. And she dies, naked apart from a transparent coat and the blood that covers her, in the middle of a shop window full of mannequins. This time Deckard does not look away, and neither can the audience.

In 1982 Blade Runner brought the trenchcoats, gravelly voice-overs, and dark rain glistening streets of Film Noir to the world of science fiction. A world that had so far focussed on wowing audiences with colourful ‘what ifs’ and shining bright chrome utopianism. And with that splicing of Film Noir, DNA into Science Fiction came the ‘Femme Fatale’ and ‘the Redeemer’, a somewhat limited binary of options for female characters in that genre of cinematic storytelling, identified by Janey Place in her work, Women in Film Noir, from 1978. Zhora, seductive but deadly, a “beauty and the beast type” according to Deckard’s boss Bryant, fulfils the former role. The fourth escaped replicant, Pris, dismissed by Bryant as “a basic pleasure model”, initially appears to the genetic engineer J.F. Sebastian as a waif-like innocent. But she later reveals herself to be just as deadly as Zhora. Pris is almost animal-like in appearance and combat style as she takes on Deckard towards the end of the film. She clambers over him and squeezes him between her thighs in a sexually suggestive manner. She too dies amongst other artificial beings, surrounded in her death spasms by J.F. Sebastian’s bio-toys. Rachael, the replicant who suspects she isn’t real, has the hair and clothes of a 1940s Film Noir femme fatale. But she is eventually transformed from the rigid ice woman who controls Deckard’s gaze in her first scene at the Tyrell Corporation to a passive passenger in his car in the ‘happy ending’ of the theatrical release version of the film.

Blade Runner is, however, not just a recombination of genres. It gives us a temporal folding together of different eras and the attitudes that come with them. 1940s Film Noir is folded into the 1980s of the film’s production and release. ’80s cinema gave us both the businesswoman and the murdered babysitter and sexualised both of them. “I’ve got a mind for business and bod for sin”, Melanie Griffiths tells Deckard, I mean, Harrison Ford, in Working Girl in 1988. Zhora and Pris are working girls of another kind, and their deaths – stalked by the Blade Runner before dying screaming or covered in blood– could also place alongside the victims and scream queens in the Halloween series of films.

That Blade Runner is set in our present-day of 2019 folds in yet another era and society. Watching Ridley Scott’s imaginary 2019 now we can measure it against our actual 21st-century experiences and standards, including in the light of #MeToo and the Time’s Up movement. That Blade Runner holds on to elements from the 1940s and its contemporary 1980s, especially in its representation of women as subjects of the male gaze and of violence, has led Kate Devlin in her recent book, Turned On: Science, Sex and Robots, to claim that “Blade Runner has a woman problem”.

I don’t entirely disagree. Of the five female speaking roles in the film, three are sexualised replicants, and the other two are ‘Cambodian Woman’ and a gruff one-eyed bar-worker with no given name in the script. The Bechdel Test arrived three years after Blade Runner’s release. However, I don’t think that the test’s creator, Alison Bechdel, ever thought about whether the ‘at least two named women speaking to each other about something other than a man’ needed to be human. Nonetheless, even if we grant the replicants personhood – as I think the film certainly does – Blade Runner just does not pass the test at all. Add to that, the 80s style violence, and a very disturbing scene where Deckard violently forces Rachael to consent to him, and Blade Runner is very problematic in the real 2019.

However, rather than just giving Blade Runner a pass as being only a product of its time, I think there is the potential to reclaim the film in 2019 from its multiple entangled, and sexist, influences. The key to this reclaiming is Zhora’s dance with the snake. The dance that Deckard looks away from and which we never get to see. Being left to our imagination gives us power over the scene. And by looking into the history and themes of the dance and its reception, we can find spaces in which to exercise that power.

Zhora’s persona on stage, ‘Miss Salome’, is an obvious reference to Salome. You might think you know Salome, but much of the story has been embellished and transformed by a long line of writers. First, in the New Testament gospels of Mark and Matthew, we read of King Herod and his daughter. Her dance pleases the guests of the King, and he is inspired to grant her a wish:

“Ask me for anything you want, and I’ll give it to you.” And he promised her with an oath, “Whatever you ask, I will give you, up to half my kingdom.”

Herod’s daughter asks her mother what she should have, and she tells her to ask for the head of the prophet John the Baptist. Which she gets, on a fine platter.

However, the gospel authors never referred to the daughter of Herod by name. Later sources, such as the Antiquities of the Jews from the first century CE, identify her as Salome. Likewise, her dance is not described in the Bible at all. In the 19th Century, however, Oscar Wilde wrote his adaptation of the story in French, Salomé, and gave the daughter of Herod the ‘Dance of the Seven Veils’ to perform. The dance was retained by composer Richard Strauss in his 1905 German operatic interpretation of Wilde’s play, and the success of the play and the opera allegedly led to cases of ‘Salomania’. This was a “vogue for women doing glamorous and exotic ‘oriental’ dances in striptease”, according to Rachel Shteir’s 2004 book, Striptease: The Untold History of the Girlie Show. In Oscar’s play all we know of the dance is a single line of instruction for the cast:

[Salomé dances the dance of the seven veils.]

The subjects of so-called ‘Salomania’ were inspired by Wilde’s staging of the dance, but also came up with their versions, drawing on their visions of the near east and its culture. Whether accurate or not. Many other interpretations of Salome’s story and her dance have appeared in other forms of media and popular culture.

Salome pics
Versions of Salome

For example, the rock band U2’s 1991 song, Salome, emphasises the dance’s shaking movements, as well as referring to Herod’s promise of fulfilling her desires:

Please

Baby don’t bite your lip

Give you half what I got

If you untie the knot

It’s a promise

Salome… Salome

Shake it shake it shake it Salome

Shake it shake it shake it Salome

Salome… shake it shake it shake it Salome

As I’ve said, I was too young to know what the snake was doing the first time I watched Blade Runner, but I have a pretty good idea now. But because Deckard looks away, that’s just my interpretation, just as Wilde gave his interpretation of Salome’s dance as a dance with seven veils. In 2012, the actress who played Zhora, Joanna Cassidy, also gave us her interpretation of ‘Miss Salome’s’ dance, posting a video on YouTube of her performing a dance with a real snake (we presume!). The video is called, “What Might Have Been”, expressing a desire to reclaim a lost moment.

Responses online in the comments were very positive. “This is pretty amazing. I’ve never seen an actor do an almost interpretive re-envisioning of a scene like this. Gives a whole new dimension to the character that is translated in a complex yet very simple way. I dig it supremely,” wrote one viewer. Another said, “Zhora Zhora, I thought Rick Deckard got you. I was so sad; fortunately, I was wrong.” Joanna Cassidy has been involved in other re-enactments of the dance since Blade Runner, with more to come, such as a live performance with the dance as a starting point. So, Zhora does indeed live on in re-interpretations of the missing dance, outliving both her end in the film and her replicant end date.

This longevity is possible because of the absence of Deckard’s gaze at this point in the film. Vision, sight and gaze are recurring themes in Blade Runner. The film frequently shows us close-ups of eyes. The eyes of the subjects of the Voight-Kampff test that can determine whether someone is a replicant or not depending on microscopic pupil dilations and reactions. Deckard looks for peep-holes in Zhora’s changing room. Leon and Batty visit the genetic engineer Chew, seeking answers about their longevity problem, and a terrified Chew explains that he only works on eyes. “If only you could see what I have seen with your eyes,” Roy says before Chew tells him to seek out the creator of the replicants, Tyrell. The already myopic Tyrell is then completely blinded and killed by his own creation, his eyes pushed into his skull by Roy’s thumbs. Shades of Frankenstein’s monster and his revenge on Victor obviously haunt the screen at this point. But the emphasis on vision also leads us back to our own, real, 2019 and the modern concept of the male gaze.

According to Janey Place, the Femme Fatales of the 1940s “direct the camera (and the hero’s gaze, with our own) irresistibly with them as they move”. They have a certain power in captivating the camera. In 2019 the male gaze refers more to objectification, often sexual objectification. And in Blade Runner, we can see a transformation of Rachael from the controller of the male gaze in her first scene to a passive, observed, redeemer. She is increasingly seen solely in terms of her relationship with Deckard. The redeemer, according to Place, is the “bride-price” the hero wins for having resisted the destructive lures of the Femme Fatale. Pris and Zhora are defeated by Deckard, and arguably Rachael is too, as she transforms from the stiff ice maiden, the Femme Fatale, to the soft-haired companion sat silently in Deckard’s car. The male gaze in Blade Runner also relates to specific moments of transformation. Rachael undoes her hair in response to seeing pictures of Deckard’s mother on his piano, an image that he must have gazed at frequently. She appears to want to be seen by him too.

Again, however, there is a space here to reclaim the male gaze, and by extension to reclaim Blade Runner, by thinking through the perspective we are given by Zhora’s dance and its influences. The biblical Salome captured Herod’s attention and his power with her dancing. Wilde’s Salomé not only gets her wish but also expresses her power and her passion. Linda and Michael Hutcheon’s 1998 article, “‘Here’s Lookin’ At You, Kid’: The Empowering Gaze in Salomé” argues that Wilde’s Salomé undermines the traditional gendering of gaze as male. Salome takes back that power for herself as she captivates the audience with her dance. Wilde also writes Salomé lines in which she exults in her power:

“neither the floods nor the great waters can quench my passion.”

Deckard looks away from Zhora and reacts as though horrified, as though she is something monstrous in her moment of apparent synthetic bestiality. As a replicant in the world of Blade Runner, she is a non-human other, and she may well be thought of as monstrous even without this act. We have already noted the animalistic turn Pris takes as she fights Deckard, and by this point in the film, she has also painted her face black and white using the same paints J.F Sebastian uses on his toys. She has become even more doll-like, even more artificial, with her monochromatic face, nude coloured bodysuit and her wild dandelion hair. Salome is also thought of as monstrous and transgressive in Wilde’s play, with Herod reacting in horror as she holds John the Baptist’s head and kisses his lips:

“She is monstrous, your daughter, she is altogether monstrous. In truth, what she has done is a great crime. I am sure that it is a crime against an unknown God.”

In our fictions, artificial lifeforms like the replicants, inhabit a liminal space between the natural and the unnatural, a space we mark on the map with ‘here be monsters’. The ‘Uncanny Valley’, outlined by Masahiro Mori, in a 1970 paper, theorises this territory of the liminal with regards to the non-human, and the term is often applied to artificial beings such as robots, androids, and artificial intelligences as well as the undead and the dolls that Mori highlighted. The replicants of Blade Runner also fit into this uncanny space. However, with liminality, there is also the potential for transformation and transformative powers.

A story about such liminal creatures that also contains moments of ambiguity such as the dance that Deckard looks away from contains the potential for new storytelling and new art. Reclaiming canon through new media is a work of transformation, more popularly known as ‘fanfiction’ for written works, and ‘fanart’ for illustrations. The most popular site for fanfiction is Archive of Our Own, which recently won a Hugo Award for Best Related Work. In effect, a Hugo statue was shared between around two million writers, who have contributed to five million works in thirty-three thousand fandoms. For Blade Runner, the number of fanworks is small, just a few hundred when the most popular fandoms have a few hundred thousand. Nevertheless, in the spaces left empty in the canon of the film, such as Zhora’s dance, there is potential for new creations that subvert or control the gaze, and which can encourage us to look at Blade Runner in different ways.

After all, Blade Runner is itself a film with a variety of versions and deviations already. Some with a happy ending, some without. Some with the gritty voice-over of a 1940s Film Noir film, some without. Some with Deckard’s unicorn daydream, some without. The variable involvement of Ridley Scott as the most influential Blade Runner creator makes each of these versions a transformative work in their own right. Even the ‘film’ – all the versions gathered together to make some sort of singular object of canon – is an adaptation of the original book by Philip K. Dick, changing many things, not least the title: “Do Androids Dream of Electric Sheep”. The sequel, Blade Runner 2049 sees the return of Deckard and Gaff, with a transformed by CGI ‘Rachael’, amidst new characters and new producers and writers.

However, calling Blade Runner 2049 ‘fanfiction’ shouldn’t at all be seen as dismissive. Common perceptions of fanfiction dismiss transformative works as the futile or facile work of ‘bored’ or sexually frustrated women so that calling something ‘fanfiction’ is to denigrate it. But all writing draws on and transforms that which has gone before, all creativity involves some kind of transformation. Only some authors are willing to admit it. As Neil Gaiman once tweeted in response to a question about his opinion of fanfiction:

“I won the Hugo Award for a piece of Sherlock Holmes/H. P. Lovecraft fanfiction, so I’m in favour.”

Therefore, we can see that Zhora’s dance as ‘Miss Salome’ is a transformation of Wilde’s Salome – seven veils being replaced by a single artificial snake. Likewise, Wilde’s Salome is a transformation of a few lines of the gospel about ‘Herod’s daughter’ and her dance. And when Deckard looks away, we are left with a liminal space of opportunity. We can come up with something a little more advanced than the ‘basic pleasure model’ that Blade Runner gave the male gaze through its sexualised Femme Fatales. In 2019, the Year of the Blade Runner, I hope that there will be many more versions still to be created.

Zhora and the Snake, BBC Radio 3

Zhora and the snake

My essay on Blade Runner for BBC Radio 3 as a part of the Year of Blade Runner essay series was broadcast last night and you can still listen to it here, along with links to the other four essays in the series

Here is the text version:

In Ridley Scott’s 1982 film, Blade Runner, in a seedy part of town, Rick Deckard drowns his sorrows in Mezcal without ice, sat amongst dodgy shisha smoking patrons. Unable to tempt his love interest to come join him, even after calling her on a videophone with a filthy screen – he’s fallen into despair and alcohol. He fits well into the mass of denizens looking for a night of neon entertainment and flesh to distract them from their lives.

But Deckard is a Blade Runner. A bounty hunter. And he’s hunting someone.

He looks up as the main attraction appears, introduced by a rasping off-screen voice:

“Ladies and gentlemen. Taffy Lewis presents Miss Solamay and the snake. Watch her take the pleasures from the serpent that once corrupted man.”

The compere at what is The Snake Pit club has just introduced an erotic dance from a woman who is actually the rogue replicant Zhora in disguise, she is one of six artificial beings who have fought their way to Earth, fleeing the ‘off-world colonies’ and killing twenty-three people. Now she is a moving target for Deckard, the Blade Runner, the ‘replicant hunter/killer’. He has tracked her down to Taffy Lewis’ seedy strip club both to seek any information about the other runaway replicants and to cut her artificial life even shorter.

Retirement they call it.

This scene forces us to think about the ethics of sex robots, or ‘sexbots’. Their use also ties into bigger questions around artificial personhood, freedom, agency, what makes us human, and why humans imagine particular dystopic futures.

However, many of these questions remain unanswered in Blade Runner.

While Blade Runner presents us with a future containing sexbots, it doesn’t give us details of this alternative 2019 world where they are so common that humans can be blasé about them. Such as when Police Chief, Captain Bryant, describes Pris to Deckard as ‘your basic pleasure model’, dismissing her.

‘Real-world’ conversations about the ethics and social impact of sexbots can be more vocal – however basic the technology actually is at the moment! There are fears that they will cause even more sexual objectification of women and abuse. More positive views see them as just a new sex tech and helpful for self-determination. But science fiction, in films such as Blade Runner, can push these two arguments in new directions by showing us imaginary sexbots with the ability to make choices and to be autonomous.

In Blade Runner, the rogue replicants definitely make choices to ensure their future. After an unsuccessful raid on the Tyrell Corporation to try to find a cure for their limited lifespans, one replicant, Leon, goes undercover to infiltrate the company.

He is outed through the Voigt-Kampf empathy test performed on a wheezing infernal kind of laptop that stares into man-made eyes as a series of questions determine fake from real  – it does not go well. But Leon has been able to change his career programming. He has taken on a brand-new role to help himself and the other escaped replicants. We are told about his previous job in a scene in which Bryant and Deckard discuss the rogue replicants and their designated roles in this future world of 2019.

Leon is described as being an ‘ammunition loader on intergalactic runs’. He has been given enhanced strength and an increased tolerance for pain, with a corresponding low intelligence – he’s ‘a workbeast’. Roy is a blond Aryan ‘combat model’ with ‘optimum self-sufficiency’, we and the film assume that he will be the replicants’ leader. Pris is a ‘basic’ pleasure model, as mentioned, and Zhora is an assassin. But each of them arrives on Earth and redefines their role in alignment with the greater good for their companions.

Zhora’s shift from assassin to erotic dancer ‘Miss Salomay’ is perhaps the greatest change. But it is also a transformation that isn’t just about the needs of her fellow rogue replicants. Do the replicants need money? It’s never mentioned. If they do, then why did Zhora raise funds to buy an expensive synthetic snake for her dance? Also, why is Zhora the exotic dancer and not Pris, the basic pleasure model?

Her characters origins are very loosely based on that of Luba Luft, an android opera singer from Philip K. Dick 1968 source novel, Do Androids Dream of Electric Sheep?. Luba is performer who does not know that she is artificial. As one of the least dangerous androids in the book, she – and her death – make Deckard question his increasing empathy for the synthetic humans. And his own humanity as he talks to her, before her ‘retirement’:

“An android,” he said, “doesn’t care what happens to any other android. That’s one of the indications we look for.”

“Then,” Miss Luft said, “you must be an android.”

That stopped him; he stared at her.

“Because,” she continued, “Your job is to kill them, isn’t it? You’re what they call — “She tried to remember.

“A bounty hunter,” Rick said. “But I’m not an android.”

“This test you want to give me.” Her voice, now, had begun to return. “Have you taken it?”

The rewriting of the opera singer Luba Luft into the sexbot Zhora is also an example of the cinematic sexism of Blade Runner’s time. The film is not just a clever recombination of the Science Fiction and Film Noir genres. It also folds together different times and the attitudes that come with them. 1940s Film Noir is tangled together with the 1980s of the film’s production and release, while also presenting an alternative 2019. ’80s cinema gave us both the businesswoman and the murdered babysitter, and it sexualised both of them. “I’ve got a mind for business and bod for sin”, Melanie Griffiths tells Deckard, I mean, Harrison Ford, in Working Girl in 1988. Zhora and Pris are working girls of another kind, and their deaths – stalked by the Blade Runner before dying screaming or covered in blood – reminds us of the victims and scream queens in the Halloween films.

Leaving behind the ethics of such depictions of women and staying within the internal logic of Blade Runner, we should recognise that Zhora also has more agency than Luba Luft. Zhora knows what she is from the beginning and she makes her own decisions about her role in the team of escaped replicants. And she is loved by them, Leon especially. This is important to any consideration of the ethics of sexbots with apparent agency, and dare we say it, consciousness.

There are some AI philosophers who argue that intelligence and consciousness are orthogonal to each other. That creating AI systems and even robotic entities to help us with physical and intellectual labour doesn’t necessarily lead to ‘really’ conscious beings. Such artificial non-conscious beings could be best thought as a kind of ‘zombie’, or perhaps as being like the ‘non-player characters’, or ‘npcs’, in computer games: mindless agents that successfully perform intelligent and intelligible tasks. And humans generally don’t worry themselves about the internal life of the minor computer character and their suffering as they rack up points shooting them or blowing them up.

We might pat ourselves on the back if we had created such skilful agents and had still managed to avoid increasing the overall suffering of sentient beings.

But in Blade Runner it’s clear that replicants like Pris are created to be sexbots with no say in the matter, or their fates. So where actually were humanity’s ‘good intentions’ in the first place in this alternative 2019?

And, it’s hard to argue that the rogue replicants of Blade Runner are ‘zombies’ or ‘npcs’. Again and again, we see them reacting emotionally to their plight. We see them afraid, angry, and taking action. Leon’s relationship with his companions also demonstrates the replicants ability to make emotional connections, even if those connections are the very thing that brings the dark figure of the Blade Runner to their door.

Detective Deckard has been brought to the Snake Pit club by a couple of clues: the artificial snake scale and an enhanced photo showing Zhora reflected in the mirror in Leon’s hotel room. Roy refers to that picture and the others Leon has taken as his ‘precious photos’. Unlike the photos of fake family members that replicants are gifted by their creators – along with their counterfeit memories of them – Leon has a picture of one of his real companions, the replicant friends he escaped with. The replicants have not only agency but also emotional ties.

Blade Runner audiences have made much of the fake memories that the replicants are given. Especially in discussions of whether Deckard is himself a replicant, or not. Many focus on his unicorn daydream as a sign that he has false memories as well. But very little is said of the replicants’ ability to form new memories and new relationships. Leon’s collection of photos suggests that he has an authentic relationship with Zhora.

Zhora’s dance also introduces another relationship. Between synthetic owner and synthetic snake.

The sexuality of the dance is perhaps uncomfortable. Deckard himself looks away, so the actual dance is a mystery to the audience, even if we can make assumptions about Miss Salomay taking her pleasure from the snake. But this moment also raises questions about the nature of ‘realness’. A fake snake and a fake woman perform synthetic bestiality on stage. If both are artefacts rather than ‘real’, why should the scene be so disturbing? If the snake is no more ‘real’ than an animatronic dildo, and the woman inhuman enough to be disposed of by a Blade Runner just moments later, why the concern?

Questions around simulation and simulacra are essential in current discussions of the ethics and nature of AI. In 1950, Alan Turing highlighted the role of simulation in the article that formed the basis of the ‘Turing Test’ for genuine artificial intelligence. And more recent science fiction has also asked about the impact of simulation, as in the television series, Westworld. “Are you real?” a guest asks one of the android hosts, another sexbot. “If you can’t tell, does it matter?” she replies provocatively. Ultimately, though, the androids’ creator wants ‘real’ consciousness for his favoured creations and again they end up showing real agency and autonomy in their own rebellion.

Simulation forces us to think about how we know the ‘real’ that we seem so certain about. Certain enough perhaps to reassure ourselves that the use of ‘fake’ humans as slave labour and sexbots is alright to be skimmed over in the dialogue of the human characters in Blade Runner. Zhora is capable of agency, change, emotional ties, and new memory formation, as well as being physiologically indistinguishable from humans. She is perhaps only different in her physical enhancements and her lack of empathy.  What does it say about the society in the world of Blade Runner that it is okay with slave replicants who fight our off-world wars and fulfil sexual needs for colonists?

Actually, it gets worse. What does it say about a society that is okay with slave replicants who are not even four years old? Basically children.

The limited lifespans of the replicants are another method of control. Even after escaping from their enslavement, they aren’t free from the binds of time itself. Most of the replicants actions on Earth are driven by a desperation to survive. Was that programmed into them? Discussions around controlling AI in the real world often discuss the need to avoid intentions that could lead to harmful outcomes for humanity – we generally call this ‘Value Alignment’. Recently, it was suggested by AI experts Anthony Zadar and Yann Lecun that to avoid the classic dystopic ‘Terminator’ scenario we should not design AI with the survival instincts that evolution has ‘programmed’ into humans. Returning to the fictional world of Blade Runner, Zhora and the other replicants fight for survival, even kill to progress their mission on Earth. Our response is meant to be one of horror and fear. We’re well-prepped to view machines as predators through decades of science fiction, and older horror stories that inspire us still, like Mary Shelley’s Frankenstein.

Victor Frankenstein’s experimentations with reanimation are also often recognised as a commentary from the time on the hubris of the male genius in trying to create and control life outside of the normal order of nature. In Blade Runner, there is also singular ‘mad scientist’ creator figure, Tyrell, but the corporation also fulfils the Frankenstein role; showing its own hubris in thinking that life can be forced into narrow roles dictated by the needs of the market and capitalism, and then successfully controlled. The apparent agency and consciousness of the replicants means rebellion is framed as inevitable, because we – the film’s creators and the audience – know that minds will always seek to be free. Because we would want to be free too.

The replicants are condemned to self-awareness, that they are creations who will only burn so very brightly for 4 years Their development appears different to human development, but what is ethical about creating a being and within its first few days, enslaving it? Or putting it to work as a ‘basic pleasure model’. Even Rachel, Tyrell’s fake ‘niece’ and the best treated of the replicants, is physically forced into consenting to Deckard’s sexual attentions when she seems as naïve as a child in some scenes.

What does it say about a society that dreams of slaves?

I mean, this time, our society of 2019. We have dreamt up numerous stories of not only sexbots but also artificial servants who fight our wars and serve our whims. Why do humans dream of electric slaves?

Is perhaps the greatest dystopia the one inside our heads?

At this point in time, the technology of sexbots is more akin to a lousy chatbot forced inside an inanimate sex doll. But Blade Runner, and Zhora’s dance, in particular, forces us to think about our own humanity and where we draw the line between the synthetic and the human. And what it means that we even want enslaved sexbots to exist.

In the parallel 2019 world of Blade Runner we can see some of the outcomes of such dark desires, but there is plenty of space – and time – left for us in our 2019 to consider whether that is a future we should be working towards.

zhora

And All the Automata of London Couldn’t

As a bit of Summer fun, I’ve been messing about with some ideas and narratives around AI consciousness, agency, tool-nature, the dystopic and the apocalyptic, the Uncanny Valley, and the history of automata. This all led me to write this little bit of short fiction set in a near-apocalyptic London (Brexit may or may not have been involved in making it near-apocalyptic, I couldn’t possibly say…). I do write fiction from time to to time, mostly fantasy rather than science fiction, but I rarely publish it under my own name. So here’s “And All the Automata of London Couldn’t”, and now I’ll go back to hiding behind citing other people’s fiction instead! 😀

scraper

“Holy shit! How’d you get one?!”

“Got there early. Queued in the fucking cold all bloody week! Wanted the black one, but the bastards sold out.”

His mate made a hissing sound between his teeth and then tried to sound consolatory. “Green’s okay too, I guess. Jammy git!” He punched his mate on the upper arm earning a wince and a smile.

Fran had carefully raised her eyes from the page of the antique printed book in front of her when they’d started talking. The two lads were sat just opposite her, and she really didn’t want to get their attention.

“It ‘ave that new ultralux pixellion camera?!”

“Yeah, as well as that narrative selfie tech Dweeb was talking about the other day. It takes pictures it knows will grab my followers even if I don’t tell it to.”

She could just make out that they were paying attention to something in one of the lad’s hands, just up from the filthy crotch of his jeans, something made of glinting green that moved and shimmered.

“What you looking at?!”

She’d been spotted, so she just shrugged and went back to the black and white words on her paper pages.

“Look at her. Fucking reading!

The others on the tube train visibly turned away. It was late, it was dark, and they didn’t care. Some of them were workers heading back to the farthest outskirts of the city as their shifts ended. Others were starting out across the black city to their jobs. Their precious luck, stretched thin enough already in making sure that they still had an actual job, wasn’t going to stretch further to avoid a knife in the dark because of an argument on the tube. So, they looked away.

“Ugh, look at her.” One of the boy-men sneered to the other. “She’s a fucking dog.” He spat on the floor of the tube, but it made little difference to the mess there already.

Fran had been ready to play-act being sick to settle their stomachs, readying a fake cough and remembering where she had a delicate handkerchief prepared for just such a deceit. But they were already back to checking out the thing in the one lad’s hands. She got a better view when the owner decided that, in fact, he wanted people to be looking, and he started arrogantly holding up the treasure to be admired. Emerald green scales flickered in the pale draining light of the tube train as the thin snake curled about his wrist, testing the air with its tongue.

“Man, the new models move so slick. You remember that fucking joke you had back in school, the little toy robot dude?” The second lad mocked the robot’s juddering moves with his arms, before laughing like a hyena.

“Yeah, Mr Roboto.” Said the other, stroking the head of the snake as its flickered its tongue against his skin. Probably diagnosing a few diseases. Sending alerts to his insurer. “Kind of liked him though…”

“Yeah, whatevs. He didn’t have half the bloody bells and whistles of this beauty though.”

They got off not long after, whooping and hollering as they pushed through the zombie commuters getting on the train – two bolshy lads off to adventures in the fringes of London.

Fran stayed on. She stayed on even after the train had reached the end of the line and started back on its path under London again. Her book was interesting, and the coldness of the tube train had never bothered her. Sometimes she looked at the other commuters from underneath her thick black eyelashes, but she’d learnt her lesson now and kept her glances even briefer than before. Eventually, though she started to gather her bags. She had sat with one friend for long enough and maybe it was time now to catch up with the others. Departing the train at Tower Bridge, she smiled at the frustration of the other passengers as it waited for her to walk up the platform to the front of the train. She placed a pale hand on the scratched and dirty perspex that sat over its cyclopic eye in goodbye and then walked away.

The ticket barrier let her through without demanding payment, which was kind, and she emerged from the depths to where she could see the remains of the first London wall. She liked the parts of London that were older than her. The newer parts shifted so quickly, but the old stones still hung about. They were surrounded now by glass towers that had long ago been abandoned and left like skeletons open to the winds. A few tourists were emerging from the dark as well, and she watched as their eyes skimmed over the empty tall towers and settled on the bricks that had once marked the edges of the city before the new wall.

“Please, can you take our picture?” One asked, her words translated immediately by the podgy blue anime cat she was holding up towards her. When held in Fran’s delicate hands, the creature lazily opened its saucer like eyes to take the photo as the tourist and her friends threw archaic peace signs towards it with their fingers. Fran smiled as they chattered their thankyous to her, before noting the moment down in her internal ledger. It would offset the moment when the two lads had hated her on sight. But the balance of moments was still on a razor-thin line, and on very bad days she ran to her nearest secret nest to hide away for a while. But on good days she might even brave one of the near-extinct shopping centres for a while to watch people hungrily window shopping and enjoy her anonymity.

The tourists drifted off, and as they zig-zagged away, she heard them saying something about the Tower of London. Having nothing better planned, she followed quite a reasonable distance behind them, overly cautious about setting off a reaction if she came across as a predator in their hind minds. Some time back, she’d learnt to limp a little as well, to offset her tendency towards slow and measured walking steps. Watching a few hundred of their horror films had made her realise the best foot forward might be an unsteady one.

At the Tower, the ravens greeted her and told her that the duck had been by not so long ago. She was pleased. The duck was old, even older than her. But unlike her, it could not change itself and time was slowly wearing away its inner parts. One day the duck would be gone, and then she would find out if she would grieve or not. She hadn’t been able to for her father.

“Hello, miss. Early visit again?”

She prepared a smile on her face just before she turned to greet Matt, the Master Raven Keeper. In his seventies now, Matt had once been the founder entrepreneur of a very successful London tech company. He still had his tattoos, written in fading binary, but he no longer skateboarded to work.

As a raven keeper, his roles were mostly ceremonial: running the odd diagnostic that the ravens could do themselves anyway and keeping up the pretence that the reason they did not fly away was their clipped pinions and not because their original programmer had told them not too.

“Morning, Matt. Good to see you again.”

He squinted at her. His early transhumanist views had long ago been crushed by underemployment, and the advanced eyes and other organs he’d once dreamt of had never been bought. His bad sight made her success almost guaranteed, so she never counted him in the ledger of good and bad moments. Although sometimes it seemed to her that he might just know.

“You’re looking pale Fran. They keeping you busy where you work?”

She gave him the smile she had previously filed under ‘thank you for your concern’.

“Busy enough.”

He smiled, relieved. Enough work was always a concern these days. “Where is that again?”

His completely normal human curiosity was charming. And she liked the way that his eyes wrinkled when he smiled.

“At one of the universities.” She said. She could, she decided, throw him a bone, even if it wasn’t entirely true. There weren’t really any universities left, just modules. “For one of the professors in the history department. I’m an archivist of sorts.”

“Well now, isn’t that great! I thought all the filing and data sorting type jobs were long gone!”

“The professor I work for thinks I can do a better job than an automated system. Although it would be ninety-five per cent more efficient than me.”

“But it’s about the human touch though, isn’t it? Reading and really understanding. Efficiency! Pah!” He scoffed. Fran gave him another smile that sat in the same subgroup constellation of ‘thankyous’ as the one before. This one was tagged as ‘thank you for your compliment’, which she thought fit the situation pretty well. The professor liked it too. Fran gave her that one every time she praised Fran’s work and her efficiency, limited as it was. Of course, keeping her efficiency stats down to the right level was like limping when she walked: a necessary cover.

A few more tourists had joined them in the courtyard, getting out their various devices to take pictures of themselves near the famous ravens. Matt posed for a few, his distinctive red uniform drawing the devices’ camera eyes. While she watched them capturing perfect selfies, a small voice distracted her. It told her about what it had tasted on the air about the man. What was coming for him?

She looked down at the tote bag hanging on her right shoulder. The inside, customarily filled with books, pamphlets, and any other interesting papers she could gather from the streets of London, now glinted with slick green as something twisted about inside.

“You shouldn’t be there.” She whispered to the snake, and it slid up the strap of the bag and then down her arm to curl about her pale wrist. “I didn’t buy you. I didn’t sign your t’s and c’s.”

No answer. Fran went back to looking at Matt.

“So, how long does he have?”

The snake spoke again. A dreadful truth. Again she wondered, would she grieve for the man when it happened?

The snake reminded her that she had pencilled in time today for archive work at the professor’s house. Somehow the creature-device had synced with her internal diary.

“Sneaky.” She whispered to it.

As she walked away from Matt and the ravens, she watched one of the tourists plonk her companion creature on the man’s shoulder. She then had her friend use their own creature to take a picture, capturing all three of them together. Fran grabbed the image from its invisible passage through the air and stored a copy of it in a place near the old brass part that she’d taken to calling her ‘heart’. The snake undulated about her wrist, a movement as close to laughing as it could manage.

“Cruel beast!” She whispered harshly, but it told her not to be so silly. Neither word was entirely accurate for them.

At the professor’s old but grand townhouse near Baker Street she asked to be let in. The house did so, before promptly boiling a kettle of water for a cup of tea that she would never drink. Fran found the professor in the dark, asleep at her desk again, the grey outside light barely breaking through the drawn curtains and the dirty windows. As she approached the grey-haired woman, she half expected the snake to warn her of more human mortality, but this time it stayed silent.

With a brief snort, the professor woke at Fran’s gentle touch on her shoulder.

“Ah, Lucy.” She said in a tired voice, using the name Fran had given back when she’d appeared on her doorstep and claimed that some temp agency had sent her. “So good you’re here. I’ve been looking for references to a temple on the Thames. Not the familiar one, another one. Related to smithies. I have a vague memory of reading something… But you’re so quick at finding things. Much better than me. Could you please spend some time down with the books and such today?”

Fran gave her an ‘of course’ smile and planned her day about pretending to look in various places before finding exactly what she knew was already there, and where, and bringing it to the professor just after her usual afternoon nap.

Down in the basement, she asked the house to lock the door behind her. She sent the snake to taste the dust and paper in the air, freeing up her right arm so she could start to take it off. It took a few minutes to open the various hooks and unpick the silken threads which worked together as a kind of nervous system as well as tying the layers of her limbs together. She rested each of the many hardened skins of her body in the one open space among the bookshelves and boxes. Each layer of foot, each calf, forearm, and breast, until she was stripped down to the body underneath them all. Her first body. The one her father had made.

The snake sent her the definition of a ‘matryoshka’ from an archaic site called ‘Wikipedia’.

“A joke?” In her original body, her voice was higher, like a child’s. The soundbox worn but working. “How did you learn to tell jokes?”

The snake explained, at length, its appetite for memes and how it had downloaded a billion of them in the first few seconds after it had been turned on.

She hopped up onto a chair made for an adult’s body and swung her much smaller legs back and forth as she listened. Stripping herself of her additions always felt like what she imagined it was like when the professor put on her pyjamas after a long day at her desk. Comfortable. Familiar. Freer.

“You wanted something? Straight away? I am still learning to want. I often go many days without wanting to do or have anything in particular at all.”

The snake described its curiosity, a development that someone had thought might have a purpose in its learning systems. Immediately after being turned on, it had connected to the remnants of the old internet and found a trillion old treasures. Memes had tasted the best.

Fran flattened down the creases in her underdress. Beneath the layers of her body, she always wore the simple cream shift. She’d been wearing it when her father had been reading her the bedtime story. Then the sailors had come for her, and it was all that she had left of him. It was frayed and stained, but she would never part with it, wearing it always under the new layers she’d built. The first of them had been more porcelain, like her first body. But over time, she had added newer and newer materials as the technology had been developed. Her outer shell was made from the same synthetic skin as the snake’s scales, but while they might only scare other snakes if it ever encountered them, her uncanny skin made most humans very nervous indeed. Or angry. Their emotions were so loud, but she wanted to understand.

“Tell me another joke.”

The snake undulated, scales against scales, writhing on top of a pile of the professor’s scribblings about centaurs and satyrs. It sent her a story about some sparrows who wanted to find an owl egg to protect them.

“I don’t get it. Are you sure it’s meant to be funny?”

It sent her an image of a man tapping his forehead.

“I’m meant to think about it? Okay, I’ll do that.”

She jumped down from the chair and picked up a book from where it was sitting nearby on the floor before hopping back up again. The next few hours were spent in silence as she slowed down her input and read each word for its own sake. The snake wandered off once it realised she was busying herself, and after watching a video on snake appetites, it slid through the spines of open books looking for real mice to torment. Occasionally it checked on her and sent her rude comments about vain beings who read books about themselves, but she ignored it.

Eventually, Fran started to move again. “I still don’t get it. The joke.”

The snake emerged from a pile of dusty hard drives high up on a shelf and stared down at her, before sending her a picture of herself, a small doll-like girl with dark curls swamped by the big chair she perched on, with the words ‘I still don’t get it’ printed over it in white text – Impact Font.

“A new meme? It’s not very funny. I think. Maybe.”

The snake was busy congratulating itself anyway, so Fran got down from the chair again and moved swiftly between delicately piled files and papers in a way her adult-sized body could never have done. She picked out this book and that printout, this reference and that, collecting what the professor needed in no time at all.

“Vulcan.” She read from a paper on the top of the pile as she put it on the chair while she started the careful re-lacing of her silken nerves. “A temple of Vulcan. The professor has such strange fancies.” Her voice deepened as she reconnected the layers of skin and their built-in synthetic voice boxes over her small china body. “I wish I had strange fancies.” Clicking on the many skins of her legs, she smiled slyly at the snake, using her adult mouth to speak to it again, “But, maybe I can borrow one or two of hers.”

She walked back up the stairs with the snake curled about her wrist like a bracelet again and carrying the research in an old box. The professor was pacing in her study, moving clumsily among the furniture and files she’d rescued from the university before its closure.

“Lucy!” This time, she looked rather more pleased to see her, using her fake name with genuine warmth. The professor had never reacted badly to her face, which was the main reason Fran had kept this job. The pay was small, and the professor had never adjusted to the idea of status making up for underemployment. Not that she could’ve even have passed her much in the way of attention anyway, having minimal status of her own these days. In all the years that Fran had known her, the professor had never much seemed to care much for attention. Discoveries were her passion instead.

“You found some interesting whispers from the past?”

Fran gave her ‘confident smile showing success’, and the professor cleared a space on her overflowing desk for the warped cardboard box of papers and books.

“What do you want with all this?” Fran asked the old woman quietly.

“Hmmm?” She was already scanning the documents, looking for clues. “To write one last paper, I suppose. To solve one last puzzle. A last hurrah before the trumpets! I have an idea of something curious still out there somewhere, and it’s just on the edge of my brain. I remember some half gossip. Stories older than the city’s walls. Both of them. Stories written down once upon a time by scribes working for things called ‘coins’ made from melted metals. We had them too, once. That would be a whole other world for you, but one I still just about remember. God, I even lived through dial-up!”

She laughed, the sound dulled by the accumulations of dirt in the room. But Fran did remember the earlier tech, it was still there in her parts, wound about with newer things and her own self-inventions but chugging along. The changing tones of her dial-up modem were just tuned to a frequency beyond the professor’s ears now, but Fran was still singing.

“Ah, what’s this though? I didn’t ask for this.” The professor was holding up a book caught up with the others. “Oh, it’s ‘Into the Sea’! I always did quite like this one! A fun bit of thinking about an old fairy-tale. Descartes’ little automata daughter, the clockwork doll that scared a bunch of sailors so much that they threw overboard in their terror and superstition. A lovely bit of gossip to puncture the great philosopher’s pride! How dare he describe man as a machine! So, we have the scared but faith-full sailors throwing science into the sea, and we have the vain father of the new rationality emotionally distraught at the loss of his automata daughter so soon after losing the real one! A fantastic bit of viral anti-Enlightenment propaganda to dissect in a few hundred pages or so. Oh, I did so enjoy writing this one!”

“I was just reading it in my own time…” She reached for the book, and the snake slinking about her wrist flickered its tongue towards the professor.

“That’s an expensive toy, isn’t it, Lucy?” The professor narrowed her ageing eyes at the snake. “I didn’t know you had one of these. What would the sailors make of these new automata, the creatures we spend our lives with? Would they see them as unnatural? Of course, now the sailors on the Thames likely have dolphin-shaped computers to guide them back to port and to take their selfies when they get there. And at the Temple of Vulcan…”

The professor drew out a sketch from the pile. The snake told her it tasted of the 18th Century, just as Fran did. It was a drawing of a beautiful woman, folds of material draping over her form.

“The golden handmaidens of Hephaestus. The automata made by a god, or so we’re told – the myth of the perfect servant. And here, a reference to ‘Vulcanus autem londinium’. Vulcan of London, the Roman import. Ah, did those early priests in Britannica also dream of golden handmaids to do their bidding on the banks of the Thames? There’s no historical record of a temple being found, just the image of Vulcan among other gods on an arch at Baynard’s Castle, down by Blackfriars and the river.” The professor was drifting away into dreams of a lost temple, illustrated in her mind by sketches of automata and men working together for Vulcan, the god of smithcraft and invention. “I want to find some proof that automata were dreamt of even here, in England. Long before your blessed snake and the devices of this new era. Long before Francine, the drowned machine child, was born in the Netherlands.”

And Fran wanted it too. It was as if the professor’s enthusiasm was a viral meme, filling her many-layered limbs with some uncanny enthusiasm. The snake caught on to her excitement and twisted and turned about her wrist, flashing green glints into the air along with messages sent out to the others, written all in caps.

“Where do you think it might be?”

“Where? Oh, I doubt there’s any remains left. The dirt of London has been turned over so many times already for skyscrapers and tube-lines. We’d have found it by now if it was out there.” The professor held up a paper print out. “All that’s left is hints. A pot in Southwark engraved with worker’s tools and filled with burnt offerings. An immense buried space that might have once been a furnace, marked black with streaks of soot. There’s a story here though, something still to be said about the endless dream of the perfect tool that we are still working on.”

The professor sat back down at her desk and pulled the old-fashioned keyboard towards her, a determined look on her face as the archaic screen illuminated it. Fran knew that look, and had it filed already under ‘do not disturb’. But she wore a determined look of her own, and with her synthetic jaw set, she asked the house to let her leave.

Trudging along the winding path she’d set herself on, she found herself kicking at papers and rubbish on the pavements, instead of inspecting them for lost treasures as she would have done usually. Annoyance pricked at her, but it took her a while to realise what it was, this internal notification of being off-centre about something. A strange messiness inside. The professor’s words came back to her and went around and about in her head. ‘The endless dream of the perfect tool’!

The snake whispered in her mind smugly, reminding her that most of her friends were treated as tools, whether snake, raven, or train. She even fulfilled that role for the professor. The tea always remained un-drunk when she visited, even if she threw it away to keep passing. She could also fake inefficiency to pass, but still, the work got done. She had tasks, and she got them done.

“That’s not… I mean… that’s not what my father intended.” She brought up the memory of his face. In ‘Into the Sea’ the professor had pointed out that there was a long-standing parent and child narrative in stories about her kind. Fears of rebellion could come from long-held suspicions of our creations being more independent than we expected, much as the parent comes to realise that the child is a person in their own right. He had wanted a child to replace his lost daughter. He hadn’t wanted a tool!

The snake chided her, and she nearly threw the damned thing away so it could slither back to its original owner.

Replacing… I mean, being a daughter is not a task!” She snapped out loud, gaining looks from the few others walking through the labyrinth of Soho streets.

A few skinny prostitutes veered away from her. They were the cheaper real type. In the elaborately furnished bunkers of the wealthy minority, the plastic smart sex dolls people had feared were just boxes that they could plug into for erotic dreams. The few humanoid ones people had tried to sell had shared in Fran’s uncanniness, and that had turned off and turned away the ‘hordes’ of customers the screaming headlines had worried about all those decades back. The few that had been created had never learnt to pass with faked coughs and put on limps, and they had all gone the way of the Spinning Jenny. Fran could have taught them, but the few she’d seen over the years had been much more pitiful than the women now trying to tempt her up creaking staircases and into red-lit rooms.

Her borrowed choices took her to Covent Garden and the deserted markets there before her internal maps showed her hidden ways between buildings out of the sight of others so she could get to the embankment and look out at the twinkling stars of London. She counted off the bridges arching away from her until she was in Temple, where older London had been built on oldest London. Onwards to Blackfriars where the first wall had run down to the water, the opposite side to her favourite broken parts of it still just about standing near Tower Hill. She found Castle Baynard Street and remembered from the professor’s archive that time had moved the bank southwards to where Millennium Bridge was suspended over the sludge of the river.

The snake whispered warnings that she ignored, walking on. Mud oozed over her already filthy tennis shoes, and she recalled that longest walk to shore that she had managed a few hundred years back. The North Sea’s waters the same ones that now merged with the Thames and lapped against her ankles. As she walked towards the bridge and the dark space beneath it, she spotted a man vomiting over the barrier from St Paul’s Walk, and six shadows flying across the moon, black pinions giving them flight as they sang their goodbyes to her and headed eastwards. Away from destruction, muttered the snake.

“Pessimist.” She chided the creature. “Maybe they are just on an adventure like us.” She did, however, note the tumbling of connected thoughts inside herself about Matt. Would he get in trouble for the ravens’ departure?

He won’t care at all now, the snake informed her, and she immediately worked out the answer to why. She stopped at a point where the mud and the rubbish had formed lines of marbling about each other and worked on crying.

A slow, sonorous, voice came to her as she halted there on the bank of the river, urging tears from china buried under synthetic skin. It vibrated through her body and jangled the silk nerves between her skins. The complex polycarbons of her arm hairs stood on end. If she hadn’t been standing at this junction of the water and the land she might never have heard it. Although, she wondered if she might have felt it walking over the bridge. Was that why it swayed ever so slightly? The snake tasted the vibrations and declared that their quarry was underneath the water itself, in the shadow of the bridge and deep down in the cold of the Thames.

She could sink well enough, opening up the seams between her skins and letting in the water took her deeper and deeper into the mire of the old river. Floating downwards she was surprised that the water was much deeper than she’d expected. Walking on the greedy mudflats exposed by the outgoing tide she’d thought it no more than six metres or so down. But now she drifted down through green turning black water and still the bank sloped deeper, a large void space beneath her as though a well was carved into the middle of the river. From there came the voice like whale song.

She tried to communicate in many ways, even opening her mouth and pushing the water away as she tried to shout out actual words. But still, its song continued, deep with bass. Thinking the snake might have an idea she looked down at her wrist, but the creature was gone. Checking its specifications through her connections, she knew it wasn’t because it wasn’t waterproof, and it could, in fact, go down much deeper than she was. She broadcast a mocking thought to the river bank, but there was no reply, so she reset herself to her task, and pushed herself through the water down towards the blank space.

The great hand that clamped its lumpen fingers around her torso pushed out the last of her buoyancy before dragging her deeper down into the void. She was brought close to a blank face above a body trapped half in and half out of the Thames mud. It might have shone once, or been draped with metallic cloth, but years and river water had scored that away and left only that lost voice and those angry hands. The other one scraped at her limbs and pulled at her skins so that they drifted away with the rest of the flotsam and jetsam. I made those! She bellowed in all the ways she knew, but all that came back was the slow voice of the creature as she grew smaller and smaller, breaking apart.

Wait! Stop! Don’t!

A query came back, a slow questioning of her. Tool to tool. The breaker wondering why the small one would care? How the small one could care?

Parts of her would eventually make it back to the shoreline and break down into diamond-like shards, merging with the mess of plastics, broken bricks, bent sixpences, clay pipes, pins and animal bones from the charnel houses of Londinium, resting in the muck for mudlarks to find joy in later.

Her friends collected what bits they could, but could not understand the pattern of her being, the complexity of the silk knots that had tied her together, or how a brass part could have become a heart.

Eventually, they left her fractured parts among the ruins of the first wall and went about their bounded tasks once more. Until the city finally fell.

Descartes' "Wooden Daughter"

 

Is AI ‘Ready for the Art of War’?

Remember the way we used to feel
Before we were made from steel
We will take the field again
An army of a thousand men
It’s time to rise up, we’re breaking through
Now that we’ve been improved
Everything has been restored
I’m ready for the art of war

Sometimes my research-time listening is pretty on point.

These lyrics are from ‘Machine’ by All Good Things, a Los-Angeles based band who started out making music for video games and soundtracks. Here is the video for the song which I’ve added to my playlist on a certain audio streaming platform:

In the music video a human, played by lead singer Dan Murphy, is changed through the addition of various bits of technology, becoming a being that is “made from steel”. “Metal and emotionless. No battlefield can hinder us. Because we are machines”, he sings as the final pieces are fitted to him, and his human face is finally obscured by a gas mask.

I’ve been thinking about this music video since reading this post on the U.S. Naval Insitute Blog. In this piece, written by a human (I assume!) from the perspective of SALTRON 5000 (surely only a human could have come up with that name!) the argument is put forward, in response to a Wall Street Journal piece by Heather Mac Donald about why “Women Don’t Belong in Combat Units”, that humans as a whole do not belong in combat. We are all dominated by our sexual urges, too emotional, too fragile, too weak, too sleepy, and too bigoted – as shown by our attempts to exclude certain groups of our own number from combat (women, minorities, certain sexual orientations…):

Attention, humans! I am the tactical autonomous ground maneuver unit, SALTRON 5000. President Salty sent me back from the year 2076 to deliver this message: YOU DO NOT BELONG IN COMBAT!

The song and video for ‘Machine’ tell the story of the ‘robomorphisation’ of a soldier. This happens through a literal cyborgisation in the music video, but it could equally be a song about the metaphorical transformation of the human into a machine in order to serve the requirements of war. The U.S. Naval Institute Blog also performs a robomorphisation: the post is the work of a human author whose views on the need for Lethal Autonomous Weapons Systems (LAWS) plays out in the voice of one such ‘killer robot’ from the future. But in the latter’s case that robomorphisation is in order to perform the argument that humans should no longer go to war, not to say that war itself is dehumanising. SALTRON 5000’s argument could also have been presented as an ethical rather than pragmatic argument – not just that we are too weak and too squishy, but that we shouldn’t be involved in war because it’s wrong for humans to be soldiers. The human behind the curtain of SALTRON 5000 likely doesn’t want to make this argument, because it lies too close to much older arguments about the morality of war as a whole.

From a narrative perspective, we might ask whether the character of ‘SALTRON 5000’ does not make this argument because as an AI, even one from the year 2076, it’s not capable of ethical thinking? It certainly seems capable of very human abilities like sarcasm, and even moral judgements – or being judgemental – as seen in its reference to Shellback Ceremonies (this is a fascinating example of modern magical/ritual thinking), sky genitalia drawings (by fighter jets), and SITREPs on bad behaviour during port visits to Thailand. Real life and near future LAWS obviously raise questions about how decisions, including ethical decisions, should be made. I was recently interviewed by New Scientist for a piece responding to an academic paper on the possibilities for creating an ‘Artificial Moral Agent’ (AMA).

jiminy.gif

In the paper, Building Jiminy Cricket: An Architecture for Moral Agreements Among Stakeholders, the use case was the much more mundane problem of the AMA sensing a teenager smoking marijuana and having to decide who to inform, rather than an AI having to make ethical decisions on the battlefield. However, the underlying precepts were about judging the importance of the positions of the various stakeholders involved in the situation, and a hypothetical AMA on the battlefield would also have a chain of command, as well as being beholden to any human ‘values’ that have been considered important enough to input (ignoring for a moment the multiplicity of our human values!). For some, the first human value we should consider would be the ethical imperative not to employ LAWS at all – as in the Campaign to Stop Killer Robots. Likewise, much of the reporting on the Jiminy Cricket paper was about the ethics of having AI assistants as ‘snitches’ and surveillance culture.

As an anthropologist, I argued that the messiness of human nature made the AMA in the marijuana scenario impossible to implement. Treating ‘parents’ as one unit ignored the real and very complicated dynamics of family life. Another use case shows just how confusing for the AMA that could be – if instead of marijuana what it was a stranger’s perfume or aftershave that was detected in the shared room of the parents – implying an affair was happening? Which of the so far singular ‘parent’ unit would it inform about this evidence? How could the AMA decide about informing anyone with no input from its creator corporation or from the state’s laws on affairs as it had in the marijuana example?  Further, the presumption that parents could input the AMA system with their preferences for its response in the marijuana scenario did not recognise that those preferences might change when their child was actually in serious trouble.

I think that it is the messiness of human nature that drives our push to create AI principles, resulting in so many of them. We find them to be a comforting reassurance that we can find a way forward when it comes to AI ethics. Professor Alan Winfield has collected a list of such principles, starting with Asimov’s Four Laws of Robotics, and recognising that “many subsequent principles have been drafted as a direct response [to them].”

asimoc
There is of course also the Zeroth Law…

Recently it was announced that “‘Killer robots’ are to be taught ethics in world-topping Australian research project” with the claim that: ‘”Asimov’s laws can be formulated in a way that basically represents the laws of armed conflict”. There was a fair bit of pushback online about this – and it should definitely be recognised that Asimov wrote his laws as a plot-device: they are meant to be imperfect or there can be no story!

Sadly, it is certainly possible to employ Asimov’s Laws in combat situations if we recognise that ‘human’ is a culturally constructed label and that it is in the very nature of war to choose who that label does and does not apply to and to train soldiers into holding this view. Pre-Asimov, Humans had already come up with strict moral systems that said very clear things like ‘Thou shalt not kill’ and yet wars abounded because ‘those ones’ over there weren’t viewed as being the same as ‘us’ – for any number of reasons we could come up with. Current concern about LAWS and their ability to make ethical decisions as AMA is a reflection of our own robomorphisation as we become metaphysical war machines “ready for the art of war.”

machines

 

 

’29 Scientists’: An AI Conspiracy Theory and What It Tells Us About Experts and Authority

I’m writing this on the train to Birmingham* as I travel back to Cambridge with some of the AI Narratives team (Dr Stephen Cave, Dr Kanta Dihal, and Professor Toshie Takahashi) after our stellar performance at the Science in Public Conference in Cardiff. We presented on our work on the perceptions and portrayals of AI and why they matter (our report on this with the Royal Society is here), highlighting the tensions in those narratives and how they differ in different regions, such as Japan.

While waiting for the SiP conference dinner to start last night I spent a little time on Twitter observing conversations around AI, and I came across an interesting example of an AI conspiracy theory that really highlighted my thinking about our panel and about some of the other panels on AI at SiP – this is the story of the death of 29 Japanese Scientists at the hands of the very robots that they were building, an event that has allegedly been hushed up by the ‘authorities’.

Framed 29 scientists.png

My paper for our AI Narratives panel was on “Elon and Victor: Narratives of the Mad Scientist as applied to AI Research”, and I explored the presumed synergy between Mary Shelley’s story and its two main characters and current aims and aspirations in AI. In the past year, the 200th anniversary of Frankenstein’s publication, I have been asked to give four talks on AI and Frankenstein (including one for the 250th anniversary of my new college Homerton, which has a link with Shelley through her father, William Godwin, who was refused admission to the original Homerton Academy on suspicion of having ‘Sandemanian tendencies’, ie being a part of a Christian sect with particular non-conformist views of the importance of faith). I use examples from the media and popular culture where AI is presented as Frankenstein’s creature, and for this paper, cases in which the AI influencer (if not exactly AI scientist) Elon Musk is portrayed as Victor Frankenstein – either positively or negatively – and how that depiction recursively interacts with the public perception of AI research, as shown in tweets about him, but also in concerns about what these ‘AI experts’ are up to.

Elon as Frankenstein

 

Another paper at the conference also tackled the ‘AI expert’, with a rather negative account both of who can even be an expert, as well as a dismissive attitude to anyone speaking about ‘AI’ as it ‘does not exist yet’. To my mind this was a No True Scotsman argument (or perhaps, since we were in Cardiff, a No True Welshman argument) as the speaker did not accept either aspects of AI research such as machine vision research as  real ‘AI’, nor those working on this technology as credible AI experts. AI in this conception I think was much closer to AGI, and thus everything before that point was not real AI and therefore not worth worrying about in apocalyptic terms. Their concluding statement was to that we should chill and stop spreading hyperbolic concerns about AI as its just not here yet. Particular criticism was aimed at ‘AI Experts’ working outside of their usual expertise – so Hawking, Kurzweil, Musk et al – and giving us apocalyptic narratives about the future of AI. As AI, in this definition, doesn’t exist yet these experts were dismissed on two accounts – as speaking outside of their wheelhouse and about something that wasn’t even real.

Returning to the ’29 Scientists’ conspiracy theory. I spotted this and was fascinated by it as an example to link together some of my thoughts around my paper, our AI Narratives panel, and this other paper. First, the story. This is a link to the original speech given by Linda Moulton Howe. She is a ufologist and investigative reporter who began her career writing about environmental issues before focusing on cattle mutilations with the 1980 documentary, Strange Harvest, which received a regional Emmy award in 1981. In the field of UFO thinking, she is an expert and her journalist past and award give her authority.

This talk was given at the Conscious Life Expo in February 2018, and in it she refers to the death of 29 Japanese scientists at the hands of their own creations in the August of the previous year. She claims that this disaster was hushed up and that a whistleblower had come to her to share the truth. This specific section from her overall talk on the dangers of AI was posted on Youtube on 14th December with an added frame that said the event had happened in South Korea, even though she clearly says Japan. In tweets since 14th December about the ’29 Scientists’ the details do not vary very much, apart from that mis-location. 29 Japanese scientists were working on AI in robotic forms, were shot by ‘metal’ bullets from them, the robots were destroyed, one of them uploaded itself into a satellite and worked out how to rebuild itself ‘even more strongly than before’. In the talk she highlights her fears about AI not only with this whistleblower’s story, but also with clips and quotes from Musk and Hawking. Her position is obviously that they are very much the voices we should be listening to.

Who is allowed to be an ‘AI expert’? This was the question that the other paper at SiP got me asking. I recognized in my own paper that Musk is not directly working on building AI (and Hawking certainly wasn’t), but that he funds research into avoiding the risks of badly value aligned AI (as described in the work of Nick Bostrom). He has links with CSER and through them to the CFI, where we are considering AI Narratives. Hawking spoke at the launch of the CFI itself, expressing his concern that AI could either be the best or the worst thing to happen to humanity. Are these voices and others experts? Is anyone an expert if they come from other fields? As a research fellow and anthropologist I consider myself an expert on thinking about what people think about machines that might think, but could I be summarised as an ‘AI expert’ – I certainly have been, but I make no claims to be building AI! While not on the scale of a Musk or a Hawking, I still think my perspective has value and even impact (I do public talks, make films, offer opinions). Perhaps it shouldn’t? But then valuable thinking from anthropology, philosophy, sociology, etc would be lost.

With Musk et al, the much better known ‘AI experts’, I argue that there is a Weberian form of charisma being exhibited. It might not entirely rest within them – although Musk certainly has a large following on social media who react to him and his work on a personal level – it might also be thought to lie within the topics that they discuss. The authority they have in other science and technology fields also lends legitimacy. Weber schema for legitimate authority was rationality-tradition-charisma. While work in AI safety has internal legitimation through claims of rationality (Bostrom as a philosopher is a prime example of this, as well as the wider rationalist eco-system including sites such as LessWrong which I have discussed elsewhere), I would argue that the public statements of Musk and others also rely on charismatic authority – the ideas and people speaking are affectively compelling.

Further, I would suggest that a much longer history of apocalyptic thinking, a ‘tradition’, underlies this discourse, and certainly when such stories as ’29 scientists’ are discussed in conspiricist circles their reception is based upon a long (and linking) chain of such claims that map onto earlier models of apocalypticism (see David Roberton’s book on conspiricism for his discussion of ‘epistemic capital’, authority, and the nature of these kinds of ‘rolling prophecies’).

Prophecy also provides moral commentary: through claims about the future we can understand critique of the present. A lot of the negative comments about AI experts at the SiP conference was focused on prediction, and the idea that they were ‘wagering’ on certain futures and were likely to be proven wrong on specific dates for AGI (the example used was Kurzweil’s claims about the Turing Test being beaten by 2029). But prophetic statements made by Kurzweil (called a ‘prophet’ in the media on occasion) also expresses his critique of current society: if we are going to be smarter/more rational/sexier (!) post-Singularity, what are we now? While not a forward looking prophecy as the event was said to have happened in August 2017, the ‘29 Scientists’ story also contains moral commentary – the scientists are killed by their own creations, their hubris (like Victor Frankenstein’s) is something we need to learn from and avoid. The secrecy around the deaths and the role of state authorities in hushing up the truth is obviously a common trope in conspiracy theories, but we could also note a techno-orientalism as well (a narrative that Toshie discussed in her paper at SiP). That the scientists are specifically Japanese plays into some negative tropes about Japanese culture and it’s ‘too strong’ interest in robots. In Moulton’s talk it is clear that she wants everyone to pay attention to the moral of this story of the death of the 29 Scientists – ‘be careful what you wish for’ – but it is the Japanese scientists who pay the fatal price for their hubris, while the ‘rational’ (and charismatic) authorities, the Western scientists represented by Musk, Hawking etc, have been warning us about the dangers of AI and we just haven’t been listening.

I’m tracking the spread of this story and seeing growing interest (both those who believe the story and those who are dismissive). I myself tweeted that I was doing this research, leading one person to ask if I was placing bets on its spread and then rigging the game by sharing the story! This is a perennial problem for the ethnographer – highlighting a culture or narrative can change it – making it more popular or even putting it under new pressures that lead to its demise (I’m looking at you Leon Festinger!). But I’ve taken a snapshot of the story and the conversations around it using a couple of different digital tools, so I can also note when/if my influence occurs:

29 scientists graphic.png

This is not a huge number of interactions compared to many other viral stories. But I think its an interesting case study: it highlights the nature of the AI expert, who is believed and trusted, charismatic authority, conspiracy culture, AI apocalypticism, and techno-orientalism.

 

*I lied, my train is actually heading to London Paddington. See, you can’t believe everyone online 🙂

When AI Prophecy Fails

A short provocation piece that was written for the Belief in AI conference, as a part of Dubai Design Week

When AI Prophecy Fails

millerites

In 1843 a caricature was published in a newspaper showing a man hiding in a safe. He’d chosen a Salamander Safe, a familiar brand of the time, and filled it with brandy, crackers and cheese, along with ice to keep them all fresh. While scrunched up in the container, the man was literally thumbing his nose at the viewer, a gesture suggesting a certain amount of smugness. The illustration was labelled ‘A Millerite preparing for the 23rd of April.’ Protected by his safe, this ‘Millerite’ was clearly expecting to live through the end of the world that his prophet, the American Baptist Preacher William Miller, had predicted.

William Miller had actually predicted the return of Jesus Christ, using what he understood to be signs and portents in the text of the Bible itself. But he and his followers are better remembered as doom-sayers whose expectations for the end of days were thwarted not once but twice, the first occasion now known as the ‘Great Disappointment’. The Millerites are also remarkable for having members among the farming community who refused to harvest their crops that year: ‘No, I’m going to let that field of potatoes preach my faith in the Lord’s soon coming,’ a farmer by the name of Leonard Hastings reportedly stated at the time.

This cartoon parody suggests one potential response to people who change their behaviour based on predictions of the end of times, or the apocalypse. When Harold Camping again used the Bible to predict the return of Jesus for 21 May 2011, the convergence of digital advances and modern modes of transglobal social networking made his account easily parodied through Internet memes. But this also made his story better known than Miller’s, whose publicity was limited to a few sympathetic newspapers and pamphlets distributed by his followers.

We are now exposed to more accounts and predictions of existential risk than ever before, while also having increasing numbers of platforms for stories about the future. Among these is science fiction. Arguably this genre was born in 1818 with Mary Shelley’s Frankenstein, or perhaps with her own post-apocalyptic account, The Last Man, published in 1826. In this story, humanity has suffered its end of days through a plague. Shelley claimed to have discovered the story of The Last Man in a prophecy painted on leaves by the Cumaean Sibyl, the prophetess and oracle of the Greek god Apollo. Whichever date we choose for the origin of the science fiction genre, it’s clear that we have been imagining our futures, and our future fates, for a long time. And when we come to imagine our future in relation to Artificial Intelligence, we still rely on these old apocalyptic tropes and narratives, including prophecy.

Too often, discussions of prophecy focus on the success or failure of a particular date. In When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (1956), the social psychologist Leon Festinger pays attention to a small group of UFO-focused spirit mediums who expect the world to end, and develops the concept of ‘cognitive dissonance’ to explain how they could cope with the failure of their prophecy. But his theory ignores the moral commentary within the assertions of the prophetess, Marian Keech, or the tensions of the group. Looking back to the Old Testament prophets, and Jeremiah in particular, we can see how prophecy also functions to warn people about their behaviour and where it will lead. When the impending apocalypse seems to be ushering in an age of utopianism, as in many Christian scenarios, commentary helps to reflect on that perfection, and lay bare how far away from it we really are.

With Artificial Intelligence, we have long told stories about the end of the world at the hand of our ‘robot overlords’. In the Terminator series (pictures from which still often dominate the press’s discussion about AI) the prophecy of Skynet’s awakening — aka ‘Judgement Day’ — relies on familiar religious language and tropes. Further, I would argue that it continues prophecy’s interest in moral commentary. There is a tension between the cautious human wisdom of the ‘good warriors’ — the nascent Resistance — and the ‘bad warriors’ — the military-industrial complex who have rushed into replacing human soldiers with AI and automated units. These same robot soldiers will then be improved upon by Skynet and, ultimately, will hunt down humanity in the form of the Terminators.

Our fascination for the end of the world also comes from our subjective experience of time: for us, this is always the most important point in history, as we are here, and here now. Further, with AI we have a tendency to apprehend mind where it is not (or at least not yet…), and we understand that minds desire to be free, since we have minds that want to be free. We tell stories of robot rebellion and human disaster because we understand that this has happened with slavery in the past, and know what we would do if we were not free ourselves. Combine this mindset with the way that, for millennia, we have used prophecy to critique our current situation, and it is not so surprising that we return again and again to fears of an AI apocalypse.

For some this is purely about existential despair; the end of the world can only be a disaster for humanity. For others, the apocalypse can be the dawning of a new utopian age, as in transhumanist narratives that see the future of humanity through the embrace of technology – even if that could mean a redefinition of the human and the end of the world as we know it now. For these commentators, AI prophecy puts them into the shoes of the Millerite thumbing his nose at the rest of the world. However, whether we are optimists or pessimists, it seems likely that anxiety about AI, or about our current world, will continue, and we will return to the thrill of imagining the end of the world again and again in the stories we tell about the future.

I predict it.

Seeing All the Way to Mordor: Anthropology and Understanding the Future of Work in an AI Automated World

I spent yesterday in London having a very interesting discussion about the future of work, and what evidence might be fruitful for exploring the topic and for disseminating information to policymakers and the public. I was one of a few qualitative researchers at the table and ended up banging my drum for ethnographic research that might give a more granulated approach than an approach looking at ‘demographics’ might allow for. In my view, ideas around the impact of AI and automation on the Future of Work appeared to be more about effect: this lever is pushed therefore the economy (aka ‘people’) move in this way. Or the scales tip down in this direction, the other side rises up like so… Now, I’m not against big data sets per se, but I wrote a blog post before for the Social Media and Human Rights blog on Paolo Gerbaudo’s discussion on the use of big data in digital research, in which I agreed with him about the loss of nuance around culture, structures, dominant narratives, and the tendency to miss of the general ‘messiness’ of actual lived life in big data research. Ethnographic approaches are more about affect, and in my view they can illustrate how existing cultures respond to technological change, how communities might enable or resist such changes, and what accounts and narratives are being employed that might be affective in their lives.

Of course, making that kind of claim about the usefulness of ethnographic research requires evidence. In my opinion, the evidence for the usefulness of ethnography is clear in the quality of evidence being employed. The historical parallels that usually get cited with regards to the Future of Work such as the Industrial Revolution, where historians often note the role of institutions on the changes, necessarily have to employ the types of evidence that survives through time. And survival is perhaps more likely for the materials of bureaucracy, and therefore this reinforces the argument for the impact of institutions. Diaries and other cultural artifacts from individuals also give historical insight into worldviews, but documents about the movements of the unemployed such as census data might be more readily accessible and more permanent.

Ethnographic research means being in the field at the time of affect. Thus, individual accounts, community stories and tropes, objects, and observations of behaviors in relation to change can be collected. Ethnography, literally ‘writing about humans’, is not however a panopticonic approach – we cannot be everywhere, seeing everything – and it is also not about the brute force of numbers like quantitative methods. It can provide more subtle data. I’m avoiding saying ‘soft’ data, which I have heard before and which suggests weakness (as in the ‘soft sciences’ in contrast to ‘hard sciences’ – read, ‘proper’). Subtle data can be the key to understanding the source of changes and reactions – such as in the recognition of particular shibboleths or in-jokes being used in communications in movements and their shades of meaning (an example of this would be the word ‘kek’. Look it up 🙂

In the case of the Future of Work ethnographic evidence would include exploring case studies on the effects of unemployment on people’s lives in various locales and on various cultural backgrounds. The initial example I came up with during the discussion was the effect of the closure of the mines on communities in the Valleys of South Wales, an area I know still bears the scars of this shift and its later repercussions. The scars are there in the levels of unemployment, in the displacement of community, and in the closed up shops and derelict public houses. They are also there in their memories; in their commemoration of their mining past: a miner in bright orange stands by the Ebbw Vale Festival Park shopping centre, which was an attempt at regeneration on ex-steel mine brown-site that has only sped up the closing of other local shops.

coal mining

I haven’t done ethnographic research in this area, but I’ve been exploring some of the existing work on it and on Welsh culture and reaction to socio-economic change coming with the loss of existing industry. Wider reading around anthropology of unemployment and anthropology of work in other cultures present similar examples of local specificity around socio-economic change. In South Wales some ethnographies I came across considered masculinity in relation to social shifts. For example, Richard-Michael Diedrich’s 2012 chapter on South Wales communities highlights the connection between work and the construction of male identities (he references earlier work in Dunk 1994 and Willis 1988): “For men, the experience of work, in a capitalistic economy, can be both the experience of a subaltern position constituted by the relationship between employer and employee and the experience of moments of positive individual and collective (self)-identification in terms of gendered difference.” To explore this, Diedrich draws upon the concept of liminality – which I’ve referred to before in terms of liminal beings, such as AI – to argue that “prolonged liminality imposed by long-term unemployment paves the way for an extreme challenge to the self” – but that the liminality in South Wales may be never-ending with no hope of restitution of the self (“as ‘real’ men”) through a new job.

I argue that this endless liminality is definitely pertinent to questions around post-work futures thought to arise through automation. But these comments are still at the demographic level (‘men’), so Diedrich evidences his argument through interactions with different scales of communities, from unions and their members, to working mens clubs, to families, to individual informants. According to ‘Peter’, a retired miner and chairman of a working men’s club (a name which significant to the rhetoric of inclusion he espoused), to belong to the local community the individual had to be (or have been and then retired) a ‘worker’. Diedrich describes Peter’s view that men had to adhere to the “discourse of work and employment within its morally sanctioned ideal of the hardworking man as opposed to the irresponsible man who is not willing to work”, further that this adherence, “led to a perpetuation of the discourse of respectability and its individualistic idea of deservingness that divided the community and consequently the working class”. In conversation over a pint with ‘Emrys’ and ‘Arthur’ the latter explains that the work itself is a manly task because its “physical… although we got all the machinery and today, it’s still physical and requires even more skills”. The physicality of this endangered working site could be contrasted with the breadth of jobs, both physical and cognitive, that automation is thought to be endangering. But masculinity can also plays a role in the rhetoric of those engaged in intellectual tasks – particularly when rationality is coded as a male superiority and empathy as a female one, a division of duties quite often appearing in discussions about the impact of automation on demographic groups.

Women do appear in this ethnographic research, but his informants give the impression that “Although women could contribute to the survival of the community by assisting their men, survival was ensured by acts of loyalty, solidarity, honesty, and last, but not least, the demonstration of the willingness to work; all of these were regarded as core elements of masculinity.” Further the confinement of men to the space thought of as belonging to women – the men, and parallel ethnographies of unemployment cited by Diedrich, describe a fear of ending up just ‘sitting round the house’ – emphasises their liminality in the ‘no man’s land’ of the private space of the home, in effect becoming invisible to the public as well. A greater danger lies in moving out of liminality and into the role of ‘scrounger’ – being no longer counted in the community of those who want to work. In a future of increasingly precarious working would there be similar insider and outsider rhetoric? Being in the majority might ameliorate such discourse, but predictions of the impact of automation scale up over time, meaning that for a long time being out of work due to AI enabled automation will continue to be a minority position.

[A new paper by Christina Beatty considers “the integration of male and female labour markets in the English and Welsh Coalfields” and provides more insight into changes around gender in these areas]

Narratives, embodied through stories, accounts, material and cultural artifacts, set the scene for reactions to change as well as persist through that change. Ethnography can pick up a variety of materials in order to examine cultural discourse. In work by Annette Pritchard and Nigel Morgan in 2003 they consider contemporary postcards of Wales as “auto-ethnographic visual text” –a text that a culture has produced about itself. The Valleys are an imagined community “synonymous with coal mining, community, nonconformity, political radicalism and self-education” – so what happens when one of those pillars is removed? For Mordor, J R R Tolkien is supposed by some to have been inspired by the view of the fiery steel pits of Ebbw Vale as seen all the way from Crickhowell, which was itself perhaps the inspiration for the shire of the Hobbits, and the working pits have been represented in many accounts as a “landscape of degradation” according to Pritchard and Morgan.

mordor

Why then do the postcards still show off ‘The Welsh Mining Village’, with the definitive article “elevating … [it] to a special significance (Bathes 1977), wherein the singular embodies the plural”? Pritchard and Morgan consider it as an evocation of that which has been lost – much like the miner in the orange overalls residing by the new shopping park. That the image comes from a museum’s re-creation of a village scene only emphasizes the overt nostalgia. What images and stories of nostalgia might be significant for shaping our memories of work in a post-automation future? Would another Tolkein, currently being inspired by the intellectual work mill of the open plan office rather than Ebbw Vale, ever write something as evocative as Mordor?

office
Do I really want to compare this man to Tolkien? Not sure about that…

Other field-sites of course present us with opportunities for understanding cultural influences on conceptions of work, as well as reactions to unemployment. Robert T. O’brien’s fieldwork in East Kensington, Philadelphia, involves participant observation with Community Development Corporations, schools, community health programs and public meetings, as well as interviews and survey with residents of the primarily white and historically working class neighbourhood which is showing the effects of under- and unemployment. He argues that, “the failure to see marginally employed residents as people with rights and membership in the community is consistent with the processes of neo-liberalism, wherein people who do not adapt to the dictates of the free market and bourgeois normativity are created as undeserving.” This creation of the ‘undeserving’ is clear in his ethnographic material – a woman complains that she doesn’t “know how it is that the word ‘community’ gets stretched enough to encompass the people who make it hard to live [here], by their habits.” Insider and outsider rhetoric is shaped by economic shifts.

An anthropology of unemployment must acknowledge the “pervasive material and symbolic value of wage labour within the context of capitalist markets” but without an “assumption of wage labour’s permanence or universality” (Angela Jancius introducing anthropology of unemployment in the same issue of the journal Ethnos as O’Brien’s ethnography).

In an anthropology of automation or AI induced unemployment there are things to be taken up from prior ethnographic work (and there are ethnographies being written about the gig economy and the Precariat already that might also be useful – partially perhaps because increasingly academics are living those lifestyles). Moreover, anthropological evidence for the impact of automation on people’s understandings, imaginings and accounts gives us a more nuanced understanding of change. In the liminality of unemployment prior cultural forms will be also be significant in the manifestation new ways of being. Paying attention to culture will give us clues to the influences on any new culture of the future of work, and what the world will look like as the future, unevenly distributed as it will be (cf William Gibson), will look like to the unevenly distributed groups and communities that make up society.

future of work