This morning the other parental unit and I downloaded Portal 2 on the Xbox One for the child unit, who has now completed the minimum eight rotations around the sun necessary to understand the tricky quantum tunnelling mechanics at work in the Aperture Science Handheld Portal Device, or ‘portal gun’. Possibly. Maybe.

There are many good reasons to download a Portal game (Portal 2 has a genuinely brilliant local co-op game), but of course, as soon as GLaDOS got her first mention I started thinking about AI. And cake.

If you aren’t familiar with either Portal 1 or Portal 2, here’s my TL: DR version of their game mechanics and plots (spoilers ahead of course). In Portal 1, you play as Chell, a subject in the numerous testing rooms of the Aperture Science Laboratories Computer-Aided Enrichment Center. The portal gun you find enables you to create two distinct portal ends, blue and orange, which you can walk through to come out somewhere else. Using continuous momentum you can fall into one hole to get launched to previously unobtainable areas and doors, and move on to the next room and eventually, maybe, your freedom.

flinging

All the while, your efforts are being assessed and snarkily commented on by GLaDOS, (‘Genetic Lifeform and Disk Operating System’, a homage to MS-DOS), an AI built by the creators of the Aperture Science labs. We find out during the course of the tests that GLaDOS turned on the scientists of Aperture, releasing a deadly neurotoxin that has wiped them all out. So far, so psychotic femme-voiced killer AI.

GLaDOS’ passive-aggressive comments on your progress, and on your pathetic humanity, also come with promises of a reward at the end of the testing process: a cake. Although, we also see graffiti from a lab worker, Doug Rattmann (the ‘Lab Rat’) warning that the cake is a lie.

cake

We defeat GLaDOS at the end of Portal 1, ending up lying on the ground outside the facility gates, surrounded by the remains of the AI. But with the release of Portal 2, the original ending was retconned to show Chell being dragged away, back into the facility, with a robotic voice thanking her for lying on the floor and assuming the “party escort submission position”. GLaDOS then sings over the credits, letting us know that she’s Still Alive. She does indeed return in Portal 2, with even more test-rooms. For science.

I don’t think it’s a complete coincidence that we downloaded Portal 2 now, just as a whole generation in the UK have been tested and are now receiving their results, either for their Scottish Highers, A-Levels, and soon Scottish National 5 qualifications and GCSEs. Just as we’ve been watching an AI guiding, and lying to, a test subject, around 280,000 A-level students have had their grades adjusted downwards by an algorithm. Others have written detailed explanations of where this decision-making system has gone wrong, not only in its mathematical standardizations but also in how it makes assumptions about this school year on the basis of previous years. Go to an already high-attaining (likely Private or Public school) and the algorithm marks you upwards. Go to a school with a history of lower grades and there’s no way you could really deserve that A you were predicted. They’ve played the game, been promised cake at the end of it if they work hard enough. But the cake was a lie.

And they are angry.

I’ve written elsewhere about what happens when we trust ‘the algorithm’ too much; when we hand over so much of our agency that it becomes something we see as holding sway over our fate. So much so that we can feel that we have been ‘Blessed by the Algorithm’. Theistic metaphors hold true here too: I imagine many of these students currently feel ‘Cursed by the Algorithm’ as opportunities they have worked so hard from vanish from view. While our governments may make more of those u-turns that they are becoming infamous for (Scotland has already said universities should accept non-adjusted grades), at the moment we have a clear example of too must trust in a bad system, as well as the suggestion that any system we placed so much trust in might be bad for us.

There are of course differences between GLaDOS and the code put together by Ofqual, the exams regulator, that has created this current mess. Like so many other science-fiction AIs, GLaDOS is credited with personality, intention, and even cruelty. However, our tendency to anthropomorphize even the most abstract of mathematically driven decision making systems means students end up shouting ‘fuck the algorithm’, as though personally hurt by this string of code. They’re also shouting at Boris and his cronies of course, but there’s a danger in slipping into anthropomorphic thinking when it comes to the systems that are being used by actual humans in power to make decisions about our futures.

The creators of Portal leant into this anthropomorphism to make GLaDOS a villain. Early on in development, they also realised that one particular test required anthropomorphism to make it work for the gamer. In a ‘box marathon’ test, the player has to get from one side of a difficult test room to the other while carrying a cube. At first, players kept forgetting the box and leaving it behind. Then one of the game’s creators, having read some “declassified government interrogation thing” whereby “isolation leads subjects to begin to attach to inanimate objects”, suggested making the cube ‘cute’. They added a pink heart to its design, called it the Companion Cube, and gave GLaDOS some lines about how it was unlikely that the cube was sentient – at least, it probably wasn’t. Then, at the end of the box marathon test, the player has to learn how to use an incineration device – in order to defeat GLaDOS later on – and has to burn up the Companion Cube:

Although the euthanizing process is remarkably painful, eight out of ten Aperture Science engineers believe that the Companion Cube is most likely incapable of feeling much pain.” – GLaDOS

The player’s connection makes this moment actually affective. But, as the Michael Rosen poem above suggests, the dangers of handing over agency to algorithms are not just in reifying, or deifying, them but also in the counter-reaction: reducing those we enter into the system to being ‘just’ data. This is Goodhart’s Law and then some: ‘When a measure becomes a target, it ceases to be a good measure’. And when a student becomes something to be measured against, becomes data – your school has produced U-class A level students, therefore you must be a U-class student – the individual is lost in the algorithm.

Reifying and anthropomorphising the algorithm obviously obscures the humans – the students in this case – inputted into the system. It also obscures the humans behind the system. At the moment, students are trying to hold people in power to account, protesting outside the Department for Education in London. But it’s increasingly obvious that a certain flavour of technological accelerationism is gaining ground in the policy thinking of the UK government, signified by the adoption of the ‘move fast and break things’ thinking of Silicon Valley. The Barnard Castle fan, and Chief Advisor to the PM, Dominic Cummings has become the poster boy for this kind of approach, but he wouldn’t be so successful if there weren’t others who agreed with him.

When this kind of thinking clashes with the realities of people’s lived lives we’ll get these moments of rebellion against the algorithm, of Chell facing off against GLaDOS. But unless light is shed on the more insidious moments when algorithms are trusted too much by the government we might not know all the times when the cake is a lie.

Some tech ethicists calling for more transparency are trying to find a middle-way between ‘fuck the algorithm’ and ‘assume the party escort submission position’. I’m always a little wary of including myself in this endeavour, as the banner of ‘ethics’ is increasingly being used – by corporations in particular – to obscure these very problems. They can’t be unethical, they have an ethics board! But we can all get involved in bringing light to issues. For instance, some, like journalist Carole Cadwalladr below, have pointed out the irony of Roger Taylor being the chair of both Ofqual and the Centre for Data Ethics and Innovation in the UK (tasked with ensuring algorithmic fairness).

 

(As I write this (at 2.56pm) there are reports that he will make a statement at 4pm, likely announcing a Scottish-like u-turn. We shall see).

(Hi, its 4.03pm Beth here, yes we have a u-turn! Thoughts with all the admissions offices who now have to deal with this revision when there’s a cap on places!!)

The Portal games are challenging mind-bending, puzzles with amusing representations of AI and robots (and the co-op version lets you play as a part of robots who are always being blown up, which is more fun than it sounds) but for all that GLaDOS guides Chell in them, they are not a guide for life. It’s a few years until my small human needs to worry about exams, but already he’s in a system that will regularly test and grade him like Chell. So I’ll need to teach him about the algorithms that promise you cake in the real world and the humans who wrote the lie in the first place. Portal seems a good place to start…

cake pic
You got to the end so here is the cake: a doodle I drew while writing my notes for this post 🙂

7 thoughts on “The Cake is Still a Lie

Leave a comment