The Human in the LoopHi Reader, This month we hosted our first New Protagonist meetup. This was a chance for our members to learn more about each other, and share dominant narratives that stuck out to them recently. We’re aiming to host these monthly so that we don’t lose steam in thinking through these issues together. Remember, if you want to be part of the New Protagonist Network, please register your interest here! 🎙️ New podcast episode: what’s it like to work in the AI supply chain?Super proud of how this one has come together. In this episode, we talk about the kinds of jobs that are being created as AI systems grow, how those jobs are evolving, what the labour conditions of those jobs are like and, who is benefitting from these systems. Guests include James (Mojez) Oyange, Yoel Roth, Catherine Bracy and Cori Crider. Please listen & share far and wide! When a sweatshop of underpaid workers cosplays as AI— By Georgia Iacovou This didn’t make the final cut of our latest podcast episode, but there was a conversation that Alix had with Catherine Bracy, where Catherine described airline pilots as “the human in the loop” — i.e. a person who is present to, in some way, oversee and intervene on what is largely machine-led work. Autopilots are now sophisticated enough that human pilots don’t need to be as skilled as they used to be; they’re just there in case something goes wrong, and to provide implicit warm-blooded comfort to passengers. There are two key things at play here. Firstly, this is a great corollary to what’s happening to tech workers right now: the once benefit-addled software engineer no longer holds the same prestige in their role, because the perceived value of their work is going down, while the perceived value in automation is going up. Secondly, the airline pilot as ‘the human in the loop’ speaks to how different types of labour are being surfaced and obscured within any business that purports to leverage automated systems. If the human airline pilot is there in part to provide reassurance to passengers that there is indeed someone with a pulse in the cockpit then much of their labour has been relegated to caring labour; which I think is a useful term in conversations about AI and automation. It’s the kind of labour that, for now, only humans can do — both in a technical sense, and in a socially acceptable sense. E.g. not only does a machine lack the necessary skills to raise a child, but it would also be considered morally repugnant if we allowed one to do so. Traditional caring labour such as parenting, nursing, being an assistant, or even restaurant work, is notoriously undervalued. Over the last decade and a half, the tech industry has managed to cultivate a new flavour of caring labour — and then value it even less. This is represented in the droves of ghost-workers who must consume and moderate violent and disturbing online content so that we don’t have to; or the cleaners and front desk staff who’s wages and bargaining power are suppressed by bizarre noncompete clauses (fortunately the FTC have recently cracked down on this practice). This is the kind of labour that is necessary to ‘activate’ the high-grade intellectual labour provided by software engineers, product managers, and innovation leads. And in many cases it’s the kind of labour (e.g. content moderation) that is needed to even make tech products usable. In spite of the obvious value they provide, these workers are partitioned from their peers and organisational contexts — because tech companies only employ cool creative intellectuals. This approach has brought us to an extremely precarious inflection point where the tech labour market has maligned ‘AI’ and ‘automation’ to be secretly synonymous with ‘outsourced worker’. You may have heard that Amazon are rolling back their cashierless ‘just walk out’ technology at Amazon Fresh stores — and that this system was likely propped up by an unknown amount of workers in India, remotely watching what items shoppers put in their baskets. The lack of clarity around how much involvement these remote workers have is of course a huge part of the problem. Amazon are now selling this ‘tech’ to third-party vendors. Or are they just selling cheap invisible labour? Blurring the lines further, 404 Media recently reported on a chicken shop in New York that replaced the cashier with a woman on Zoom. This weird, impersonal, hands-off approach to service employment dehumanises this woman to the point where some questioned if she was even real (she is). While the chicken shop is not a large scale tech company, this practice is congruous with the ‘let’s just out-source everything’ model. Finally I want to leave you with an example that has no human in the loop: the man who successfully sued Air Canada because their chatbot gave him inaccurate information about their bereavement policy. Air Canada’s initial response to the complaint was to anthropomorphise the chatbot, saying it was “responsible for its own actions”, which is such a flagrantly negligent comment that it’s almost comical. It also demonstrates how much corporations are willing to humanise machines, and dehumanise humans — even though humans have the ability to care and feel pain, and machines very much… do not. Our friends at the AI Now Institute are hiring!They have two open positions: Associate Director and Operations Director. Applications close on the 2nd of May. Thanks for reading — as usual please respond with any thoughts and feedback you might have on the subjects we’ve covered here! Alix If this was forwarded to you, sign up here. |
A newsletter & podcast about AI and politics
Trump just won the presidency — what do we do now? Hey there, Just before the election we had a drafted a normal newsletter about the impacts AI did or didn’t have on the election. But like many of the conversations about technology politics, it feels like we missed the forest for the trees. 🎧 Prathm and I sat down to reflect and discuss what this outcome might mean for the technology politics work we’re all doing. You can listen here. I also wanted to share my thinking at the personal,...
Hello friends of CSM! This year we’ve had a bottomless brunch of big tech trials, which somehow feels like progress but also sort of like… we’re slowly getting nowhere? We wanted to understand better what it means to take big tech to court: in what ways are they ducking out of being accountable for their harms? What kinds of expert witnesses are litigators calling on to build a case? And what makes an expert witness anyway? Yep, it’s a lot. A few weeks ago, we wrapped up a podcast miniseries...
Laws are like pancakes Hi hi hello everyone — we’ve just wrapped up our podcast series on FAccT. In case you weren’t aware that this series even existed and you now feel woefully behind, here’s a quick rundown: First I spoke to Andrew Strait about our favourite papers presented at the conference; it was a great chat and a good overview of what FAccT even is. Then I interviewed the authors of three of my favourite papers… In Abandoning Algorithms I Interviewed Nari Johnson and Sanika Moharana...