Hello! You might have followed the sit-in of Google employees protesting Project Nimbus, a cloud infrastructure contract with the Israeli government. I sat down with one of the core organisers, Dr Kate Sim, to talk more about it. Listen to the conversation here! We discuss how the sit-in came about, how she distinguishes between moral and immoral B2B relationships with countries, and the role that employees should play in shaping the mission and focus of businesses. I’m still left with questions about how technology for defense can be done well, and how we can avoid a situation where the most abominable companies build monopolies in the intersection between technology and war because the more reputable companies won’t touch this work. Here are a few lingering takeaways from me: Employee’s fighting for clear lines of what their employers will or won’t do is critical in tech. There is a world of difference between company directives to ‘not talk about politics at work’, and those expecting staff to participate — knowingly or unknowingly — in the development and sale of violence-enabling technology, particularly to governments perpetrating war crimes. I was struck by the evolution of Kate’s position and how after the employee victory with Maven the company felt compelled to create a set of logics for when they do and do not take on contracts. The original case Google made to sell to Israel was a distinction that fell apart under any scrutiny. Leadership distinguished pursuing projects for ‘security’ and for ‘military’ purposes. The former they would do the latter they would not do. At first blush it makes a little bit of sense. A company should have clear policies for what it will and won’t make. But the kicker is they also have to stand by those red lines. And sharpen those lines over time. Tech companies don’t want clear lines. They want to sell more than their competitors, ideally without negative impact on their brand. You realise through Kate’s story that leadership of Google had no intention of actually developing that framework with specificity that would constrain their decision-making. Smart people made up a smart sounding rationale that would keep other smart colleagues off of their backs. And when those colleagues pushed for clarification or for a decision to be made based on that reasoning, they were shut down. Tech companies can make tremendous profits selling to militaries around the world. There is a race to develop and monopolise the technology infrastructure of war. The speed with which use tech companies sprinted to build deep tech relationships with Ukraine was unsurprising but still kind of shocking. In a Time piece covering that ‘race’, the author made the perhaps obvious, but really important point: “The [tech companies] willing to move fast and flout legal, ethical, or regulatory norms could make the biggest breakthroughs.” If the bad guys drop out, the worse guys show up. This means that when companies that have a brand to protect refuse to participate in any aspect of ‘defense’, a vacuum can emerge within which abominable companies race to fill the gap. And the defense discourse is shifting. Tech companies no longer feel the need to do their defense deals in secret. The recent “AI Expo for National Competitiveness” covered here, lays bare the positioning of radical purveyors of weapons of war who give zero fucks about human life. Palestine is a frontline of military technology development. Palestinians are dying so that tech companies can sharpen up their products. Like all technologies of consequence, weapons of war are also developed in contexts where developers and salesmen can experiment without accountability. I recently read and highly recommend The Palestine Laboratory. It both contextualised the current crisis through a lens of Israeli support of global authoritarianism, but also laid out a very similar thesis to Automated Inequality in the context of military tech. Whether the tools that target Gaza work doesn’t matter. They likely won’t be held accountable for civilian deaths they enable through shoddy software or disregard for human rights. Why? Because many with power do not believe that Palestinians deserve human rights. Most likely, none of this is legal. Is it legal for American companies to facilitate war crimes? Not in theory. US companies are not allowed to sell weapons to a regime that is violating human rights. Unfortunately the Leahy Law — which mandates that companies not export equipment to regimes that violate human rights — isn’t being applied to Israel. (Leahy himself thinks it should be.) And as far as I know hasn’t been used to restrict export and sale of technology products and services. Those are just some thoughts I was left with. I hope you enjoy the conversation. As always you can respond directly to these emails to share your thoughts — thank you! We have a LinkedIn group — please join itIf you’re at all active on LinkedIn, we’ll be using the CSM group to post regular updates one what we’re doing. So that’s podcast episodes, events, and NPN opportunities. It’s often hard to track/know what an org like ours is doing so we want to be as consistent as possible in at least one place. Thanks! Alix If this was forwarded to you, sign up here. |
A newsletter & podcast about AI and politics
Our questions of the year Hey there— I know this is likely one of many year-end reflection emails in your inbox, so thanks for even opening! In our final newsletter of 2024 we wanted to center the questions we were most driven by this year, and we’ll wrap this up with our 2025 plans and opportunities to work with you. What’s been on our radar in 2024? The question: Who are the dominant voices in AI coverage, and how are they shaping the narrative? The action: We dug deep into this with our...
🎧 Technology’s role in the climate crisis Hello all — just a quick podcast update from us. We’ve recently wrapped up a miniseries called Net0++. We interviewed activists, researchers, a journalist, and even an expert in concrete (yep) — to help us explore how the tech industry’s business model is worsening the climate crisis. In Big Dirty Data Centres, we heard from Jenna Ruddock and Boxi Wu, who shared their work on data centre expansion — which has ramped up with the AI hype — and how local...
Breaking the wave of harmful AI Hi Reader, Since the US election, we’ve been reflecting a lot on the power of collaboration and coordination to address the complex, structural challenges we’re facing. One voice can make a difference, but many voices, working together, can effect real change. And when we consistently act in concert, incremental progress can snowball into something much bigger - a movement. This month (and beyond), we’re focusing on both amplifying the voices of experts and...