Protesting Project Nimbus


Hello!

You might have followed the sit-in of Google employees protesting Project Nimbus, a cloud infrastructure contract with the Israeli government. I sat down with one of the core organisers, Dr Kate Sim, to talk more about it. Listen to the conversation here!

We discuss how the sit-in came about, how she distinguishes between moral and immoral B2B relationships with countries, and the role that employees should play in shaping the mission and focus of businesses. I’m still left with questions about how technology for defense can be done well, and how we can avoid a situation where the most abominable companies build monopolies in the intersection between technology and war because the more reputable companies won’t touch this work. Here are a few lingering takeaways from me:

Employee’s fighting for clear lines of what their employers will or won’t do is critical in tech.

There is a world of difference between company directives to ‘not talk about politics at work’, and those expecting staff to participate — knowingly or unknowingly — in the development and sale of violence-enabling technology, particularly to governments perpetrating war crimes. I was struck by the evolution of Kate’s position and how after the employee victory with Maven the company felt compelled to create a set of logics for when they do and do not take on contracts. The original case Google made to sell to Israel was a distinction that fell apart under any scrutiny. Leadership distinguished pursuing projects for ‘security’ and for ‘military’ purposes. The former they would do the latter they would not do.

At first blush it makes a little bit of sense. A company should have clear policies for what it will and won’t make. But the kicker is they also have to stand by those red lines. And sharpen those lines over time.

Tech companies don’t want clear lines. They want to sell more than their competitors, ideally without negative impact on their brand.

You realise through Kate’s story that leadership of Google had no intention of actually developing that framework with specificity that would constrain their decision-making. Smart people made up a smart sounding rationale that would keep other smart colleagues off of their backs. And when those colleagues pushed for clarification or for a decision to be made based on that reasoning, they were shut down.

Tech companies can make tremendous profits selling to militaries around the world.

There is a race to develop and monopolise the technology infrastructure of war. The speed with which use tech companies sprinted to build deep tech relationships with Ukraine was unsurprising but still kind of shocking. In a Time piece covering that ‘race’, the author made the perhaps obvious, but really important point: “The [tech companies] willing to move fast and flout legal, ethical, or regulatory norms could make the biggest breakthroughs.”

If the bad guys drop out, the worse guys show up.

This means that when companies that have a brand to protect refuse to participate in any aspect of ‘defense’, a vacuum can emerge within which abominable companies race to fill the gap. And the defense discourse is shifting. Tech companies no longer feel the need to do their defense deals in secret. The recent “AI Expo for National Competitiveness” covered here, lays bare the positioning of radical purveyors of weapons of war who give zero fucks about human life.

Palestine is a frontline of military technology development. Palestinians are dying so that tech companies can sharpen up their products.

Like all technologies of consequence, weapons of war are also developed in contexts where developers and salesmen can experiment without accountability. I recently read and highly recommend The Palestine Laboratory. It both contextualised the current crisis through a lens of Israeli support of global authoritarianism, but also laid out a very similar thesis to Automated Inequality in the context of military tech. Whether the tools that target Gaza work doesn’t matter. They likely won’t be held accountable for civilian deaths they enable through shoddy software or disregard for human rights. Why? Because many with power do not believe that Palestinians deserve human rights.

Most likely, none of this is legal.

Is it legal for American companies to facilitate war crimes? Not in theory. US companies are not allowed to sell weapons to a regime that is violating human rights. Unfortunately the Leahy Law — which mandates that companies not export equipment to regimes that violate human rights — isn’t being applied to Israel. (Leahy himself thinks it should be.) And as far as I know hasn’t been used to restrict export and sale of technology products and services.

Those are just some thoughts I was left with. I hope you enjoy the conversation. As always you can respond directly to these emails to share your thoughts — thank you!


We have a LinkedIn group — please join it

If you’re at all active on LinkedIn, we’ll be using the CSM group to post regular updates one what we’re doing. So that’s podcast episodes, events, and NPN opportunities. It’s often hard to track/know what an org like ours is doing so we want to be as consistent as possible in at least one place.

Thanks!

Alix


If this was forwarded to you, sign up here.

Computer Says Maybe

A newsletter & podcast about AI and politics

Read more from Computer Says Maybe

Trump just won the presidency — what do we do now? Hey there, Just before the election we had a drafted a normal newsletter about the impacts AI did or didn’t have on the election. But like many of the conversations about technology politics, it feels like we missed the forest for the trees. 🎧 Prathm and I sat down to reflect and discuss what this outcome might mean for the technology politics work we’re all doing. You can listen here. I also wanted to share my thinking at the personal,...

Hello friends of CSM! This year we’ve had a bottomless brunch of big tech trials, which somehow feels like progress but also sort of like… we’re slowly getting nowhere? We wanted to understand better what it means to take big tech to court: in what ways are they ducking out of being accountable for their harms? What kinds of expert witnesses are litigators calling on to build a case? And what makes an expert witness anyway? Yep, it’s a lot. A few weeks ago, we wrapped up a podcast miniseries...

Laws are like pancakes Hi hi hello everyone — we’ve just wrapped up our podcast series on FAccT. In case you weren’t aware that this series even existed and you now feel woefully behind, here’s a quick rundown: First I spoke to Andrew Strait about our favourite papers presented at the conference; it was a great chat and a good overview of what FAccT even is. Then I interviewed the authors of three of my favourite papers… In Abandoning Algorithms I Interviewed Nari Johnson and Sanika Moharana...