|
Hello friends of CSM! This year we’ve had a bottomless brunch of big tech trials, which somehow feels like progress but also sort of like… we’re slowly getting nowhere? We wanted to understand better what it means to take big tech to court: in what ways are they ducking out of being accountable for their harms? What kinds of expert witnesses are litigators calling on to build a case? And what makes an expert witness anyway? Yep, it’s a lot. A few weeks ago, we wrapped up a podcast miniseries called Exhibit X that sought to answer these questions. It was a heavy lift for us, because we’re new at this, but also because there’s so much to unravel here. 🎙️ You can find all Exhibit X episodes in our feed and also where ever you normally get your podcasts. You can definitely listen to any one of these episodes in isolation and learn something new — but we’ve tried to conceptually weave them all together for you in this email 👇 In the first episode on Tech & Tobacco, Alix and Prathm looked back to 1964, when a piece of research came out that finally showed cigarettes to be harmful and addictive. Then, big tobacco companies managed to evade accountability for decades — but how? Listen to the episode to learn about how corporations have historically cosplayed as research institutions, and how we’re seeing that happen again with big tech now. Part of what broke that cycle of obscuring and misrepresenting research was by individuals bravely speaking up. So, in our Whistleblowers episode Alix interviewed Frances Haugen who blew the whistle on Meta back in 2021. They were sitting on the knowledge that their products were being used by underage kids (the limit for Meta products is 13) and that there was child trafficking happening on Instagram. One of the most harrowing things here is that even Meta’s backend ad tooling ‘knew’ these kids were being exploited: one investigation revealed that they would receive targeted ads from human rights lawyers that might be able to support them. This raises the question: what should a company like Meta be doing here to intervene? Are we talking about breaking e2ee in the name of child protection? Slapping social media sites with age verification? Frances had some other ideas, which she covers in the episode. In a way, platforms don’t really have to do anything about this kind of speech; in The Litigators, Meetali Jain outlined how social media companies have weaponised section 230 and the first amendment into a kind of “double insulation of accountability”. Listen to the episode if you want to know how this works — we certainly found it interesting to learn that even product design decisions can be protected by free speech, and litigators have been playing 4D chess to make this not be the case. In fact until recently, cases brought against big tech firms had never made it to the discovery phase — meaning that when litigators get there, they have no idea what to ask for. This reminds us that we’re still very much in a nascent space here, and brings us to the episode on The Courts: here Alix interviewed Alexa Koenig who worked closely with the International Criminal Court to figure out how media pulled from social platforms can be legitimately put forward as evidence in a court of law. When Alexa began setting up meetings with judges and tech execs back in 2011, in turned out that most tech execs had never even heard of the ICC, and had no idea what their purpose was. There is of course still a lot of work to be done to connect these disparate groups. In The Community, Elizabeth Eagen discusses how a sociotechnical expert witness might finally emerge the community of litigators, researchers, and regulators currently working together at the intersection of tech and policy. Listen to this episode to get Elizabeth’s insights on pacing: in tech, lawmakers and researchers are constantly playing catch-up with the shifting whims of profit-making companies. 🎙️ We also made a wrap-up episode where Alix and Prathm discuss what they learned through the series — enjoy! Thank you as usual for consuming our content 🤹 Alix & CSM podcast team (Sarah, Prathm, & Georgia) If this was forwarded to you, sign up here. |
A newsletter & podcast about AI and politics
RSVP now! The People's Policy: Holding Big Tech Accountable CSM LIVE next Monday, March 2nd on YouTube - 5:30pm MT (7:30pm ET) Hey everyone— Next Monday, I’m hosting a live conversation, “The People’s Policy,” about how we can shape the future of Big Tech accountability at the local, state, and national level. I'd love for you to join us—RVSP here. Big Tech is lobbying hard at every level of government, pushing to expand data center development, pitching public agencies on discriminatory and...
A note from Alix | Podcast Highlights | Highlights from the Collective All hail scale...what are we doing here? Hey everyone, In Brazil, someone stole a professor’s identity and tried to take $80,000 with it. In Rwanda, a finance minister was warned that funding from the Tony Blair Institute could disappear if Oracle was removed from the government tech stack. In India, a system glitch cut 2 million people off from food rations. What do these stories have in common? They’re all tied to...
A note from Alix | Podcast Highlights | Highlights from the Collective What’s ahead in 2026 Hey everyone, It’s funny how we often treat the start of a new year like a magical clean slate. Lately, I’ve been thinking about why we reach for that feeling in the first place. Especially after a year like 2025, it’s natural to want a moment to imagine something new. But when I look ahead, I don’t see a break between 2025 and 2026. I see a through-line — one shaped by power and profit, yes, but also...