🎙️ What is an expert witness… in tech?


Hello friends of CSM!

This year we’ve had a bottomless brunch of big tech trials, which somehow feels like progress but also sort of like… we’re slowly getting nowhere?

We wanted to understand better what it means to take big tech to court: in what ways are they ducking out of being accountable for their harms? What kinds of expert witnesses are litigators calling on to build a case? And what makes an expert witness anyway?

Yep, it’s a lot. A few weeks ago, we wrapped up a podcast miniseries called Exhibit X that sought to answer these questions. It was a heavy lift for us, because we’re new at this, but also because there’s so much to unravel here.

🎙️ You can find all Exhibit X episodes in our feed and also where ever you normally get your podcasts.

You can definitely listen to any one of these episodes in isolation and learn something new — but we’ve tried to conceptually weave them all together for you in this email 👇

In the first episode on Tech & Tobacco, Alix and Prathm looked back to 1964, when a piece of research came out that finally showed cigarettes to be harmful and addictive. Then, big tobacco companies managed to evade accountability for decades — but how? Listen to the episode to learn about how corporations have historically cosplayed as research institutions, and how we’re seeing that happen again with big tech now.

Part of what broke that cycle of obscuring and misrepresenting research was by individuals bravely speaking up. So, in our Whistleblowers episode Alix interviewed Frances Haugen who blew the whistle on Meta back in 2021. They were sitting on the knowledge that their products were being used by underage kids (the limit for Meta products is 13) and that there was child trafficking happening on Instagram. One of the most harrowing things here is that even Meta’s backend ad tooling ‘knew’ these kids were being exploited: one investigation revealed that they would receive targeted ads from human rights lawyers that might be able to support them.

This raises the question: what should a company like Meta be doing here to intervene? Are we talking about breaking e2ee in the name of child protection? Slapping social media sites with age verification? Frances had some other ideas, which she covers in the episode.

In a way, platforms don’t really have to do anything about this kind of speech; in The Litigators, Meetali Jain outlined how social media companies have weaponised section 230 and the first amendment into a kind of “double insulation of accountability”. Listen to the episode if you want to know how this works — we certainly found it interesting to learn that even product design decisions can be protected by free speech, and litigators have been playing 4D chess to make this not be the case.

In fact until recently, cases brought against big tech firms had never made it to the discovery phase — meaning that when litigators get there, they have no idea what to ask for. This reminds us that we’re still very much in a nascent space here, and brings us to the episode on The Courts: here Alix interviewed Alexa Koenig who worked closely with the International Criminal Court to figure out how media pulled from social platforms can be legitimately put forward as evidence in a court of law. When Alexa began setting up meetings with judges and tech execs back in 2011, in turned out that most tech execs had never even heard of the ICC, and had no idea what their purpose was.

There is of course still a lot of work to be done to connect these disparate groups. In The Community, Elizabeth Eagen discusses how a sociotechnical expert witness might finally emerge the community of litigators, researchers, and regulators currently working together at the intersection of tech and policy. Listen to this episode to get Elizabeth’s insights on pacing: in tech, lawmakers and researchers are constantly playing catch-up with the shifting whims of profit-making companies.

🎙️ We also made a wrap-up episode where Alix and Prathm discuss what they learned through the series — enjoy!

Thank you as usual for consuming our content 🤹

Alix & CSM podcast team (Sarah, Prathm, & Georgia)


If this was forwarded to you, sign up here.

Computer Says Maybe

A newsletter & podcast about AI and politics

Read more from Computer Says Maybe

Our questions of the year Hey there— I know this is likely one of many year-end reflection emails in your inbox, so thanks for even opening! In our final newsletter of 2024 we wanted to center the questions we were most driven by this year, and we’ll wrap this up with our 2025 plans and opportunities to work with you. What’s been on our radar in 2024? The question: Who are the dominant voices in AI coverage, and how are they shaping the narrative? The action: We dug deep into this with our...

🎧 Technology’s role in the climate crisis Hello all — just a quick podcast update from us. We’ve recently wrapped up a miniseries called Net0++. We interviewed activists, researchers, a journalist, and even an expert in concrete (yep) — to help us explore how the tech industry’s business model is worsening the climate crisis. In Big Dirty Data Centres, we heard from Jenna Ruddock and Boxi Wu, who shared their work on data centre expansion — which has ramped up with the AI hype — and how local...

Breaking the wave of harmful AI Hi Reader, Since the US election, we’ve been reflecting a lot on the power of collaboration and coordination to address the complex, structural challenges we’re facing. One voice can make a difference, but many voices, working together, can effect real change. And when we consistently act in concert, incremental progress can snowball into something much bigger - a movement. This month (and beyond), we’re focusing on both amplifying the voices of experts and...