šŸŽ™ļøĀ What is an expert witnessā€¦ in tech?


Hello friends of CSM!

This year weā€™ve had a bottomless brunch of big tech trials, which somehow feels like progress but also sort of likeā€¦ weā€™re slowly getting nowhere?

We wanted to understand better what it means to take big tech to court: in what ways are they ducking out of being accountable for their harms? What kinds of expert witnesses are litigators calling on to build a case? And what makes an expert witness anyway?

Yep, itā€™s a lot. A few weeks ago, we wrapped up a podcast miniseries called Exhibit X that sought to answer these questions. It was a heavy lift for us, because weā€™re new at this, but also because thereā€™s so much to unravel here.

šŸŽ™ļø You can find all Exhibit X episodes in our feed and also where ever you normally get your podcasts.

You can definitely listen to any one of these episodes in isolation and learn something new ā€” but weā€™ve tried to conceptually weave them all together for you in this email šŸ‘‡

In the first episode on Tech & Tobacco, Alix and Prathm looked back to 1964, when a piece of research came out that finally showed cigarettes to be harmful and addictive. Then, big tobacco companies managed to evade accountability for decades ā€” but how? Listen to the episode to learn about how corporations have historically cosplayed as research institutions, and how weā€™re seeing that happen again with big tech now.

Part of what broke that cycle of obscuring and misrepresenting research was by individuals bravely speaking up. So, in our Whistleblowers episode Alix interviewed Frances Haugen who blew the whistle on Meta back in 2021. They were sitting on the knowledge that their products were being used by underage kids (the limit for Meta products is 13) and that there was child trafficking happening on Instagram. One of the most harrowing things here is that even Metaā€™s backend ad tooling ā€˜knewā€™ these kids were being exploited: one investigation revealed that they would receive targeted ads from human rights lawyers that might be able to support them.

This raises the question: what should a company like Meta be doing here to intervene? Are we talking about breaking e2ee in the name of child protection? Slapping social media sites with age verification? Frances had some other ideas, which she covers in the episode.

In a way, platforms donā€™t really have to do anything about this kind of speech; in The Litigators, Meetali Jain outlined how social media companies have weaponised section 230 and the first amendment into a kind of ā€œdouble insulation of accountabilityā€. Listen to the episode if you want to know how this works ā€” we certainly found it interesting to learn that even product design decisions can be protected by free speech, and litigators have been playing 4D chess to make this not be the case.

In fact until recently, cases brought against big tech firms had never made it to the discovery phase ā€” meaning that when litigators get there, they have no idea what to ask for. This reminds us that weā€™re still very much in a nascent space here, and brings us to the episode on The Courts: here Alix interviewed Alexa Koenig who worked closely with the International Criminal Court to figure out how media pulled from social platforms can be legitimately put forward as evidence in a court of law. When Alexa began setting up meetings with judges and tech execs back in 2011, in turned out that most tech execs had never even heard of the ICC, and had no idea what their purpose was.

There is of course still a lot of work to be done to connect these disparate groups. In The Community, Elizabeth Eagen discusses how a sociotechnical expert witness might finally emerge the community of litigators, researchers, and regulators currently working together at the intersection of tech and policy. Listen to this episode to get Elizabethā€™s insights on pacing: in tech, lawmakers and researchers are constantly playing catch-up with the shifting whims of profit-making companies.

šŸŽ™ļø We also made a wrap-up episode where Alix and Prathm discuss what they learned through the series ā€” enjoy!

Thank you as usual for consuming our content šŸ¤¹

Alix & CSM podcast team (Sarah, Prathm, & Georgia)


If this was forwarded to you, sign up here.

Computer Says Maybe

A newsletter & podcast about AI and politics

Read more from Computer Says Maybe

Trump just won the presidency ā€” what do we do now? Hey there, Just before the election we had a drafted a normal newsletter about the impacts AI did or didnā€™t have on the election. But like many of the conversations about technology politics, it feels like we missed the forest for the trees. šŸŽ§ Prathm and I sat down to reflect and discuss what this outcome might mean for the technology politics work weā€™re all doing. You can listen here. I also wanted to share my thinking at the personal,...

Laws are like pancakes Hi hi hello everyone ā€” weā€™ve just wrapped up our podcast series on FAccT. In case you werenā€™t aware that this series even existed and you now feel woefully behind, hereā€™s a quick rundown: First I spoke to Andrew Strait about our favourite papers presented at the conference; it was a great chat and a good overview of what FAccT even is. Then I interviewed the authors of three of my favourite papersā€¦ In Abandoning Algorithms I Interviewed Nari Johnson and Sanika Moharana...

What the FAccT? Hello hello everyone ā€” a couple of weeks ago I was at FAccT, and I read loads of interesting research papers, lots of which had relevant findings on technology politics issues, and some that were too wonky to really absorb. After spending a week surrounded by smart people working on issues Iā€™m interested in, I have some thoughts on the general state of this type of technology research. If you prefer the medium of voice, I actually had a great conversation (at least I enjoyed...