viernes, septiembre 12, 2025

An Interview with Meredith Whittaker (transcript)

Last week I interviewed with Meredith Whittaker about Privacy, AI Ethics and, of course, Signal. You have the video in Youtube, but if you are of those that still love reading interviews, here you have the transcript of it. For the joy of reading Meredith words.


Interview with Meredith Whittaker

You know that I've been following Signal in the beginning, I had the opportunity of being in many, many conferences with Moxie. We've been sharing wine around the world in Buenos Aires, in Norway, etc. So I know very well the project and the values that are behind the project so i created that ten questions. Feel free to answer answer them your own way so if you don't like anything then change it, okay?

Figura 2: Interview with Meredith Whittaker

1.- So I'm going to start with two questions, about your backgrounds and talking about AI ethics, that to me are interesting, especially because you studied rhetorical literacy and later become one of the world's leading voices in artificial intelligence ethics, which is fascinating. How was that process for Meredith Whitettaker working in tech companies and to be so worried about civil rights?

Well, I think I was initially interested and maybe a bit concerned around AI based not on my literature, my rhetoric backgrounds, but because I had done a lot of work building large scale measurement systems. And so I was effectively at the heart of the processes for creating data and then making claims about that data as a reliable representation of dynamic reality. And I knew how difficult that was.

I knew that data was always imperfect. It was always partial. It was never the whole story. And so in the early twenty tens, when I saw first machine learning and then we all decided to use the term AI become the next big things following the Alex Net paper, following the recognition that GPUs could effectively supercharge the compute available for neural network training.

I became very concerned initially that the data sources that were being used and the claims that were being made about these models created with this data were not accurate, were not sufficient to reflect the things that were being claimed, were certainly not sufficient to create an intelligent machine, and that the types of stories we tell about intelligence, about computational sophistication, were serving to entice people to trust and lend credibility to ultimately what are large companies, large corporate actors who may or may not have as their core objective broad social benefit and who create these technologies and these tools behind shrouds of corporate secrecy.

They're, of course, not open source. They're not open to scrutiny. And we were walking into a situation where we were ultimately being asked to trust whatever the output of a black box was. as if it were intelligent, as if it were more objective and reliable than we were without a way to scrutinize that. And I still think those dynamics are at play, and I still think that is a very risky position to be in as a world and as a society, given that our interests don't always overlap with the boardrooms of these handful of companies.

2.- So do you believe that meaningful change can come from within big tech companies or is external and alternative models the only way to go path forward?

You mean like meaningful positive change?

Right. Yeah.

You know, I have seen a lot of people within those companies do things that were positive. There are many people who hold the line on certain values, on certain practices. But I don't think, I think fundamentally, if the logic of an organization is grow at all costs, you know, generate revenue at all costs, and positive social change will cost money or will hamper growth, you're not going to see positive social change come from those companies.

And I think ultimately when we're talking about even understanding what those changes might be that would have a broad positive social impact, you categorically need broad participation. You need to understand what people want and what people need and what the conditions of life are.

And that requires one or another form of democratic participation of understanding what's happening on the ground and then what measures are needed to remediate it, whether those measures are technological or not.

3.- You are talking about growth, but Signal has grown to around seventeen, seven-zero million users while maintaining a completely different business model from the other tech platforms. What makes Signal fundamentally different from other messaging apps beyond just the technical encryption system that everybody knows?

Well, as you know, it was founded over a decade ago with a very clear vision by Moxie, who I know you know, and it stays true to that vision. It is uncompromising in its pursuit of providing a means to exercise the human right for private communication, even in a world where that right has been foreclosed and whittled away in almost all other realms.

And so we do one thing and we do it extremely well. And we are consistent with that one thing. We came along at a time when the market wasn't saturated. This is before the iPhone was largely distributed, before smartphones were ubiquitous. This is before WhatsApp. This is a very different landscape in the internet or the application ecosystem through consistent and, I would say, visionary work that has its source with Moxie and with Trevor and with those founders, Signal has developed a reputation, developed a network effect.

It has been open source since the beginning. So you don't have to take my word for it. You don't have to take Moxie's word for it. You don't have to take anyone's word for it. You can look at our repos. You can see that, yes, the things that they say are the things that it does. And if we change that, you can call us out on it. And so there is a commitment to mission, a commitment to vision, and I think a purity to what we do that is... deeply appreciated across the ecosystem, but that over time has built the kind of trust that you can't just manufacture overnight.

It's a trust that is gained by consistent action through the type of commitment that Signal has shown to the mission of private communications. And that is in everything we do, that is the one thing we do that is our lane. And I think that is what makes us special.

We're not trying to please shareholders. We're not trying to add a little AI widget. We're not trying to boil the ocean and become the everything app. We do one thing, we do it very well, and we do it because we believe it is fundamentally important to the future of a livable world that this right remain, this right to communicate privately.

Well, and to do that, that non-profit structure that Signal is maintaining is not that nice to have, but actually essential, as you've been saying many, many times. 

Exactly.

4.- Now, you said also that most people are not users of artificial intelligence, but rather subjected to its use. How does Signal's mission connect to this broader concern about technological power imbalances? 

Yeah, well I think you know signal is not an ai company again signal does one thing very very well and in some sense it's attention with some of the the dominant paradigm of ai right now which is this big data bigger is better approach that requires huge amounts of data you know we could think of that as the product of surveillance in order to train these models and then requires a huge amounts of data to inform the models for inference, whether it's your prompts or your browser history to create a summary or your emails to create a summary or what have it.

These models need your data. And of course, what is Signal's whole deal? We want to collect as close to no data as possible. We don't want your data. We don't want to be able to give your data up. If I turned evil overnight, I still don't want to be able to give your data up.

We don't want to have access to it. And so there is a fundamental tension there between the clear and commitment to privacy that Signal embodies and the belief that I have that a tech ecosystem that is not based on the collection and sale of your data, on the processing of your data and the use to tell you who you are and where you fit in life, that a tech ecosystem that doesn't build on those premises is possible. And the current state of AI, which of course demands more and more and more data and produces more and more and more data.

5.- This business of privacy is not easy. Just, Signal, goes about fifty million a year just to maintain this operation and it makes a structural challenge of building sustainable technology outside of this surveillance economy. How do you envision the long-term sustainability for the company? And what would need to change in the broader tech ecosystem to make privacy-first technology like Signal not just possible, but financially viable at scale?

Yeah. I mean, well, look, being a nonprofit it's not fun. It's not particularly virtuous. It's a bit difficult when you're running a large-scale, real-time communications infrastructure that is trusted by hundreds of millions of people around the world. I think for us, being a nonprofit is necessary because, as we discussed, profit and revenue come from... collecting and monetizing data, whether it be to sell ads or to train your AI model. And that's the engine of the tech economy that we exist in. And being a nonprofit doesn't mean it's any cheaper for Signal to develop and maintain this infrastructure.

It's very, very expensive. You know, fifty million dollars a year is a a very low budget for something like Signal. In most for-profit companies, we would have a much bigger team, we would be spending a lot more, and we keep it very narrow and very focused. But nonetheless, this is not a cheap volunteer project. These critical infrastructures that require real-time connectivity globally are very, very expensive. And so we are exploring many ways to finance this.

Currently, we exist on donations. We have about seventy percent of our revenue comes in from large donors helmed by Brian Acton, who has provided a generous contribution. uh he's the the co-founder of Whatsapp uh we also have Jack Dorsey and others who provided significant funding uh and then we have about thirty percent of our funding comes from smaller donors so you know people in the app who donate five or ten dollars a month and you know that's a significant number of people and we're very very grateful for that we're looking at growing our small donor pool we're looking at we're launching a freemium feature very soon so we'll be adding use uh secure backups um which will allow you to recover your signal if your phone say gets thrown in a lake or or some catastrophe like that happens and for a small fee if you want to save all of your media files forever or a hundred hundred gigs of your media files forever you will be able to pay one a dollar and ninety nine cents a month to do that.

So we're experimenting with a freemium model. We've looked at things like an endowment. We've looked at other vehicles and we're ultimately we're exploring what is possible. But again, without undermining that core mission, because what we can't have is a shareholder or a board member in a for profit structure pushing us to undermine privacy because they understand that the way to make money in the current tech economy is to collect and monetize data, is to undermine that privacy. And so that's really the balance that we are striking.

Yeah, well, when I was reading, fifty million dollars a year, and I've been many, many years managing budgets for big tech companies, creating technology, I was saying, oh, this is amazing, fifty million. dollars just for everything, creating the technology, operating the technology, innovate, etc. It's doing magic. And you were mentioning Jack Dorsey. He knows very well how expensive is creating technology at scale. So it's amazing what you do. 

And it's not easy because we have a future and we need to continue working in privacy. We are talking about quantum computing. We have every day new threats in in attacking privacy like the RSA that every day people are trying to to crack it, and we have AI agents that can undermine directly the device security and you are facing another challenge which is the end point.  We have the regulatory changes in every country and depending how is the political message right now, it's changing.

6.- What does success look like for Signal in the next decade? And what role do you hope it plays in shaping a different kind of digital future?

Yeah, well, I think success for Signal in the next decade looks like surviving and thriving our goal is that everyone in the world everywhere can pick up their device and easily use signal to contact anyone else we see privacy as a fundamental human right it is a fundamental human right and that this should be integrated as common sense into our applications and services.

We're obviously not there yet but your Signal success and the growth that we're seeing as people become more personally sensitized to the stakes of privacy to how important it is to be able to communicate privately is giving me some hope there but of course we will have to push back on misguided and often malicious legislation that aims as it always has since the liberalization of encryption in the late nineties to undermine the ability for your common people to have privacy you know based on the magical thinking that it's somehow possible to allow only governments and law enforcement access to privacy.

And of course, we know that if there's a backdoor anywhere, it is a backdoor for everyone and you have undermined it for everyone. So we continue to push against that legislation and try to do advocacy and education around the importance to strong privacy for everyone.

We are also, as you mentioned, very concerned with the integration of so-called AI agents, these software systems that involve usually large language models that are being integrated at the operating system level across various operating systems.

They're intended to do things on your behalf without your permission. So act autonomously based on access to huge amounts of data in ways that are not secure and that pose a real threat to our ability to continue providing robust privacy at the application layer. So we have raised the alarm there.

We're going to be working with people across the industry to propose common sense safeguards and developer level opt outs. We think it needs to be very it needs to be easy and clear for apps like Signal to flip a switch and say, no, you cannot access Signal data for your AI agent.

This is off limits. This cannot become part of your context window. You are not allowed to touch Signal messages. This is too important and that all applications should be able to make that call. And we're pushing on those and some other, again, remediations that could maintain the ability for Signal and apps like Signal to make their own choices and protect their users at the application level.

7.- Right now all operating systems are having the AI technology on the software base and if you don't accept the terms of condition many of the features that the smartphone or the of the desktop machine is having are completely blocked, so for a user it's difficult to to to stop that phase and not letting the AI software getting access to the data. So we need to make people understand what they are saying Yes to. So it's not easy. And you were talking about bad actors. In the past, Signal has become crucial infrastructure during conflicts. I don't want to mention any of them because each of them has been different. But how do you balance Signal's role as a tool for them, for each of the parts in that conflict? What does it mean for Signal to operate in this contested space of a world in conflict?

In order to work for anyone, in order to provide privacy, Signal needs to be available widely. And I think this is really the truth of encryption. Either it works for everyone or it's broken for everyone. So, you know, Signal doesn't know who uses Signal. We go out of our way to have as little data as possible. We do hear from people that governments, human rights workers, journalists, militaries, boardrooms, basically anyone who has high stakes information uses Signal in some capacity, but we remain in our lane.

Our lane is providing robust freely available private communications platform for everyone across the board. We believe this is a fundamental human right, and everyone should have access to it. And that means the people you love, and that means the people you don't love. Ultimately, Signal is available for everyone.

And if it's not, then we're not, you know, If it's not, then we are undermining our mission and potentially undermining the privacy guarantees on which that mission is based. Yeah, but should be difficult because for sure many, many times you receive pressure from different actors and in many countries asking for accomplishing the law, etcetera. So it's not easy.

8.- You were talking about artificial intelligence eager for getting data. And one of the most affected places for doing that is the web in which all the big LLMs have been or are constantly eating data from webs and the traffic is completely destroyed for the business that the publisher of the content were having until they appear. How do you see the future of the website with the big artificial intelligence models becoming the answering machine for users with all the content that is created in real time accessible without sharing the traffic with the content providers?

I think without a significant change in trajectory, it spells the death of the open web and it strikes a very grim note for the future of human produced content and creativity and writing, art, music in general. It's very concerning to me. It's not an area I've studied deeply. It's not an area that I am as familiar with as I am in other areas. But nonetheless, it is something that I am deeply concerned with. And it's something I wish we had better tools than copyright to remediate.

9.- And last question, I don't want to take more of your time. If you have only one thing to say to people that are simply using other messaging platforms, what do you say to them related to migrate to Signal?

asas So move to Signal, what should be your message for them? Well, first I want to say, The reason you use a messenger is because you want to talk to your friends. Communication is about human relationships. It's not about purity. It's not about choosing the right platform. So I understand you. I see you. It's fine. You're there because you want to be included in the group chat. You want to go to the party. So do I. That's human. That's the point of life. 

However, move your close groups to Signal because you just don't want to go out that way, as they say. You don't want to have that text leaked in a document dump. You don't want a data breach to publish your group chats online. You don't want a regime change to suddenly make what you're doing today, which is totally normal and legal, illegal in three years and begin to look through databases to prosecute people.

I think that Signal Happily works very, very well. It is lovely. It is classy. It is not full of crappy little features that some product manager forced us to integrate because they're trying to please a company OKR. It is honestly the sleekest and most well-functioning messaging app that I am familiar with.

And I think that start moving a couple of your groups to Signal. And then slowly more and more people will migrate because it just feels nicer to know that whatever you're saying, whatever rant you are typing out to your best friend, whatever random voice note you are sharing with your girls chat is never going to show up in discovery. 

It is never going to show up in a data breach and it will never be misconstrued by some enemy wanting to paint you in a bad light after they got access to your most intimate data. 

Well, I really love to listen to you. And I would like to say thank you. It's been an honor to listen to you this time. And I feel completely happy after having this talk. Thank you very much, Meredith. Thank you so much. 

And thank you for keeping up the good fight for so many years. 
It's been such a delight to talk to you. 

¡Saludos Malignos!

Autor: Chema Alonso (Contactar con Chema Alonso)  


No hay comentarios:

Entrada destacada

+300 referencias a papers, posts y talks de Hacking & Security con Inteligencia Artificial

Hace un mes comencé a recuperar en un post mi interés en los últimos años, donde he publicado muchos artículos en este blog , y he dejado mu...

Entradas populares