O’Connor thinks we’ll get a federal privacy bill this year. But she’s more concerned about the future of free speech on the internet.
On the latest episode of Recode Decode, Recode’s Kara Swisher spoke with Nuala O’Connor — the president and CEO of the Center for Democracy and Technology — about how the group is currently lobbying the government and tech companies. O’Connor said there’s a “holy war” going on among the tech companies that have unprecedented power in societies around the world.
“Maybe it’s an inappropriate phrase,” she said. “I think you are going to see a race, hopefully to the top, on this is how we treat our customer, and data is part of the equation. It’s old-time customer trust. It’s old-time respect for the customer.”
“Knowing your customer is fine, but are you using the information you have gleaned about them in a way that, yes, furthers your corporate interest, but furthers their needs, and not in a way that is adverse?” O’Connor asked. “This tech sector is not in a good place right now. It’s in an incredibly disregulated place, literally and figuratively. I am short-term, a little pessimistic, but long-term optimistic that there’ll be enough public pressure … and ultimately government action to constrain some of the worst uses.”
Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Nuala.
Kara Swisher: Hi, I’m Kara Swisher, editor-at-large of Recode. You may know me as a big believer in privacy, except for the people covering up the scandals in Washington, but in my spare time I talk tech, and you’re listening to Recode Decode from the Vox Media Podcast Network.
Today in the red chair is Nuala O’Connor, the president and CEO of the Center for Democracy and Technology. Previously, she worked at companies like Amazon and General Electric and was the chief privacy officer for the US Department of Homeland Security. Nuala, welcome to Recode Decode.
Nuala O’Connor: So delighted to be here.
There’s so much to talk about. We just started. We were just talking, before this taping started, about what your themes are and I think you said a war is going on essentially, right?
I describe it as a “holy war,” which is really saying something given that I was born in Belfast, Northern Ireland and so I know of what I speak.
Yes, okay. All right, we’re going to get to that. But first I want you to explain how you got to where you are — you worked at these companies — and what the Center for Democracy and Technology does.
Thank you so much.
It’s here in DC, right?
That’s right. We are here in DC and in Brussels and around the world. We are a nonprofit 501(c)(3) charitable organization focused on the rights of the individual in the internet age, in the digital world — so human rights, privacy, freedom of expression, freedom of association, security, anti-government surveillance. We are centered on the human. Right now, the rights of the human, I hope, are still predominant in our structure and obviously the values of a democratic society around the world, which are in peril, I like to say. Democracy is in peril at home here in the United States and elsewhere in the world. So how do we …
And how did it start? What was the, who’s funding it? Where’s the money coming from?
The original — and it’s completely transparent, all on our website — the original hardy band of internet advocates and internet human rights advocates around the dawn of the commercial internet in 1994, so we’re celebrating our 25th anniversary. They are people like Jerry Berman; Deirdre Mulligan, who’s now at Berkeley; Danny Weitzner, who’s now at MIT; Jonah Seiger, here in DC; Janlori Goldman, who’s now in New York. It grew over time both in DC and in Brussels, funded in parts by foundations, by companies, by individuals, by fundraising events. A diversified portfolio. There is no one major funder.
There’s no Eric Schmidt wandering around, trying to influence things.
No, I will be happy to take his money. I think I say I’ll take anyone’s money if …
Eric Schmidt wanders around trying to influence things with his money.
Really? Well, a little bit, that’s true, sure.
Remember that little controversy?
A very diversified portfolio.
But no one funder determines what we stand for. We stand for the human.
And what we think is good for them.
Right, okay, all right. And you worked … Give me your background a little bit.
Yeah, I am a lawyer by trade. I’m a recovering lawyer, as people like to say. I’ve worked in government, in the private sector, in big companies and small companies. Previously to CDT, I was at Amazon as a data governance person, General Electric for a long long time, the once-great General Electric Company.
Uh-huh, the once great. “It will be great again.”
I, you know, maybe.
No it won’t.
Exactly, but great training as a manager. The federal government, Homeland Security in the startup days doing data privacy and the private practice of law at a little company way back at the beginning of the internet called DoubleClick, which is an online advertising company now owned by Google.
Owned by Google.
Which is the most sneaky of the companies. It’s the most …
DoubleClick or Google, which one?
Well, Google bought a company that really did spy on people quite beautifully. It’s really the beginning. That was the company …
It was earliest days. It was 1999. They’d gotten into some trouble.
Yeah, and they bought it because they didn’t have a business. Google did not have a business and it started to really hop up on advertising, which required a lot of data.
Which is the lifeblood of the internet…
Lifeblood, it’s the fuel as Roger McNamee said.
… for free services, right? These were the old old days when people were afraid of cookies. I think what that company was doing back then it looks positively byzantine compared to what we do now.
But you know, Nuala, free ain’t free.
Yes, mm-hmm. Exactly.
That’s my new t-shirt, free ain’t free.
I like that. That’s good. I like that.
What did you do for the Department of Homeland Security?
I was the chief privacy officer.
What does that mean? This was when it started?
When it started. It was the first statutorily required chief privacy officer in any federal agency, which meant I was responsible for looking at the data uses of all the divisions.
Right, because they have all sorts of previous spying.
They have all sorts of data and the question is, “Do they really need that and how are they using it? Are they being honest and transparent with the citizens and visitors to this country?” They have data not only about US citizens…
Not any more! They’re talking about drones flying over the borders now.
Exactly, right, and emerging technologies. At the time I was there, this was what, right after 2001. Their big issues were airport security and screening people at the airports and body scanners and those sorts of things. I had a great champion in my boss Tom Ridge who said, “We want to keep the values of this company as well as the people safe.” The values are freedom of movement and freedom of association and freedom from ubiquitous surveillance. I had a great champion. It’s great to have a good boss. We had a lot of freedom to run in that department because it was so new. It was a first great impression.
How do you think that’s evolved since then? Have they continued to have privacy officers?
Every agency that I am aware of now has federal privacy officers. That office that I helped start is still alive and kicking and doing well. I think these are really different times for federal government employees.
Difficult times. We’re going to talk about that in a minute. Tell me what you all do at the … What do you do? You advocate for humans. What does that mean?
Right. We do a lot of long-form research and writing, so foundation grants. One of our biggest projects right now, I’m really excited about is studying and helping advocate for better cybersecurity in voting technology. We’ve got this across the country, funded by the Democracy Fund, a foundation to help Secretaries of State and state election officials get better and do at least, not only the minimum but get to a baseline level of data hygiene and security hygiene so that our country can trust that their vote is counted and that it is counted accurately. I like to say, in the last two years, we are going back to the fundamentals of democracy and we’re taking the democracy for a test ride in the United States generally.
When you do the research paper and stuff, what is your goal?
The goal is to help secure voting technology. In this case, it’s to help secure the democracy and help people have confidence that their vote is counted.
Right. This is a big issue and Ron Wyden and I have talked about that.
We literally go door to door, literally to the states and to the Secretaries of State. We have seminars and we have webinars. We do some teaching. We do that sort of thing. We’re doing similar work in education technology and education data, helping state-level officials improve their education data systems, make sure they’re secure, making sure they’re not buying vaporware from external parties.
That’s a big issue, is the transfer of data from the private sector to the public sector and the sale of goods and services that are data-collecting and the use of commercial technologies in the classroom. I’m a parent. I’m very concerned about what my kids are seeing and what they’re using.
Sure. Absolutely. Your goal, when you do things, is to do what? Is it to convince legislators?
To improve policy in both the private sector and in the government. It started really … You’re asking where it started, 1994, direct to government advocacy, Capitol Hill lobbying saying, “Don’t pass this law.” “Do pass this law.” “Improve privacy and security for end users whether it’s government data collection or private sector.”
It’s what I call our direct-to-company advocacy, which is going to companies and saying, “Don’t do that.” Or, “Do that differently,” or, “If you do that people are going to be really upset about it.” I actually think that’s in some ways much more effective given how things are moving slowly in Capitol Hill right now.
Mm-hmm, very much so. We’re going to talk about that in a moment. Yeah.
Would you be akin to a GLAAD? I’m trying to think of a comparison. GLAAD goes around, sort of shames companies into …
A little bit. We both cajole. We sweet-talk. We criticize, when necessary. Our partners in advocacy are everyone from the ACLU to any number of privacy groups to … I’m thinking in other areas. The Environmental Defense Fund. They both work with and against companies when necessary to say, “These are the best practices in what you do.” I’m just reading an article I’m going to share with you in a minute on the analogy between environmental compliance and privacy and data compliance.
Oh, totally. I’m excited to call tech companies chemical companies. Oh, is that what you …
Have you got Judy Estrin’s work on digital …
No, she sent it to me.
Pollution. Yes, exactly. I’m a big fan. This is a thing that’s been going around for a long long time. One of the oldest privacy lawyers in the field, I don’t mean that age-wise I mean that in terms of experience, Lisa Sotto in New York, started her career as an environmental lawyer. She’s the first person to say to me, “You know, the bell curve off criminal act or bad act and clean up and self regulation and regulation, it’s exactly the same bell curve we seeing in the environmental world. Now we’re seeing it in data.”
Absolutely. How did you get into this? What was your … Why did you come to this place, because you did privacy for the thing.
Came to the CDT or in some data and privacy generally?
Yeah, no, come here. Yeah, why was this your …
It was the summer of Snowden.
Summer of Snowden.
Summer of Snowden.
Aaah, the summer of Snowden. No two summers are the same.
You know, exactly. I think there’s actually analogies to the Snowden revelations and the Cambridge Analytica revelation. I want to talk about that in a minute because the “aha” moment for people and, “Oh, the data’s moving in a place and in a way and to an extent that I did not know.”
It’s summer of Snowden. I’m at a big company and all of these revelations come out. It implicates not only commercial data projects and processes but national security. I’m thinking, “I’ve done both of those. I’ve been a privacy officer in a private company. I’ve been a privacy officer in the government. I believe we can do this better. I believe that we can keep the company safe and be judicious and restrained in our use and our collection of data about human beings, but you have to make hard and deliberative and transparent choices. This is not what is happening here.” And this job came to me…
So tell me. So the summer of Snowden. What was the revelation?
The summer of Snowden was that all these companies had been both surreptitiously and permanently transferring data.
What did that do? What was your reaction?
My reaction was …
You were surprised by this?
Yes, and I work in this area.
Not me. Not me. I was like, “Of course they’re spying on us.”
I was surprised by the ongoing ubiquitousness of it.
I certainly know, having been on both sides of that. As I joke, I’m guilty of all of these things. I’ve been on both sides of that conversation saying, “Give us the data,” or, “Don’t give us the data,” or, “We’re not going to give you the data.” The kind of blasé, “Oh, it just happens and it’s all okay.” No, that is not what people’s …
I think yes, it’s true, American’s expectations have changed, sharpened, and improved on this issue. And also, I think we are right to demand a level of transparency and accountability both from companies and from governments. This job was available and I thought, “You know what, this is an opportunity to be involved and really to serve the public again.” It’s not unlike being in the government and feeling like you’re in public service and trying to do the right thing.
Right, well there’s a lot of these companies. Besides the companies washing around there’s a lot of these groups washing around where you’re trying to do it and trying to get a place which was in these areas and each of them approaches it from a different angle, whether it’s privacy or the ACLU or the EFF or things like that.
Let’s talk a little bit about where it’s going, and in the next session we’re talking about what the big issues you see are and how it’s going to resolve itself. What do you think the goals right now are for these kinds of thoughts? Is it privacy? Is it less data collection? Is it transparency of data collection?
You know, we’ve talked all about privacy and we haven’t even talked about the other seven things, divisions at CDT including free expression and anti-surveillance and open internet and neutrality and all that kind of stuff.
Well, it’s all related.
You’re right, it is all related. While I think privacy and data are part of the levers I’m the most familiar with from my own career and certainly are front and center in the legislative dialogue right now on Capitol Hill, I think it is a combination of … It’s really power. It’s really about power and voice for the individual and do I understand my world? What am I seeing? And the power of the algorithm and the power of the feed of information that I receive.
I think the calls for antitrust and kind of the break-up of institutions may not be the best legal tool in the arsenal right now. If a goal is to rebalance the power between individual and institution and the oil, the currency, the flow is data, I … Again, it’s maybe my human-centric and also my privacy-centric view of the world, I think giving more control and agency to the individual in their own data and constraining the uses, particularly secondary uses and onward transfer and all that good stuff, are a more relevant way.
The point I made and the point I made in Congressional testimony is, listen, Cambridge Analytica was a tiny little company. It had a heck of a lot of data and made a big impact. An antitrust analysis would not apply to a 14-person company outside of the United States and yet the harm was still real.
Sketch out, before we get in the next session, what you think the biggest issues are ongoing right now. What are the ones that are really coming front and center right now? There’s so many going on. I’ve written about the Internet Bill of Rights and the difference seeing there’s no net neutrality, obviously. From your perspective, what do you think is front and center? Obviously Facebook has sucked up the oxygen around both misuse of the platform, what happens to data and also addiction that sort of flies in from the left and general misuse of the platform or inability to control it.
I think the biggest issue confronting the internet ecosystem right now is speech. Speech and responsibility.
I know that’s something because we just spent 10 minutes talking about data and privacy, but I actually think privacy is a known-known at this point. I think people get it. I think you can be skeptical if you want to be about the many companies in many different areas of the economy who are now calling for privacy regulation, but I think it’s at the arc. It’s at the peak of the arc and it is likely to happen. I think that’s great and I welcome all the new voices and players at the table. I think that privacy is a debate that does not only impact internet companies, it’s any company that has information about customers or clients and partners and the data should be the same on- or offline. It should be a trust equation for your commercial thing.
I think our national conversation around speech and how much the intermediaries — whether they be online platforms, newspapers, television, radio, whatever — what responsibility? What is the corporate social responsibility in the digital age for the kinds of ideas and speech and information you are purveying?
I think that is actually the harder question because it is … And you wrote beautifully about it in one of your op-eds recently. The intersection of our old-line First Amendment free expression values in this country, which CDT has held dear and held firm to for 25 years, and the new reality that the speed, velocity, volume, scope, scale whatever you want to call it …
Amplification is the word I like, too.
Thank you, amplification.
Weaponization is a word I sometimes use.
Well it is. Weaponization of speech and ideas.
I call it the weaponization of the First Amendment. That’s how I put it.
It’s really true. I was going to say … Don’t get me started. Do get me started on the privilege involved …
Yeah, I’m gonna get you started.
… in saying speech, free speech for all. Well, your speech, whether it’s loud or fast or better, white male, Ivy League, Northeast United States, which is fundamental to the algorithm of some of the platforms, drowns out the speech of women or immigrants or people of color or non-native speakers or whatever. I think there is an inherent privilege in the very architecture of some of the online experiences we have that is unseen by their creator.
Oh, I see it. They see it. They don’t want to see it. All right.
We’re here with Nuala O’Connor. She’s the president and CEO of the Center for Democracy and Technology, which is having a problem. The very name that you have is, they’re fighting with each other, democracy and technology.
Do you think that’s the case? Your name, you’re going to have to change your name.
It doesn’t have to be.
How about the Center for Democracy Hates Technology?
Listen, we are right in the crosshairs.
The Center for Technology Kills Democracy.
I have never gotten so lucky with a job, right? I’m right in the middle of everything all the time. Why aren’t people giving us lots of money to fight this good fight? No, the fight is making …
What’s the fight?
Making technology work for democracy, to be in service of democracy, that we all … I think the good question we’re asking ourselves in the last several years, “Are you for it or against it? Are you for democracy?” I am for democracy. I am willing to stand on principle that it is the best of all the other organizing options, and technology can serve those interests but it has to be intentional and thoughtful.
That’s what we were talking about a minute ago that is the inherent architecture of the platform, the construct, the entity that you’re engaging with, is it in favor of, is it supporting the values of democracy? And are we just sort of bandying about the flag of the First Amendment and saying, “Well, it’s all okay because everybody can talk.”
Right, right. Let’s go through some of the things. When you say it’s a holy war, explain that idea. You’re using a very loaded term.
Loaded term, you’re absolutely right. And I should probably be more careful about that. What I mean is the recent discussion about the enterprise access, and Apple turning off Google’s and Facebook’s, was perhaps, from the outside view, not a big deal, but it was kind of a big slap in the face, right? There was a lot behind that, I think, and a lot more intentionality.
You’ve written about, and other people have written about, that as the larger tech companies start to encroach on each other’s space, right? They’re no longer truly separate and segregated, we’re seeing some real conflict and friction. I also think we’re seeing, potentially, real alternative visions of how a company uses your data and your information.
And so, I guess that’s the war that I see, or the conflict I see is the narrative about how much of my data is mine, and I don’t really like the US construct of ownership. I don’t think of it as a property right, I think of it as a human right. I am probably more European in that sense, or at least a sense of …
The construct I use is the digital self, that my data is an extension of me. You are hearing my voice right now, the person that’s listening to this podcast is hearing my voice. You now have some interest in it. You can turn around and say, “Well, she said that,” or, “Wasn’t that dumb?” or whatever, but it’s still my voice. I still retain some ongoing interest, from a property law construct, that would be like an ongoing interest, a covenant that runs with the land. I think of it more as like it’s sort of a body part, right? I think it’s attached to me and I have ongoing rights in it and interest in keeping it safe and it being used appropriately and effectively in my interest.
I’m not sure if body part’s a good idea, but let’s …
Right, yeah, yeah, yeah, that’s maybe too much.
… body parts.
Yeah, especially as a female.
Think about it, think about it, think about it.
But you’re familiar with the Jonathan Zittrain and Jack Balkin Construct of Information Fiduciary. I mean, I think that’s a really intriguing … I think … And it’s a clunky work.
Explain that. Explain that for …
The idea is that a company or an institution that is holding your data on your behalf to provide you a service, whether it’s email or a phone or whatever the service is, not unlike the way a bank holds your money in fiduciary trust for you, has an ongoing duty of care to you.
Meaning they have to watch your money.
Right, to use it in the way … Now, they also get some … Let’s be also really honest, they also get benefits, right?
And they make their interest. Right.
They make the interest and they can invest that money and they can use it to make their own profits. That’s why banks get profits, right? But they also have a fundamental …
That’s a very good comparison, banks.
It is, right? Because they’re a lot … And that’s, I think, also where we get a little tripped up. I don’t think most ordinary internet users are naïve or unknowing about the implications of internet advertising. I think what they don’t want is to be surprised, to be duped, to have their data misused or used against them. There’s a difference between saying, I’m willing to put up with some ads, even some annoying ads — like the ad for the red sweater after I searched for a red sweater, that follows me across the internet for three weeks — because I’m getting this service for free.
I get that deal. I think we are more savvy consumers of internet services than we even were 10 years ago, or 20 years ago.
What I think people don’t like is to be surprised, or to have their information used in an adverse way. So the bank, again, can use your money, they can make profits, they can make investments, they can get some benefit from it. But at the end of the day, you come a-calling, and you have to … that $10 has to still be there.
Right, exactly. Right, and not have been taken elsewhere.
So, when you talk about this relationship, where do you think we are? And I mean that in kind of a broader term, but what do you think the key narratives are of this year?
Now, last year it was all about Cambridge Analytica and Facebook abusing your data, abusing the platform, the Russians, things like that. Where do we go from … It’s been a really disastrous year in terms of technology in democracy.
It’s been an unbelievably …
With not an ability to understand the real damage. I don’t think we’ll ever understand, right? I don’t think we’ll ever know, except that it’s damaged.
I think we’ll know, but we may not be alive to know, I mean, we — the greater we — history will judge us, and I feel very confident, again, CDT is on the right side of history on these questions.
I think the … Yes, I think the narrative around data still persists, and I think you are seeing a lot of people flocking to the table for some certainty, especially given California, GDPR in Europe, and Cambridge Analytica, I think these are the three pillars …
So, go through those.
… three pillars of this swarm, of this storm, the perfect storm on privacy. California passes a law, it’s tough, it’s got meaningful …
Tough-ish. It’s got … tough to comply with, maybe, is really the issue.
Needs teeth. Yeah.
Europe passes a revised version of their data protection directive, the global data protection regulation.
Tough. Real tough.
Tough, and with real fines, exactly, tough in that way. Tough also to comply with, because it’s maybe not clear in certain areas. And then the Cambridge Analytica disclosures of the early part of last year, which I think, again, were the “aha” moment for the internet user as, as Snowden revealed the flow of data from the private sector to the government in ways unknown to most ordinary consumers.
I think Cambridge Analytica and Facebook disclosures in that era were, “My data’s being taken,” my trivial data, not my sensitive health, or finance, or kids, or sex, or whatever category is sensitive to, my trivial … You know, what color is my house or what is my favorite dog breed, is being taken, moved out of the country, literally, to a different entity, and being fed into an algorithm that is going to determine not only what advertising I’m seeing — and that’s important — but also what content, and what meaningful content with meaningful consequences for the democracy, right? Who I might vote for, or what I know about my democracy, what I know about the election.
That, to me, ratcheted up an awareness of the seriousness of trivial data being used for serious consequences in most …
Yeah, that’s a really good way to put it: It’s trivial but it isn’t. It isn’t trivial, it’s …
Yeah, when put enough together and it’s used in certain ways, it’s no longer trivial.
No, of course. And cross-referenced with voting data and everything, yeah.
Right, and even though it’s anonymous and pseudonymous.
So the whole debate about is metadata data, or is your PII PII, this computer did not necessarily need to know it was you, it needed to know it was this computer, or this IP address, or whatever your identifier was.
And I think that has serious consequences. You remember the book, The Opening of the American Mind, or The Closing the American Mind, this is the closing of the aperture of your online experience in a way that had potentially very seriously damaging consequences for your understanding of the world around you, of the candidates who were in front of you, of the Brexit vote in Europe, of whatever. And I think that was a wake-up call.
And because it had political consequences, I think that awoke some people on Capitol Hill. But I think that people felt … they felt deceived about the use of their data.
Deceived, you’re so kind. I think they lied to us.
And so, I think a …
“People felt deceived, not that we had anything to do with it.”
The trade, yeah. Well, I think the issue is we’re so exhausted by it we don’t get upset about it, because we’re so exhausted by it.
It’s like breach notice fatigue a few years ago, where you got how many letters from your banks, or credit cards, or whatever, saying somebody has your credit card number. You stop reading them.
Similarly, you’re feeling … I think there is, and the Pew research shows, there is a feeling of hopelessness, or helplessness, among the American consumer about their internet experience. I’m still this much of the internet optimist. I think we can get this right. I’m short-term pessimistic, long-term optimistic.
Right. We’re going to talk about how we can fix it, but right now I want to state where we are, because I think people do feel fucked and don’t have any way to get out of it. They feel like they know vaguely they’re being taken advantage of, they also like the free stuff, and like, why not play Fortnite or Red Dead Redemption, it doesn’t matter. It sort of becomes exhausting, and then these companies are dying out on that exhaustion, pretty much, of people, or not … the ability not to get angry about it.
It will surprise you that I’m putting my hope in Capitol Hill, but I think that the public outcry for federal legislation boundarying the use of data by companies of all kinds has, actually, some legs. And so, I am more optimistic than I have been in a while that that, as one of the pieces of the internet ecosystem, could be solvable, or at least move the ball forward on that issue.
Okay, I want to talk about the solutions later, but what I want to … And so, we’ve got the issues of our privacy, getting a privacy bill, a national privacy bill passed.
One that has teeth is probably somewhere between Europe and California, or something like that. Secondly, what else?
I think the harder questions for many of the companies, and for all of us using online experiences, is what is our duty of care as individuals, what is the private sector’s duty of care to the democracy, and would we … do we ever want any kind of government intervention here. Because we all say oh, no, no, no, no government censorship, you know, that’s an anathema to free speech.
But how do we solve the riddle of … And again, I point to Judy Estrin and Sam Gill’s article about digital pollution, the glut of bad information that is drowning out good information, the lack of quality information and the fact that the internet is the great democratizer. Everything looks like the New York Times. You can put it all in the same font, and it all looks like it’s quality information.
That has led to a breakdown in trust in journalistic … not that it’s only led to, it’s contributed to a larger breakdown in trust in institutions, institutions of the fourth estate, institutions of government, and what is, at some point, the fundamental frontline duty of online platforms as they are purveyors, or sharers, of journalism and information about the democracy.
That, I think, is the harder question, because we have stood firm on Section 230 of the Communications Decency Act since it was drafted, and we were … CDT was there at the beginning, and we are all now looking at what we have wrought. I think there’s a lot of grief among people who were there at the beginning, and a lot …
Right. Well, let’s talk about that Section 230. What is going to happen with it? I feel like these companies don’t deserve the immunity, or the big ones don’t, or we somehow ratchet it down so only a certain size company is granted broadly …
I’m not even sure if it’s size. I’m less convinced that big or small deserve any kind of protections.
I don’t think they deserve any, if they’re big enough.
I think, you know, it made sense at the time. I think 25 years on from the dawn of the commercial internet, I think we’ve got to say … I think we’ve got to deal with the reality in front of us, not what we thought we were creating, but what is actually here and how people are using it.
And the end user is profoundly different. It is no longer a bunch of MIT grad students and engineers, right? This is mom-and-pop, and you’re doing your online banking and your kid’s music lessons and your whatever. What are the expectations, and what are the consequences, for the kinds of communities we’ve created?
I was talking to a colleague in Civil Society who said, you know, given our … I grew up in Belfast, and was it hard, and he’s, I grew up in the South of England, and we, you know, they had this civic center that was built in his town, and he said, you know, it was a great idea. They wanted a place for the youth, the underserved youth, to hang out in the … after school, he said, but there was nobody minding it. And he said three weeks, I don’t remember the exact time, three weeks, four weeks, a month, the place was covered in graffiti and it was destroyed.
And he said there are days — and again, this is not my thought process, and I should … I don’t want to out the person who said it, but he said, “You know, it kind of reminds me of some of these online platforms. It’s like nobody is minding the store.”
No, they don’t have to mind the store because it doesn’t cost anything, that’s why. You got to put costs in. If you give them costs they will behave like that, like …
It was interesting, I was having a debate with someone on Facebook, they’re like, well … about the Communications Decency Act and the Section 230, and I think they should get rid of it, you don’t deserve it anymore, you’ve misused it. Oh, this and that, with startups, they said we’re at a 30-year low of startups, it’s not helping startups, it’s helping you that you don’t have to run your thing.
And it was just … It went back and forth, and back and forth. And I was like, y’all don’t deserve this incredible gift, which is immunity. And if you …
Look, stop signs work really well, so do red lights, so do … People, if they didn’t have them, everyone would drive like crazy, because of course I would, I know I would. And so, therefore, the idea of getting a photo ticket, or the idea of … is preventative, like, and they don’t have any preventions. And so they just spew their chemical waste into the thing and they don’t have to put filters, and stuff like that.
And why would they? Like, why would they if they didn’t have to? You have to rely on … I think Roger just … Roger McNamee was saying you have to rely on their betterness, like a better Jeff Bezos, a better Mark Zuckerberg. They’re just not going to be better.
The better angels of our nature.
But they’re not better, they’re better not at all, and so …
I think I am here for a conversation about that, and the one thing I wouldn’t say is let’s not do it piecemeal, let’s not do, like, this deserves special protection, and this … whether it’s category, size, content, whatever. Let’s have a conversation about what is your duty of care to the information you serve, and of the information of the people who you are serving at this … and a holistic approach.
And, you know, even one of our founders said to me, not long ago, “No liability was never intended to mean no responsibility.”
Yes, yes, yes, exactly, exactly. I’m writing down better angels. I’m going to write a column in the New York Times.
Yeah, that’s not good.
We’re here with Nuala O’Connor, the president and CEO of the Center for Democracy and Technology. We are where we are now, where technology, trust in technology is at a low. It’s clear that they haven’t run their platforms correctly. They’re under siege.
Although some of their stocks have never been higher because they’re doing very well in the business of stealing data, you know, or using … I’m sorry, Mark. Taking our data and giving us relevant ads, thank you. In their business of giving us relevant ads, that we really want, because that’s what we really want in life is relevant ads. I’m going to make fun of that, relevant ads. It’s so funny that he’s …
Anyway, so, where are we now and what’s going to happen this year? What do you think the big themes of 2019 are going to be?
I think accountability for truth in data and information. I think that’s …
Okay, and where does that come from?
You know, I’m still hopeful, and I’m holding out a lot of hope in the short term for data privacy, but that’s going to focus on personal information. I think we’re going to have a larger conversation about who really has responsibility for the content.
And, you know, you said before, what’s our individual responsibility? Yes, I think there’s individual responsibility about what you post and what you share and that sort of thing, but I also don’t really kind of want to blame the user here. I don’t think that’s the primary solution either.
I think the one who has the most power and the one who has the most access to information probably has the greatest responsibility to make sure what is being served up is accurate, helpful. Whatever your metrics are, at least be transparent about what your metrics are. And that’s only … that’s necessary, but not a complete step.
So, I think the conversation’s going to be about the impact … like, it’s going to continue to be about the impact of technology on the democracy, and what steps we are taking, collectively and individually.
And who should be taking those, so …
All of those sectors: the companies, the governments, and individuals.
But where do you imagine it coming from? The government this time?
Well, yes, because the companies haven’t.
Right. Have they woken up to that?
I think in varying degrees, in different …
I think they act like victims. That’s what I’m getting from them, victimization. “You’re so mean.”
That’s maybe just to you.
I’m not mean.
Not mean, maybe to you, I’m saying.
I yell, “Clean your room!” and they think I’m mean. I suppose my kids are the same way, but they’re 13 and 16, so they’re allowed to think I’m mean.
It does feel like parenting, though, sometimes, doesn’t it?
It does. I literally yell, “Clean …” That’s what I’m doing. I am the internet’s mom who’s saying …
Mommy. Someone said that to me and I kind of bristled at that, but I’m starting to kind of embrace it. The whole narrative that having kids was bad for the career, no, no, no, it’s has made me far better and far sharper and far less willing to …
“Clean your room!”
And also things are really basic. You know, is this about greed? It’s about power. It’s about money. You know, these … It’s not really that complicated.
You see, I think it’s beyond that, because they’ve got so much money. It’s not money. And that’s why I wrote last week, they’re so poor all they have is money.
I know, I love that, it was great, was great.
Like, it was great, because the idea was it’s not money that’s …
See what you’re saying.
… getting them, they just believe they’re doing the right thing, or they don’t believe they’ve done the wrong thing.
You have nailed what I’ve been seeing in tech now for a long, long time, which is the disconnect between the mindset and the impact. And I say, you know, the first step is recognizing the impact you’re having, you know.
And I remember being, again, back at a company in the 1990s, and they were saying, “But, but we’re fueling the internet economy, and we’re doing good things, and we’re serving this new ecosystem.” But the impact you’re having on people, you know, is profoundly different, and it’s not okay to simply say, “Well, we’re just a little startup.” You’re not a little start up. You’ve got how many billions of customers?
Right. Okay. So, it’s the realization that we’ve got to do something. So, what will be done in 2019?
That’s where I have to be honest that I think we all have to be skeptical about Congress’s power to act swiftly. I still think there is a chance for baseline federal privacy legislation in the next year, before everyone starts checking out to deal with the next presidential election.
You know who’s in the way of that in the Senate? It’s Chuck Schumer. I’m gonna fry him on this one. He can’t be in the way of this.
Why? Why is he?
Because he gets a lot of money from them, and he’s ignorant about technology. Those two things are a combination, a delightful cocktail of no action.
I do think there’s great new talent in the House. Right?
Yes, I know that. God, Ocasio’s smacking the heck out of them. I love it. I want to get her just to talk, but I just want to give her an entire hour to smack them. I’d love her to come to Code and just yell at them and look great doing it, just the whole … She’s so clever. You know what I mean? She’s so clever and witty that that’s my favorite part.
You wrote a great column on her use of the internet and how …
Yeah, yeah, yeah. It’s a moral thing. That was interesting, when she veered into that, but I kind of agree in a lot of ways. It’s a tough argument to make, that there’s a moral imperative here, but there is, more than … Well, there’s more on imperative in terms of chemical companies. There’s a moral imperative in terms of pollution.
Yeah. No, I think there is, and again, are you for or against the democracy?
What are you … Yes, your first-order interest may be to your shareholders, but your second-order interest has gotta be to the community, the democracy, the way of life.
Right. Exactly. So, privacy. What else? Privacy…
I’m still worried about speech here and around the world, and we’ve got …
So, how does that get solved? How does that … Because that’s a tough one for the government to take on.
Yeah. I’m not really sure we want the government to take that on. Right?
No, we don’t want the government to take that on. No, and we don’t want to Supreme Court near it.
I mean, I think that’s pretty clear. Right? Yeah, and we don’t want government intrusion into private space and private speech, but I do think we are in the middle of —and it’s not gonna be solved this year — but we are in the middle of a national conversation. It strikes fear in the hearts of First Amendment advocates about controlling speech, whether it’s on campus, whether it’s on the internet, whether it’s in our daily lives.
There is a great phrase that goes around our office all the time: “Freedom of speech does not mean freedom from consequence.”
Yes, I say that a lot.
I do love that. I think that’s the first line of conversation, is, “Yeah, you have the right to be a complete jerk, but we also have the right to say we don’t want you in this house or this party or this whatever.” I don’t have a quick and glib answer for you. I think it’s a much larger societal question about the expectations.
So, where’s the flash point of that this year that people can look to? Is there an area?
I think it’s social media. I think it’s Facebook and online platforms, and both the need to corral our state-sponsored propaganda, and, as I’ve said, also, we need to start talking about: It’s not fake news. If it’s government-sponsored, it’s propaganda. We need to call it by its name.
It may seem cute to government operatives who are used to … I’ve noticed in a building nearby there was Radio Free Europe, and there’s the Public Broadcasting Systems, and the state department’s good on radio and propaganda systems’ television, and I think they don’t devote … The BBB does not devote what it needs to to internet kind of cyber warfare or information warfare, and that’s a real concern, not just with Russia but with China, and there are really big fights ahead on content.
Right, on what should be on these platforms. Yeah.
Exactly, and I think some of that does have to implicate, not government action, but an awareness that the stakes are a lot higher than people realize.
Yep, absolutely, because I think we’ve never had a major communication system run by a private person, a single private person …
That’s a good point.
… who doesn’t want to do his job, who doesn’t want to do something about it.
And what’s interesting … What do you think about Facebook’s, “I need to have this council”? I want to smack them for that one.
It does sound vaguely governmental, and I’m not sure how …
Yeah. They don’t want to do their job. It’s a private company. They can do …
I think one of the biggest existential questions for a lot of platforms is, you’re doing business in a whole bunch of different countries, and in native languages that you may not be even equipped to parse or understand or really be sensitive to the local norms, customs, communications.
I’ve always said, you want to play in their sandbox, you gotta play by their rules, meaning we have to be mindful, we are exporting — “we,” whoever that we is — entities based in the United States are exporting a US vision of how the world should operate. That’s not necessarily always welcome in other parts of the world.
I mean, I just talked to my friends in Europe and my colleagues at CDT. A dear friend to me said, “You Americans, you love your First Amendment almost the way you love your Second Amendment,” and that kind of blew … I mean, it made it me sit up.
Oh, my god. That’s brilliant.
Right? I’ve hung onto this phrase because it made me sit up and go, “Ah, that’s …”
Uh-huh. They don’t love it as much over there. They don’t love the First Amendment.
They don’t love either of those things.
They don’t love either of them. They don’t like either of them.
When you lump one and two together, it kind of makes you think, hmm. Right?
Yeah. Oh, you love your … Which one do we think we love more, the First or the Second? We certainly love to use the First to talk about the Second.
I think in this country, it might be the Second.
The Second, yeah. Now, some people the First, other people the Second.
Yeah. I mean, I wonder what the overlapping Venn diagram of those two groups actually looks like.
It’s probably pretty close.
I don’t know.
Yeah, it could be. It’s an interesting question. Yeah.
I don’t know. I think a lot …
I don’t think anyone hates any of the amendments. They just want them to be used responsibly.
Thank you. Now you gun lovers can ride me. By the way, I know how to shoot a gun.
I don’t see the enormous appeal of it, but I do know how, and I’m a very good shot.
Again, constrained, reasonable regulation is appropriate in both contexts.
Yeah, yeah. Absolutely. It’s so funny. My son killed a boar this last summer, and my friends are horrified.
I’m like, “He wanted to learn how to shoot a boar.”
Yeah. It always surprises the gun-crazed nuts. They’re like, “Oh.” I’m like, “Yeah, right? So, back off.”
I’m not sure I’d know where that would even happen. Where does something like that happen?
It’s in California. We need not go into it, but it was delicious, and we had it at New Year’s. I don’t eat meat, but I did try my son’s boar. So, away from the boar, net neutrality, let’s go over it because we don’t have much time. What other big issues …
It’s gonna happen, man.
I mean, we were just at the Supreme Court. CDT was a party to the case. I’m super proud of that.
So, explain where that is. Explain.
I wish I could. After how many rounds of appellate litigation, I think where we will actually … I mean, the standard in the case was did the FCC act in an arbitrary and capricious manner, one of the few phrases I remember from law school, in overturning the previous set of rules that we liked.
Right, which they keep doing.
And which now we don’t like. And in fact, one of the lawyers on the case who does not want to be mentioned by name said to me, “This might have to actually be solved by legislation.”
Legislation. That’s what I said. I’m sick of this back-and-forth.
So, this is actually what I said, too. I said that.
I said that to Ajit Pai. I was like, “I’m sick of you, Ajit Pai.” I’m sick of the other guy, or whoever the …
I’m so glad … Again, talk about war, ideological war.
Yeah, it goes back and forth.
I mean, there are people on both sides who just kind of want to keep fighting. I’m like, “No, no, no. I got other things to do. I got kids to raise. I got things to do.”
No, legislation is what should come.
So, a legislation that settles the playing field would be absolutely, assuming it’s appropriate and constraining of the things we’re …
There may be no solution, though. The signs are so …
I’m not sure that’s true, actually.
Right. Really? Okay.
I think there’s a lot more. Again, I may be showing my optimism here. I actually think there’s a fair bit of common ground, and I think that reasonable players would want the outliers constrained and regulated out of existence. I actually think there’s an open enough ground in the middle where there could be.
All right. If you say so, because I get an earful from Comcast. I get an earful from Google or whoever, blank, blank, blank.
I know, I know, and yet I think that there are places where they can …
Not Google. They’re rich enough, the smaller companies. Google doesn’t care anymore.
That’s the fear with privacy legislation as well, is you do not want to create a playing field that locks in the incumbents, and I retweeted a great, I think, Wall Street Journal article about this, and somebody in Europe got very, very annoyed at me, and I said, “I actually think that’s a valid concern, especially if you get …”
Absolutely. They have enough lawyers. They have enough lawyers.
Right. If you get a compliance regime that favors the rich, the big, the whatever, you are actually creating an uneven playing field.
So, I say simpler, smaller, shorter, elegant privacy bill that says, “These are the principles we know. These are the uses that are okay. Here are a couple of really far-out things that are not okay,” and allows for a fairly simple compliance regime for the smallest, and also certainty in the playing field. This, again, it reminds me of the banks. I started my career as a banking lawyer a hundred years ago, and people, they …
That must have been riveting.
Yeah. The bank regulated … What was the big Bank Secrecy Act, whatever?
Whoa, you must have been fun at a party.
I was super fun. I still am. Can you tell?
“Let’s talk about interest rates.”
I’m still like, “Let me tell you about privacy data law around the world.” But the banks fought. The credit card kept on limited liability to $50, and yet credit card use shot up after that. Right?
It settled the playing field. It created consumer confidence. It created stability for big and small players. That’s what we want here. We want a level playing field for new entrants to the field, incumbents, big, small, whatever, but ultimately, for the individual to know my data will be treated with respect, with dignity, and I have some sense of control over it.
Right. Okay. Last thing around bullying and addiction. They all kind of fit together in a pile. Where do you see that?
I told some folks working on addiction, I said, “Having had a family experience with real-world drug and alcohol addiction, I am sensitive to misuse,” as I should have not used the phrase I used at the beginning of the hour, “I’m sensitive to overstating things like addiction. However, I am now the mom and soon-to-be stepmom of many teenagers, and I think, again, it’s a social and moral construct that we need to be transparent about how our devices are configured, and are they being, and on games as well, configured to keep people on and to create stickiness.”
Yes, they are. I am here to tell you.
Yes. Well, exactly. We know they are, absolutely. So, what, again, is the both parenting responsibility there, but also the-
Well, it’s a little hard to fight, I have to say. I’m pretty responsible, but it’s hard to fight. Last night, my son called me. He calls me from his bed, like he can’t get up and come down the stairs. He’s like, “Oh, can I sleep in? I just am really tired.” I’m like, “Why are you tired?” “I woke up at 3:00,” and I’m like, “Did you pick up your phone?”
“Yeah,” and I go, “Too bad. Don’t pick up your phone another night. Get the hell up.”
Let’s start with probably the most important things I’ll say on this. No phones in the bedroom, phones on the kitchen counter before bedtime, limitations on numbers of hours on screens, on games and whatever on the weekends, no phones at the dinner table, and something I’ve tried on and off to do, and I’ve been a little less successful lately, is digital sabbath, literally having a day from sundown to sundown where there are no devices.
Oh, I couldn’t do that. I could not do that.
In our house, we have two different sabbaths, so that’s actually a challenge.
I’m not sure I would want to do that, but how did it work?
Right. Exactly. I did no social media. I mean, yes, I had to do email for work or whatever, whatever, but it’s kind of cabining off uses, no games, no social media or whatever.
Can’t do it.
I’ll tell you, it was lovely.
It was really lovely.
I always like to tell my kids of time before cellphones. I was like, “In the old days, in the old days …”
Yeah, in the old days, when you actually had to call people on the phone with this curly string.
Call people. You had to go stand up and get the phone and pick it up. I’m writing an article for tomorrow for the New York Times about car use, and I found an old story I wrote in the Wall Street Journal in 1998 called “Cutting the Cord,” and it’s about everyone will be mobile. You will not have a landline, and I remember that he went, “No, of course we will.” I’m like, “No, you won’t. You will absolutely …” It was interesting.
All right, Nuala, I want to finish up by talking about … Do you feel positive … I mean, you said holy war. Do you feel positive about where we are? Because tech, at least with the media and regulators, has a bad reputation now. How do they get it back, from your perspective?
Oh, they’ve gotta take some serious, serious strides…
… to talk about what the constructs, again, what is the very architecture and infrastructure of the things they have built, and to whom they are accountable fundamentally. Are they serving the first-order interests of the consumer? When I said holy war, maybe it’s an inappropriate phrase. I’m sorry. When I say the difference is the differentiation the different companies are trying to make among themselves on how they treat consumer data in a respectful way versus not, or versus the perceived not. I think you are going to see a race, hopefully to the top, on this is how we treat our customer, and data is part of the equation.
It’s old-time customer trust. It’s old-time respect for the customer. Knowing your customer is fine, but are you using the information you have gleaned about them in a way that, yes, furthers your corporate interest, but furthers their needs, and not in a way that is adverse. And I think no, this tech sector is not in a good place right now. It’s in an incredibly disregulated place, literally and figuratively. I am, again, short-term, a little pessimistic, but long-term, optimistic that there’ll be enough public pressure, and thanks to good journalism and good writing about it, enough public pressure from civil society, and ultimately government action to constrain some of the worst uses.
All right. Nuala, you need to come back and see what happens next year.
All right, and the No. 1 most important issue, from your perspective?
It’s still speech.
Speech. I’m going with privacy.
Good. I’m glad.
I’m going with privacy. I’m thinking a privacy bill. Who shall I squeeze up at Capitol Hill?
All of them. We’re talking to everybody. It’s gonna happen. Maybe that’s why I’m not making it my No. 1, because I’m not worried about it. It’s gonna happen.
I’m confident of that.
Who? Which is the senator over there that’s gonna do that? There’s a woman who’s on the FT that they …
There’s a handful that all …
There’s a commerce … There’s a lady from Illinois who’s running that committee, the subcommittee.
There are quite a few that we’ve talked to.
There’s quite a few. Yeah, absolutely. Schumer, I’m coming for you. Anyway, Nuala, it was great talking to you. Thanks for coming on the show.