HEFFNER: I'm Alexander Heffner, your host on The Open Mind.
The headline of a recent Forbes report reads:
"As the Trump Presidency Looms,
Digital Activists Brace for a Fight for the Internet".
The story featured prominently the work
of Cindy Cohn, Executive Director of the
Electronic Frontier Foundation.
Based in San Francisco, the EFF is the leading
nonprofit organization defending civil liberties in the digital world.
Founded in 1990, EFF champions user privacy,
free expression, and innovation through
litigation, policy analysis,
grassroots activism and technology development.
So as the new administration considers
its tech policies and advocacy for 2017
and beyond, I asked Cindy to join us today to consider
how she and her colleagues are going to safeguard
our most fundamental digital institutions that appear
under constant threat, and surveillance, we might add.
Welcome, Cindy.
COHN: Thank you.
HEFFNER: How are you and your colleagues
at the Foundation and elsewhere intent on protecting,
whether it's digital resources or
infrastructure, to insure that a free and open web
um- never ceases to exist.
COHN: Well, it's always been the case that digital
technologies could help give us more freedom
and liberty, but only if we wanted them to
and only if we made a set of decisions that point
in that direction.
And so, you know we, we are very concerned about
the language coming from the Trump Administration
and some of the key appointees so far.
Um- that that's going to be under bigger threat.
Um- you know, the tools that we use are
litigation, which may become more important
um- as- as- if we have a hostile administration,
it may be up to the courts to stand up
for our liberties in a way that they might not
have been called upon so much to do, um, before.
Um. Building technologies.
We're working to encrypt the web and we've got a
project to encrypt email, um, as it transits the internet,
um, more thoroughly and other kinds of projects
where the technology itself serves the goals
of liberty and privacy.
And then of course we'll continue to do- be engaged
in the policy battles as well.
HEFFNER: What- what do you concern yourself
with most as the central policy battle?
COHN: The idea that the internet- that new
technologies, left to their own devices,
will magically make us all better people
and live in a more just society was never really true,
and now I think we're beginning to see it.
So what does that mean for us?
I think it means first of all we need to build
a secure internet and a privacy protected internet.
We need to make sure that somebody in charge
of policy decisions can't effect our ability
to have a private conversation to organize,
um- to build communities of support for each other.
The thing that the technology lets us do
is that it lets us coordinate,
speak to, and work with people who are very far
away, in- you know, in a way that no technology
before has really allowed us to do.
But that doesn't magically fix things.
We need to have a secure set of technologies.
We need to um- find ways to empower users,
get rid of people in the middle who can have
so much of a- of a- of a- a censorship effect
on what we can say and do.
And then we need to use these tools to educate
people to make better decisions,
as opposed to just fomenting and spreading
the worst ideas, which the technologies are happy to do.
We need to start marshalling these- these
forces to be able to help us decide better
what's true, to help us spread good ideas,
to make sure that- you know,
there's an old- uh- uh- quote from I think
it was from Mark Twain, that you know,
"a lie can make it half way around the world
before the truth gets its boots on." Um.
The technology isn't going to fix that magically.
But if we start thinking about it that way and
empowering users, we might be able to- to do that.
But you know, at the end of the day,
technology is just reflective of what's going on in society.
And I think often it's very easy to blame
the internet for hate when hate happens because
people aren't feeling heard and because
they're being given, frankly, bad stories about why their-
why their suffering and the suffering they're
seeing around them are actually occurring.
And um- technology can't magically solve that problem.
We actually have to move beyond technology
to have conversations with people about how
to make the right choices in these situations.
And so I often find that it's easy to blame
the internet because of course the
internet can't really speak back,
as opposed to looking at- maybe
a little deeper about why some of these conversations
are- um- catching fire in a way that they didn't before.
HEFFNER: Well, whenever I'm on a college campus,
I'm asking the students
what browser do you use, what platforms are you on.
And really having an evaluation of your digital
footprint as a citizen. And um- I think to this day,
young people are surprised to learn that if you're on
Safari or Chrome, it's going to be a different
outcome than if you're on Mozilla Firefox in terms
of the security you have that you can retain privacy.
Uh- so that literacy point resonates for me very much.
We're at what appears to be a dangerous moment
in the way we configure the internet.
Because at the same time, we want to develop
cyber-security so that the Russians or any entities
are unable to hack in to our political leadership's
emails, or lay people, any citizen's email.
We- at the same time, want to create encrypted
networks to protect against,
um, any attempts at a Muslim registry,
um- some of the chatter that's come from the
administration about surveillance that is
targeting people based on their skin color or religion.
So how do you balance that,
um, the reality that we need more security
and yet, you know, recognize that individuals demand
that same level of protection too?
COHN: Well, I think that they both point
in the same direction. Right?
We need networks that are secure,
that are not hackable- aren't as tappable-
are as protective of our privacy as possible,
um- both for societal reasons and for personal reasons.
Whether your threat model is identify
theft- somebody's gonna come hack your identity,
whether your threat model is because you might lose
your phone and you know, three million people
had their phones stolen, another 1.2 lost them
in 2013 alone- that's from Consumer Reports.
Um. Your data needs to be safe so that when these
incidents happen it's not an utter catastrophe for you.
Because more and more, you know,
our ability to get a mortgage,
our ability to actually navigate the world depends
on information that's in the hands of third
parties at some times.
And if that information is corrupted,
it can really affect our lives.
So we believe strongly that um- that one of the
core values we need to have in a digitized world
is strong encryption, strong protection of that information.
And that requires having conversations about the
government's role in tapping the networks
and also private company's roles in gathering
these vast storehouses that can then be used,
you know- Facebook, uh- or other social networks
are a really simple way for the government to figure
out, say, who's a Muslim, who's an immigrant,
you know- who's a person- who's a vulnerable community.
We need to make that information less available
to them and harder to use for them so that
the standards get- get raised.
So. Um. So we're a very strong proponent for strong
encryption and strong security not just because
of the privacy implications but because
of the security implications as well.
HEFFNER: What are you expecting and hoping to
hear from these companies in their pledge to protect
their user's privacy?
COHN: Well, I think the first thing they need to
do is really deploy strong encryption everywhere, right?
Data at rest should be encrypted.
If you break in to the Democratic National
Committee's web- computers,
you shouldn't- you shouldn't be able to see
data in the clear. Right.
It should all be encrypted at rest,
it should be encrypted when it's traveling.
Like- you know, these are- these are um-
not that hard to do. They take a little thinking,
but they're all things that these companies could do.
Um- many of them started doing them,
you know, significantly more after,
um, Mr. Snowden's revelations in-
you know- proved that the NSA was actually
tapping in to the places where these networks you know,
where Google servers in one country talking to
Google servers in another, that- that that link in
between those two servers was one of the key places
that the NSA and GCHQ, the British equivalent,
were tapping in to their networks to suck down
information about us.
Um. Google quickly started encrypting those links,
as did lots of other companies. But there's more to be done.
And uh- EFF has been helping.
We actually launched, uh, in conjunction
with a group called ISRG, a- something called
a certificate authority that is- is um- now the-
you know, number one way that you
can encrypt your- your web traffic.
Um- and it- it requires the websites that you
visit to- to engage in something called a
certificate process.
Um. But since launching one of these for free,
dead easy, we've seen um- the encrypted- you know-
the part of the web that's encrypted just rise
dramatically, but you know- we're a tiny
nonprofit in San Francisco. We shouldn't be doing these.
Like- the companies should be doing this themselves.
And they're starting to.
So I think that's a first and easy step
for the companies to do.
The second thing is for the companies to push back.
Don't roll over when the government shows up
and seeks information about your users.
Um- demand all the legal process,
fight gag orders, uh- a lot of this process
that happens, um, that has been
implemented in the last 15 years involves really-
not only getting information from companies,
but making companies not tell you that
your information has been collected by them. Um.
Having companies begin to push back against
those and make noise.
Again, some of this is already starting to occur,
but we need more of it to occur.
And then we need the companies to stand up
for us in the policy debates as well, you know?
We- you know- uh- President-Elect Trump
called in a meeting with tech and it was all tech
corporate leaders.
There was nobody representing the users
of technology, which of course- the internet
isn't them, the internet is us- all of us who use these technologies.
It's great that they build them and it's great
that they have a good business model,
but they're not tech- they're just a piece of tech.
Um- so they need to stand up for their users
and recognize that, you know, if they don't stand with
all of us- the users- then we all go down together.
HEFFNER: What about code of character?
COHN: Code of character?
HEFFNER: That's the problem, right?
That's the problem, because if I said that-
I- I- I'm going to spell it out in a second.
COHN: OK. Good. I could guess, but-
HEFFNER: No, I won't make you guess, but Farhad Manjoo- um-
technology columnist for the New York Times-
wrote a piece not so long ago on Twitter.
Twitter has the right to suspend Donald Trump,
but it shouldn't.
COHN: Mm-hmm.
HEFFNER: I ask you this in the context of,
not just freedom to safeguard your personal
information, but violations to folks' space
on the web, and these companies' failure to
institute a code of character for its users.
To insure um- a degree of uh- public discourse-
a quality of public discourse that uh- is
going to um- encourage um- you know,
basic human decency and if you peruse Twitter,
Facebook, or some of the other social media-
and we had the CEO of the Anti-Defamation League
on recently, you'll see a huge proliferation of hate speech.
And so when I say a code of character,
it occurs to me, Cindy, that- and I wonder
what EFF's response is to this- that we- we ought
to embrace a freedom from- not just a freedom
to or of- information access but a freedom
from disinformation, misinformation,
that seems to be so pervasive online today.
COHN: Well, I think that um- uh- I-
part of my hesitation with this is- who you going to make god?
Right? Who are you going to decide- who's gonna be
the person that decides what you and I see?
Um. Who are we going to trust with that awesome responsibility?
Because Twitter- I'm not sure I think that
the executives at Twitter or the people at Twitter-
the people at Twitter who do their moderation,
many of whom are, you know,
contractors with very little training who are
based- some of them based in,
you know, San Bruno, California,
many more of them based all over the world-
are really in a good position to be able to tell uh-
what stuff we should be able to see and what stuff we shouldn't.
Um. Our pressure on Twitter and these other companies
is to put tools in the hands of their users-
to let them control what comes at them
and let them control what they do.
There's plenty of room that those companies
should- can do to empower people to be able to mark
things as inappropriate.
To be able to share block lists,
to be able to do all sorts of things to control what they see.
But the minute you outsource that and demand
that a company start doing it,
they're going to start making decisions
that I don't think you're gonna agree with.
And they're going to get played.
We see this all the time in fights around um-
on Twitter and other social networks around, say, the Ukraine.
Um. There are lots and lots of people who uh-
in Russia and in Ukraine who are fighting over who's who-
who's a Nazi, who's not a Nazi.
Who's a real believer in freedom,
who's not a real believer in freedom.
And these companies get played all the time
in terms of who they block and who they let speak.
And you know, Twitter does a lot of blocking.
So do all of these groups.
And they get played- this happens in the-
HEFFNER: What do you mean they get played?
COHN: It means that they get convinced that
somebody's a bad guy when they're not really a bad guy.
The vulnerable people- the people- you know,
I mean the Chinese are masters at this.
They flood the network with people who will push
and attack the people who are critical of the
government such that those people get played as if
they- they get portrayed as if they are the bad
guys, when I suspect, from a human rights
perspective, we might actually think that
they're the good guys.
This happened in the Ukraine and the fight
around the Crimea where there were many many
Russians who flooded on to Twitter and were trying
to convince Twitter that the people who are talking
about Ukrainian independence were actually Nazis.
Some of them actually probably were Nazis-
the Ukrainian Independence movement is complex.
But the idea that a bunch of third party people
sitting in, you know, a- an office in India are
going to be able to understand the geopolitics
of these conversations, to understand who's a good
guy and who's a bad guy and what's fake
and what's real about that, I think- is wishful thinking
on behalf of a lot of us.
And so you have to really be- the First Amendment
idea is that uh- we don't have a centralized person
who decides what's true and what's not true
and what's right and what's not right and which voices
deserve to be heard and which voices don't deserve to be heard.
Instead we let the people themselves decide
what they get to hear and the government stays out
of that conversation.
Now, Twitter's not the government,
and this is- Farhad is right to say that Twitter
certainly has the power to decide who gets to speak
on its platform and who doesn't. Um.
HEFFNER: And he's pointing out that there could be
blowback from Trump's removal just as there
has been and you've documented in individual cases of-
you remove uh- one set of racist accounts
or one kind of bigotry, what you see in return
is a multiplication of that sometimes, too.
COHN: Yes. We call that the Streisand effect
for historical reasons at EFF. I think that's right.
I mean, so- not only is it dangerous to put somebody
in charge of that, but you'll lose.
Right, because people find ways to communicate ideas
to each other regardless.
You can't- they're not magical key words that
only bad people use that good people don't use.
You've got- you know, if you've got a picture of-
you know, I think that, you know,
this example, I heard your ADL thing.
You know, if you've got a picture of,
you know, Auschwitz, that might actually be,
uh, something that's very powerful and something
we think ought to be shared in one context
and something that's really awful
and should not be shared in another.
And expecting Twitter to be omniscient enough
about every single situation in the world
such that they can- that they've got people who can make those
decisions and make them in the way that you
and I would think were right- it's wishful thinking.
HEFFNER: Let me tell you what the problem is though.
It may be wishful thinking,
but the problem is that Twitter is to this
generation what Encyclopedia Britannica was to us.
That's the problem.
The problem is that Twitter is legitimizing
voices as- because so many people get news
and information from Facebook and Twitter-
it becomes valid in a way that it would not be valid,
even though definitionally it might be described in-
there would- that's what I see as the problem.
COHN: But that's a media literacy problem. Right?
This is because this is a new technology
and people haven't learned the kind of skepticism
and the fact-checking ability that you need
to have to be able to deal with the new technology.
You know, Encyclopedia Britannica had errors in it too.
Um- uh- but I think-
HEFFNER: They weren't Nazi memes, though, I mean it just-
COHN: Right, but, but- Twitter isn't supposed
to be the Encyclopedia Britannica and the fact
that people are treating it like that is something we need to fix.
HEFFNER: Well let's talk about the fixing of that problem.
COHN: So- I mean- because there have been
yellow sheets, right? There have been- there has been bad
information shared widely in the world.
HEFFNER: I understand.
COHN: And in this country for the rest- for the time.
But we all learned the difference when we were- you know.
We all need to learn the difference between what
information you can trust and what information you can't trust.
And no amount of technology can replace
that educational process.
And it can lead to really scary bad places if
we decide to do that.
You know, if we decided that the FCC should decide
what's true and what's not true,
that might be really perfectly OK for people
under one administration, but then somebody
else is gonna grab the reins of power,
and are you going to be comfortable with that as well?
Official censors, pressure on companies to become
the official censors is something we have
historical- historical, um- uh- examples of
and it doesn't really end well.
And again, our founding- our founding documents
were created to make sure that didn't happen.
So that we didn't end up in a world where a certain
subset of powerful people, be they in government
or in private, didn't get to decide what the rest of us saw.
HEFFNER: Right. I hear you.
I- I want- the reason- one of the reasons I invited
you here was to help our viewers understand beyond
what browser you use, what EFF implores,
what it encourages in terms of that kind of
responsible stewardship of digital technology.
Because the reality is Twitter
and Facebook have not owned up to being the media- or
reconceive themselves as educational companies in
the way that would enable the kind of experience
that you could foresee in that there is a mechanism
for learning and it is not subject to censorship
but at the same time there can be a greater,
uh- friendlier collaboration on social media.
So what can we do, besides take ownership
of what browser we use, or download the badge,
um, what are we to do, um, in playing our role
in the digital ecosystem to model the kind of internet
we want for the broad spectrum of folks.
COHN: Well, I think that people can engage in fact
checking and share that information.
I also think- you know, one of the things
that I think we should unpack a little bit
is this idea that Facebook especially is some kind of neutral
and that neutrality is the problem,
because Facebook isn't showing you a neutral set
of things when you log on to Facebook.
It has algorithms that are trying to decide what you
want to see based on what you've looked at before
and what people in your cohort have looked for
before, and using these big data techniques to try
to present you with information that will keep you on Facebook.
That's not neutrality. That's a very different thing.
And I think that talking about that and
getting Facebook, um- out of the- out- pushing them
to be out of the business of reinforcing everybody's
biases by what they show them- that's not Facebook-
HEFFNER: Right.
COHN: That's actually a legitimate thing
to have a conversation with them about
and about letting people have more options than just Facebook's
algorithm's prediction of the kind of thing
that's gonna keep you outraged
and upset and thus on Facebook longer.
HEFFNER: But what about net neutrality,
which seems to be up in the air now that the new
administration has taken power... uh- how would you
underscore the importance of the FCC's existing
rules and how they may be reformed now.
COHN: Well, the worry about this is that the-
that the people who control,
you know, what we like to think of as the pipes
of the internet will start using that power
to extract rents from people who want to serve you information.
Right, so the idea is that AT&T goes to,
you know, Craigslist, or one of the other websites
that you might like to go through, Thirteen.org and says,
well, nice website you got there,
you know, why don't you give us a little money
and we'll make sure it actually loads for people.
Um. So that's what we're going to be watching for.
I mean, we're going to fight for the rules.
That's what the rules we're attempting
to try to stop, to make sure that, you know,
your ability to access a website depends on your
desire to access the website and not the deals
that the intermediaries have made with each other
about what information you get to receive and
what information you don't get to receive or how well it works.
And um- so we're going to tend- you know,
this used to be called the end-to-end principle
before it was called network neutrality.
We're going to keep watching for that.
We've already been the leading organization
identifying situations in which that principle's
being violated, and we're going to continue to do so.
Um. Um. But it is nervous-making.
I think the FCC took a pretty bold step
to try to make sure that they enforce these rules for people.
And if that enforcement goes away,
I'm worried that the- the duopoly- Verizon and AT&T
who control access to- you know- a large percentage
of people's broadband, Comcast as well- will
start viewing their role as basically deciding
what you get to see if you're one of their customers.
HEFFNER: And finally, Cindy,
what about the prospect of deletion of content
on .gov sites and across the internet space?
COHN: Well, I think there's a worry about that.
There's a worry about the libel laws- conversations
that Mr. Trump has said and other things.
That's why organizations like the Internet Archive,
um- Brewster Kahle, who is the founder
of the archives, on my board.
And we have represented the Archive a lot
in various battles that they've had.
Um. They've become really important.
But we can do that.
HEFFNER: And he's moving offshore, to Canada.
COHN: Well, he's trying to make a copy of the Archive
that's available- that's not available-
that's not solely vulnerable to U.S. jurisdiction.
There's a piece of the Archive that's already
in Alexandria- uh- the Library of Alexandria in Egypt.
He's moving more and more of it to Canada,
just so it's distributed.
And one of the nice things that networks let us do
is distribute information so that if it goes down
in one place it's available in another still.
And um- I think it's wise for us to begin to think
about what's important information for us to be
able to share, keep, and not lose down a memory
hole and begin to make many copies of it
and spread it across the internet.
HEFFNER: And what does that mean for a lay
person, in terms of their storage on a Google Drive, or Apple.
COHN: Well, for starters, as a consumer,
you should never just have something
on a third party's computer.
You should always keep a home copy of anything
that matters to you.
Whether that- you can just buy a cheap hard drive,
plug it in to your computer,
save all your important documents locally
so that if those go down or if they shift dramatically,
you still have all of your- all of your
important information.
That's important to do regardless.
People should be doing that anyway because
those third party systems can go down.
They can have problems, they can- you know,
there's mergers and acquisitions.
Companies that were here today are not there tomorrow,
and they do not continue their services necessarily.
Uh. Anybody who's ever had music that they bought
from- you know, a Zune or something like
that understands this.
We should think of all of our important documents
and information in that same light and you should
always keep a local copy. Um. So that's-
HEFFNER: Right.
COHN: That's just straight consumer information,
I think, but that's important too because
if those third parties go down,
then people are still going to have
this information and they can do it.
Then there are networks- there's more like Tor,
the Onion server, and other sorts of things
where people who have a little bit more technical
knowledge can begin to make copies of critical
information and keep them uh- in multiple places
so that it's available.
HEFFNER: Servers that are not just Palo Alto, and San Francisco.
COHN: Exactly. Servers that are available around the world.
Um- systems that are available around the world.
Human Rights groups have been doing this
for a very long time.
People are trying to get important information
out of places like China. Um. And Iran and Iraq.
These are all systems that if you go in to the Human
Rights world are kind of well established,
and um- it just may be that the rest of us need
to think about how to use them a little bit as- as well.
Um. But the good news is that a lot of them exist
and that they're fairly easy to use,
things like peer-to-peer systems um as well.
HEFFNER: Thank you so much for joining me today.
COHN: Thank you.
HEFFNER: And thanks to you in the audience.
I hope you join us again next time for a thoughtful
excursion in to the world of ideas.
Until then- keep an open mind.
Please visit The Open Mind website at
Thirteen.org/openmind to view this program online
or to access over 1,500 other interviews and do
check us out on Twitter and Facebook @OpenMindTV
for updates on future programming.
No comments:
Post a Comment