The Misinformation Society

The Misinformation Society


– So welcome. This is our final, our fifth class of Journalism Under Siege? Truth and Trust in a Time of Turmoil. I’m Dawn Garcia. I’m the co-host for this course and the director of the John S. Knight Journalism
Fellowships Program, and my other co-host is Michael Bolden, managing director for communications. This speaker series is a collaboration of the JSK Fellowships, and the continuing studies
program here at Stanford. Each of the five nights of this program has been devoted to issues that are important, key crucial issues that are part of the
JSK Fellowships program, and important issues in journalism. So tonight’s class is The
Misinformation Society. So the midterms are
just upon us, very soon, and we have a deluge of
information coming our way, and a lot of misinformation
coming our way, so it’s flooding the internets, the internets, yeah that’s true, the internet and the airwaves, and how does anybody make sense of it all? So this past week has been
a particularly awful one in terms of misinformation
and speech online, and what it can lead to. I just wanted to highlight one thing, and then we’ll move into
our session here tonight. There’s a story in The New York Times that said on Monday, if you
did a search on Instagram, the photo sharing site
that’s owned by Facebook, you’d find the torrent of
anti-Semitic images and videos were uploaded in wake of
the very tragic shooting at the Pittsburgh Synagogue last weekend. If you searched for the word Jews, you had 11,696 posts with
the hashtag jewsdid911, which claimed that Jews
orchestrated the 2011, September 11 terror attacks. So I think these posts point
to a real stark reality that we’re facing today, which is over, you know the
last decade in Silicon Valley, media companies have extended their reach, and influence all over the world, but one of the things
that’s glaringly apparent is they’ve not quite understood perhaps, or tried to figure out
and have not been able to deal with some of the
negative consequences, or they’ve tried but it’s
been very complicated, so we’re gonna talk about that tonight. You can’t really put the
genie back in the bottle, so there’s going to have
to be solutions found. Tonight I’m gonna moderate a
conversation with Alex Stamos, who’s here with us. He’s the former chief
security officer at Facebook, and he’s now working
to improve the security and the safety of the internet throughout his teaching and
research at Stanford University. The second part of the
evening will be a conversation with academics and journalists, and specialists who are
dealing with the effects of misinformation, and how much
the media trust the public. Please check our Canvas site
for full bios of our speakers. As we do every evening, we’ll have a short break between the two sessions. We invite you, again as always, to write your questions on index cards. You guys are getting really good at those, and give them to Erika Bartholomew. Erika also has a scarf and a key that somebody left last week, so if they’re yours please claim them. A final reminder about Canvas. All the material we’ve been gathering for you is on the Canvas site. Books, links to websites,
articles, biographies. They’re gonna be available
for the rest of the quarter, but the website does turn into a pumpkin, and not on Halloween. December 28th at 11:59
apparently it disappears. So if you’d like to
read any of those things or download them, please
do that before then. We will have videos
posted on YouTube as well. Okay, moving onto our key event. So Alex Stamos is a cybersecurity
expert, business leader, and entrepreneur working to improve the security and safety of the internet through now teaching at Stanford. He’s an adjunct professor at Stanford’s Freeman Spogli Institute, the William Perry Fellow, and the Center for International
Security and Cooperation, and a visiting scholar
at Hoover Institution. Prior to joining Facebook he was at Yahoo, where he was the chief
information security officer, and he led the company’s response to the Edward Snowden disclosures, so we might have a few
questions about that. So let’s begin. So you most recently were at Facebook, chief security officer, where you coauthored a report, Information Operations in Facebook which was an examination of the 2016 US presidential election. What were your biggest
takeaways from that report? – So that report was a kind of mid, a progress report on a research project that we kicked off after the 2016 election to try to dive into what is fake news? And if you remember, I mean fake news in the context of when
it was used in 2016. The term fake news has a
different connotation now, but at the time people
were talking about the, you know Pope endorses Trump stories, Hillary actually has cancer stories, all of those kinds of stories
that you saw on social media. There’s a question of this content, where does it come
from, who is driving it, what is their purpose? And there’s a big project to kind of dive into that and figure it out, and the truth is it’s
from a volume perspective, and from the amount of stuff people saw, the vast majority of what people were calling fake news at the time is actually driven from
people who were financially and ideologically motivated to push it. Not by Russians or any
other foreign actors. The vast, vast majority comes from that, but then as part of
that we had found groups that looked aligned
with Russian interests, and one of the things we
wanted to do with the paper is kind of tease apart the various, you know instead of just
using the fake news terms, we tried to come up with
definitions of misinformation, disinformation, information operation, and then we laid out the
model of what at that point we believed a Russian information
operation looked like. – So a couple questions with that. So what is the difference
between misinformation, disinformation, what was the third– – Information operation.
– Information operation. – Yeah so for our purposes, and this is a problem in
that people who work in tech, work in journalism, work
in kind of offensive cyber have very different
definitions of all this. For our purposes, we were
using the term misinformation to mean information that
is misleading or untrue that is shared without
knowledge of that fact. So you know, information
that is re-shared and pushed, not necessarily with the
intention of misleading. Disinformation is
information that’s pushed intentionally to mislead, and information operations
is the umbrella term for operations on behalf
of an organized group most likely aligned with state government, but not necessarily, and we
can talk a little bit about some of the post-2016 elections, where that definition
becomes more interesting, but with an organized group
to effect a geopolitical goal through manipulating the
information environment. So that could include
misinformation disinformation. It also includes offensive
cyber capabilities to feed into the misinformation
disinformation campaigns. – [Dawn] Right. So the misinformation operations, and you mentioned they’re not all Russian, but some of them were
supporting or tied to. Who are these people? – So, we’ll take a bit of a step back. They’re really three categories
of Russian operations against the US election in 2016. The first was the pure
propaganda operations that happened mostly on social media, and the people behind that were generally people working for private organizations with a nebulous connection to the Russian state
infrastructure, but that are clearly trying to work on the same playbook as the Russian intel agencies. And so the umbrella term
for a bunch of these is called Project Glokta,
which is the name of the street that the main office used to be on. People talk about the
Internet Research Agency. Internet Research Agency is
one of these shell corporations that was uncovered in 2014. They actually used a bunch of
different shell companies now, but effectively you can use
Internet Research Agency as an umbrella term for all the people working for this private organization owned by an oligarch who’s
actually a restaurateur and a friend of Putin’s,
and their goal is to push divisive narratives
in America generally, and around the election obviously to do so around the candidates and the topics that are in the election. We can come back to what
that model looks like a little bit more if you like. – [Dawn] Okay. – The second category of operation was the operation run by the GRU, which the GRU is the main
intelligence directorate of the Russian military, so we don’t have a direct equivalent of this organization in the United States. It’s something like a
combination of the CIA, the Defense Information Agency,
DIA intelligence agency, and parts of the NSA, but these are people who
work for the uniform generals of the Russian military, and the GRU’s offensive cyber operations are often focused on weakening NATO and weakening enemies of
the Russian Federation as part of kind of hybrid warfare against the overall NATO alliance. The GRU activity was a
hack and leak operation. So the GRU hackers were the ones who broke into the DNC. They’re the ones that
spear phished John Podesta and stole John Podesta’s email, and then they took that information and because they now had the leaked emails and leaked information, they
had an incredibly potent tool to change the information
environment in the United States, and so they pushed that leaked information to specific journalists that they thought would be vulnerable to manipulation. Those journalists wrote
the stories they wanted of out of, you know all of these emails they’ve stolen, we want to
tell two or three stories to hurt the Hillary Clinton campaign, and they leaked it to those journalists, the journalists wrote their stories, and then the other side of the
GRU amplified those stories on social media and elsewhere. And then the third category was the part that is the most subtle, and is perhaps the most scary for future elections, which is there was a
campaign to break into the election systems of around 30 states. And in the end, it doesn’t look like they did anything there, and
that could have been a dry run to see what kind of capabilities they have to break into these systems, or in a alternate universe,
that might have been effective, and my thesis here is
I actually don’t think the Russians thought Trump was gonna win. I don’t think even the
Russian intelligence services are better than Nate Silver
at predicting elections, and if you look at, you know, the IRA was involved on both sides of trying to get Americans to hate each other, and the GRU activity was
not pro-Trump activity. It was activity to hurt Hillary, and to create a crisis in
her government from day one, and a conflict between her
and Congress on day one, and so there’s an alternate universe I think where Hillary wins that night. If you remember a couple of
weeks before the election, Trump started dropping
hints that it’s all rigged. He’s not gonna concede. So Hillary wins, Trump doesn’t concede, the next day Wikileaks or
one of these personas the GRU has created could dump
out, here are screenshots from inside the Nevada
Secretary of State’s office. Here’s a database of
voters that has been taken and manipulated from inside
the Ohio Secretary of State’s, and they could have said
we are pro-Hillary hackers that we swung the election for her, and what you would have
had is the FBI saying we’re gonna investigate, and a week later they
would probably say yes, there’s evidence that hackers were inside of all these networks. We don’t know what they did, and you would have
complete total chaos right? We’d then end up in a
Bush v. Gore situation, so those are kind of the three categories, and the first one is kind of
what a lot of our focus was on, ’cause that’s kind of the one that happened mostly within social media. – [Dawn] So all of that was going on– – Yeah. – And Mark Zuckerberg said ridiculous that we have any role in problems with the election. – I’m not sure that’s
exactly what he said. I mean I don’t like to be held responsible for things that Mark Zuckerberg says because I don’t own
like a quarter of Kauai. So if I was responsible
for things that Mark said, if you gave me that much of
Kauai then I’d be cool with it. – I guess the question is why do you, I mean you’re not Mark, but why would he initially
dismiss the whole thing as, I don’t think he said
ridiculous, I forget the word– – No– – But I mean it was really– – And I think he believed
that it was ridiculous to say that Facebook threw the election, which is still an open, I don’t think it’s ridiculous to say it. I don’t think he should have said that. He was not at that time
briefed on the fact that our team had been finding
and stopping GRU activity since the start of 2016, right? So we had kind of an internal
communication issue here of him not understanding all of the stuff that was happening,
and it’s a big company. But he shouldn’t have said
it, but it’s also still an I think an open argument of the 20 things that happened in the last
two months of the election, which of those are awaiting, like there’s two books that
are coming out right now, and one of them from
quantitative social scientists. So they have two totally
different responses to that, so but yes, I mean he said it and I don’t think he should’ve. – So– – Certainly I didn’t think
it was ridiculous, right? Like at that time, we were
deep in this investigation and we had already seen and
shut down a bunch of GRU stuff. – So what’s happening
now is it just escalated. The misinformation,
the spread of it online especially through social media. It has seemed at least I think to readers, people watching from the outside, that Facebook and Google
seem a little lost in how to handle it, at least publicly. – Yeah. – What’s your view? You’ve worked for both Yahoo and Facebook. – Yeah, two very different companies. – Two very different companies. Do they want to fix it? Do they know how to fix it? Do they need help fixing it? Do they care? – Well it matters what
you mean by fix it, right? And that’s, I think
that’s part of the problem is defining what the problem
space is here, right? Do they want their
platforms not to be used for foreign influence on elections? Absolutely, yes. Do they want to be the arbiters of what is true? No, right? And one of the things
you gotta think about when you think about the
decision making processes of these companies is that
they are staffed by Americans, they, a lot of the thinking that goes into kind of the
fundamental policy decisions is based upon a very American
idea of what free speech is, and the fact that you
and I can sit up here and criticize the current government and not worry about being
disappeared from our beds tonight, 90% of Facebook’s users
are not American, right? – [Dawn] Right. – And of the 2.2 billion
people on Facebook platform, and then 2.5 billion people
on the family of apps, my guess is that around
half of those people either live in non-free countries, or live in emerging democracies that do not have protection for speech. And so that’s the other problem is that we take a very American
view of what is fake news, what is the misinformation problem, but any solution that’s put in place looks very very different in
Turkey or India or Thailand, and I think that’s one
of the things you see. It is true, the companies don’t know exactly how to fix this, right? And what you see from the outside is all of these different
groups and equities sometimes rising to the top
and winning some battle, and then them falling behind and somebody else winning, right? And I think that’s one of the
problems the companies have is that they’re not working off of a consistent set of principles
about what they want to do, and so it’s all extremely reactive to whatever the last emergency is, and it is not based upon some kind of reasonable, thoughtful
process of these are the problems we’re gonna work on, and we’re gonna set
these other ones aside, and we’re gonna be public
and transparent about that. – [Dawn] Right, so we had
talked a little earlier about, and you just mentioned now that it’s been pretty US-focused, some of this, but some pretty serious things have happened around the world, right? – Yeah. – Like fake news, fake stories
on what’s happening in India which then about child kidnappings which then led to mobs to murder more than a dozen people
a year, or Myanmar where they doctored messages
on Facebook I understand, and to have fear and anxiety about the Muslim Rohingya group. What do these kind of
international information campaigns that spread like wildfire, what can be done about those? And what places in the world
are you most worried about? – So those are two
totally different problems it turns out, right? So the India problem
is WhatsApp has become the dominant platform there for
interpersonal communication. – For anybody who doesn’t know what WhatsApp is, maybe just– – Sorry, yeah because people in America, WhatsApp is basically the most popular chat app in the world, and not a lot of Americans use it. So it does also show
you kind of the future is not American domination
of the internet, and that this is an American company that belongs to Facebook now, but the vast majority of their users are outside of the United States. WhatsApp is a tool that
started with chatting, and then chatting in small groups, and can now do voice and video chat, but still a lot of it’s used for texting, and sending of, there’s
actually a decent number of WhatsApp users are
functionally illiterate, and use the app without
being able to read and write, and so they send voice
messages to each other. So there’s actually all these interesting emergent properties of these apps when they’re used around the world. But WhatsApp is a tool for that. WhatsApp made this unprecedented decision
a couple of years ago, so WhatsApp was bought by Facebook, and the two guys who founded WhatsApp were named Brian Acton and Jan Koum. They were actually Yahoo alumni, and they were super into privacy in especially the post-Snowden era, and they had always been planning that they were going to
encrypt WhatsApp chats so that those chats
were outside of control of WhatsApp itself, and
after Facebook bought ’em they made that decision
and actually completed it, which is probably the largest uplift in interpersonal privacy
in the history of mankind that like over a couple month period, a billion people all the sudden were able to talk to each other without governments or
companies, in this case Facebook, being able to see any of it. So Facebook sees none of those chats. Facebook can’t moderate
those in the same way Facebook moderates the
Facebook app and Instagram, and so what has happened
in India and Brazil and a couple other places
where WhatsApp has taken over, is that the texture of
misinformation looks very different. It’s injected via true
believers of political parties, and then those true believers amplify it and send it into groups. So like in India, people will have groups of dozens or hundreds of people of their extended family
and cousins and all that, and so a true believer
in one of the parties, let’s say the BJP ’cause they’re probably the most active in this, a true believer in the PJB,
BJP, in Hindu nationalism will send a Hindu nationalist
anti-Muslim message that was fed to them via WhatsApp by somebody who works for the party, and they will amplify
it and they will send it to all the dozens of WhatsApp
groups that they’re part of, and then that will get amplified and the people will
forward it and forward it. And so from WhatsApp’s
perspective that is mostly opaque because all that
communication is encrypted, and so the WhatsApp problem
is a really difficult privacy versus information
integrity problem, ’cause effectively if you
wanted to completely stop that, you’d have to one, drop encryption, and two you’d have to build AI to spy on every single person’s
communications, right? And so that becomes a really creepy future versus the Myanmar problem
which is much closer to kind of the Russian
interference, except it’s domestic. The Myanmar government themselves are running fake accounts
and running profiles and then using it to push
anti-Rohingya sentiment. – Right. Are there parts of the world
that you’re worried about? – So when I think of the two elections I’m worried about next year, the Indian election is probably absolutely number one, right? It is a critical election
in their history. It is also the BJP is
both a major complainer about WhatsApp misinformation and they are also it looks like the largest purveyor of that. And so the call is coming
from inside the house, which is not surprising for anybody here who’s ever texted with an Indian teenager, it is a combination of in
the Latin character set, English, Hindi, Gujarati, Punjabi, it is, you can see the
future when you text with a young person in India, but it is also completely opaque too. You can’t just run it through
Google translate, right? It’s not the kind of thing
that it would be easy for a foreign country to influence, except maybe Pakistan, and if I was in India I would be worried about Pakistan, because
of the shared languages and shared movies and culture
that make that possible, but it would be very difficult to build an information operation
in the United States to target in India because of that issue. But it’s because it’s
domestic and it’s being pushed by people who are aligned to the parties, it also makes I think in some
ways much more dangerous, because the messages are very finely tuned for the sentiments of very
particular places in India, and that makes it a really
difficult problem to deal with. – [Dawn] So India? – And then probably the other election I’m most interested in is the European Parliamentary elections, because–
– Talk about that. – There have been these
massive right wing victories in Poland and Hungary, at the state level, and there has yet to be an election where there has been kind
of the populist alt right, as we call it here, exceed
to the European Parliament. The Russians would absolutely love that. The GRU activity is really targeted against the multinational institutions that pull Western countries together. They don’t like NATO,
they don’t like the EU, they don’t like the Five Eyes, the alliance the United States has with our anglophone allies, and so anything they can do
to fissure those alliances, they see as strengthening
their position in the world. And so getting people elected
from Italy, Hungary, Poland maybe a couple other countries where there’s regional parties that align with their interests, getting them to European
Parliament would be a huge uplift. The other interesting
thing is this will be the first major set of elections after the passing of GDPR, which has a number of–
– And maybe say GDPR– – Sorry, yeah right. So GDPR’s the General
Data Privacy Regulation that’s a new massive
overarching renovation of European privacy law. There are two issues with it. One, nobody knows what it means yet because the way this works in Europe is the European Commission passes this very generic language, and then that language is made real by 28 different national
data protection authorities and then in Germany, because
of a historical artifact, they have 12 state data
protection authorities that get to make their
own interpretations. So that’s fun for people. That’s fine, that’s just like
a cost of regulation thing. The interesting thing about GDPR when it comes to elections is that it has a number of pro-privacy moves that can be weaponized,
and because of some quirks in the relationship between
European governments and US tech companies,
specifically the inability to share data with the governments without the US government
as an intermediary, there’s a real risk of the abuse of GDPR, particularly a part called Article 17 by organized actors
like the IRA or the GRU, and so it’ll be interesting
to see if they figure that out and are able to figure out
how to weaponize privacy law on their behalf to make it difficult to do these investigations. – [Dawn] One last election question, then I want to turn to what
you’re doing here at Stanford. Brazil. So? What happened there? How was social media involved
do you think in that? – Right so it’s hard to know overall because what you get are
these little snapshots from people who do research, and the best research has
been done by a university, the acronym of which is UMFG. I won’t try to pronounce the name in Portuguese and insult them, but what they did is they built a project where their students
infiltrated pro-Bolsonaro groups and measured how much of
the kind of image memes, the images you know like here’s an image and it’ll have some text on it, right? How many of those were misleading, you know something like 50%, so they did some work, but it’s very hard because of the encrypted nature, to understand how much reach that had, and what the impact was. There’s also people on the left who are pushing the same story. See that’s one of the things
we’ve got to think about is you can, every political campaign for the rest of our lives is going to have an online component. So if you look at Facebook or WhatsApp or any social network, if you’ve decided that
that technology is evil, you can really justify that to yourself because you will only see the stuff that supports the other side, but the truth is is right now in the current election,
democratic candidates and democratic groups have way
more engagement on Facebook than the Republican side. There’s a good New York
Times article about this, and so you gotta be careful
when you make these comparisons to understand that
there’s two sides there. I mean what happened with Bolsonaro, I’m not an expert in Portuguese, or I’m sorry in Brazilian history, you know they have this history of military dictatorship. They have a huge crime problem. People looking for the big brother figure, and he won by 10%, so this was not a Trump-like victory of losing the popular vote and
winning by a technicality. This was a resounding victory, and my message to people in Brazil around things like WhatsApp encryption is you gotta be real careful in that Bolsonaro had a lot of
corporations behind him, people who owned newspapers,
people who owned TV stations. The fact that WhatsApp is encrypted, which because of WhatsApp being encrypted, the head of the Brazilian
office of Facebook has been arrested twice already because Facebook has
not dropped encryption to give data to the Brazilian state. The fact that the Brazilian state, which Bolsonaro now controls. He has the intel agencies, he has the federal prosecutors office, he has the federal
national police of Brazil are under his control. The fact that they do not have access to people’s personal communications is gonna be one of the only asymmetries that is somewhat evened out, and it’s by an American tech company. So there’s this weird dystopian
future thing going on here, and we gotta be real careful about assigning blame
to private communication for this kind of stuff, because
I think that in the long run you don’t beat the powerful
by giving more governments and more corporations control
over people’s speech, right? – [Dawn] Yeah. I know I have a lot of friends in Brazil who are very worried right now. – Right, and I think they should be. I mean, he is, he says horrible, crazy stuff and we’ve lived through two years of Trump and the things he says is shocking. Bolsonaro is shocking even after that, so it’s unbelievable. I mean my hope is I
don’t really understand, my hope is that there’s
still some institutional, you know, brakes on his ability to enact, and also every Brazilian president in the last couple years ends up getting indicted for corruption. – So maybe he won’t make it. – Maybe they will, yes maybe they’ll have a level of instability
that generally is bad, but in this case might keep him from putting a truly fascist state in place. – Right. So you come to Stanford now, with the intent to improve the security and safety of the internet. Just a small, tiny job.
– Yeah. – What can you do from
your perch at Stanford that you could not do
while you were at Facebook? – That’s a good question. It was great to, how do I say this? I’m glad I, people would
ask me do you like your job? When I was at Facebook. And my answer would be I’m glad I’m in it, which is different than
I like my job, right? And there is a benefit of
being on the outside of one, being able to think about these ideas not just in emergency, right? I see now my friends on that
side who are still there who are jumping from critical emergency to critical emergency an they’re never able to
take a step back, right? So I have the ability of
having some of that experience and now being able to chill a little bit and not get a call at two a.m. that a child’s been kidnapped or that somebody’s been arrested by the Secret Police in Thailand. – So you get to go trick or treating with your kids tomorrow night. – Yes, for the first
time in like four years I get to actually go trick or treating. I was in Africa one
year getting yelled at. You know, yes it was, that is nice, but yeah. And so I do have the ability to kind of think about that kind of stuff. I think the other thing we’ve
got to do in academia is you know I come from a very traditional information security background, right? As effectively a teenage hacker, Berkeley double E, studied computer security academically, worked professionally, I started a company of
professional hackers. But then when I got to Yahoo, and then especially Facebook
it became readily apparent that the vast majority of bad things that happen to people online have no interesting
technical component, right? They’re not related to all
of the sexy security bugs that those of us in the security
industry are interested in. They are, because of what is termed abuse, which is the technically correct use of technology to cause harm. They are the bullying and
harassment of everybody, but especially women,
especially minorities. They are teenagers telling each other to commit suicide on Instagram. The sexual abuse of
children which is actually the worst thing that happens
any day on the internet, and nobody ever talks about it, but it’s just like a
horrible background level of lives being destroyed, and that happens all the time and it’s not an area
that people talk about, even though these companies
have huge teams working on it. And so there is no academic
discipline that studies that. For information security, computer science departments study it, but they’re really, you
know you don’t get a PhD in computer science for
studying spamming or phishing because it’s not technically
sophisticated enough, and so one of the things I’m
trying to do here at Stanford is to create a center
where we can pull people from different disciplines to work on it, and there’s a computer science component. There are people from communications, there are people from
psychology, area studies right? So for example, one of
the things we want to do is have a project to look is to examine the misuse of
WhatsApp in the Indian election, and that’s going to require
computer science students to build the technology
that allows us to collect up information from all over the place. It’s gonna require people
from the Southeast Asia Center who understand Indian politics. It’s gonna require the fact that we have this incredibly diverse student body, so we can pay research assistants
to sit in their dorm rooms and to translate tweets for us. Like I said, there’s no way, we can’t run through Google translate the stuff that people
say on WhatsApp in India. And so those are the kinds of things that hopefully we can do in academia, and one of the things we
can do is you’re right, the companies don’t know
what they’re doing, right? And they’re making it up as they go along, and from the company’s side they feel like they’re besieged no matter what they do, they’re gonna get blamed right? If they don’t censor
enough, they get blamed, and if they censor too
much they get blamed, and there’s no way out, and part of the problem is
there’s not an infrastructure around the decision making processes that are happening in the companies. And effectively, all of the
important policy reactions in 2016 have happened inside of Facebook, Google, and Twitter. Congress has done nothing, right? The United States
Congress has done nothing. The executive branch has
done close to nothing to protect 2018. Those companies have defined
what is political advertising. They’ve defined the transparency rules around political advertising. They’ve defined the level of the standard that you have to meet. They have built information
sharing agreements between each other. – Do we want that to happen? – Well I think so. – I guess we want someone to do it. – Well somebody’s gotta do it. None of it’s legally required, and they’re just making
it up as they go along, and unlike when Congress makes a decision, and there’s think tanks writing papers, and there’s lobbyists coming
in to lobby for their side, they’re just kind of inventing it, and yeah in a mixed world
we can take a step back, and we can think about what
do we want these people to do? What are the principles that we want? What is the kind of
transparency that we want? And we can build an intellectual
framework within which people at Facebook and Google and Twitter and all these other
companies can make decisions that you know, they want to have the help to do that correctly. – Do they want that help? – Oh I think so, absolutely yeah. No and they’re absolutely
desperate for right now, because right now no matter
what decision they make, they take a lot of crap from people, and that gets a little frustrating and it becomes like a
bunker mentality I think. They want some external, if there’s some external discussion of this is a reasonable set of
tradeoffs in these areas, and they can adapt while
we’re going to try to follow these reasonable set of tradeoffs as from this group of academics, I think they’d be very
open to that kind of stuff, and that’s something that
we can do at Stanford in a way that other companies, or universities can’t because we’re so close to the companies. ‘Cause we have the connections there. We have Hoover in DC, and so we can service a bridge between these different groups in a way that nobody is serving right now. – So you seem very passionate
about these issues. What drives you to do this work? – Yeah. I mean the reason I got into security is the puzzle aspect, right? It’s really fun to hack into stuff. It’s really fun to look at a big complex system and to break it. And I did that for years,
and that was great, and then I realized like, it’s easy to break stuff. It’s a lot hard to defend stuff, right? Terrorists can blow up buildings. They can’t design them and
build them themselves, right? And I think that’s one of the switches we have to make in the security world is to have as much
emphasis on the building and the protection side. But then I took the job at Yahoo because I wanted to have
some direct consumer impact, and it turns out the world’s way worse than I thought, right? We just, in Silicon Valley
we don’t build technologies that are safe for people to use every day. We don’t build safe tech the
way Toyota builds safe cars. If anybody in this room
goes and clicks on a link, and gets tricked into
giving up their password the same way John Podesta did, John Podesta, the man who is
the White House chief of staff and had a top secret SCI clearance and who is incredibly paranoid, if anybody in this room fell for that what the companies would
say is like oh, user error. That’s your problem. But if you drive your
car into a highway median at 20 miles per hour, it doesn’t just explode and
burn you to death, right? You can’t, Toyota can’t
just be like well yeah, you really shouldn’t
run into walls, right? Like, but that’s how we
act in Silicon Valley, and I think that really makes me angry that we don’t build technology that is safe for people by default, or safe in the way that they use it. We build stuff that’s secure
if used perfectly sometimes. We never build stuff that
is safe for normal people, and I think that’s just a massive failing on behalf of the industry, and it takes people from the industry I think to effect that change. – [Dawn] So two last questions for me, and then we’ll have some
questions from the audience. One, what keeps you up at night? And two, where do you see hope? – I mean I feel like,
the truth is as a society we’re never going to, we’re never going back to the
Walter Cronkite era, right? And that is kind of implied, we’ll see from the next panel, but that’s kind of implied when I talk to a lot of journalists and
people that work in journalism is that the truth is we’re living through this crazy transition in that
we had this mass media era which was honestly unique, right? Radio, television, newspapers
in everybody’s homes. From all of human history, the ability for a small number of
people to communicate with millions of people is actually a artificial time, right? People used to be illiterate, right? And there used to be no information that you didn’t get
directly from other people for tens of thousands of years and then we have this strange
time in the mass media era when 50 middle aged white guys decided what was newsworthy, right? And that there seems to
be kind of a implied, you know there’s a number of people who kind of want to go back to that era. They can’t say that, but
that is implied in their goal of talking about what social
media’s done and stuff. The truth is the marginal
cost of moving information has gone to zero. It no longer costs any money to move information around the world. That means there is no
more newspaper oligopolies. There are no more TV oligopolies. There will never again be
an informational oligopoly so you can wipe out
Facebook, Twitter, Google, all those companies and nothing changes because the fundamental
fact is billions of people have the ability to move information. Other people will rise
up into their place, and so you just have
to kind of accept that and move past how as a
society do we live in a world where there are no longer any gatekeepers, and there never will be? Or if they are, they’re horrible. I mean there are places with gatekeepers like the People’s
Republic of China, right? We’re now in a place where if
you have those gatekeepers, it becomes an incredible
mechanism of control. And I’m really worried
about a couple things. I’m worried can our society
survive in that world where anybody can run a newspaper, anybody can be a TV station? There are upsides. Black Lives Matter, the Me Too movement, things that were totally
valid problems 30 years ago, but nobody talked about it because the editors of CBS News, ABC News, NBC News, CNN, The New
York Times, The LA Times were all middle aged white men, and that was not a
problem for them, right? But in a world where everybody
has a pocket supercomputer that is also a TV truck, those kinds of, and the
ability to broadcast that around the world, those kinds of problems can be things that our society deals with. So that’s what makes me
hopeful is that there are these positive kind of grassroots from the bottom movements,
and emergent things that are actually positive. How do we have those while also then not having these crazy populist movements take advantage of it? And I’m just, I am worried
about us not figuring that out, and I’m worried about an
overreaction where we end up building levers of control. I think this is where I differ from some of my other progressive friends. There’s a lot of people
on kind of the center left that I agree with
politically in a lot of ways who are very excited to control the speech of other people, right? Or to control the information
consumptions habits of other people, specifically
Trump voters, right? And my fear is that we
create these levers, and a future populist who is smarter and more effective than
Trump, gets their hands on it, and so my message to people when they think about these controls is give Silicon Valley companies the power you would give them not
with Mark Zuckerberg and Sundar Pichai or any of them. Mark is a socially progressive millennial from Long Island, right? Imagine Peter Thiel,
the Austrian Ubermensch who believes the world was
better when women couldn’t vote, and who sued Gawker out of existence. Think about him running
one of those companies. What kind of powers to define journalism, to define fake news, to
control people’s speech, would you want to have in his hands? And that’s like, you know, we can imagine a world
where people we agree with make these rules up, but the truth is not only people we agree with are gonna have their hands on those levers, so we gotta be real careful creating those levers in the first place. – Right, okay. Okay. We have got a few questions here. Tim Cook called for our version of GDR. Would it happen or was it disingenuous? – I think Tim completely agrees
it would be better if his corporate competitors had their
business models kneecapped. I do believe Tim thinks
that is a positive thing, but I think what’s happened
is the United States, we have completely we don’t have a competent privacy regulator in the same way European countries do. The Federal Trade Commission
has privacy regulation, but they can only do so in the context of an unfair business practice or a deceptive business practice. So effectively they go
around suing companies that lie about their privacy protections, which just incentivize companies
to have terms of service that say we can do anything to screw you with your data whenever we want, right? And they’re like well,
it wasn’t undeceptive. And like because the United States does not have a functioning
privacy regulatory system, we have created this vacuum the Europeans have stepped into, and the Europeans are not very good at passing law about tech. They do not have very
technically sophisticated people in the European Commission. The European Commission is not very democratically accountable, and there’s a massive problem
of the European Commission not having national
security responsibility, and so what the state governments are asking tech companies to
do is completely incompatible with what the European Commission is asking them to do, right? And so, we could do better in America. I think we should pass our own GDPR, but with a single privacy regulator who defines what it means, a single set of rules across
the entire United States, and taking in all those equities, and I think that would then
reduce the chance of Europe controlling where we’re going, because the Europeans pretty clearly don’t know what they’re doing. – Mhmm. So the Facebook annual report notes that as the controlling– – Somebody typed that up
while we were sitting here? – Yes.
– Wow. – I don’t know what they’re printing on. Mark Zuckerberg can vote his
shares in his own interest which may not always be in the interest of our stockholders generally is a quote. In effect this means he could
pursue policies at Facebook that facilitate the operation
of democracy in the US, even if it did not maximize
long run profits at Facebook. Do you think he should
devote even more resources to making democracy work in the US, and if so, what would
you advise him to do? – So I feel like that’s two parts. So in the latter, the thing
I said in the Drell Lecture that I think’s been a huge
failure of the tech companies is finding a sustainable
model for journalism, especially local journalism
in the 21st Century. – Yes.
– Right? We’re moving to this subscription model that’s working fine for The New York Times and The Washington Post, but one that doesn’t work for,
you know I’m from Sacramento. Sacramento Bee is the sad shell of what it used to be, right? It’s all stories from other,
it’s just incredibly sad to go home and to read the Bee, which doesn’t even get delivered on Monday through Wednesday or something. It’s incredibly bad, and this is a, you go to Estonia, Estonia
is 1.3 million people, so that’s about the size
of Sacramento County. They have like five daily
newspapers in Estonia. Like how is it that we can’t
do this in the United States? So the tech companies have to find a model that is not subscription based, ’cause the other thing we’re
doing with subscriptions is you’re gonna end up with rich people getting good journalism,
and everybody else is reading Brietbart, right? And that is not a good future
of a small number of people with Wall Street Journal, New York Times, and Washington Post subscriptions being the only people who are exposed to that kind of good journalism. On the first one on
Mark voting his shares, I don’t think that’s a correct
statement of the rules. I’m actually really afraid of a future where there’s a shareholder
revolt against Mark, and it’s because I worked
at a tech company, Yahoo, that had an active investor, and during that period of time the company did everything it can to
make Wall Street happy, and a tech company
making Wall Street happy is completely orthogonal to a tech company living up to its security, safety, and privacy promises, right? In the middle of finding Russians, I have people being laid
off from my team, right? Which is eventually why I had to go. You can’t fix stuff
when that’s the context in which you’re working. – I missed that, why were people being laid off from your team? – Because midway through our CEO’s tenure she picked up active investors who cared all about the
short term revenue, right? And so they were cutting employees from Yahoo very very quickly, and so– – Oh yeah. – Mark is beyond money.
– Yes. – He doesn’t care. Again, he owns a big chunk, like he’s gonna be, him and Priscilla are gonna be the Astors or the
Rockefellers of the Bay Area. People are gonna go to Zuckerberg Hospital next to Zuckerberg Place and take the Zuckerberg
Highline, you know. They’re at that stage. He cares about his historical legacy. That is good. I would hate for there
to be a CEO of Facebook that cares about making
the quarterly numbers good, ’cause that would be a terrifying thing, and so the fact that Mark has that control I think is a temporary respite. That kind of control has
never really been tested. We’ve never ended up in a situation where you’ve got like the
vanguards and the fidelities and the other people who
actually own Facebook, you know revolt against
the founder control, and in a world where there’s a CEO who’s just pumping the stock, I think a lot of the investment that needs to happen would not happen. – Why were hackers able
to recently harvest the private data of nearly
100 million Facebook users? – So hackers were able to get to some profile information
on 39 million users because they found a security flaw that was actually in a privacy feature. It was a privacy feature that let you see how your profile
looked like somebody else, so that’s called
impersonation in software. It is a very very dangerous thing, and there was three security bugs that you could put together
that they could get a token that let them then see
other parts of your profile, but not all of it. They were able to do that
because software is fallible. Because it is created by
humans, and humans are fallible. Born of man and mortal. You know Facebook probably has the third most sophisticated software
security program in tech, after Microsoft and Google, and still when you write
hundreds of millions of lines of code, you’re gonna end
up having bugs like that. – So just sort of spinning off of that, so it’s my understanding
this questioner says there’s no way to fix a problem with total security on
Facebook or Yahoo et cetera. When starting these companies, were they naive to assume
they would not need security, or was it about financial
gains and worldwide usage that was the driving force? – So, I don’t think any
companies think they don’t need, any of these companies think
they don’t need security. That is a standard
problem in Silicon Valley is people don’t build security until they have their first incident. The Yahoo issue is Yahoo has
been dying for 10, 12 years, and so seeing a, just like
when a huge star dies, it causes a lot of damage
to the stellar neighborhood, it is really when a tech company that has the information of hundreds of millions of people dies, it has a humongous blast radius, and that’s the Yahoo story. On the Facebook side I
think on the safety issues, the growth mindset was a problem, right? That Facebook expanded much more quickly than it could handle the issues. On the security side, very few companies have put as much money into secure software
development as Facebook has, and still there are going to be flaws, and I’m not sure what to say about that other than the tradeoff there
is not really about money, but about the speed at which you develop certain feature sets
and you roll those out, and I think that’s the balance that they’re probably rethinking now is on the development speed versus some of the software security stuff that might slow down that pipeline. – [Dawn] Mhmm. Let’s see, you have so many questions. Let me see if– – Any other typed ones? ‘Cause I feel like we
should give precedent to somebody who brought
a laser printer to a… (audience laughs) – [Dawn] If Facebook is just a platform, what is its lifespan? Won’t people eventually move on, like all of my nieces and
nephews almost all the time? – Yeah. Yeah.
– Yeah? – I mean, 101 is littered
by the bleached skulls of all the companies
that have come before. (audience laughs) So I think that is one of the silly things that people that talk about these companies lasting forever. They have these life cycles
where they get big, or smaller, or they merge and the products change. I think what Mark is
trying to do at this point is to see competitors, this is probably his
superpower of all the things he’s able to do as CEO is
he’s been really really good at seeing emerging competitors and then buying them while
they’re still affordable. He bought Instagram for
a couple billion dollars and that’s worth 100
billion on their own now. He bought WhatsApp for like, I think it ended up being in the 20 to 22 billion dollar range, ’cause there’s stock and it fluctuates. Yeah, well it seemed like a
huge amount of money at the time and now everybody’s like that was the smartest decision he’s ever made. Now, how does Facebook continue to do that in the current anti-trust environment, which I think will prevent them from ever buying a
social network of scale? I don’t know, and so I
think from his perspective he now has to have that kind
of creativity organically or Facebook’s gonna have to, like Facebook could not buy Snapchat, so effectively the company
copied Snapchat’s product and crushed Snapchat under
the wheels of Instagram, and so whether or not
they’re gonna be able to continue to do that at a speed in which they do not become a
bleached skull is a question. – [Dawn] So you gave a lecture, the Drell Lecture a few weeks ago, and you talked about you
were quoted as saying our society will not
be healthy if we don’t have good economic model for journalism. And that somehow, nobody wanted to destroy the
newspaper industry you said, but this was a consequence
of allowing people to be journalists and take away things like classified
advertising economics. Is it Silicon Valley tech companies’ jobs to fix the economic model of journalism? – I don’t think like in a big picture
kind of moral ethical, it’s not their job. I think practically if
the companies want to not be destroyed by the rest of society, they have to, right? – [Dawn] What should they do? – I think we’re probably gonna have to, so one of the real challenges here is the advertising model
has this incredible benefit of a small number of
consumers in rich countries subsidize the creation of these platforms that reach billions of people, right? So Facebook released their
quarterly numbers today. I didn’t see the exact number,
but I’m guessing the APRU, which is the, or ARPU,
average revenue per user which is the amount of money
Facebook makes per user in North America is something
in the 25 dollars range. Every year, every American
Facebook user worth 25 bucks. In Africa it’s like a dollar, right? So people in Africa have access
to all of these platforms because there are people
who click ads and buy stuff in Europe, North America,
and the rich parts of Asia. And so the question is is
how do you build a model that replicates some of that without it being, like
the advertising model is not working for journalism. The subscription model like I said, one it creates a two class
society within classes, and then globally it creates a
huge class difference, right? And so how do you replicate
that kind of model where you can support
people creating content all around the world? I think maybe it’s a micropayment model. Maybe it is through advertising,
but via shared revenue. I feel like it’s not gonna be the advertising model, honestly. ‘Cause the value per
user is not big enough, and driving value out of advertising if you have to make more, that means violating people’s privacy more, and I think that’s just a
direction that we’re not gonna go, and we shouldn’t go. And so, it’s probably gonna be something like a Spotify model, right? Like you pay Spotify
whatever, 15 bucks a month. – [Dawn] Spotify for news. – Yeah, and people play, you know you play a Lana Del Rey track, she gets one cent right? And an aggregate, the amount
of money, same with Netflix. Netflix collects billions and
billions of dollars per year, and then they divvy it out to the content that’s played the most, and I think we might have
to have a model like that. There’s a number of
companies working on that. I don’t think any of the small companies will ever be able to do it because they just don’t have the scale. Really Google, Amazon, Facebook are probably the only three companies that could fix that problem. – [Dawn] Okay so we’ll be back
in touch with you about that. – Oh sure, yeah. – So I want to thank
Alex Stamos for tonight, and thank you.
– Thank you very much. (audience applauds) – Thank you, that was great. – Thank you for being on the
journey with us this month, and for being here for this final panel. Alex and Dawn discussed
platforms, misinformation, and the effect on the news and information ecosystem
around the world. Now our final guests will discuss some of the actions news
organizations are taking, or can take, to counter misinformation and to rebuild your trust. Our moderator this evening is the 2019 JSK fellow. She’s seated in the center. It is Mandy Jenkins. She was the first editor
in chief at Storyful, where she oversaw a team
that worked with newsrooms to find, verify, and
publish eyewitness media and social insights from around the world. She is president of the
Online News Association’s board of directors. To your left, Dr. Meredith Clark. Dr. Clark is a former journalist and an assistant professor in the Department of Media Studies at the University of Virginia. Her research focuses on the intersections of race, media, and power, and she has done deep research into social media communities, and their attitudes on
trust and the media. Seated next to Dr. Clark is Sally Lehrman. Sally is senior director of
the Journalism Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University. She leads the Trust Project, an international collaboration to strengthen public
confidence in the news. Many news organizations
as well as the platform such as Google and Facebook
are participants in that. She is also a 1996 alumni of the John S. Knight Journalism Fellowships. Seated next to Mandy on the
other side is Aaron Sharockman. Aaron is the executive
director of PolitiFact, the largest fact checking
organization in the United States, which won the Pulitzer Prize
for national reporting in 2009 for its coverage of the
2008 presidential election. If you’ve seen the Truth-O-Meter measuring the truth of stories, PolitiFact is the organization
that gives you that. And seated next to Aaron is Lynn Walsh. Lynn is project manager for a
project called Trusting News, an initiative at the
University of Missouri which tests strategy
for improving the trust readers have in news organizations. She serves on the National Board for the Society of
Professional Journalists, and served as president in 2016, 2017. That’s your panel for the evening, and your final panel for the class. Take it away, Mandy. – Thanks Michael. So, as we’ve seen over
the course of this class these past few weeks,
there are lots of obstacles standing between
journalists and the public that used to be the audience. We have economic forces that
are squeezing local newsrooms, we have crackdowns on
freedom around the world, and now we also have bad actors using the tools of mass media
to spread disinformation. Sometimes the barrier
is all of these things and none of these things at the same time. It’s a failing of trust between newsrooms and their audiences, and between people and the institutions that they used to trust a long time ago, and that’s our focus
for this panel tonight. So in 1976, Gallup found
that 72% of Americans had confidence in the news media, and since that time, the trust
in journalism has plummeted. So, I want to start with
Sally here to my right. You’ve done research on how the audience interacts with the news and have seen some of this over time, so how did we get here? How did we get to this place right now? – Boy, there’s a question. So I think back to 1997, when actually the seeds of
the Trust Project began, and I gathered together
a group of editors, and we were thinking about
well what is it that, what kind of journalism ethics should we be thinking
about in this digital era? And I heard a lot from
the right at that moment, so this is you know, 20 plus years ago, about their concerns that
the digital environment was degrading the quality of the news, and was degrading the ethics of the news, and you know, fast forward 20 years and I came to Santa Clara University and had a faculty position and we brought together a
similar group of editors from around the US, and they were talking about the same thing, and
so I thought well we really shouldn’t be talking about this anymore. We should be trying to
figure out what’s wrong, and find a way to address it, and their concern at that time and four years ago when I brought the same kinds of people back together was again that the digital environment that we were just hearing about was really supporting clicks. It was supporting the worst of journalism because everyone was out there rushing to get the clicks, was rushing to write the kind of headlines that would grab people’s emotions, and they’d be setting aside some of their ethics for that reason. So I was thinking, can’t
we flip the picture and use the digital environment toward supporting the greater good, or the kinds of good that
journalism really does? And so this problem of trust in the news is not new, as you said dating back to the ’70s. I believe that a lot of it did start or it worsened with the
digital environment, and these editors were really seeing that this was gonna happen, and now of course we have
these incredible forces. We’ve seen it manifest. Incredible forces of misinformation of still the clickbait problem. We have increasing uncertainty on the part of the public
because of the misinformation and all the, as you’re
getting 2/3 of people, or people are getting 2/3 of their news from social or search today, so they’re not seeing
the brand necessarily or you’re not seeing the brand. You’re going right to an article and you don’t necessarily
know where it’s coming from, and it looks exactly like all
other kinds of information, so of course there’s a lot of uncertainly about where is this coming from? And then finally there
are deliberate efforts to undermine the public’s
faith in the news. Now, the other piece of course is that journalism has not always
done right by the public and we do make a lot of mistakes, and there are populations
that we have not covered well at all for a long time. So combine all these forces
in the digital environment with the problems that have been within the process of
journalism for some time, the business model declining, and now we’ve got a
really bad toxic equation. So what we did was, as
I was thinking about well how do we flip the picture? Where do we begin to
address this Trust Project? I thought well we can’t keep
talking among journalists. It’s time to go out and
really talk to the public, and so inspired in fact by what
I learned here at Stanford, coming back as an alum to the
Knight Fellowship program, we did empathy interviews with the public and went one on one with
people and asked them across Europe and the
US, across race, class, gender, or generation and geography, what is it you value in the news, when do you trust it, and when don’t you? And out of that to me
came some encouraging discoveries, and I’ll just
tell you briefly what they are and then I’ll let you
ask some more questions, but so we were able to
find four user types, and one is the avid news user, and that’s probably
many of you in the room, and those are people that are out there checking and cross checking the news, and then pushing it out through your networks to help guide others. Then there’s the engaged news user who is interested in the news, maybe subscribes, talks about
it with friends and family, but is a little bit
overwhelmed and uncertain. Then we have the opportunistic, who is the type of person
who just doesn’t have time to pursue the news,
and so it’s just water. They’re there when it washes over them maybe in the break room,
it’s on the television, or they get a notification on their phone. And then the angry and disengaged, and those are the ones that we worry about most in journalism. The part that I feel
is encouraging is that number one across all those groups there was a, lots of expression that news matters. It’s just from the angry
disengaged point of view well, it matters to us but we don’t
think it’s doing a good job, and then across the middle there was a lot of uncertainty
about what to trust or not, so I feel that we can through things like what we’re
doing at the Trust Project, we can engage those avid folks to help people down the line, and so what we do is provide
these trust indicators to the public, and I’ll tell
you more about them later, but essentially provide the tools to help people know who
and what is behind the news so that each group can help the other to do a better job of really
becoming an informed news user. – Thanks Sally. Thanks for giving us the foundation for where we’re starting
off for here today, and actually that’s a great segue. Lynn I wanted to ask you in trusting news, you guys do a lot, especially
when we’re talking about research into transparency,
media literacy you know what are some findings
that you guys have found in your work so far about
essentially the plot we’ve been talking about here with Sally? – Yeah so at the Trusting News Project, we work directly with newsrooms, so these are local newspapers,
local television stations, online only news organizations, and with their help we were
able to talk to their audiences, so people like you who
might read the newspaper, watch the news. And what we found is that primarily, people said, news consumers said that they want ethical,
responsible journalism. They want journalism that is going to show opposing viewpoints. They want journalism that’s
going to show context. They want journalism that is going to include multiple sources, and there were some outliers on there that said most of journalism is biased, but for the most part
people wanted journalism, just good, ethical,
responsible journalism. And the things that they
said that they wanted, what’s interesting about
it is they’re things that journalists are
already doing day to day, so once we get a story
we’re making multiple calls. You may only see that one person on TV that we talk to, or you may only see one quote in the newspaper,
but there were multiple people that we talked to, so
that’s just one example. So what we’re trying to do
at the Trusting News Project is because journalists are doing what the public wants them
to do to be responsible and to be responsible journalists, but what journalists are doing allows a job basically
of showing the public how we do those things, and
that we even do those things. So we’re working directly with journalists to get them to explain more
about how our process works, talk and be very very
specific about what is news, what is opinion. Also being very careful to
not use journalism jargon in everyday reporting. So basically just to be more transparent with everything that we’re doing so that the public is
very aware that we are doing these things that you want us to do, and the goal is that if they
know that we’re doing this, can we meet at a point then to then talk about the content that we’re sharing, instead of having these
assumptions that are made that generally are negative. – Thank you. I definitely want us to
return to that concept about transparency and media
literacy a little bit later, but I want to shift for a second ’cause we talk about the public, and that’s a lot of people. That’s a pretty wide look at that, but I’d like to shift to look
at some specific audiences we’re talking about trust. So Dr. Clark, you’ve done
some really great work looking at social media subcultures
and minority communities and I’d really like to
know how does a disconnect between what’s discussed
in those communities, and then what is reported
in mainstream media about those communities affecting the trust there? – Some of the things I’ve seen, I study specifically
one Twitter subculture called Black Twitter, and Black Twitter is comprised
of black users of Twitter who use the service in
a very distinct way. One that’s easily surveilled online. The way that Black Twitter users use the platform to
communicate with one another is heavily influenced by
culture and cultural references. It is embedded with symbolic meaning, and it has context that is
often missing from news stories. Some of the problems that
we’ve heard about historically have to do with a lack of
coverage of certain communities, including black communities, and so some of the things that
you see Black Twitter doing is that in the coverage that is deficient of the backstory, the context, the history that you would
want to see in journalism about your community, Black Twitter provides for one another. – So how much, and kind
of following up on that, you know how much is sort of the culture within these communities on social media I guess somewhat in conflict with the lack of diversity in newsrooms covering that same community? – Well I would definitely say that there is a correlation
between the two. I am currently running the American Society of News Editors
Newsroom Diversity Survey which was initially developed in 1968, 10 years after the Kerner
Commission report came out and made this indictment
of national news media for failing to cover black
communities in particular adequately and fairly, and since 1978, the survey
has tracked the number of journalists of color
in different newsrooms across the country and also women in newsrooms across the country. And some of the things that we’ve seen is that since about 2012,
there’s been kind of a plateauing of the number of journalists
of color in these newsrooms, so what we’re seeing online
in these social media spaces, and making up for those deficiencies is providing news that the people who are absent from the newsrooms can’t. Some of the reporting that you’ve seen from places like Ferguson on the ground coming directly from
social media activists, some of the things that you’ve seen in terms of breaking down
social media movements like the Black Lives Matter
movement as they unfurl has come through these
communities on Twitter. So it’s people who are able to, because of their access to the platforms, the speed of the platforms, and their knowledge of very specific and specialized subjects are able to do some sort of reporting that is different than what we get from mainstream media. – Thank you. So you know, when we’re talking about trusted news organizations,
and I think that probably is one of those things that
depends on who you ask. I’m sure from what we
know here in this room versus maybe some other people out there. We know what it is that we trust. Now PolitiFact is often on that list, and so is Snopes. I know my mom who is a
big not truster of media, especially considering I’m in the media, but she loves Snopes. She calls it Snoops. She will send everything to
Snoops to make sure it’s real, and PolitiFact it really has been a big mover in that space as well, especially when it
pertains to political news, so Aaron I’m interested to hear from you, you know is it the case that
PolitiFact’s really trusted? Do you guys still kind of get charged with being the fake media these days? – Sure, yes. Well first thanks for having me, and since 2016, fact
checking is kind of in vogue, and so I get invited to
a lot of these panels. This is the first one where I’m the only guy
on the stage, and so. (audience applauds) Kudos to Michael and
Dawn for putting together a diverse panel, that’s excellent. Yeah, the answer is yes. So we just fact checked a claim by Ted Cruz earlier
this week or last week. He had claimed that since
the president’s tax cut bill had passed, tax revenues had gone up, and he was saying that this is evidence that the tax cuts are working. More people are having
more money in their pocket, and still yet more monies are
coming into the federal bank. The facts of this are nominally yes, there’s
a little bit more money that the government is taking in, but by every kind of
actual important measure, it’s the lowest growth in
revenues in the federal, to the federal treasury in decades when you adjust for inflation, it’s actually down, things like that. So, we rated that claim half true, and when we published the
claim we said half true. The number if nominally
up, but when you kind of really peel back the layers
of the onion you’ll find a lot of problems with this. Ted Cruz gets on Twitter and
says liberal lyin’ PolitiFact out to get me again. They literally said revenues have gone up and they still say half
true, how could they? 10 minutes later, Paul
Krugman, New York Times, who you’d probably consider
him a progressive or liberal. I really don’t agree with
PolitiFact on this one. They were way too easy on Ted Cruz. (audience laughs) I kind of take that as a
badge of honor on both sides and say maybe we’re in the right place, but the point is when
it comes to politics, and in the partisan
space we’re in right now, it is very difficult I think for anyone to truly trust anything. At PolitiFact we try every day to be as transparent and as
fair and objective as possible. We were probably, and I
could be wrong on this, but as far as I know
when we started in 2007 we were probably the first website to literally have a bibliography with every newspaper article we wrote that listed every source we considered, every source we talked to, whether or not they were
included in the final article. We had hyperlinks to all
of our source material. I’m told this is terrible for SEO, but I really like a paragraph
that says something like if you want to see the report, click here. We do that so you can read it. We try in everything we do to
be as transparent as possible so you can help make decisions about who you’re gonna vote for,
what policies to stand for. That all adds up to still
we’re trusted a little bit, and sometimes not so much. After the 2016 election we did a lot of kind of soul searching and found that there are
truly pockets of people who defend us, and maybe some of you guys are here, thank you. But then there are, thank you. (scattered applause) But there are a lot of people who try to discredit our work, and do it for political gain, and so we are still looking, trust me, at new and different ways to
try to create and build trust because we don’t think,
we’re not even close. If we’re trustworthy in your eyes, great, but I think we have a lot more work to do. – So coming back to you Sally, I’m interested in hearing and I know you referenced it earlier, but about sort of the work that you’re doing with newsrooms specifically about this right now. We’re talking about solving
some of the problems, so what is it that you guys are doing, especially with those
metrics applied there? – Okay, great. Yeah I love telling
about the Trust Project. So one of the stories that I think really helps see why what
we’re doing makes sense is that when I travel a lot, I was one trip going back and forth to SFO talking, I talked to the
cab driver both ways, and on the way I thought I had a pretty liberal cab driver, and
we got to talking about what I’m doing and he says
you just can’t trust the news, and I said well why? And he said because it’s
completely controlled by big business and government, and then on the way back I got a driver who I think was pretty conservative, and again got to talking about what I do and he says you can’t trust
the news, and I go why? Well because it’s completely controlled by big business and government. So what the Trust Project is doing is working with newsrooms around the world to help give the people
the information they need in order to understand
how journalism is distinct from all other kinds of information that really are controlled
by business or government. That this is the only kind of information that is truly designed to
serve the public interest. So what we do, we have right now we have 120 news sites showing these things called
the trust indicators, and they are transparency factors that provide information to you built from the user research we did, and from work done by senior executives from 80 different news organizations who came together to marry what users said would help them understand
whether or not to trust a site, married that with journalistic values, and so we have eight trust indicators and what they show to the public is information about who and
what is behind the news. So what are the principles and policies that this organization stands by? So what is their ethics commitment? What is conflict of interest? How do they deal with that? Do they have a commitment to
bringing in diverse voices? What do they do when they make a mistake? How do they make their corrections? Where and when do they place them? Information about the author. So who is this person
that’s producing the news with their expertise? Labels. Is it news, is it analysis, is it opinion, is it some sort of content that has been paid for by the advertiser? Because we heard a lot from
users that it felt like journalism was sort of
merging those things, and people were unhappy about that. Also get into what we heard a minute ago. How do you know what you know journalists, so for some stories information
about the actual sources that were, more information about the sources that
went into that story, just like PolitiFact is doing, and finally information about
is there local expertise built into this story? In a way is it locally sourced? Do they know about my demographic group? Do they know about my region? Were they actually there? And how much public engagement does this news organization
really engage in? So these are all shown to the public, and then there’s also signals that are provided to the
news distribution platforms connected to those things that you see, and the idea there is
because so much is seen through these social
media platforms and search that can we help them do
a better job of surfacing and lifting up trustworthy sources? And as you’ll see, the idea
here is to provide information to the public so that you
can make informed decisions about what you want to trust
and what you want to share as opposed to just sort of
handing you this evaluation based on a black box of yes
this is good, or no this is bad, and as a result I hope we can really shift the equation
so that the public can in fact own these trust indicators and start having a stronger say in holding news organizations accountable, and news organizations will hold themselves more accountable too. – Thanks. You know kind of taking off on that theme of the media
literacy and transparency and I know that that’s
something that both Sally, Lynn, I’m curious to know why is it
that we are very much invested in the idea that media
literacy and transparency is going to win back trust? – So I think it comes back
to what I mentioned earlier and the fact that the
public doesn’t understand how journalism works, and doesn’t know how a journalist does their job. And why would they? We have never told them. It’s just like I’ve never run
a kitchen at a restaurant, so how would I know how to do that, right? We haven’t kind of torn back
the pieces to show them, so the idea is that if we can explain, and if we can tell them
how we do our jobs, then instead of them
having to make assumptions about how we do it, which
normally turn out to be negative, we can meet at a place
of mutual understanding to maybe discuss how the
journalism could be better, things we maybe could
have done differently instead of this negative
assumption that comes to place that just I’ll give an example,
one that’s very common, the use of an anonymous source. I know anytime I ask
a member of the public that’s a non-journalist what it means when someone has used an anonymous source, I get varied answers. A lot of times people assume that we’d ever even met the person and or know exactly who that person is. Well, it’s actually the complete opposite. If I’m an editor and my reporter wants to use an anonymous source, I probably know the name
of who that person is. The reporter definitely knows the name of who that is in most cases, but we’re concealing their
identity for a certain reason. So it’s not that we’re just picking someone up off of Twitter and just using the
information that they give us. So if we can kind of meet at a point where these assumptions
don’t need to be made because we are informing the
public of how we did the job, then hopefully we can kind of
have this mutual understanding to build off of. The other goal that we
really want to do to is to have news organizations increase their engagement
with their audience. So this has been mentioned, this is mentioned earlier
on the panel before, you know it used to be just
a bunch of old white guys that were making decisions
about what content was given, and they were making those decisions. Well, in some cases that still is true and we need to be better, but can we then as journalists ask for, ask our communities what were we missing? Did we miss a certain
element of the story? Is there a community that we
left out of the conversation? And when we ask that and
we get that feedback, we can then decide to act on it and hopefully make the coverage better. Maybe go out and do an additional story involving that community
on that same subject and linking it all together. So it’s also this engagement
component as well. – Yeah, and you know I know
in the study that you did, Dr. Clark, there was
mention of this as well that really understanding
how the gears are working behind the scenes very much affecting sort of the minority community’s different responses to the media. Can you kind of enlighten me on that one? – Absolutely. It’s interesting that you mention not just taking a source from Twitter and kind of plopping it into a story, because that was one of the problems that the individuals that we talked to for our specific report
on Twitter subcultures, black Twitter, feminist
Twitter, Asian American Twitter mentioned when they talked about the process of making the news, that instead of source development being something they saw
with their communities and the potential for engagement on these social media platforms, what they saw was journalists swooping in, taking the conversations
that they were having, and using them without proper context. And so one of the problems
that they talked about was a lack of engagement that goes beyond what we know as the traditional
show leather reporting where you got out, you met
people in the community, you spent time in the coffee shops and you went to the different houses of worship to meet people when you can see them all online, or at least the ones who use
those social media platforms. So they were looking for more
ways to engage with people to get them to actually not just take from those communities, but actually engage in
equitable relationships with those communities, give and take. Being able to exchange information and thus inform coverage that way. – Now this one I’m kind of interested in getting a quick take
from all of you on this one because it’s intriguing to me. Just last week Pew came out with a study that said that research is now suggesting that younger people are
better at discerning opinion from older audiences, and I’m wondering if this
gives you guys some hope that some of this idea,
kind of the trust factor and the transparency, media
literacy, that it’s working and if there’s hope for
the future based on this. Who wants to take it first? – I’ll go first, no. No. I actually I still think
we have a big issue, and younger people have huge issues. Younger people, so y’all know
what The New York Times is. It means something to you. Young people, it doesn’t
have the same cachet. It doesn’t mean as much, and when you consume your news
on a platform like Facebook, now they don’t do it on Facebook
but Instagram or whatever, all the content looks identical. That’s what Facebook’s really good at. And platforms, they make the
content look exactly the same. If you search on Google,
the result for junknews.com versus highly trusted reputable
source look identical to you and in fact the bad guys
are getting really good at mimicking what the good guys try to do. And so I think we’ve really done a poor job educating middle schoolers and high schoolers how to be better at
discerning real from fake. The Stanford history education group here has done amazing research
to developing a curriculum, but I think they show
that this is a problem, and so I guess what I
would say is I trust Pew, I think they’re reputable, but if they say younger audiences are better than older audiences, I would say they’re both really bad. How about that? (audience laughs) – Everybody’s bad. – So I actually disagree
a little bit with that. I’m a millennial and I very much own that, and I think that the
younger audiences are better than the older audiences
because they know social, they know digital, they
know how to fact check, they’re used to fact checking, they’re used to seeing one article and then looking to
see where else it goes. That’s how they are consuming information. So to me when I saw that
study I wasn’t surprised. I do think it gives me some hope, but it actually really really worries me for the immediate future,
because I was doing some research about what media literacy
programs are out there for a story that I’m writing, and a lot of the projects
are based on K through 12, which is great. It’s good that it’s there, but there isn’t a lot that’s focused on once people are college
age, people my parents’ age, and my mother doesn’t
know how to use Facebook. I love her dearly, but
she doesn’t know how to so there is this audience that I think does need this education,
because it’s not something that they know how to use as well, and so can we reach them? And as far as what I was seeing with just the groups that
are working in this space, most are working with K through 12. A good thing, but we need
more to work with people that are not necessarily in school age. – Well I would just say, one I resist any kind of
general characterizations by age group or any other group. The study itself I think really ended up evaluating more how attached
are you to your perspective as opposed to how well do you
discern news from opinion? Because it had to do with people assessing specific statements and is this fact or is this opinion? What we found in our user research, and as I said we went across race, class, gender,
generation, geography. We found these kinds of
problems at every level. We also found these avid news users in younger people, in older people. Sometimes we were really surprised at how amazing, I mean unfairly surprised about how amazingly
adept older people were at figuring out what they
were looking at online and using the social networks in order to gather information from a
lot of different places, check, cross check, and push it out there. Millennials were the same way, and some of the ones that we talked with, and really felt in a way it was their duty to get out there and correct the record. So I feel, I’m just
naturally optimistic anyway, but I think there’s a lot of reason to be optimistic across generations. – I’m always naturally cautious, and that caution comes from
working with young people at the collegiate level, and then also coming out of newsrooms
and always wondering what is coming next? And so on one end I’m very encouraged that young people can pick out statements and separate fact from fiction, but I’m concerned about technologies that may disrupt their ability to do so, and what I am thinking
about is heavy engagement with visually based technologies, specifically with social media, Instagram being one of the
most highly adapted platforms among younger people, and technologies that
are being developed now that can change everything
from the appearance of a person to the way that their voice sounds. How do we train young people to be able to detect
what looks real to them? And so that’s one of my primary concerns. I would add that there is a study that was released very recently, I believe on October 12 by
Project Information Literacy. It was headed by Dr. Alison Head, and it looks at the way young people in college and high school in
specific challenge the news, how they process the news,
how they share the news. So there’s a little more
work that’s being done. It’s breaking news, I guess. About information literacy
among older populations. Older in college, but
there are a few things that are still out there
where people are working with the Gen Z I believe,
after the millennials. – So, it’s a mixed bag. We’ll take that, we’ll take that. – [Aaron] Half true. – It’s half true, exactly. (audience laughs) Well knowing this crowd,
having sat in this class over the last few weeks, I know everybody wants
to talk about politics. I know that that question’s
come up all the time. So Aaron, you know looking at
the coverage of US politics, and especially when we’re
talking about the midterms, has fact checking helped in
terms of all of these issues in terms of media literacy,
misinformation, trust, since it’s been around,
you know, for a while now? You guys have seen it all in a way. How much has it changed now? – So I guess my first answer is I hope so. So I’m optimistic about fact checking. No I think that you know
after the 2016 election, everyone kind of looked
at fact checkers and said, Trump lied way more than
Clinton, but Trump was elected. You guys failed at your job, right? We never saw that as our job. You know, I think we can go into the specifics of the election,
but I’ll set that aside. Our job as a fact checker is not to tell you who to vote for. I don’t want you voting for
the most honest candidate. I don’t think that’s
necessarily the best measure. It’s a measure, but it doesn’t
have to be the only one. You know I think we have
been effective over, we started in 2007 so our
first election was 2008, so that was our third
presidential election. I think we have kind of
really broken through as a type of journalism so that newsrooms across the country now are having a conversation
about when they tell a story, are they gonna do it through
a Q and A or a narrative or a video or a photo? I think now telling a
story as a fact check is something that a lot of
newsrooms are thinking about, which I think is really good. The research on fact
checking is amazing in that, you know it doesn’t change
how you vote, necessarily, but you as a reader when
you read a fact check you retain the information
better and longer than if you read a straight news story. And the second thing, which
is the president aside, the threat of being
fact checked often times makes politicians more
cautious and careful about what they say,
which is a good thing. I can’t measure it, but
there is research that we did in some of the states
where we have relationships where we sent lawmakers, there are three, lawmakers were set into three groups. Group one was the control
group, nothing happened to them. Group two got a letter
that was kind of like League of Women Voters
style and it was like hey, you know it’d be really
good if you told the truth because it’s good for
democracy to tell the truth. Please tell the truth. The third group got a letter that said PolitiFact is in your neighborhood. They are gonna be watching
the things you say, and if you say something wrong, they will set your pants on fire. Don’t lie. They tracked fact checks for a year. The group that was least likely to tell the falsehood was
the one who got the threat. And so I think there’s
effect we’re having. It’s not nearly enough. We are, Michael said we’re the largest fact checking organization
in the United States. We are 12 people, okay? (audience laughs) And I’m here with you, and the president’s probably somewhere talking. (audience laughs) So we can’t do everything, so the other thing we
have to do is train you and everyone to be their own fact checker, and so like Sally’s trust indicators? One of the cool things that they also do is they kind of work in reverse. So if you look at her indicators and say what websites don’t have that stuff? They could make you say I should be a little skeptical of that. We need to be teaching you how to do reverse Google image searches, how to do lateral reading which is to kind of move from website to website, how to look for proper sourcing, how to see if you can see on a website oh I can contact the author. That looks like a real person. That’s a good sign, you know? So I think we have a lot
more to do in that regard, but I think we’ve made progress over the 11 years that we’ve
been doing it, and happy. – Yep. I have one last one I’d like
to hear your guys’s take on before we open it up to some of the questions
from the audience here. And we have a study that
came out a few weeks ago about the spread of news deserts, which we talked a little bit
about here last week too. The loss of local news. Places where there’s not
necessarily a local newspaper, a local TV station, or
a local media anymore, and I’m curious to hear
especially you know, because you’re coming from
some different backgrounds and have talked to a lot
of people in the field, how much is that affecting trust and affecting the work that you’re doing? To whoever wants to go first. – One of the biggest
complaints that we hear, that our partners have heard is that people don’t understand if they’re reading their local newspaper why something comes across
and it says Associated Press, and that makes them either very angry or they don’t understand you know, how that content got in there and how the local journalist, were they involved in the process? Did they have anything to do with it? Any of that. So that type of, so the news desert right? If there is this news desert, if there are less journalists, you’re going to be using more content that is coming from the Associated Press, that’s coming from maybe
your corporate reporter that’s in DC, any of that, and what we found is
that audiences generally that does make them sort of question, or maybe distrust that
information that’s coming because it’s not coming
directly from their local news organization,
or that local reporter, and they wonder why didn’t
you send someone to DC, and so what we are seeing
is that’s a complaint and kind of a cause of distrust when they aren’t seeing
their own local reporters reporting the content. – Yeah, I just feel what’s happening to local news is devastating to not only the information system, you know, the quality news we need to be getting from every level, but
also to trust overall, because one of the things, when I told you that people are interested,
like they want to know does this journalist
understand my community? Does this journalist understand me? And as Lynn can tell
you, there’s strong trust in the local news organization. So if you’re just getting news
from a national site only, and you don’t see your own community, your own selves in the news, then you’re much less likely to trust it, and that’s something we
heard across the board, too. We heard people saying we want to hear from people like ourselves in the news. We want to hear from people
unlike ourselves in the news, not just at high levels of
business and government. So when you think about that’s
what national news does best, and international news
does best is to bring you to the voices people at high levels in business and government,
but not the local news, and local national news tends
to pull from local news, so it really does undermine the quality overall of what we’re getting. – I would want to complicate
our definition of local news and what we’re thinking about when we’re talking about local news. If it’s the papers that were
run by major corporations that used to be in our towns that have shrunk in size, that are no longer there, they’ve left, and it’s
not so much a news desert as it is a news ghost town. There were people that were there. This isn’t a naturally
occurring phenomenon. I’m curious about whether we’re including the perspective of ethnic newspapers, of specific language based newspapers, and organizations that have been there, that have been reporting
on the communities, but are overlooked because they are not what we continue to think
about as local news. So I’d start by complicating the question, and then I am not as I would say concerned from the perspective that I think this is something, it’s a single solution
that we need to address. I think it has great
historical implications in that if communities, if news organizations had established stronger
ties with the communities they served in all of those communities that they serve I ask the
question whether they would have more buy-in from people who
could potentially subscribe to the paper, who could support
the advertising revenues, and patronize the advertisers
that are trying to cater to a number of diverse audiences. And I think about that in terms of my own hometown’s newspaper,
the story that gets me is that in 2004, I was 24 years old, and the paper ran a
correction on the front page, and it was about 22 words. It said the Lexington, Kentucky, Lexington Herald leader has, it’s come to our attention that
the Lexington Herald Leader has failed to cover the
Civil Rights Movement. (audience laughs) The editor regrets the error. (audience laughs)
– Wow. – So some 50 years later, admitting that there is a
whole segment of history that as Alex mentioned earlier, a few white guys in a room decided was not important enough to cover. That alienates an entire
generation of people. Those people are potential subscribers, they’re potential contributors, and having made those choices then, we’re seeing the effects of
them play out now in local news. – Well we just got, that was good. (audience laughs) I’ll simply say, I always got talk. In the fact checking world, local fact checks are way more tangible, and so they do build trust, and so we’ve seen this
play out in the work we do. If I told you the traffic in
Palo Alto is just excellent, and the roads are always free and clear, you would throw tomatoes at my face because you know that’s not true. If I told you that there
was a caravan of you know, angry migrants coming to
knock down the border, you can’t see that with your own eyes, so then you have to come
to your political beliefs and say whether or not
you think that’s true. So I think the more we
can do at the local level, of course you can build trust
because you can participate in the stories in a way you can’t as we, when we don’t cover
local communities and issues. – All right, we’re gonna go to
some audience questions here. This is a great one for us to start with. So would you all address
the case of two sides? You know the binary he-said, she-said, this person said this, this person said that approach to news where it involves you know, discriminating exclusionary
racist issues especially. You know how that’s affecting our news, but also what the danger is for all of us when we’re focusing on this
he-said she-said situation. And this is to all of you, so whoever would like to take that one. – I can certainly speak
about concerns about this dichotomous approach to journalism that insists that there are
two sides to every story. I’ve taught my students for years now that there are many sides to every story. Herbert Gans when he did his seminal study on American newsrooms came
out with recommendations that talked about
multi-perspectival journalism, so providing multiple
perspectives to the new stories that we see, so that people
can use the information that we are producing to make decisions in their everyday lives. In order to make really
informed decisions, you need to have a multiplicity of voices involved in those stories. When you boil things down for the sake of reporting on deadline,
reporting to a particular space, and reporting for a very narrow audience, you miss so many parts of the story, and what you can wind up doing, and what I think we’re seeing now is normalizing voices that do not belong in mainstream media. Normalizing positions that are harmful, that use inflammatory rhetoric
to espouse their points, that just do not belong. They are not legitimate. They can be quite dangerous, but if we rely on this old trope that there are two sides to every story, then there’s no contesting
that because I got this one person who said something
that was totally radical, I now have to get someone
else who’s totally radical and I don’t get to work
in the middle at all. (audience applauds) – [Mandy] Someone else
want to add to that one? – I’ll simply say, sorry
I’m doing it again. PolitiFact was founded really to combat this idea of
he-said she-said reporting. The origin story of PolitiFact is in 2004, Republican National Convention, Zell Miller, a Democrat is giving a speech in support of George W. Bush. Lays into John Kerry
over defense spending. Our founder read the stories the next day, was Zell Miller said John Kerry’s terrible on defense spending. Quote quote quote. John Kerry pushed back saying this was all untrue, quote quote quote. Zell Miller defended what he said saying quote quote quote, end of story. If you read it, you had
no idea who was right, and as a journalist I
think it’s incumbent on us when we can take the time to say who’s right and who’s wrong, and we don’t have to give equal weight to the wrong side, right? On matters of fact, and so
that’s the story of PolitiFact. – Well we’re gonna keep you on deck here, so get your water in fast. The audience would like
to know who decides what PolitiFact decides
to fact check and how? And who checks the fact checkers? – The second part is easy.
– Who watches the watchmen? – You, you all check the
fact checkers, trust me. If we get something wrong,
people will find out, will quickly alert us and
we will issue a correction. In fact if you really want
to see all our errors, you can go to PolitiFact.com
and look at our subject tags. We list every correction
we’ve ever issued. And so, when we make a mistake it’s often because we didn’t have information. It’s usually not because we were wrong. A lot of times politicians
will play games with us, they won’t participate. We’ll issue a verdict and they’ll say I can’t believe you missed this. Well it’s usually ’cause
we didn’t know it existed. How do we pick what we check
and the process real briefly, we choose what we check using news judgment. So our reporters every day look at the top of the television news, the front page of the newspaper, what are the big stories
people are talking about? And then we go in search of
things to fact check about them. We do not keep a tally where
it’s like a Republican today, a Democrat tomorrow. We do not keep a tally
if it’s like a true today it has to be a false tomorrow. We make news judgments about
what we want to fact check. However, I will say we do make
sure we fact check everyone. No one is off limits
from the Truth-O-Meter. We make that really clear. From there, the one
thing I want to highlight about our process is a reporter
goes through the process and does our fact checks. The reporter recommends
a verdict, a rating, but the reporter actually doesn’t decide. We have three editors who sit as a jury. They read every fact check. I’ve been on most of the
16,000 that we’ve published over 11 years. We read, sit, we have jury instructions like four questions that we ask ourselves. We go through that process
and we issue that verdict. 75% of the time we’re
in unanimous agreement. If it’s a two one vote, so to speak, we’ll either maybe it’s a
case to do more reporting, or ask questions, where are the holes? Sometimes we just kind of agree
to disagree and we move on, but every process is pretty thorough. It takes at least a day
usually to do a fact check. Sometimes it can take a week or 10 days, and sometimes that conversation, that jury conversation can be 30 seconds. Some days it can literally go over a day, ’cause we’re kind of hanging
up the phone at each other arguing about what we
think it should be, so. – Well knowing that we only
have five minutes left, I feel like this is a good one that we can each give one example of. So, you know we’ve talked
a lot about what we do. What we in media are trying
to do to solve these problems, but I would like to know what
can the consumers of news do to solve this problem, to
do their own fact checking, to fight disinformation, all of the above? I know there’s a lot, so I’m
gonna say one from each person, and Lynn I’m gonna start with you. – So kind of building off what you were just talking about, something that we encourage and try to get news organizations to do is if you are in the public
and if you see something that’s incorrect, like please let that
news organization know. The thing about journalism is yes, we are reporting on the facts, but we’re reporting on
the facts from the time that we are writing the story
right in that moment, right? Things change, there might
be something that happened right after a deadline was pushed, and yes we should do an update and there needs to be an update, but the thing you were
reading is in a slice of time with what was known at that moment, and a lot of times we
may not get information. Talk about politicians in the government. They are like so hard to get
information from nowadays. It’s become really really difficult so if you ever see something
that you think is incorrect, please contact that news
organization, let them know. If you have a document or
just any kind of information who they could talk to so they can correct it, like please let them know. That’s gonna go a long way. – I’ll go next. Thanks again for having me. I would say, how many people
know someone on social media who shares what you
would consider fake news? Interact with these people, okay? Trust me, I don’t want to do it either. However, everyone has the
crazy Uncle Joe, right? Who shares the cooky stories? Typically we just avoid the conversation because like oh, that’s
crazy Uncle Joe, you know? Just being crazy. What I really should suggest is this is a public health crisis,
and so we will do better the more we are all engaged, and so as difficult as it may seem, and as much as you might not want to, it might be helpful to say hey Uncle Joe, where’d you get that information from? And then can I show you
a couple other stories that might help you understand? You don’t have to do it in a mean way. You don’t have to say you’re
fired, you’re a liar, whatever. You know, have a nice conversation ’cause I think what we definitely know is the best way, so I
can’t change how you vote and I’m not trying, but the
best way people can change how other people think are based
on people they know, right? So people who know each
other having a conversation about issues and facts can have a much better resolution than if I was trying to come in, or if Facebook was trying
to come in, Lord help us, and say this is wrong. So, engage, talk. Tell crazy Uncle Joe I said hi. – Sally? – Well, and I want to echo the thank you. Great questions, great
moderation, great opportunity. I would ask you to really
own the trust indicators. So you can find these
transparency factors on locally, on the San Jose Mercury News, you can find them on The Washington Post, you can find ’em on television
stations around the country, you can find ’em on BBC, and look at our website. It’s thetrustproject.org and
if you go to the list of, the FAQ, you can see what
the trust indicators are, and start looking for them. And then if you don’t see them, then there’s a reason for questions. If you see them, then think
about well how do these make sense to me? How can I spread these,
spread use of these? So can you talk to your
friends about them? Can you talk to your local librarian and get the library to
use them in various ways that librarians start
to build media literacy? Are there teachers you can talk with? Are there groups that you
might want to meet with to talk over the news and
use these trust indicators as a way to engage folks? So again, the idea here is
to really take ownership and be involved in
making informed decisions about the news and helping one another, and those ideas came from a
group very much like yours, so I think there’s plenty of opportunity, and I hope you take it. – One thing about engaging
with that crazy news, engage offline because
depending on the platform that you’re using, engaging online may actually incentivize the postings, so one thing to consider. But I like to think about strategies and approaches more than solutions. I think of solutions being very fixed, and when we’re talking
about media, technology, and social norms and how these
things are always shifting, you want to think about the way that we can be responsive to
the struggle of the moment. The challenges that we have right now are very different than the ones we experienced last
year, or 10 years before. I would go back to the
appeal that Lynn made about being involved, and
think of yourself as a source, looking around at the
coverage that you get, see who’s missing,
whose voices are missing from the coverage, contact news outlets to let them know who’s
missing from the content, and if you know someone
who could be an expert or who has a context or
perspective that is necessary for those stories, put folks in touch. Media is yours. You are a part of this whole system. It does not work without
your input, so be a source. – And I’m going to add in
as a moderator privilege, one more for you guys. I’ve studied disinformation
and misinformation for years, and you know you mentioned
earlier about the bad guys who are really good at
looking like the good guys, so the fake news sites, the sites that are popped up out of nowhere. Sometimes they’re made by
Russia, sometimes they’re not, that are made to look like real news, and one of the things they
don’t do very well is sourcing. So if you’re seeing a story and the facts are according to who? Like you need to ask that
question all the time when you’re reading it. Who’s this supposedly coming from? ‘Cause often they don’t even say, or it’s not even citing
an anonymous source. It’s citing a nobody source. It’s just saying it like it’s a fact, and if there isn’t a source,
if there isn’t a quote, if there isn’t a link,
and if there is a link it might be a link to another site that’s just like that one. If you’re seeing yourself
going down a link rabbit hole, where there’s still nobody who is actually owning that information, there’s a 99.9% chance
that that is a fake story and a fake website and a fake reporter and you should never go
to that website again, and you should tell your friends that too. So, that concludes tonight. I’d like to thank our panel. (audience applauds) And also big thanks to
Dawn and to Michael too for bringing this all together
and bringing you all here. (audience applauds) – Well five weeks certainly
went fast, didn’t it? I was thinking about
the Carol Burnett song, I’m So Glad We Had This Time Together. (audience laughs) Just want to say a couple things. Over these five weeks, we’ve
covered a lot of territory. We had about 30 speakers, journalists, media analysts, media experts
from around the country, from around the world,
and around the block, and right here at Stanford
and in Silicon Valley. So we’ve talked about how independent press protects press freedom. How one of the most important
functions of a free press is to help hold the powerful
accountable to the people, whether that’s government,
corporations, or maybe Facebook. You’ve heard journalists
themselves talk about how they’re covering and navigating issues of personal identity while trying to inform the public on issues of bias,
intolerance, and injustice. We had some local people talking about how failing business models
and changing reader habits have really decimated the
local news organizations in communities here in Silicon Valley and really all around the
country, and how news startups are trying and emerging
to fill those gaps. Tonight we talked about the
flood of misinformation online and how it’s tainting the
quality of information that we need to make good decisions. You also got over your initial shyness about asking questions. (Dawn chuckles) You had some great questions
over these five weeks. So Michael Bolden and
I have really enjoyed building this class, and coming up with this journalism sessions for you. We greatly appreciate
your interest in learning about journalism and about
its role in democracy. I want to thank everybody
who’ve made this class possible. My co-host Michael Bolden, yay! (audience applauds) All the talented people
at Continuing Studies. There are many, I’ll just name a few. Charlie Junkerman, Alexandra Argyropoulos, Jack Kirkner and Amy Tollafield. Thanks to the JSK staff,
especially Enrico Benjamin. He’s been up here doing
our live streaming, and social media, Erica Bartholomew who’s been the question whisperer, and in the back John and Mike are the Stanford video and audio staff who were with us each and every night to keep us sounding good and looking good. So finally I just want to end with, to echo the last panel
and their conversation about how media does not really exist or survive without you, so I want a call to action to ask for all of your help. So we live in tumultuous times. For our journalism, for our democracy, and really around the world. So access to relevant, truthful information is
really key to what we need to sustain our communities, wherever they are. Quality information is crucial
to supporting democracy, and we all know for certain there are such things
as facts and the truth. They do exist. So here’s what I’m gonna ask of you. Support your local newspaper
and news organization, your local radio station. Yes, read The New York Times, but more important your local media needs you more than ever. How are you going to know what’s going on if they just disappear? There’s a quote I read
recently that is a little dark, but I thought quite telling which it said first they came for the journalists, and I did not speak up because
I was not a journalist. We have no idea what happened after that. (audience laughs) So live your ideals, seek
out ways to help journalism. All of you are here
because you’re interested in journalism and democracy. Stay informed so you can fully participate in your community, in the life of your community. Keep an eye out for
other events like this. There will be others, not necessarily from us this next quarter, but the continuing studies
program has sessions, the JSK Fellowships, they have
some sessions in the future, and partners that we have who are actually in the audience tonight, the Stanford Journalism and
Democracy Initiative, the JDI, which is the Brown
Institute for Media Studies and the Stanford
Journalism Program and JSK will have occasional events. And then adopt a journalist. Find journalists, obviously we’re people that you can talk to. You can reach out to them
whether you have a question, whether you have a
correction, whether you have information about something
they should know about. And get to know them. Help us help journalism thrive, because our world depends on it. Thank you and goodnight and good luck. (audience applauds)

Author: Kennedi Daugherty

Leave a Reply

Your email address will not be published. Required fields are marked *