18 pessimistic opinions on the next 10 years of fake news (and 5 optimistic ones)

Devin Coldewey
A topic like fake news, or more broadly the question of trust and verification on the internet, is a complex one — a land of contrasts.

A topic like fake news, or more broadly the question of trust and verification on the internet, is a complex one — a land of contrasts. Sometimes you just have to poll the room and get a feel for what people are thinking before drawing any conclusions. That's what Pew Internet did, contacting thousands of experts in tech, internet, and social policy and asking how they thought things would go over the next decade. They were not optimistic!

Well actually, 49 percent were "optimistic" in that they filled in the bubble saying the information environment will improve in the next 10 years; 51 percent filled the other one. So in that way it's split right down the middle. But the hundreds of comments they sent back seem to lean a lot harder on the pessimistic side of things. Their optimism is better described as a distant hope.

For reference, one of the themes identified by the study organizers is "Humans are by nature selfish, tribal, gullible convenience seekers who put the most trust in that which seems familiar." Wonderful!

There are lots (and I mean lots) of interesting opinions in this 92-page collection (PDF), but I sorted through many of them to find the best. Honestly I didn't deliberately exclude the ones from the "optimistic" side; most of them were just pretty hand-wavy. So without further ado, here is the cream of this dismal crop. (They really are worth reading, though.)

The Pessimists

(Authors' titles are as stated in the survey)

An institute director and university professor:

The internet is the 21st century’s threat of a ‘nuclear winter,’ and there’s no equivalent international framework for nonproliferation or disarmament. The public can grasp the destructive power of nuclear weapons in a way they will never understand the utterly corrosive power of the internet to civilized society, when there is no reliable mechanism for sorting out what people can believe to be true or false.

Clay Shirky, vice provost for educational technology at New York University:

‘News’ is not a stable category – it is a social bargain. There’s no technical solution for designing a system that prevents people from asserting that Obama is a Muslim but allows them to assert that Jesus loves you.

A respondent affiliated with Harvard University’s Berkman Klein Center for Internet & Society:

The democratization of publication and consumption that the networked sphere represents is too expansive for there to be any meaningful improvement possible in terms of controlling or labeling information. People will continue to cosset their own cognitive biases.

Mike DeVito, graduate researcher at Northwestern University:

These are not technical problems; they are human problems that technology has simply helped scale, yet we keep attempting purely technological solutions. We can’t machine-learn our way out of this disaster, which is actually a perfect storm of poor civics knowledge and poor information literacy.

Tom Rosenstiel, author, director of the American Press Institute and senior fellow at the Brookings Institution:

Whatever changes platform companies make, and whatever innovations fact checkers and other journalists put in place, those who want to deceive will adapt to them. Misinformation is not like a plumbing problem you fix. It is a social condition, like crime, that you must constantly monitor and adjust to.

Philip J. Nickel, lecturer at Eindhoven University of Technology in the Netherlands:

“The decline of traditional news media and the persistence of closed social networks will not change in the next 10 years. These are the main causes of the deterioration of a public domain of shared facts as the basis for discourse and political debate.

Zbigniew Łukasiak, a business leader based in Europe:

Big political players have just learned how to play this game. I don’t think they will put much effort into eliminating it.

An internet pioneer and longtime leader at ICANN:

There is little prospect of a forcing factor that will emerge that will improve the ‘truthfulness’ of information in the internet.

Willie Currie, a longtime expert in global communications diffusion:

The apparent success of fake news on platforms like Facebook will have to be dealt with on a regulatory basis as it is clear that technically minded people will only look for technical fixes and may have incentives not to look very hard, so self-regulation is unlikely to succeed.

Dean Willis, consultant for Softarmor Systems:

Governments and political groups have now discovered the power of targeted misinformation coupled to personalized understanding of the targets. Messages can now be tailored with devastating accuracy. We’re doomed to living in targeted information bubbles.

A retired university professor:

Increased censorship and mass surveillance will tend to create official ‘truths’ in various parts of the world. In the United States, corporate filtering of information will impose the views of the economic elite.

Bill Woodcock, executive director of the Packet Clearing House:

There’s a fundamental conflict between anonymity and control of public speech, and the countries that don’t value anonymous speech domestically are still free to weaponize it internationally, whereas the countries that do value anonymous speech must make it available to all, [or] else fail to uphold their own principle.

A vice president for public policy at one of the world’s foremost entertainment and media companies:

The small number of dominant online platforms do not have the skills or ethical center in place to build responsible systems, technical or procedural. They eschew accountability for the impact of their inventions on society and have not developed any of the principles or practices that can deal with the complex issues. They are like biomedical or nuclear technology firms absent any ethics rules or ethics training or philosophy. Worse, their active philosophy is that assessing and responding to likely or potential negative impacts of their inventions is both not theirs to do and even shouldn’t be done.

An executive consultant based in North America:

It comes down to motivation: There is no market for the truth. The public isn’t motivated to seek out verified, vetted information. They are happy hearing what confirms their views. And people can gain more creating fake information (both monetary and in notoriety) than they can keeping it from occurring.

A vice president for stakeholder engagement:

Trust networks are best established with physical and unstructured interaction, discussion and observation. Technology is reducing opportunities for such interactions and disrupting human discourse, while giving the ‘feeling’ that we are communicating more than ever.

Karen Mossberger, professor and director of the School of Public Affairs at Arizona State University:

The spread of fake news is not merely a problem of bots, but part of a larger problem of whether or not people exercise critical thinking and information-literacy skills. Perhaps the surge of fake news in the recent past will serve as a wake-up call to address these aspects of online skills in the media and to address these as fundamental educational competencies in our education system. Online information more generally has an almost limitless diversity of sources, with varied credibility. Technology is driving this issue, but the fix isn’t a technical one alone.

Sally Wentworth, vice president of global policy development at the Internet Society:

It’s encouraging to see some of the big platforms beginning to deploy internet solutions to some of the issues around online extremism, violence and fake news. And yet, it feels like as a society, we are outsourcing this function to private entities that exist, ultimately, to make a profit and not necessarily for a social good. How much power are we turning over to them to govern our social discourse? Do we know where that might eventually lead? On the one hand, it’s good that the big players are finally stepping up and taking responsibility. But governments, users and society are being too quick to turn all of the responsibility over to internet platforms. Who holds them accountable for the decisions they make on behalf of all of us? Do we even know what those decisions are?

A research scientist for the Computer Science and Artificial Intelligence Laboratory at MIT:

Problems will get worse faster than solutions can address, but that only means solutions are more needed than ever.

The Optimists

Adam Lella, senior analyst for marketing insights at comScore Inc.:

There have been numerous other industry-related issues in the past (e.g., viewability, invalid traffic detection, cross-platform measurement) that were seemingly impossible to solve, and yet major progress was made in the past few years. If there is a great amount of pressure from the industry to solve this problem (which there is), then methodologies will be developed and progress will be made to help mitigate this issue in the long run. In other words, if there’s a will, there’s way.

Irene Wu, adjunct professor of communications, culture and technology at Georgetown University:

Information will improve because people will learn better how to deal with masses of digital information. Right now, many people naively believe what they read on social media. When the television became popular, people also believed everything on TV was true. It’s how people choose to react and access to information and news that’s important, not the mechanisms that distribute them.

A longtime director for Google:

Companies like Google and Facebook are investing heavily in coming up with usable solutions. Like email spam, this problem can never entirely be eliminated, but it can be managed.

A sociologist doing research on technology and civic engagement at MIT:

Though likely to get worse before it gets better, the 2016-2017 information ecosystem problems represent a watershed moment and call to action for citizens, policymakers, journalists, designers and philanthropists who must work together to address the issues at the heart of misinformation.

Frank Kaufmann, founder and director of several international projects for peace activism and media and information:

The quality of news will improve, because things always improve.

That last one may count as the only real optimist of the bunch.