October 19, 2017

The Future of Truth and Misinformation Online

Theme 4: The information environment will improve, because people will adjust and make things better

Most respondents who expect an improvement in the information environment in the coming years put their faith in maturing – and more discerning – information consumers finding ways to cope personally and band together to effect change.

Fake news and information manipulation are no longer ‘other people’s problems.’ This new awareness of the importance of media will shift resources, education and behaviors across society.
Pamela Rutledge

Alexios Mantzarlis, director of the International Fact-Checking Network based at the Poynter Institute for Media Studies, commented, “While the risk of misguided solutions is high, lots of clever people are trying to find ways to make the online information ecosystem healthier and more accurate. I am hopeful their aggregate effect will be positive.”

Barry Chudakov, founder and principal at Sertain Research and StreamFuzion Corp., observed, “Globally, we have more people with more tools with more access to more information – and yes, more conflicting intent – than ever before; but, while messy and confusing, this will ultimately improve the information environment. We will continue to widen access to all types of information – access for citizen journalists, professionals, technical experts, others – so while the information environment becomes more diverse, the broader arc of human knowledge bends towards revelation and clarity; only mass suppression will stop the paid and unpaid information armies from discovering and revealing the truth.”

A North American research scientist replied, “I’m an optimist, and believe we are going through a period of growing pains with the spread of knowledge. In the next decade, we’ll create better ways to suss out truth.”

Sharon Tettegah, professor at the University of Nevada, commented, “As we learn more about the types of information, we will be able to isolate misinformation and reliable sources.”

Pamela Rutledge, director of the Media Psychology Research Center, noted, “Fake news and information manipulation are no longer ‘other people’s problems.’ This new awareness of the importance of media will shift resources, education and behaviors across society.”

Dariusz Jemielniak, professor of organization studies in the department of management in networked and digital societies (MiNDS) at Kozminski University, said, “There are a number of efforts aimed at eliminating fake news, and we as a society are going to make them work.”

Misinformation has always been with us and people have found ways to lessen its impact. The problems will become more manageable as people become more adept at sorting through material

Many respondents said the online realm as simply yet another step in human and communications evolution and that history’s lessons here should be comforting. They argued that previous information revolutions have inspired people to invent new ways to handle problems with information overload, the proliferation of misinformation, and opportunities for schemers to manipulate the emerging systems. The more hopeful among these experts believe that dynamic will play out again in the digital age.

Society always adjusts to new media and responds to weaknesses and flaws. Individuals will adjust, as will the technology.
Anonymous journalism and communications dean

A professor of media studies at a European university wrote, “The history of technology shows repeatedly that as a new technology is introduced – whatever the intentions of the designers and manufacturers, bad actors will find ways to exploit the technology in darker, more dangerous ways. In the short run, they can succeed, sometimes spectacularly: in the long run, however, we usually find ways to limit and control the damage.”

A futurist/consultant replied, “We’re seeing the same kinds of misinformation that used to be in supermarket tabloids move online – it’s the format that has changed, not the human desire for salacious and dubious news.”

Robin James, an associate professor of philosophy at a North American university, wrote, “The original question assumes that things have recently gotten worse. Scholars know that phenomena like patriarchy and white supremacy have created ‘epistemologies of ignorance’ that have been around for hundreds of years. ‘Fake news’ is just a new variation on this.”

The dean of one of the top 10 journalism and communications schools in the United States replied, “Society always adjusts to new media and responds to weaknesses and flaws. Individuals will adjust, as will the technology.”

Lokman Tsui, assistant professor at the School of Journalism and Communication at The Chinese University of Hong Kong, commented, “The information environment will improve. This is not a new question; we had concerns about fake news when radio broadcasting and mass media first appeared (for example, the Orson Welles’ reading of ‘War of the Worlds’). People will develop literacy. Standards, norms and conventions to separate advertising from ‘organic’ content will develop. Bad actors who profit from fake news will be identified and targeted.”

Adam Nelson, a developer at Amazon, replied, “We had yellow journalism a hundred years ago and we have it now. We’re at a low point of trust, but people will begin to see the value of truth once people become more comfortable with what social platforms do and how they work.”

Axel Bruns, professor at the Digital Media Research Centre at the Queensland University of Technology, commented, “Moral panics over new media platforms are nothing new. The web, television, radio, newspapers and even the alphabet were seen as making it easier to spread misinformation. The answer is media literacy amongst the public, which always takes some years to catch up with the possibilities of new media technologies.”

An anonymous respondent who predicts improvement replied, “Powerful social trends have a life cycle, and the pendulum typically swings back over time.”

An anonymous respondent said, “It is the nature of the technical development that politics and regulatory forces are only able to react ex post, but they will.”

A senior researcher at a U.S.-based nonprofit research center replied, “The next generation of news and information users will be more attuned to the environment of online news and will hopefully be more discerning as to its veracity. While there are questions as to whether the digital native generation can accurately separate real news from fake, they at least will have the technical and experiential knowledge that the older generations mostly do not.”

Many respondents expressed faith that technologists would be at the forefront of helping people meet the challenges of misinformation. A managing partner and fellow in economics predicted, “In order to avoid censorship, the internet will remain relatively open, but technology will develop to more effectively warn and screen for fact-inaccurate information. Think of it as an automated ‘PolitiFact’ that will point out b******* passively to the reader.”

An author and journalist based in North America said, “Social media, technology and legacy media companies have an ethical and economic incentive to place a premium on trusted, verified news and information. This will lead to the creation of new digital tools to weed out hoaxes and untrusted sources.”

Susan Price, lead experience strategist at Firecat Studio, observed, “There will always be a demand for trusted information, and human creativity will continue to be applied to create solutions to meet that demand.”

Dane Smith, president of the public policy research and equity advocacy group Growth & Justice, noted, “I’m an optimist. Truth will find a way and prevail.”

Louisa Heinrich, founder of Superhuman Ltd., commented, “The need to tell our stories to one another is a deeply rooted part of human nature, and we will continue to seek out better ways of doing so. This drive, combined with the ongoing upward trend of accessibility of technology, will lead more people to engage with the digital information environment, and new trust frameworks will emerge as old ones collapse.”

Michael R. Nelson, public policy executive at Cloudflare, replied, “Some news sites will continue to differentiate themselves as sources of verified, unbiased information, and as these sites learn how to better distinguish themselves from ‘fake news’ sites, more and more advertisers will pay a premium to run their ads on such sites.”

Steven Polunsky, writer with the Social Strategy Network, replied, “As with most disruptive events, people will adjust to accommodate needs and the changing environment.”

Liz Ananat, an associate professor of public policy and economics at a major U.S. university wrote, “It will likely get worse first, but over 10 years, civil society will respond with resources and innovation in an intensive effort. Historically, when civil society has banded together and given its all to fight destructive forces, it has been successful.”

Jane Elizabeth, senior manager at the American Press Institute, said, “The information environment will improve because the alternative is too costly. Misinformation and disinformation will contribute to the crumbling of a democratic system of government.”

A number of these respondents said they expect information platform providers to police the environment in good faith, implementing the screening of content and/or other solutions while still protecting rights such as free speech.

A principal network architect for a major edge cloud platform company replied, “Retooling of social networking platforms will likely, over time, reduce the value of stupid/wrong news.”

A senior solutions architect for a global provider of software engineering and IT consulting services wrote, “The problem of fake news is largely a problem of untrusted source. Online media platforms delegated the role of human judgment to algorithms and bots. I expect that these social media platforms will begin to exercise more discretion in what is posted when.”

An anonymous respondent said, “Information platforms optimized for the internet are in their infancy. Like early e-commerce models, which merely sought to replicate existing, known systems, there will be massive shifts in understanding and therefore optimizing new delivery platforms in the future.”

An anonymous respondent wrote, “Google and other outlets like Facebook are taking measures to become socially responsible content promoters. Combined with research trends in AI and other computing sectors, this may help improve the ‘fake news’ trends by providing better attribution channels.”

Adam Gismondi, a researcher at the Institute for Democracy & Higher Education at Tufts University, predicted, “Ultimately, the information distributors – primarily social media platform companies, but others as well – will be forced, through their own economic self-interest and public pushback, to play a pivotal role in developing filters and signals that make the information environment easier for consumers to navigate.”

Anonymous respondents shared these related remarks: 

  • “Everything we know about how human ingenuity and persistence has shaped the commercial and military (and philanthropic) drivers of the internet, and the web suggests to me that we will continue to ensure this incredible resource remains useful and beneficial to our development.”
  • “The tide of false information has to be stemmed. The alternative will be dystopia.”
  • “People will gain in sophistication, especially after witnessing the problems caused by the spread of misinformation in this decade. Vetting will be more sophisticated, and readers/viewers will be more alert to the signs that a source is not reliable.”
  • “I have hope in human goodness.”
  • “Over the next 10 years, users will become much more savvy and less credulous on average.”
  • “People will develop better practices for dealing with information online.”

Crowdsourcing will work to highlight verified facts and block those who propagate lies and propaganda. Some also have hopes for distributed ledgers (blockchain)

Some respondents expressed optimism about the potential for people’s capabilities in improving the visibility of the most-useful content, including the implementation of human-machine evaluation of content to identify sources, grade their credibility and usefulness, and possibly flag, tag or ban propagators of misinformation. An anonymous respondent wrote, “AI, blockchain and crowdsourcing appear to have promise.”

There will be new forms of crowdsourcing – a radical kind of curation – participation in which will improve critical-thinking skills and will mitigate the effects of misinformation.
Jack Park

An assistant professor at a university in the U.S. Midwest wrote, “Crowd-based systems show promise in this area. Consider some Reddit forums where people are called out for providing false information … if journalists were called out/tagged/flagged by large numbers of readers rather than their bosses alone, we would be inching the pebble forward.”

But whose “facts” are being verified in this setting? Ned Rossiter, professor of communication at Western Sydney University, argued, “Regardless of advances in verification systems, information environments are no longer enmeshed in the era of broadcast media and national publics or ‘imagined communities’ on national scales. The increasing social, cultural and political fragmentation will be a key factor in the ongoing contestation of legitimacy. Informational verification merely amplifies already existing conditions.”

Richard Rothenberg, professor and associate dean at the School of Public Health at Georgia State University, said, “It is my guess that the dark end of the internet is relatively small but it has an outsized presence. … If nothing else, folks have demonstrated enormous resourcefulness, particularly in crowd endeavors, and I believe methods for assuring veracity will be developed.”

An anonymous research scientist based in North America wrote, “A system that enables commentary on public assertions by certified, non-anonymous reviewers – such that the reviewers themselves would be subject to Yelp-like review – might work, with the certification provided by Verisign-like organizations. Wikipedia is maybe a somewhat imperfect prototype for the kind of system I’m thinking of.”

A Ph.D. candidate in informatics, commented, “It is possible to create systems that are reliable and trusted, but probably not unhackable. I imagine there could be systems that leverage the crowd to check facts in real time. Computational systems would be possible, but it would be very difficult to create algorithms we could trust.”

Jack Park, CEO at TopicQuests Foundation, predicted, “There will be new forms of crowdsourcing – a radical kind of curation – participation in which will improve critical-thinking skills and will mitigate the effects of misinformation.”

Some respondents also pointed out the rise of additional platforms where people can publish useful information could be a positive force. An anonymous respondent wrote, “The rise of more public platforms for media content (online opinion/editorials and platforms such as Medium) gives me confidence that as information is shared, knowledge will increase so that trust and reliability will grow. Collaboration is key here.”

Blockchain systems were mentioned by a number of respondents – a senior expert in technology policy based in Europe, commented, “… use blockchain to verify news” – but with mixed support, as many hedged their responses. A journalist who writes about science and technology said, “We can certainly create blockchain-like systems that are pretty reliable. Nothing is ever perfect, though, and trusted systems are often hard to use.”

The president of a center for media literacy commented, “The technology capability [of potential verification systems] is immature and the costs are high. Blockchain technology offers great promise and hope.”

A journalist and experience strategist at one of the world’s top five technology companies said, “The blockchain can be used to create an unhackable verification system. However, this does not stop the dissemination of ‘fake news,’ it simply creates a way to trace information.”

A chief executive officer said, “Can P2P, blockchain, with attribution be unhackable? We need a general societal move to more transparency.”