Is the Web Breeding Ignorance?
In 1992, The Disposable Heroes of Hiphoprisy called television “the drug of the nation” that was “breeding ignorance and feeding radiation”. Sir Tim Berners-Lee, creator of the http protocol that facilitated the World Wide Web, once called the Web a “liberal artefact” that, in the spirit of the European Enlightenment, has democratised information. Has this vision been replaced in reality by a dysfunctional form of information anarchy that allows misinformation to flourish: has the Web become the new drug of the nation?
It is said, the Web compares unfavourably with regulated media like television and newspapers because online there are “no government or ethical regulations” (Eastin, Yang, & Amy, 2010, p211) or “traditional gatekeepers” (Fullerton et al., 2010, p469) who should be responsible for “quality control standards” (Flanagin & Metzger, 2008, p12). This editorial vacuum is referred to as “disintermediation” (Flanagin & Metzger, 2008, p12). As a result of disintermediation, it is argued, the origin of information online, its quality, and its veracity is “less clear than ever before” (Flanagin & Metzger 2008, p12). This results in the indiscernible “blending of advertising and information” (Flanagin & Metzger 2007, p320) and the spread of “misleading, questionable, and factually incorrect information” (Schwarz & Morris, 2011, p1). In the domain of health for example, “the opinions of crusaders, critics, and conspiracy theorists” are said to “be potentially weakening messages from qualified experts” and ultimately undermining the definition of truth (Kata, 2011, p2). Many argue, in the absence of an editorial authority, we are free to exercise technically augmented cultural cognition. For example, Pariser (2011) claims web technology helps us live inside what he calls “filter bubbles” – customised data feeds that reinforce our prejudices.
The counter argument is the Web is a culturally maturing technology: the ‘wisdom of crowds’ now corrects misinformation; expert participation in online communities is increasing; fact-checking websites are having an impact on public debate; Wikipedia’s accuracy compares favourably with its rivals and an ambitious project known as Hypothes.is even promises us a “peer reviewed Web”.
Moreover search engines are becoming more sophisticated in reflecting notions of authority and reputation. According to Google’s Webmaster Central Blog the company makes over 500 “improvements” to its search algorithms a year. Recently, in an effort to eliminate the “long tail” of “low-quality websites” Google has incorporated, for example, “user feedback signals” and “sites that users block” into its filtering codes. The company claims its results prioritise content from official domains such as .gov and it offers advice to publishers to increase their website’s likelihood of topping its searches; for example ensure “the articles describe both sides of a story” and the site “conforms to the same standards as a printed magazine, encyclopaedia or book” (Google, 2011).
Of course, sophisticated and popular websites can peddle nonsense and one man’s misinformation is another man’s truth. However, during my research I have asked teenagers to test Google for its information values and I captured the results using proxy servers. I asked them to research, for example, if aspartame caused cancer and the long term psychological effects of smoking marijuana. I expected the young people’s answers to show me a lot of pseudo-scientific nonsense from less than reputable sources. However, the teenagers queried Google and it offered them scientifically sound advice from authoritative guides such as NHS direct and Talk to Frank. Similarly, for more ideologically charged questions about immigration and climate change information emerged from the Google that sometimes challenged the student’s preconceived ideas. I’ve treated the data with due caution; there are many contextual and technical factors at play, but it shows the Web can no longer be considered a lawless, information wild west.
A recent survey by the Royal Statistical Society showed the British public was “wrong about everything” – it over-estimated the amount of non-white UK residents by almost 300%. Perhaps the Web is also part of the solution rather than the problem?
Huw Davies & Lisa Sugiura
Eastin, M. S., Yang, M., & Amy, I. (2010). Journal of Broadcasting & Children of the Net : An Empirical Exploration Into the Evaluation of Internet Content. Journal of Broadcasting & Electronic Media, (June 2012), 37–41.
Flanagin, & Metzger. (2007). The role of site features, user attributes, and information verification behaviors on the perceived credibility of web-based information. New Media & Society, 9(2), 319–342. doi:10.1177/1461444807075015
Flanagin, & Metzger. (2008). Digital Media and Youth : Unparalleled Opportunity and Unprecedented Responsibility. In Digital Media Youth and Credibility (pp. 5–27). MIT Press. doi:10.1162/dmal.9780262562324.005
Google. (2011). More guidance on building-high-quality websites. Retrieved from http://googlewebmastercentral.blogspot.co.uk/2011/05/more-guidance-on-building-high-quality.html
Hargittai, Fullerton, Menchen-Trevino, & Thomas. (2010). Trust Online : Young Adults ’ Evaluation of Web Content °. Journal of Communication, 4, 468–494.
Kata, A. (2011). Anti-vaccine activists, Web 2.0, and the postmodern paradigm – An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine, 1–12. doi:10.1016/j.vaccine.2011.11.112
Schwarz, J., & Morris, M. R. (2011). Augmenting Web Pages and Search Results to Help People Find Trustworthy Information Online. Interfaces.