28ac Web 2.0: The Sleep of Reason, Part II -Britannica Blog

Top Categories

Navigation

Blog Forums
Election 2008
War With Iran? Founders & Faith
Web 2.0
Cult of Celebrity

Recent Authors

About this Blog

Britannica Blog is a place for smart, lively conversations about a broad range of topics. Art, science, history, current events – it’s all grist for the mill. We’ve given our writers encouragement and a lot of freedom, so the opinions here are theirs, not the company’s. Please jump in and add your own thoughts.

Feeds

Recent Comments

Expertise and high standards in scholarship and publishing are certainly translatable into the digital age, but there are many obstacles blocking the transition.  One chief obstacle is the notion that Jaron Lanier has called “digital Maoism” (in his May 2006 essay of that name on the Edge website).

He defines this “new online collectivism” as “nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force.”  This “wisdom of the crowds” and “hive mind” mentality is a direct assault on the tradition of individualism in scholarship that has been paramount in Western societies at least since the Renaissance and, before then, can be seen in the Church Fathers and the Greek philosophers, among others.

Digital Maoism is an unholy brew made up of the digital utopianism that hailed the Internet as the second coming of Haight-Ashbury—everyone’s tripping and it’s all free; pop sociology derived from misreading books such as James Surowiecki’s 2004 The Wisdom of Crowds: Why the Many are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies, and Nations; a desire to avoid individual responsibility; anti-intellectualism—the common disdain for pointy headed professors; and the corporatist “team” mentality that infests much modern management theory.  Consider, for example, the computer company’s TV advertisement that shows a tweedy professor trying to explain the difficulties of publishing and being deflated by a student who explains that, because of computers, everything can be published and we are all authors now.

This neatly conflates derision of the professorial authority figure and the endemic confusion of means (computer technology makes it easy to produce books) and ends (the creation of worthwhile texts is neither helped nor hindered, except in the most banal aspects, by computer technology).  Publishers, developers of publishing projects, editors, fact-checkers, proofreaders, and the other people necessary to the publication of authoritative texts are all mustache-twirling villains to the digital collectivist.  Such people see “gatekeepers” as antidemocratic agencies that stunt human development rather than as persons or entities seeking to promote intellectual development by exercising judgment and expertise to make the task of the seeker of knowledge easier. 

The flight from expertise is accompanied by the opposite of expertise—the phenomenon that Andrew Keen has called, in his new book of the same name, “the cult of the amateur.”  This cult, says Keen, “worships the creative amateur: the self-taught filmmaker, the dorm-room musician, the unpublished writer. It suggests that everyone—even the most poorly educated and inarticulate amongst us—can and should use digital media to express and realize themselves.”  He is referring to the impulse behind Web 2.0, but his words have a wider resonance—a world in which everyone is an expert in a world devoid of expertise.

Perceived generational differences are another obfuscating factor in this discussion.  The argument is that scholarship based on individual expertise resulting in authoritative statements is somehow passé and that today’s younger people think and act differently and prefer collective to individual sources because of their immersion in a digital culture.  This is both a trivial argument (as if scholarship and truth were matters of preference akin to liking the Beatles better than Nelly) and one that is demeaning to younger people (as if their minds were hopelessly blurred by their interaction with digital resources and entertainments).  Some go even further—witness a comment on Mr. Lanier’s essay on the Edge website (it appears to be by John Brockman, but the citation is murky):

Now, another big idea is taking hold, but this time it’s more painful for some people to embrace, even to contemplate. It’s nothing less than the migration from individual mind to collective intelligence. I call it ‘here comes everybody,’ and it represents, for good or for bad, a fundamental change in our notion of who we are. In other words, we are witnessing the emergence of a new kind of person.

Leaving aside the understandable tendency to reject this as an extreme example of technophiliac rambling (despite its evocation of Joyce’s Finnegans Wake), there is something very troubling about the bleak, dehumanizing vision it embodies—this monster brought forth by the sleep of reason.  Is the astonishing spread of computer technology to change not just our society and personal lives but also the very nature of human intelligence?  Google cofounder Sergey Brin has said that “the perfect search engine would be like the mind of God,” but most of us took that to be billionaire hyperventilating not blasphemy. 

Perhaps this view of an emerging collective human consciousness is also an ineffectively stretched metaphor, but, if it is put forward seriously, it (like the idea that the Internet itself is an intelligence apart from its users and the creators of its content) is antihuman and intellectually debasing.  The structures of scholarship and learning are based on respect for individuality and the authentic expression of individual personalities.  The person who creates knowledge or literature matters as much as the knowledge or the literature itself.  The manner in which that individual expresses knowledge matters too.  Good clear writing is more than a vehicle for conveying knowledge and information—it is an authentic expression of human personality.  Bad writing is, all too often, the outward manifestation of inward confusion and lack of clarity, as is bad organization or the lack of organization. 

An encyclopedia (literally, the “circle of learning”) is the product of many minds.  It is not the product of a collective mind.  It is an assemblage of texts that have been written by people with credentials and expertise and that have been edited, verified, and supplied with a scholarly apparatus enabling the user to locate desired knowledge.  It differs in almost all relevant particulars from one of the current manifestations of the flight from expertise—Wikipedia, which bills itself as “the free encyclopedia that anyone can edit” and to which everyone can contribute irrespective of whether they possess, or simply pretend to possess, credentials and expertise.  I will return to encyclopedias and Wikipedia in another blog next week and will content myself here by restating that the intellectual life of our society must continue to be based on respect for expertise, the scientific method, evidence-based texts, and, above all, the value of the individual scholar, author, and creator of knowledge. 

2022



42 Responses to “Web 2.0: The Sleep of Reason, Part II”

  1. Benn Says:

    Very interesting article. Everyone is an expert.

    The web is a great tool, but finding authoritative information on it can be a real struggle.

    To be honest, I’m getting a little fed up with “Web 2.0″. I’m sick of opinion, opinion, opinion, so I tend to seek out websites that offer only expert comment. (Am I a hypocrite, commenting here?)

    The web is a very efficient machine at creating facts from hearsay, recycling the same opinions through a thousand mouths until they become “known”.

    The electronic structures that enable people to cross-link to other blogs and sites (RSS, etc.) are reminiscent of the mental structures that produce “confirmation bias” - it becomes too easy to believe and convince oneself of anything. Proof, analysis, precedence, become merely forms of paranoid self-reference.

    Having a true and meaningful “discussion” on a message board is almost impossible. The web is TOO democratic and open to provide a decent platform for discourse. At least in real life you can find people similar to yourself. Wading into unknown waters to share a fragile thought is not a good idea.

    Then there’s Wikipedia’s patent lack of real editorial control. It has its uses, but it’s not an encyclopedia.

    I’m an editor myself. A good 70% of my work is done on computer anyway, though we do still have to mark up manually. For my work to be posted on the Internet or on paper would make very little difference to what I do every day. The editing processes are effectively the same.

    The only inefficiencies occur when changes on paper have to be made electronic, and vice versa. The adaptation of a book to e-book form was the single most irritating and time-consuming project I’ve worked on.

    I’m serious considering learning about some DTP and imaging programs, because future editors will sorely need to know about them. A future editor will have to deal with more than words, sadly, as companies realise how much time and money can be saved by cutting out the middlemen in Design/Production.

  2. Peter WIlliamson Says:

    Great post.

    This forum will definitely get the Web 2.0 folks in a lather, especially since they’ve long had it in for Gorum for speaking his mind. They’ll doubtless accuse him yet again of being a Luddite, or of not understanding, etc., etc., which is the usual come back to stifle discussion. Defame with labels, and then you’ve blackened and tainted his voice in debate. Kudos to Gorman for the courage of his convictions.

    You may not (I don’t) agree with everything he argues, but his underlying concerns are the concerns of civilization, and they shouldn’t be dismissed lightly. The young, of course–the most giddy of the Web 2.0 enthusiasts–are seldom interested in the long view, the undercurrent, the structures beneath. It’s merely the facade that fascinates. We need the Gormans of the world for the occasional reality check, the slap in the face. And a slap, like a hard look in the mirror, is never pleasant, but it can, at times, be exactly what we, and a culture, needs.

    Great forum.

    1f46
  3. Gorman vs. the straw-people « John Miedema Says:

    […] June 13th, 2007 This week Michael Gorman posted an article in the Britannica Blog: Web 2.0: The Sleep of Reason, Part I and Part 2. Thanks Jessamyn West. […]

  4. jeffb Says:

    Back in 1989 I sold Britannica. I’ve purchased 2 sets of the printed works and bought the CD and DVD versions a half dozen times through the years. I’ve given the great books collection as a gift… I bought several the children’s encyclopedia’s for my own children.

    On and off I’ve had free access to Britannica’s website (usually bundled with a purchase). I currently have the DVD version on my computer. I almost never use it. I love individual scholarship there is no question Britannica’s writers articles word for word are better than Wikipedia’s. However:

    1) You have about 1/20 or less the number of articles
    2) The articles are on average much shorter
    3) Wikipedia’s articles link off to additional materials in both their external links and in their reference section.

    The fact is I simply cannot find the materials I’m looking for nor the depth of coverage I’m looking for in Britannica. It isn’t remotely close to “big enough”. And Britannica is far and away the “biggest” encyclopedia in English.

    You are simply failing to address the core of the web 2.0 argument.

    1) The comparison should not be between a individual scholar and an amateur scholar but rather between individual brain cells and amateur scholars. Web 2.0 intelligence works by the sort of statistical averaging processes that go in inside your brain as the MECHANISM OF THOUGHT. The result of those processes are your thoughts, but as the result of Web 2.0 technologies are the finished processes.

    2) What other methods exists for constructing and distribute massive quantities of information in a cost effective manner? Even if using individual scholars is theoretically better web 2.0 groups have been able to put together “good enough” materials for a fraction of the cost. They aren’t competing with the theoretical reality of what could be done, but rather with the practical reality of what is done.

  5. John Connell: the blog » Blog Archive » Fighting Talk….. Says:

    […] As Gorman explains, the intellectual life of our society is at stake. This is a critically serious debate that will determine the credibility and the very viability of our information economy. If we want our kids to be ignorant, then accept the fashionable inanities of Web 2.0. If not, join the cause. And fight against the flattening of our culture into a wasteland of collectivist nonsense. […]

  6. Ryan Deschamps Says:

    I think most individuals who blog or advocate to the Web 2.0 theme would reply as Jesus to the soldier in the passion:

    ““If I have spoken wrongly, testify to the wrong;
    but if I have spoken rightly, why do you strike me?””

    I find this essay to submit arguments that veil ad hominem attacks against an enemy made largely out of straw.

    Web 2.0 advocates I know strongly support the participation of experts in wider communities. They have disdain for professors — or anyone for that matter — who would be a “pointy head,” meaning those who would purport the strength of a view as infallible because of some stated expertise. Not professors in particular.

    I think it is also a mistake to suggest that Web 2.0 advocates see information access as a zero-sum game. A recent Statistics Canada study demonstrated that Internet users also tend to be avid readers (I assume of published materials).

    As Gorman failed to point out, Surowiecki suggests that the crowd can only be wise when members act independently, have a diverse range of opinions and are coordinated in a decentralized manner. When these conditions are met, the argument goes, the “crowd” will out perform even a group of “experts” working together. Surowiecki cites a number of studies that illustrate this point.

    For every technophiliac rambling there are 10 commenters who will disagree (on post or in their own blogs). Those disagreements are often accessed easily through search, and linked via a wide range of tags, RSS feeds and other communication devices. The technophiliac cannot control the actions of the other 10, unless there is some collusion. But even the scientific community cannot say that their research is immune to collective thought.

    To provide an example, I recently read a public book to my son that claimed that “The Great Wall of China is the only object visible from space.” This “fact” also came up in a high school “trivia” tournament I attended.

    It wasn’t until snopes.com and then Wikipedia that I have been assured, in detail, that this is mostly an urban myth. In this case, my knowledge base is more accurate than it was in the 1980s, and my son is much better reading the Wikipedia article than he is reading the published book I provided him. Further, had I gone to Wikipedia first, I would have been led to a variety of other print sources on the Great Wall that would (perhaps) be more authoritative.

    Another theory, though not tested to my knowledge, is that because online resources are naturally suspect and “unexpert”, individuals will approach them with a more critical eye. That is a healthy habit and one that could be used when accessing published material as well.

    In sum, we can harp at examples of poor writing and factual errors, but in the end, there is nothing stopping anyone from combining taught knowledge with their experience (i should add that being “taught” is an experience of its own) to come up with the “truth.” Rumours of the truth’s “death,” I would say, are largely exaggerated.

  7. the goblin in the library › Gorman With the Wind Says:

    […] The biggest irony in Michael Gorman’s two-part blog post entitled “Web 2.0: The Sleep of Reason” (part I and part II) is that he clearly doesn’t understand how the internet (including, but in no way limited to, Wikipedia) works, or he’s willfully misrepresenting how it works in order to make his point. Whichever the case, it means that he’s not an authoritative, reliable source, and his writings on the matter cannot be trusted. […]

  8. 1faf
  9. The flight from expertise at Arriving Somewhere Says:

    […] http://blogs.britannica.com/blog/main/2007/06/web-20-the-sleep-of-reason-part-i/ http://blogs.britannica.com/blog/main/2007/06/web-20-the-sleep-of-reason-part-ii/ […]

  10. LibrarySupportStaff.Org » Michael Gorman’s Sleep of Reason Says:

    […] Michael Gorman, former president of ALA, has riled up some people with his posts on Britannica Blog titled Web 2.0 : The Sleep of Reason (part I) and (part II). […]

  11. Martijn Kriens Says:

    The intelligence of crowds is not the holy grail nor is it a silver bullet. But some aspects we should think about.

    Fact is that when you look at succesfull inventions there is always a group of talented people behind it, almost never one individual. Partly because talent attracts other talent but also because people need others to sharpen themselves. Only through sharing and taking knowledge with others great ideas are formed. This infrastructure of the group is an important aspect in real scientific life. That is way there are pockets of excellence in specific places.

    The Internet however makes us (partly) free from the physical geographical bounderies. People can feel part of the virtual group, including the sharing and taking from each other and therefore lift each other up.

    In group dynamics size does matter. And the fact that you can get more highly talented people together does have an extra effect. Not supernatural but real

    Some processes simply do not happen when there are not enough people with enough talent.

  12. Greg Park Says:

    Mr. Gorman: you should give us all more credit. We are not inept at sifting good information from bad, and don’t need our hands held by “experts” to help us make those judgments. By letting the “experts” filter information for us, they can tell us what is important, and what we should ignore. If a genre of music is inconsequential to an expert, what business do any of us have knowing about it?

    As an example, let’s take Death Metal. If you’re curious what Death Metal is, research it. Encyclopedia Britannica doesn’t have an entry on it, but it does have an article on Heavy Metal, (which may or may not mention Death Metal) which is 909 words long (8.25% of them are available without a credit card number). Wikipedia has nearly 4000 words on Death Metal alone, with extensive links for further research. And none of it costs a penny.

    On the Internet, you could have in-depth advice and live discussion on these topics within seconds, instead of days at a library. To “experts,” these items are trivial and worthless. The focus of the “experts” is, by definition, narrow and elitist. True, Web 2.0 may get things wrong, but I’d wager that the margin for error is close to that of the scholars. But what Web 2.0 may lose in accuracy, it makes up in inclusion, depth, and immediacy.

    The system that you defend, Mr. Gorman, was not established to protect the purity of information, but to keep academics fed.

  13. michael confoy Says:

    I am curious how Gorman “does not understand or is misrepresenting how the Internet or Wikipedia work.” I have been involved in the development of web applications for 10 years now; tell me, how does the Internet work? Pretty nonsensical question. The man is trying to have a serious conversation here and, not sure what to call them, readers are all up in arms about something, not sure what, but they deride him for using a blog, for forced expertise, etc.  

    I spend my day using social, Internet-based technology as I work at home for a very large company that produces that technology. All I have noted on the use of this technology on the Internet is flame wars (see the responses on part 1 of this article especially) and the false feeling of a virtual group. Virtual is the right word though — not real. Real productivity with these tools only tends to occur when you have seen whom you are working with face to face. Perhaps web cams and meetings in a Second Life like environment will prove to be beneficial here. But it won’t make up for the fact that Wikipedia tells me that the Ramones’ biggest musical influence was the Beatles.

  14. A.P Peters Says:

    Michael Confoy’s comments before me here are spot on.

    And why is Michael Gorman “unqualified” to speak to these issues? You may not agree with his opinions, but since when should opinion dictate a voice at the table? Of course, all this is the same ol’ story: the equalizers and levelers say they dislike authority and elites, don’t like them, that is, until it comes to the authority to stifle debate, limit discussion, and clamp down on who is and who isn’t “legitimate” to speak and thereby install their own elite, an elite more akin to their ideological liking. Then, suddenly, authority matters.

    It’s the same simple power struggle to control the agenda and dictate results, all in the guise of greater “openness” and “democracy.”

  15. Benn Says:

    I think the main issues here are credibility, answerability, and academic practice.

    Someone’s IP address is not enough. Scholars, when publishing work, put their reputations on the line. Their names, positions and places of work are all known. Their experience and previous works are usually listed, or are at any rate available elsewhere.

    The academic process is one of rigorously-circumscribed (and civil) debate, realised in the form of publications, seminars, peer review and letters.

    No one is claiming that Web 2.0 is trying to usurp the scholar. They are separate, though related, things. But the people BEHIND Web 2.0 should not show an arrogant disregard for the scholar. And vice versa.

    Genuine scholarship and the exchange of opinion and information in an answerable, credible fashion using the web could lead to something really liberating. Though I still countenance that it would have to be a closed system that discloses participants’ identities.

    In other words, the technology makes great things possible, but the complete openness of the Web 2.0 project is its fatal weakness.

    1f45
  16. The Invisible Library » Blog Archive » Gorman Rants, Again Says:

    […] Part 2 is up and Gorman’s not making any stronger arguments: He [Jaron Lanier] defines this “new online collectivism” as “nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force.” This “wisdom of the crowds” and “hive mind” mentality is a direct assault on the tradition of individualism in scholarship that has been paramount in Western societies at least since the Renaissance and, before then, can be seen in the Church Fathers and the Greek philosophers, among others. […]

  17. John Connell: the blog » Blog Archive » The Arrogant Imperium Says:

    […] Leading the charge is Michael Gorman. In his posts, “The Sleep of Reason” (Part I and Part II), he cites: “…evidence of a tide of credulity and misinformation that can only be countered by a culture of respect for authenticity and expertise in all scholarly, research, and educational endeavors.” […]

  18. Jonathan Says:

    Micheal Gorman’s misunderstanding is that regardless of where the information comes from, it is still artificial. It is the work of another imperfect person. There might be levels of imperfection, but the striving to have authoritative sources is really the choice of who to believe. The best professors in the world allow their beliefs to govern their research (Micheal Gorman himself is a good example of this). Because all people are imperfect - whether you choose to believe the professor or the uneducated blogger, it is your freedom to do. Someone looking for something real and authoritative is not going to look in books or the internet, they are going to take a walk outside.

  19. Thomas Says:

    Michael,
    Another aspect of “collective intelligence” that I’ve been worried about lately is that it also seems to be a shield for intellectual laziness. Instead of studying a topic or field of learning in-depth, you can do a Google search or post a question on Yahoo! Answers and have other people just give you the answer. Supposedly. In fact, if you look at the quality of searches or “collective intelligence” tools like Yahoo! Answers, it’s clear that they fall exceedingly short of the hyperbole voiced by Sergey and Lanier. The reality does not match the theory. These tools do return some level of the “wisdom of the crowd”, but how wise is the crowd these days? Being a contrarian today may be more valuable than ever considering the ability of the crowd to pursue more meaningless and ill-defined paths in today’s society. In other words, you’re right, and the people who understand that will be better served than the syncophants who refuse to do the intellectual work necessary to excel.

  20. Richard George Says:

    The advent of the internet reduces the profit margin for publishers, including those of the Britannica. The basic complaint against the established publishing houses is as follows:

    Look for any article from the science on the last century and you will find a Reed-Elsevier, ScienceDirect, Nature, etc. website demanding upwards of $30 to read a four page PDF article.

    At these rates, scholarship is impossible without a copyright library to hand, or a few thousand to invest in PDF fees.

    It’s worth noting that none of that $30 goes to the original authors who invested months of their time performing experiments or deriving mathematical results, none goes to the reviewers whose expertise was actually pivotal in determining the merit of the paper and recommending it for publication.

    Gorman claims that the service publishing houses provide is a pre-filtering of wheat from chaff for the benefit of readers who are too naive make such distinctions themselves, or who lack the time. No one would disagree that editorial and peer-review is a valuable action that must be performed.

    The argument of “Web 2.0″ is that the production costs of such filtered articles are a one-time expense, while Britannica and like attempt to charge ad-infinitum for the fruits of the same labour.

    In academia, publishing houses are able to arbitrarily fragment their journals in order to increase the number of subscriptions that institutions must maintain. Working at Cambridge University, I frequently find I’m prevented from accessing an article because it is not included in the university subscription.

    Publishing houses’ ultimate responsibility is to generate money for their shareholders, a requirement that they must balance against a need to not annoy their customers `too much’.

    Fortunately, Web 2.0 is providing the means to circumvent their racket via new forms of peer review - for instance arxiv.org, which allows draft versions of a document to appear in a form accessible anywhere in the world without restriction or extracting a fee.

    The trend developing is that an abridged article will be submitted to one of the closed access journals in order to gain the authenticity provided by peer review - (the actual value that such a journal adds) - while an extended ‘draft’ article can appear on an open site, in order that interested people can actually read the work, which is what the author is interested in.

    Web 2.0 may yet force a rethink of strategy and a competitive pricing model out of publishing houses, which will be a good thing both to authors and to those who seek to learn.

  21. David Says:

    Very strange…why is it that so many are willing to suggest that the National Enquirer, obtained from a news stand, is a more authoritative source than the New York Times is when read on-line? The reality is that I can find, review and assess the credibility of 30 or 40 different sources on-line in relation to any given topic in less time than it used to take me to go to the library and find just one single source. The historical record contains plenty of peer reviewed hard-copy publications supporting the contention that the world is flat, the earth is the center of the universe and bleeding to restore the balance of humors is the best way to cure a fever.

  22. iCrowds.net » Blog Archive » Filters Says:

    […] Filters The Brittanica is doing something very daring: they are starting a discussion about web 2.0 (like what is the relation between Wikipedia and the Brittanica) on a web 2.0 platform (blogs). Sun Tzu whom is no doubt also in the Brittanica, would have taught never to fight on the enemy terrain. Like I said: daring. […]

  23. Nic Wistreich Says:

    Absolutely second Jeffb and Richard George’s points above. A few further thoughts:

    The ‘expert’ is typically deemed so as a result of academic acumen, which invariably comes with some cultural baggage - class, wealth or nationality, and sometimes, tho they may be the last to admit it, an inflexibility in their views, as academic circles demand (try to get 10 expert quantum physicists or psychologists or social scientists to agree). Meritocracies that display the best of wise crowds, such as open source development communities, and the better written wikipedia pages, help to normalise these extremes. (of course these web connected crowds are still culturally biased - 9% of Africa having web access, compared with 90% in South Korea)

    The dominant cultural expert voice is increasingly motivated not academically, but commercially. As consumers, we increasingly distrust the ‘expert’ - and rightly so given academia’s ever-closer relationship with private interests, be it the ‘dodgy dossier’ on Iraq’s weapons, global warming denial, creationism or, say, a Pfizer sponsored study into the dangers of talk-therapy.
    If the crowd’s motive is to ‘do no harm’ or ‘always be neutral and accurate’ then it’s more likely to stay on course than an individual, whose interpretation of those ideas may wildly differ from the next person.

    There is the assumption in Gorman’s article that people will always continue to absorb whatever nonsense is presented to them without questioning its validity. Yet by encouraging more people to be creators (or amateurs as this article calls them), and as - like in this conversation - views and assumptions are challenged, such creators surely become more aware and in turn critical of the media and knowledge creation process. This can only increase the standard of debate and critical process, while the traditional expert will always be treated like a king - 100 people trying to describe a Higgs boson will embrace and respect someone who just completed her doctorate on the subject.

    Of course there are dangers with this shift, especially as a tiny imbalance at a small scale extrapolated to a million-mind crowd could be ghastly, and the ‘crowd’ at present is far from representative of the planet’s views. But most importantly it brings the discussion, creation and interpretation of substantial cultural work within the grasp of all of us who wish to partake, and can only increase learning and critical faculties, especially as the means for verifying accuracy, authority and accountability improve.

  24. Mark C. Rosenzweig Says:

    I have found the idea of the “hive mind” offensive in its irrationalism and anti-humanism since I first read about it, and that despite the fact that I am a collectivist.

    For me the real roots of this peculiar kind of collectivism Gorman rightly disdains and worries about is not Maoism (although there are some analogies to Maoist ideology, for which, in any case, I carry no brief, to be sure) but anarchism.and new-age mysticism.

    Anarchism in the mode of self-styled cyber-libertariansim cannot generate a notion of the “social” (like Thatcher, they believe “There is no such thing as society”), Indeed, the social is its enemy, despite the meretricious coinage, “the social web.” They need to believe, however, after all, that their onanistic “virtual” activity actually adds up to something despite the triviality of its components and the fact that each of the constituents wishes to conceive of themselves as a self-sufficient monad in relation to everything else. This is because, for them, they wish to deny the creative tension between the individual and the social, which is a structure and a process they find inconvenient and arduous, and look at the “collective” as something which, spontaneously and in a transcendent, religious sort of move (this is the “new age” element), gives god-ike qualities to the junk heap of babble they contribute their little bits to creating.

    This seems to set up a “super-individual,” the so-called “mind of God,” rather more to their taste than the more troublesome and dialectical notion of a society, with its reciprocal demands and responsibilities and norms, yet nonetheless, they create with it, if ideologically, a necessary whole-greater-than-the-sum-of-its-parts, a disembodied , objective super-intelligence without the drawbacks of human subjectivity, a knowledge-without-a -knower, forged automatically out of the unintended consequences of cybertechnologically-enabled information transmissions.

    It’s part sci-fi, part “new age,” part comic-book philosophy, and part cyber-anarchist anti-politics.
    There’s a kind of “invisible hand” argument here, but without the rational groundings of Adam Smith’s, one in which the narcissistic personality-types of our times boundlessly maximize their obsessive, repetitive, and sterile self–affirmations, and yet, through some quasi-mystical networking miracle, the result is some kind of cosmic and creative harmony rather than delusion, resentment and frustration writ large.

    The “anti-social ’social’ ” of Web 2.0 ideologues is the analogue of the “anti-political ‘politics’ ” of anarcho-libertarianism. The naive belief that capitalism can transcend itself through technological changes, a kind of crude technologiocal determinism, is a faith for a generation which wants to believe that real relations of power and domination , real structures of oppression and inequality, can somehow be ignored and that the practices one is enaged in are already the change that one wishes to see, that they are, in effect, already a new world, rather than a simulacrum of the old transposed to a virtual level and imagined to be unfettered by the real relations of power on which it is nonetheless firmly based and utterly dependent.

    Mark Rosenzweig
    American Library Association, Councilor at large
    Progressive Libraians Guild, co-founder
    “Progressive Librarian”, co-editor

  25. Bradley Gardner Says:

    Am I the only one who noticed the irony of posting a critique of Web 2.0 on a blog?

  26. Jonquil Says:

    You quote Andrew Keen as saying, “It suggests that everyone—even the most poorly educated and inarticulate amongst us—can and should use digital media to express and realize themselves.” He is referring to the impulse behind Web 2.0, but his words have a wider resonance—a world in which everyone is an expert in a world devoid of expertise.”

    It seems unlikely to me that you are seriously supporting the idea that the gatekeepers should prevent the unworthy from expressing themselves. If so, you are expressing solidarity with the 16th-century Church hierarchy that was horrified by vernacular Bible translations because they allowed any believer to form theological opinions. The blogosphere, like any other group of writers, contains the brilliant, the competent, and the clumsy; do you really want to silence the first and second in order to avoid the presence of the third?

    It seems to me to be more fruitful to discuss ways of sifting the wheat from the chaff than to deplore the existence of the entire harvest.

    1f45
  27. Brian H Says:

    The “irony” Bradley notes above has even more to it; note the misspelling of Librarian by Mark, giving his own title. Also, this whole exercise itself is the kind of feedback-correction activity that the Internet accelerates, almost beyond recognition. It is a hypertrophied form of the very critical commentary which has formed the backbone of qualification of expertise and received opinion of experts throughout the entire post-Gutenberg era, and even before (see Socrates et al.)

    Its speed and messiness is a problem; absence of brains and knowledge is not. There are a lot more of both the latter about than has ever previously been evident. The Wikipedia experiment-in-progress actually explicitly contains mechanisms for sifting through contributions for the most informative, complete, and “accurate”, given that knowledge and theory in so many areas is always in flux. The cliche’d observation that the next couple of years will see the accumulation of more information than has been built up in all the previous history of man, all languages and all eras included, is not just a minor buzz-bomb. It requires hugely more efficient and rapid means of assessing and comprehending and integration than were ever even contemplated in the print-and-paper world. Can things go off the rails, and errors be perpetuated? Of course, and no guarantees exist that they will be corrected in any kind of timely manner, if at all. I am reminded of the cautionary physics quote: “The universe is not only stranger than we imagine, but stranger than we can imagine.”

    In the face of such huge explosions of exploration and efforts to understand into what may be an infinite expanse of fact and creative fantasy (the latter encompassing both the offensive drivel of C-Rap and the beautiful inscrutability of centuries of Japanese haiku and …) it is well not to explicitly or implicitly insist that the old grey mare of traditional scholarship and academia can and must maintain its mandate and authority to duly deliberate over each proposed interpretation and adjudge them right or wrong. It just can’t bear the load, no matter how hard it tries. What the internet’s “wisdom of crowds” brings to the table is lightning fast dispersal and feedback adequate to the task of coping with the firehose information flow. It may finally break the historical pattern in, e.g., science, in which the passing of dominant and outmoded theoretical paradigms had to await the death of the lions of yesteryear who retained control of the research purse-strings and who appointed the peers who reviewed and deep-sixed unwanted challenges along with the inadequate research. (That process often had serious real-world consequences. The Australian doctor/researcher who first asserted and developed evidence that ulcers were a curable consequence of over-populations of helicobacter pylori in the upper gut and stomach was blackballed and suppressed by the industrial complex of doctors and pharmaceuticals who battened on the long-failed but lucrative treading-water mitigation approach which was the approved and authoritative treatment and theoretical flat earth upon which “medical science” then lived. In the 5-10 years it took him to get a basic hearing and honest checking and testing, thousands suffered and even died from what is now usually a condition treatable with a 2-week course of antibiotics. Go figure.)

    The value of expertise was summarized ably by Bertrand Russell, who said, paraphrasing, that it’s not intellectually safe to be certain of an idea contradicting unanimous experts, but absent such unanimity it’s not intellectually safe to be certain of any opinion.

    In the new world, expert is as expert does. It was always so, really, but it’s just more obviouser more quicker now. ;)

  28. Brian H Says:

    Excuse my “hypobole” above. It should have said, “tens of tens of thousands of thousands suffered and even died.”

  29. Laurie L Says:

    I would like to know the big “why” you Gorman have insinuated that a “sleep of reason” has sprung forth from freedom, a courtesy that has in the least been provided by the internet. In every field, except mathematics and the conceptual study of statistics, information is full of biases that spring especially from the ‘experts’. That which you call reason “Gorman” is a reality that you have unconsciously gobbled up from these so called experts, who can be sparingly alluded to Babylonian priests that pass around their beliefs, using their so-called “reason”, to bring about selfish and counterfeit results. This Euclidean reason is open to very little sides of debate because it is closed by specific clauses that cut out other viable impulses that emerge from society. In the end, its purpose is to legitimize the appetite of the elite by getting as many people to believe what they want us to believe. Gorman, you seem like an educated librarian. Let me ask you a question: have you ever opened up to the truth of books that have not met your standards of reason? Let me tell you something the experts that you talk about are slowly being considered by the multitudes as “false gods”. Soon only librarians, old scholars and politicians, stupid people who have given in to your reason, and corrupt scientists, historians, and journalists will take your mumbo-jumbo seriously. Throughout your education you have questioned the legitimacy (not argumentative legitimacy, but the publication legitimacy that you so uphold) of those books sparingly and as long as your professors said so and as long as they were written to your standards of reason, they inevitably suited you taste. And what happened to your professors who did not suite the status quo? Gorman, I know that you worship books, and are sexually captivated with the amount of legitimacy held by those books; for why else would your write of such blog (unless you were paid to do so). There’s more that I have to say on this, but on some other time. I don’t think that you’ll be waking up anytime soon anyway. (full-stop)

  30. Mark C Rosenzweig Says:

    Laurie L. says: \”Gorman, I know that you worship books, and are sexually captivated with the amount of legitimacy held by those books; for why else would your write of such blog (unless you were paid to do so).\”

    I guess Michael Gorman is a genuine \”bibliophile\”, and that _must_ be some kind of sexual perversion or other. At least it sounds like one! Besides, I doubt even Mr. Gorman would deny some libidinal element in the love of books. Either that, or else he\’s only in it for the money, which also. pace Freud, has some distinctly erotic aspect! That\’s pretty amusing!

    It seems this discussion is on the downward curve which is so characteristic of the dynamic of these kind of web-based fora. What else would one expect, especially considering whose ox has been gored by Mr Gorman.

  31. SpragueD Says:

    Gorman’s charge of anti-professionalism and hyperbolic claims about “collective intelligence” from the social media acolytes somehow skirts the most obvious underlying dynamic: greed.

    The internet is ever more a place of business and the Web 2.0 moguls have hit on a business model that is rather ingenious — collect enormous amounts of money in seed capital and ad revenue from (free) user-generated content. “Expedient greed masquerading as populism” I called it in a piece critical of so-called social media.

    When it comes to producing content that we can rely on, experts are costly in the short term but amateurs are costly long term.

  32. Steve Marquardt Says:

    Kindly consider us remote users in rural areas. Michael Gorman could walk about 70 steps in his CSU-Fresno university library to consult published reference works on the shelf, but I live 70 miles from the nearest university library. I would not welcome a return to exclusive reliance upon the printed page. Furthermore, there are items — authenticated government and professional association publications, even — on the Web that I (a former cataloger) do not recall as available in several of the five university libraries in which I toiled during my 34 year career.

  33. JeffN Says:

    Instead of railing on about the faults of Wikipedia, correct the factual errors and move on. Sure there is going to be a lot of junk out there but have some faith in the end-user using the grey matter between their ears to figure out if something is bs or not.

    Trust, but verify.

  34. Web 2.0 good or bad? « Eric Jennings Says:

    […] Web 2.0 good or bad? Posted June 20, 2007 Recently, Michael Gorman, former president of the American Library Association wrote a couple of blog pieces for Britannica Online called, “Web 2.0: The Sleep of Reason.” Here’s a quote which I think is especially appropriate for dissection: “This small example typifies the difference between the print world of scholarly and educational publishing and the often-anarchic world of the Internet. The difference is in the authenticity and fixity of the former (that its creator is reputable and it is what it says it is), the expertise that has given it credibility, and the scholarly apparatus that makes the recorded knowledge accessible on the one hand and the lack of authenticity, expertise, and complex finding aids in the latter. The difference is not, emphatically not, in the communication technology involved. Print does not necessarily bestow authenticity, and an increasing number of digital resources do not, by themselves, reflect an increase in expertise. The task before us is to extend into the digital world the virtues of authenticity, expertise, and scholarly apparatus that have evolved over the 500 years of print, virtues often absent in the manuscript age that preceded print.” […]

  35. Adra Says:

    Since I’ve been instructed to contextualize web 2.0 content against the author (his scholarly credentials, his expertise), I’m dismissing this whole essay in light of the obvious conflict on interest therein. In my corner of the “hive mind,” librarians have become increasingly superficial and arrogant “gatekeepers” of knowledge. It seems Mr. Gorman is trying to whine his way out of obsolescence.

  36. EdVentures in Technology » Diigo Links 06/22/2007 - The Online Predation and Blog-Bashing Edition Says:

    […] Web 2.0: The Sleep of Reason, Part II - Britannica Blog  Annotated […]

  37. re: web 2.0: the sleep of reason « Ghostfooting Says:

    […] woody evans opens his head « recent work re: web 2.0: the sleep of reason June 25th, 2007 I haven’t read every blog response to Michael Gorman’s recenttwo-parter about why Web 2.0 is bad for us (”It is this latter way of learning [learning through interaction with the human record, that vast assemblage of texts, images, and symbolic representations that have come to us from the past and is being added to in the present] that is under threat in the realm of digital resources.”)… so others may have said this first, but something immediately strikes me as wrong here… […]

  38. Fórum Web 2.0 « Jornalismo e Comunicação Says:

    […] Web 2.0 Junho 26, 2007 Posted by Luis Santos in Participação, Tecnologia, Internet. trackback O Britannica Blog está a promover um debate sobre a Web 2.0. Parece-me, pelo pouco que li,tratar-se de um espaço de leitura imprescindível para quem tem interesse no assunto. Retiro, do que escreveu um dos meus ‘bloggers’ favoritos - Nicholas Carr - dois excertos da resposta a um post anterior de Michael Gorman: Contemplative Man, the fellow who came to understand the world sentence by sentence, paragraph by paragraph, is a goner. He’s being succeeded by Flickering Man, the fellow who darts from link to link, conjuring the world out of continually refreshed arrays of isolate pixels, shadows of shadows. (…) What’s happening here isn’t about amateurs and professionals. George Washington was an amateur politician. Charles Darwin was an amateur scientist. Wallace Stevens was an amateur poet. Talent cannot be classified; it’s an individual trait. What’s happening here isn’t even really about expertise or its absence. The decisive factor is not how we produce intellectual works but how we consume them. (…) It’s our mode of consumption that is going to shape our intellectual lives and even, in time, our intellects. And that mode is shifting, rapidly and inexorably, from page to web. […]

  39. Alycia Says:

    […]Publishers, developers of publishing projects, editors, fact-checkers, proofreaders, and the other people necessary to the publication of authoritative texts are all mustache-twirling villains to the digital collectivist.

    Gorman conveniently forgets that this tried-and-true(?) system of editorial checks and balances has recently failed (and in spectacular fashion). South Korean researcher Hwang Woo-suk’s paper in Cell was the most notorious incident. New York Times staff reporter Jayson Blair resigned after his fraud and plagiarism was uncovered. And the best example yet, it was bloggers who exposed Reuters’ photographer Adnan Hajj who used Photoshop to alter war photos. I hope Gorman enjoys the irony.

  40. how to prove Michael Gorman right « Ghostfooting Says:

    […] woody evans opens his head « patriot act librarian gagging how to prove Michael Gorman right June 27th, 2007 Recently writing about why Michael Gorman didn’t quite get it right withhis Britannica blog post about Web 2.0, I referenced Gorman’s reference of Lanier’s “digital maoism” by pointing toward tightly hyperlinked circle of friends that link in and out of boingboing lots. […]

  41. T. G. McFadden Says:

    As he so often does, Michael Gorman has hit the nail squarely on the head in his recent blog postings on Web 2.0. And, as they so often do, his critics can find neither the nail nor the hammer. Full disclosure: I rarely disagree with anything Gorman writes or publishes, especially his recent work on the nature and content of librarianship (including his book co-authored with Walt Crawford). So we can dispense with ad hominem arguments on both sides of the issue right off the bat.

    Perhaps the most interesting issue that Gorman raises is that of scholarly authority in an online world. And the question he poses succinctly is this: As we move to publishing online (in addition to print, for instance), do we face a new understanding of the nature of “authority”? Is what has traditionally constituted authority in these matters suddenly being transformed into something else entirely? Or, is the concept of “authority” simply doomed altogether in the hive world? Do we now need to speak of “collective intelligence” rather than, say, “collective ignorance” or “collective stupidity”?

    It might be useful to explore, just for a moment, the origin and core meaning of the concept of “authority” in this context. The primal origins of the concept and word, of course, derive from the idea of the “author”—that is, the person(s) in the best position to know about what he or she has written, claimed, said, or otherwise created. Over time, the concept has developed along two parallel lines: (1) the concept of having the power, or right, to do something or compel someone else to do something; (2) the concept of being in a position to have the final say (or at least the most credible say) about a matter. Thus we speak of “authoritative opinion”, “authoritative statement”, and to provide the “authority” for a claim or statement. It is clearly implied that a person or persons may acquire or forfeit authority in the latter sense as quickly as they may be purged from a position of authority in the first sense. But while political downfall, say, can be simply a matter of brute force or weight of numbers, such is presumably not the case with the loss of intellectual authority. Intellectual authority is not a democratic concept at all. This fact about the concept is quite independent of modes or formats of publication or dissemination. Just as the legal concept of an “expert witness” transcends time and place, so the concept of intellectual (or scholarly) authority does not depend on the mode of distribution of opinion and scholarship.

    There are two questions about authority that must be kept scrupulously distinct: (1) What confers authority, or constitutes authoritativeness, and (2) How can we tell who (or what) has it? I may be a saint, but perhaps no one can tell that from outward appearances. Gorman is asking us to consider the second question, really, rather than the first. Once we get beyond the idea that every opinion about any matter is equally worthy of our attention and respect, then we are still left with the question of how to decide whose opinion to trust. This is the question that Michael Jensen has raised in his Chronicle of Higher Education article on the “new metrics of authority” (CHE, 6/15/2007). As we move increasingly to the online publication and dissemination of scholarly and intellectual work, do we need to find new ways to measure authoritativeness? Or does the concept simply evaporate altogether in the hive world?

    Let’s assume for a moment that what constitutes authority, or authoritativeness, can be explained in a fairly straightforward, pre-analytic way—something like the following from Black’s Law Dictionary: “…by reason of education or special experience [having] knowledge respecting a subject matter about which persons having no particular training are incapable of forming an accurate opinion or making a correct deduction.” Surely we can all agree that, if we had to bet our lives on an opinion about something, we would want to bet on the opinion of someone who qualifies as an authority in more or less this way. But how are we going to recognize such an individual when we need him or her? By what markers or criteria are we going to identify a proper authority and not merely a pretender?
    In the halcyon days of print publication in the academic and scholarly worlds, we had fairly well understood ways of doing this (even if they sometimes failed to distinguish the good from the bad; witness the Alan Sokal affair, for which there is a Wikipedia entry, by the way). The distribution of not just “information” but actual scholarly and intellectual content and opinion on the Internet is not merely pushing the limits of traditional ways of marking authority, but rapidly moving beyond any semblance of possibility that we can continue to cling to traditional strategies. Nor should we want to. But that is a far cry from giving up the traditional concept of “authority” itself. We should agree with Jensen (and Gorman) that many of the values of scholarship are not well served yet by the Internet: contemplation, abstract synthesis, construction of argument. Authority by virtue of applause and popularity, Jensen observes, is not a satisfactory substitute. Gorman would doubtless agree that what is needed most urgently as we move on to Web 3.0 (or whatever) are new ways of identifying, publishing, and making available to critical analysis and review the results of scholarly and intellectual inquiry in ways that both preserve the traditional values of authority and yet recognize the sometimes tenuous and fluid nature of that authority.

  42. MediaBlog » Andrew Keen en het beroerde imago van internet Says:

    […] Interessante discussie over hetzelfde onderwerp met The Sleep of Reason en From Contemplative Man to Flickering Man (Nick Carr). En een verhaal, The blog dimmed tide is loosed, van Scott Rosenberg van Salon. […]

    14f0
  43. Thoof: The End of Culture as We Knew It, cont. Says:

    […] UPDATE 6/16, via Matthew Ingram’s [blog], I found my way to the debate on the anti-expertise trends of Web 2.0 over on the Britannica blog: Michael Gorman on [The Sleep of Reason] and Nick Carr’s [response]. Resonates with issues I’m trying to raise here. […]

Leave a Reply

0