It has become fashionable to worry about Google's market power and economic clout, as in the Business Week cover story on 9 April 2007: "Is Google Too Powerful?" It elaborates the question thus: "Will the vast commercial landscape of the Net, like so many other tech markets in the past, condense to one dominant force for the foreseeable future? Will we just Google everything?"
But we should be asking the bigger question about Google's and the rest of the algorithmic web: what power does it have over human understanding?
Alain Finkielkraut, the French philosopher, makes the philosopher's case against Google, Wikipedia, the blogsphere, Digg et al in a March 2007 conversation on France Culture. The web is destroying "the work" - the article, the novel, the essay, the film - by bringing us information without any of the carefully crafted context that can transform information into knowledge or even wisdom. A Google search is the result of a populist, market-like algorithm in which the most popular "goods" (items of information) have the highest "price" (rank on the search page). Populist, atomistic, consumerist: as happens to other commodities, the reduction of "works" to "information" eliminates the subtleties that make cultural productions distinctive. As David Levi Strauss has it, web surfing is eliminating the trace, the permanent, and turning culture into mere momentary flow.
Also in openDemocracy on the "Google question":
" Google: search or destroy? "
(16 December 2005) – a roundtable with Roger Clarke, Chris Cresswell, Michael Handler, Moyra McAllister, Matthew Rimmer, and Sarah Waladan
Andrew Brown, " What does Google know about you? "
(24 January 2006)
Becky Hogge, " Some grown-up questions for Google"
(1 February 2006)
The techniques we have of producing knowledge shape the knowledge that we come to have. Hand-copying of books in monastic libraries creates a culture of sparse commentary in the margins of a sacred cannon - time is rare, paper is rare, and reproduction is a priority over original production. The economics of the printing press change all that: the fixed costs of the print-run create the rewards for a mass media, and shape a world capable of consuming mass media: education, literacy, advertising all develop apace. This logic continues into film, television and radio: the world of the entertainment-industrial complex. It is the world of "the work", of the celebration of the author, the world for which Finkielkraut is already anticipating his nostalgia.
The internet has changed the economics of knowledge-production, and it will therefore transform what we know. Finkielkraut quotes Jorge Luis Borges's story "The Library of Babel", which contained all that ever could be written.
"When it was proclaimed that the Library contained all books, the first impression was one of extravagant happiness. All men felt themselves to be the masters of an intact and secret treasure...
... As was natural, this inordinate hope was followed by an excessive depression. The certitude that some shelf in some hexagon held precious books and that these precious books were inaccessible, seemed almost intolerable."
Prescience from Borges: it is the hopelessness of the vast quantities of data on the web that the various indexing and search services are busy filling. Google, del.icio.us, Digg, Wikipedia all use some form of "crowd wisdom" to give shape to information that would otherwise be un-navigable. PageRank, Google's patented and cashogenic algorithm (it earned the company $10 billion in the last 3 months), is a scheme to compute the number of votes that are cast for a web page. Every piece of content on the web is in a constant referendum, each link to a page representing some number of votes that depends on the stature of the linker. Digg is an even less subtle referendum: a story gets "thumbed-up/thumbed-down" by a click of the mouse and rises or falls in salience accordingly. A Wikipedia article is in a Darwinian battle: if a reader cares sufficiently that it should be mutated, she can cast a vote by modifying the article. Wikipedia works by the survival of the editorially-resilient - like its natural counterpart, this evolution creates wonders as well as unlikely monsters of the deep.
The shaping of knowledge by crowds through these algorithms changes what we know and therefore changes us. When, aged 15, I scoured my father's library for anything on love, I found Werther, Stendhal and, somewhat camouflaged, Frank Harris (the Edwardian fantasist and pornographer). The library - for good and ill - told a story - in this case my father's, which was becoming mine - through its selection and organisation.
Repeating the exercise on Google, on Stendhal's page, I find "Digg - Amazing Flash Animations about Love" alongside "Deus Caritas Est" from the Vatican and "Gay Men Spend $62m on Love". The next day, looking for a screen shot, the ranking has changed. What I have now in Stendhal's neighborhood is:
There is no meaning to this list. It is ordered by GoogleRank, a sort of "price" in the bazaar of information. The contiguity of Kevin Kelly, Stendhal, Jason Miraz and Love Psychics has as much significance as the fact that in the market for goods, Penguin's edition of On Love costs $11.25 on Amazon, as does a half-pound of Kona coffee or an Oriole feeder. Their only link is that they have the same weighting in "crowd vote" that is the market.
The virtual bazaar
Google is a bazaar for information. The web-sceptic fears not just majority tyranny from PageRank, but the more subtle ways in which we are all changed by the technologies of knowledge. Change - in our lives as well as across the whole world - is fuelled by the divergence between realities and beliefs. We believe there is anthropogenic climate change; we believe that a better world should do X, Y and Z to address it. The divergence of what is and what ought to be produces change at every level. So the way that we produce knowledge has a double impact: it determines what we think reality to be as well as our beliefs on how it ought to be - the evaluative is itself a complex function of what we know. As I click around the Google pages "on love", a new reality is intimated.
Borges predicts also that the library of Babel would produce the book-burners, those desperate to find and establish by force a line in the endless information. Their attempts are futile, says Borges, because the material really is endless: burn one book, and another, almost identical, is immediately there to take its place.
But despite Borges's prescience, should we follow his pessimism? Amazingly, Google shares some of it. Eliot Schrage, Google's man in Washington, raised an uncomfortable thought in his keynote address to George Washington University's annual Politics Online Conference in March 2007:
"Are we turning the Internet into a tabloid, destroying the truth? ... We're in an awkward position, because we don't assess truth or falsity.''
This comment came in the week that "Hillary '1984'", an anonymous video on Google's YouTube, demonstrated the political consequences of the information explosion: candidates have lost control of their images, parties cannot deliver orchestrated access to media, because the message is shorn of any of the context that allows the viewer to assess credibility. The web and social search have been like a great detonation in the old machines of knowledge-making. They have spread a débris of information, atomised pieces strewn here and there. Who will put Humpty Dumpty back together again?
The community answer
Communities will; groups held together by interest, shared history and reputation. This has always been the source of meaning, and this is the strategy that Borges forgets in the Library of Babel. Being part of a community on the Internet is understanding and appreciating, explicitly or implicitly, how it organises knowledge. BBC News - I think I understand the organisation that makes it, the reputation it needs to maintain and the motivations of those that work inside it - is where I will look for trustworthy up-to-the minute information, now, for example, about the violence in Somalia. BBC online, in a sense, lives in the aura of its pre-internet brand, as do some of the newspapers and universities. The information does not speak for itself, the community around it does. I remember the broad lines of the Somalian breakdown; the country's longest-"serving" lack of any government makes it mysterious and fascinating. But to understand it from the perspective of a self-critical insider, I naturally look to openDemocracy's coverage - again, no analysis speaks for itself, but the accumulation of history, archive and commentary creates credibility.
This works at every level of the web: when I look for advice on buying a new kite-surf, I need to find the energetic community of serious practitioners. If I really want to understand the advice, I need to lurk and participate, understand the tone of the discussion and identify the shrill and the serious voices. When I want a sense of whether stock-market turmoil is froth or fundamental, I want to find a community of insiders, like Roubini's and the commentary he attracts - I look to find what is their tone today. Communities create understanding out of information like worms make compost.
The credibility of that understanding is a different matter. Communities group around conspiracies and fundamentalisms often with more vigor than they do around subtle distinctions and perspectives. The web and its algorithms empowers communities of all sorts: credibility has no necessary correlation to truth or the beliefs of a considered judge. But if we want to understand how we are changing our world through what is known, we need to understand the technology of social search together with the sociology of the communities who bring legitimacy to belief.
If algorithms such as PageRank are like the market, web communities are like the range of products and organisations that the market sustains. The same algorithm that matches supply and demand is equally compatible with a world of tulip-manias or property-manias, gas-guzzlers or "voluntary simplicitarians ". The algorithm underdetermines the outcome.
Joseph Schumpeter, the Austrian evolutionary economist, was pre-occupied by "das Adam Smith Problem ": is the defense of laissez-faire economics in The Wealth of Nations consistent with Adam Smith's insistence on the social importance of personal virtues in his The Theory of Moral Sentiments? Was Adam Smith's chair of logic inconsistent with his chair of moral philosophy? Do people have to be good for the invisible hand to do its work, or will the invisible hand work its magic even in the face of private vice?
Das Google Problem is directly analogous: will the algorithms organising information under Web2.0 bring us credible, trustworthy information? To answer "yes" is to believe in an "invisible mouse", whose operation, unbeknownst to each link-maker and clicker, leads to a harmonious organisation of knowledge. But in The Theory of the Moral Sentiments, Smith explores the necessary pre-conditions for the operation of the invisible hand: a world of virtues, some natural, some social and sustained by social practices.
The same applies to Das Google Problem: web communities are emerging within the new spaces shaped by social-search algorithms, in which epistemic virtue are developed and which will make the invisible mouse roar. But what are the preconditions for these communities to produce knowledge of the quality that the old hierarchies produced? And what new technologies will allow and hinder that progress? The big questions for the production of meaning have become sociological and technical: this is the environment that, one way or another, will put Humpty Dumpty together again. Into what shape, who knows.