The Democratic Republic of Cyberspace?

About the author
Bill Thompson, new media pioneer, has been working in, on and around the Internet since 1984. Formerly head of new media at the Guardian newspapers, he writes a weekly column, the BillBlog, for BBC News online.

The internet we know today traces its lineage back to 1968 and the ARPANET research network that linked together computers at four US universities. Although it is often presented as a triumph of the free market and a prime example of how the invisible hand can create something of great social value, the net is in fact the product of public investment and an operational model that allowed levels of cooperation and consensus of which the private sector is simply incapable.

There are two reasons for its success. First, for ten years after it was created in 1983 the internet was generally ignored by politicians, policy-makers, campaigning organisations and almost everyone outside the circle of university researchers who were building it. And second, those who created the standards, built the physical network and wrote the code were interested in creating something that worked, not something that satisfied interest groups, promoted any particular agenda or met with the approval of anyone except themselves.

This article forms part of the “Peer Power: Reinventing Accountability” debate. AccountAbility, openDemocracy’s partner in this debate, will hold a major event, “Accountability 21: Reinventing Accountability for the 21st Century” on 3-5 October in London.

This is no longer the case. Over the last decade the net has gone from being a largely academic pastime to become a key part of the infrastructure of the burgeoning network society. The old mechanisms have broken down as the bodies defining the net’s technical architecture have become more distant from ordinary users. This has created a democratic deficit that leaves the future development of the network open to capture by two very powerful interests – private corporations and national governments – to the exclusion of civil society.

The conflict between them, both seeing themselves as the appropriate locus for the networks’ development, is clearly shown in the debate which is hotting up ahead of November’s World Summit on the Information Society (WSIS). A “working group on internet governance” (WGIG) was convened after the December 2003 WSIS meeting, and it has come up with a range of proposals for the governance of the internet, governance it defines as

“the development and application by governments, the private sector and civil society, in their respective roles, of shared principles, norms, rules, decision-making procedures, and programmes that shape the evolution and use of the internet."

Most of the discussion circles around the future of ICANN, the body set up in 1998 by the US government to manage the domain name system and the allocation of IP addresses. As a result wider issues will remain largely unaddressed.

WGIG’s proposals assume that the technical aspects of managing the internet can be separated from policy issues. The idea is that international bodies can be left to agree the technical standards and network architecture while national governments deal with content regulation, community standards and other “political” aspects.

This is not a feasible proposition. In Code (Basic Books, 1999), internet law professor Lawrence Lessig showed how the characteristics of the software that creates the network constrain what we can do online. Lessig’s dictum that “code is law” cuts both ways, and a clear but largely unexplored implication is that our political decisions must be implemented in software if they are to have any effect.

It is not enough for politicians to believe that they can will the ends without engaging in the means, so if the government of China wants to restrict access to websites promoting democracy they have to build (or buy) firewalls, routers and filters. This also implies that any political requirements have to be reflected in technical standards, so national governments will inevitably end up attempting to influence or drive the core technologies. Attempts to separate the technical aspects of the network’s operation and the political imperatives of individual governments will simply not work.

The internet is both a place and a way of communicating, and the space it defines is one that we find hard to govern. It is not separate from the real world, but it is connected to it in ways that challenge existing hierarchies and social structures. Instead of relying on these existing structures to find a solution for internet governance, we should aim to govern the internet in accordance with its own principles: those of distributed responsibility, disintermediation and peer review.

Old Models, New Structures

In this we can learn from the spirit of the early internet. This spirit is exemplified by one of its key technical bodies, the Internet Engineering Task Force or IETF. Formed in 1986, it remains the main technical body determining internet standards. All of the core network protocols, from the internet protocol that glues the networks together to the complexities of sending multimedia attachments with emails (MIME) were decided there and promulgated to the wider net-using community through a series of documents called, with admirable humility “request for comment”, or RFC (they can all be read at www.faqs.org/rfcs).

The IETF was, during its most important period, accountable to the entire net community by virtue of the fact that its meetings were open to all comers, with no need for accreditation or authentication. Its processes were all transparent and it reported in depth on the very network it was dedicated to creating. Anyone could propose, discuss, suggest modifications to and criticise any standard or proposed standard.

The IETF remains in place, but corporate interests and government demands have undermined it to the point where new standards are almost impossible to agree or fatally undermined by corporate selfishness. Current proposals to reduce spam emails, for example, are unlikely to move forward because Microsoft refuses to accept that its own technology will not be adopted as the basis for a standard.

Government interference is also rife. Many of the issues the IETF currently faces have emerged because political requirements cannot be absorbed within the technical discussion. Those responsible for the core network architecture are either unaware of or actively opposed to any attempt to take social and political aspects into account. Clearly a way needs to be found to let these aspects into the debate.

Also by Bill Thompson on openDemocracy

“Dump the World Wide Web!” (December, 2004)

“Random” (October, 2004)

“Indymedia’s silencing: a warning to us all?” (October, 2004)

If you find this material valuable please consider supporting openDemocracy by sending us a donation so that we can continue our work and keep it free for all

Power to the People

The network makes collaboration easy and we have many examples of joint working, information-sharing and data distribution to inspire our new approach to net governance:

  • At the program level, peer-to-peer networks confound the efforts of the copyright police to enforce nineteenth-century models of intellectual property
  • Skype users rip apart the telephone companies’ business plans and offer only free calls in return
  • Members of large online communities manage contact lists that run into the thousands
  • Users of auction sites use reputation and prior history to decide whether to rely on a seller

Internet governance does not need a simple-minded direct democracy, where standards are proposed and voted on by the ill-informed masses, but a true deliberative forum that takes full advantage of the affordances of the internet itself and extends membership to all who wish to engage.

Such a forum is in danger of being dominated by the same interests that have already captured the existing structures, so we need to take the more radical but very desirable step of taking governments and corporations out of the technical space entirely. The simplest way to achieve this is to restrict membership of the deliberative community to individuals, with no corporate, organisational, company or any other forms of representation possible.

Instead of representatives, we will have the people speaking without intermediaries, members of a massive, distributed, online community which determines internet standards, set up and managed by an international body (on which more later) which backs up those standards with research, reference materials and test implementations of software, the whole system using advanced community management tools to make it work. In this way the net could be the key to its own salvation .

The forum will be a combination of eBay’s auctions, the Second Life multiplayer game and the Slashdot community, based around a reputation system that provides greater weight to the opinions of those who have provided sound advice and shown good judgement in the past.

Unlike blogging platforms which measure only incoming links and cannot distinguish between a populist ranter and a reasoned debate, this reputation system will rank participants on many different axes and provide a means of deciding whether to listen to someone. Governments and companies will, of course, provide people who will join the group and argue their employer’s case, but every member’s affiliations, commercial interests and prior history will be clearly visible to all, and the scale of membership will make it unfeasible to manipulate its deliberations.

All discussions and all choices will be in the public record, so tools available to analyse voting patterns, like those at PublicWhip, will show whether a particular individual always votes the Microsoft line – perhaps they are an employee or a lobbyist with a declared interest, perhaps they are just an enthusiast for Windows.

Once decisions are made then national governments will have to judge whether their political goals can be achieved using the international standards or whether they need to deviate from them. But at least the standards will be set in a way that does not allow government and corporate agendas undue influence.

This is not a perfect solution. Firstly, it fails to address one of the core problems facing any form of net governance, which is how to represent the interests of those who are not yet connected but will be in future, the next five billion users, and some method must be found to allow them to have a voice.

Secondly, central guidance cannot be taken out of the equation altogether. Responsibility for the system must clearly lie with the United Nations, (or one of the agencies in the UN system such as the International Telecommunications Union) creating a direct connection between the highest levels of the UN system and ordinary users. Putting aside current debate surrounding its future, those of us with faith in the UN must work for its reform and rehabilitation. Giving a UN agency responsibility for the net is one way of achieving this.

Without a way of reconciling the vastly differing interests of private companies, public bodies, governments and civil society, the prospects for the internet’s future as anything other than a heavily-censored, highly inefficient and privately run data network are poor indeed. It is time to strip out the intermediaries and the vested interests and return net governance to the people.