The Actually Existing Internet: The Disappearing Internet (1991–2010)

Liam Mullally

7th February 2024


In this blog I’m continuing to sketch the changing shape of the actually existing internet over its fifty or so years of its existence. This entry takes off where the previous finished: the beginning of the 1990s. With ARPANET officially discontinued, PCs now a fixture of many homes and commercial ISPs on the rise, a new formation of the internet was beginning.

The 1990s brought significant changes to the software of the internet. Usenet, which had been around since the end of the 1970s, had been a major draw for home internet users, but this would soon be superseded by the arrival of the World Wide Web. The Web was first proposed by Tim Berners-Lee in 1989, then made public in 1991 and finally open source in 1993; together these signalled the beginning of the web’s ascendency and the beginning of the end of Usenet as a mainstream communications technology, although it is still accessible today. (And contrary to perceptions, Usenet traffic has continued to increase year on year since the early 90s – the Web is still not the only game in town!) Also in 1993, Mosaic was released, the first internet browser to support embedded sounds and images. Mosaic was soon replaced by Netscape Navigator, which became the first truly hegemonic internet browser before being superseded by the likes of Internet Explorer, Mozilla Firefox (which was forked from the Netscape code) later in the decade, and eventually Google Chrome. From the mid 90s onwards the Internet quickly became synonymous with the World Wide Web.

Figure 1: NSFNET traffic from 1988 to mid-1994; traffic builds steadily for several years before increasing rapidly in the early 1990s.

The rise of the web coincides with a reformation in the hard infrastructures of the internet. The rapid expansion of the internet’s reach via ISPs in the early 1990s brought on a massive expansion in internet users and their traffic. By 1991 the NSFNET had become the largest channel of internet traffic in the US, but was still (officially at least) restricted to use by government agencies and universities. Acknowledging this infrastructural role (and the difficulty of enforcement), restrictions to the network were dropped, giving access to private companies (including ISPs). In line with its new prominence, NSFNET received a major upgrade the same year; at the same time, Al Gore’s National Information Infrastructure plan laid out plans for growing private sector involvement in the internet. By 1994, traffic on the network had increased by around 7.5 times (from approx. 8 billion packets in June 1991 approx. 60 billion packets in June 1994) and even this upgraded backbone-infrastructure was at capacity. This problem demanded significant planning and development; instead, the Clinton administration saw it as further justification for privatisation, and NSFNET infrastructure was divided up and handed to the private sector. Though little attention was given to this landmark at the time, by 1995 the US internet backbone had been fully privatised. From this moment on, internet infrastructure (or at least, American internet infrastructure) no longer operated a mixed private/public model, but had become a fully privatised network, subject to some regulation (and less than that applied to telecommunications).

Figure 2: The division of the NSFNET into local backbones in 1995, after which it was replaced by competing commercial backbones.

Breaking up NSFNET

The breaking up of the NSFNET required some formalisation of relationships between different kinds of networks – competing regional backbones, internet service providers, etc. In the US, this manifested materially in NAPs (Network Access Points), which joined up the regional networks of the NSFNET. After privatisation, NAPs were owned and operated by private companies, many of which had been ARPA contractors working on previously publicly owned infrastructure. Because NAPs were the locations at which separate networks (with different private operators) met, they became a site in which new rules for networked were developed. In a break from the principle of full-transit which characterised the earlier internet, hierarchies of access were established: organisations with large amounts of important infrastructure began to charge smaller organisations selling internet access to home users, or organisational networks (and ultimately users, now conceived of as customers) to access their networks – if you imagine the internet as a road network, every road is a toll road.

But this customer-provider model was not universal; for a smaller number of organisations with key infrastructure, the routine sharing of traffic was a technical imperative. They settled to carry each other’s traffic without charge – what is today called settlement-free peering. For the biggest, mainly but not only American infrastructure operators, full-transit never went away. (This is effectively a form of network planning, albeit one mediated by the market). The terms Tier 1, 2 or 3 (still used today) originate here, and characterise a financial relationship according to which the network operates. Tier 1 networks transmit all their data via peering agreements, and charge customers to carry data; tier 2 networks have to pay some tier 1 networks for transit but have their own customers; tier 3 networks have no peering agreements and are effectively at the bottom of the hierarchy. One legacy of the move away from full-transit is that deals can be made today for preferential treatment – so, for instance, a large media delivery platform like Netflix negotiates directly with infrastructure operators. Very quickly, any promise of “universality” from Gore’s National Information Infrastructure plan was gone, but the private sector involvement was not.

The NAPs were very much a key infrastructure, from which information infrastructure companies could leverage a preferential position in this hierarchy; throughout the 1990s powerful infrastructure companies were able to replace a significant number of peering relationships with customer ones (Shane Greenstein’s book, How the Internet became Commercial (2015) is especially strong in its account of this). There was some controversy surrounding alleged preferential treatment of NSFNET contractors in this transition including IBM, but especially Advanced Network & Services (ANS), a nonprofit corporation founded by NSFNET. NAPs were generally replaced by Internet eXchange Points (IXPs), third-party meeting points between networks which facilitate the interchange of traffic. IXPs remain an uncharacteristically visible (or at least easily rendered visible) feature of internet architecture today.

Figure 3: While from its exterior it looks like an ordinary office building, One Wilshire in downtown Los Angeles is a characteristic example of an Internet Exchange Point, a role it has played since 1992 (see more in Dourish, 2015).

The post-privatisation internet

The battles of the post-privatisation internet, with private organisations vying for predominance, depart greatly from the collaborative routing of the 1980s, although the protocols themselves did not change massively. Instead, the economic surroundings of the protocols were transformed, with private companies servicing the connection to a public/common resource and space – that of communication. One of the conceptual difficulties of this period is that the new communications technologies seemed to take on a communal character; at the time this was characterised as a digital third place. In other words, during the heyday of the “free and open internet”, the actual public infrastructures of the internet were being ruthlessly outsourced and privatised – with an overwhelming focus on and widespread praise of the rapid growth of internet connectivity, little attention appears to have been given at the time to how this change in ownership might come to undermine its free character.

During the heyday of the “free and open internet”, the actual public infrastructures of the internet were being ruthlessly outsourced and privatised – with an overwhelming focus on and widespread praise of the rapid growth of internet connectivity, little attention appears to have been given at the time to how this change in ownership might come to undermine its free character.

Free-Nets struggled in this new landscape, gradually ending as the decade progressed. Without a regular or robust form of funding, the Cleveland Free-Net was one of the last to shut down in 1999. With it, hopes that the openness of the web might apply to hardware, not just software, dwindled.

Such growth, or its private mode, was not limited to the US. In the UK, work began on a super fast fibre optic network, SuperJanet, in 1992. This academic network was roughly analogous – and connected – to the NSFNET, but didn’t become a backbone in the same sense. Instead, BT (relatively recently privatised) and a number of new dial-up network providers began to expand a network largely piggybacking on existing telephone lines and routes. Fibre-optic cables were being laid in the UK during the 1990s, but for cable television, not internet connections (although these same lines would be repurposed for internet connections, mostly by Virgin Media a few decades later).

Figure 4: A map of academic computer networking in London from May 1999.

Rapidly expanding international connection partially decentered the US in the internet; a system for allocating AS numbers and IP addresses globally was formalised. The network remained skewed – most Tier 1 providers were based in the US – but it was not unipolar. The RIPE Network Coordination Centre was established in the Netherlands in 1992, responsible for Europe, with equivalent organisations following for East Asia (1993), North America (1997), Latin America and the Caribbean (1999), and Africa (2004).

Mapping the shape of the internet also became increasingly difficult. The expansion of ISPs and the interconnection of the global internet rendered simple network maps like those of the 70s and 80s impossible to keep track of. New stretches of internet architecture were being constructed every day, with the network growing, seemingly organically, at a rate which exceeded easy measurement. The kinds of network topology inference techniques used today had not yet been developed (the first internet-wide AS-level inference was carried out in 1997), and so the actual shape of the internet in this period is surprisingly undocumented today. Without records of its structure, it is hard to say anything at all of the internet infrastructures of the 1990s other than that they were expanding rapidly in a somewhat decentralised manner, propelled by the popularity of Usenet and then even more so of the World Wide Web.

Figure 5: A map of global Usenet traffic flow, 1993.

In fact, to look at visualisations from the period (and up to the present!), you might think that all of a sudden in the mid 1990s the internet evaporated, and became immaterial. Yes, this is in-part a problem of representation – strictly speaking internet topology changes every time you plug in a printer to a local network, or connect a phone/laptop to a wifi network – but the absence of graphical representations of even high-level infrastructure clearly indicates a change in how these material networks were conceived. It is not a coincidence that the internet seemed to become immaterial at exactly the moment it was also made private, and that the last legible maps of the US internet backbone coincide with the period just after its privatisation. Here, as elsewhere, Capital looks to recede from prominent view.

It is not a coincidence that the internet seemed to become immaterial at exactly the moment it was also made private, and that the last legible maps of the US internet backbone coincide with the period just after its privatisation. Here, as elsewhere, Capital looks to recede from prominent view.

In the UK the situation was very similar. ISPs and internet backbone operators maintained graphical maps of their infrastructures during the 1990s, but both the academic interest in monitoring these, and the industrial interest in making them public waned in the first few years of the 2000s.

In 2003, one of the first major acquisitions of private internet infrastructure occurred. Genuity, BBN’s ISP, was acquired by Level 3 Communications, itself a fork of a construction firm (Peter Kiewit Sons, Inc.) which had built fibre-optic cable infrastructure for DARPA in the 1980s. Through this acquisition Level 3 acquired AS1, the first ever allocated AS number. Level 3 operated as an ISP, but its main business was not selling internet access to end users, instead it brokered access between other networks. It quickly became one of a growing number of obscure private firms (“information services” companies) that own and operate huge amounts of core internet infrastructure. These actors might be distinguished from both older telecommunications companies (although they are in direct competition), as well as from ISPs, which sell access directly to consumers and are therefore much better known and more visible.

There is an irony that this period contains both the growth of public digital commons, and the total privatisation of internet infrastructures. Much of the so-called “digital commons” can trace its origins back to the early cultures of the academic internet (which was hosted on public infrastructures). While the privatisation of internet infrastructures did allow for a rapid expansion of access, the “commons” only ever existed in software. Sub-surface, the total reorganisation of the logics of the internet had already begun.

Selected bibliography

  • Rajiv Shah and Jay Kesan, “The Privatization  of the Internet’s Backbone Network”, Journal of Broadcasting and Electronic Media 51 (2007), pp.93-109 
  • Shane Greenstein, How the Internet became Commercial: Innovation, Privitization and the Birth of a New Network (New Jersey: Princeton University Press, 2016)
  • Paul Dourish, “Protocols, Packets, and proximity: The Materiality of Internet Routing”, Signal Traffic: Critical Studies of Media Infrastructures (Urbana: University of Illinois Press, 2015)

Liam Mullally is a CHASE-funded PhD candidate in Cultural Studies at Goldsmiths, University of London. His research interests include digital culture, the history of computing, information theory, glitch studies, and the politics and philosophy of noise. Previously he has worked as a copywriter and tech journalist. He is working on several projects with Autonomy, from skills commissions to policy strategy.