Chapter One: The Origins and Growth of The Internet and the World Wide Web
Richard T. Griffiths (Leiden University)
The internet is an innovation (or rather a series of innovations) that enables communication and transmission of data between computers at different locations. It is an extremely new scientific development, but that does not mean that we cannot analyse it historically, using the concepts that we apply to other innovations in the past. Basically the discussion among historians about the causation of inventions tends to boil down to two main approaches:
The second approach, that of the material environment, also usually devolves into two sub-questions:
To illustrate this, let us turn to the early eighteenth century and the start of the industrial revolution.
The 'science school' faces a difficulty that many of the early inventions were not really scientifically based. The earliest textile machinery were made of largely of wood and depended for their success on various combinations of levels, pulleys and spindles. The steam-engine was admittedly more intricate, but it was not until the early 19th century that its principles were correctly described. And the series of anonymous advances in the size and shape of blast furnaces, their linings and the various mixes of fuels and ores all took place within the existing corpus of knowledge among iron-masters themselves. Nonetheless, the science school emphasized the growth of learned societies, the rise of a new environment of experimentation and the growth of a corpus of experts working in these new areas (and the over-representation of non-conformist Scots among them is explained by the fact that because of their religion they were excluded from higher echelons of traditional careers). They emphasise the rise of a 'scientific method' rather than the role of formal science... the asking of new questions, the methodological pursuit of experiments and the insistence on scientific measurement. The material school that emphasises supply constraints have a much easier time of it. The earliest textile innovations came in weaving which created a bottleneck in the spinning of yarn.
Fig One: Flying Shuttle
This, in its turn, led to a cluster of innovations in spinning (and later the introduction of steam-engines).
Figure 2:Arkwright’s Water Frame (1769)
Figure 3:Crompton’s Mule (1779)
Figure 4: Hargeaves’ Spinning Jenny (1765)
which created a new bottleneck in the weaving sector. And this led, in the 1830s to the invention, and continuous improvement, of power-looms.
The first steam-engines were pumping engines (water and air) introduced because the need for fuel meant the sinking of ever deeper mine shafts which had to be kept ventilated and unflooded. They were highly inefficient in terms of fuel consumption, which didn't really matter as long as they were situated at mine-heads.
Figures 5:Newcomen engine (1705) Figure 6: Boulton-Watt engine (1776)
It was only when their efficiency was improved and adapted for a rotary motion, that they could be applied elsewhere.
The experimentation in blast-furnaces was equally dictated fuel shortages (caused by the disappearance of forests) and exhaustion of supplies of rich iron-ore. Thus, the argument goes, in each case shortages and bottlenecks drove up the prices of critical materials and stimulated the search for improvements.
Figure 7: Abraham Derby’s iron works at Coalbrookdale (1709)
Before leaving the 'supply school' it is worth adding a caveat - some of these inventions depend on innovations elsewhere before they can be effectuated. For example, it is nice to invent a steam engine, but if you cannot make pistons and shafts to the right tolerances, or cannot make sheet metal of equal consistency or do not have appropriate welding techniques, you cannot build it - the works will jam and the boiler will buckle and explode.
The 'demand-siders' would argue for the importance of new markets. They would point to the population growth and urbanisation in the United Kingdom in this period, to improvements in transportation and to development of overseas markets. This works best, of course, when dealing with consumer goods. For textiles they would point to the greater number of British inhabitants to clothe and the reduced opportunities for household production. They would observe the shift in preferences from woolen textiles to cotton and to lighter fabrics in general and they would certainly emphasise the coincidence of the development of spinning technology and the introduction of steam with the fact that Napoleonic Wars had given the British not only unfettered access to their own colonial markets, but those of France and the Netherlands as well.
The Industrial Revolution illustrates the problem for the historical analysis of inventions. Someone has to have the idea and unless we are going to be satisfied with a 'heroic inventors' school of explanation, we will be driven to searching for the social and scientific context. Yet an invention has to be feasible and it has to be applied if it is to have any effect. This means that it has to be worthwhile either because it allows one to do something better or cheaper (supply-side) or because it allows one to make more of something, or something entirely new (demand-side). It is impossible to isolate one factor, though the balance in the explanation varies according to the circumstance. While the supply-side explanation seems to have the stronger claims to 'importance' in the so-called 'First Industrial Revolution', this is far less in the 'Second Industrial Revolution', dating from the 1870s, based on chemicals and electronics and associated with not only with new production processes but with a whole new range of consumer goods, from motor vehicles to consumer durables, associated with higher real incomes.
The Internet is a system for allowing computers to communicate with each other. it goes without saying, that before we get the Internet we have to have computers. Much of the information for this section was derived from The Virtual Computing History Museum. The first step towards the modern computer was Samuel Morse's invention in 1844 of communication using electronic impulses, a key and a special code that sequences of pulses to letters of the alphabet. We won't get bogged down in whether Morse was actually the inventor of the telegraph (or his partner Alfred Vail in 1837) since you can find all you ever wanted to know about the topic at The Telegraph Office page.
Figure8: Babbage’s Difference Engine (eventually built 1991)
Figure 9: Morse’s telegraph (1849)
The next step is to link this particular invention to another of man's perennial strivings, the creation of a calculating machine. Although calculators have existed since the wire and bead abacus was first discovered in Egypt around 500 BC, one could say that the first main step towards the modern computer was Charles Babbage's experiments in the 1820s-1840s to build a "Difference Engine". I used the words 'one could say' deliberately because small errors in his calculations meant that Babbage never actually managed to build his engine - the Science Museum in Kensington built a copy of the Difference Engine in 1991 to celebrate the bi-centenary of his birth. The idea of digital calculation was taken a step further by Herman Hollerith who developed digital processing machines to assist in compiling the 1890 US Census. Hollerith went on to found the Calculating-Tabulating-Recording (C-T-C) company in 1914, a company renamed IBM (International Business Machines) in 1924. Babbage's and Hollerith's ideas for digital computing, however, seemed to have led to a dead-end, with most scientists preferring to develop techniques for analog devices, based on slide-rule principles.
These, too, could get pretty big as this Differential Analyzer built at the Massachusetts Institute of Technology (MIT) in the 1930s reveals. But machines of this size were also running up against the frontiers of their capabilities and, on the eve of the 1930s, new interest was being shown in digital devices. By now a whole host of devices associated with the development of the telephone (switches, relays etc) and radio (cathode tubes) would extend the possibilities of any solution. But what accelerated developments was the outbreak of World War II.
The war produced two major bottlenecks that were solved by digital machines. In the US, the need for gun-firing tables, navigational tables and tracking and aiming devices for anti-aircraft guns resulted in 1944 in the development of the first large scale automatic electromechanical calculator, the Harvard Mark I built by IBM. Note that it did not have an inbuilt program, the operating instructions were driven by a paper tape. A second crying need was to break the German (and Japanese) codes quickly enough to be useful. This work was undertaken by British scientists at Bletchley, and it culminated in the construction of the Colossus which became operational in 1944. This was more advanced than the Harvard Mark I, but its subsequent impact was limited by the fact that its very existence was a classified secret until 1970.
The War had produced a considerable advance in design technology, but basically we were still at the stage of large and complex calculating machines. The challenge was to produce a device with an internal stored memory, a leap that would take us from calculators to computers proper. The war had also created a pool of scientists with experience in digital computing, and work in advancing technology proceeded rapidly on both sides of the Atlantic. If we are looking for the first modern computer, the credit should go to the Manchester University whose prototype, Baby, became operational in June 1948, followed soon by a full scale operational model, Manchester Mark I. The next major step, the incorporation of a Random Access Memory came three years later with the Whirlwind constructed at MIT.
Figure 14 Manchester Mark I (1949) Figure 15 Whirlwind (1951)
Until now computer advances had been developed either for various branches of government or as prototype units within universities. In 1951 Remington-Rand entered the market with the UNIVAC computer, largely in an effort to recoup the cost over-run in its contract with the US government which had originally ordered the device for the census. A year later, it started producing ready-made software (although the term did not come into use until a decade later). IBM, which had previously specialised in punch-card systems, entered the market with its 700 series in 1953. Offering 60 per cent discount for educational uses, IBM quickly came to dominate the university market. Computers were now spreading quickly through the business and scientific communities, becoming ever faster and ever more user-friendly. They were also becoming smaller. By the end of the 1950s, transistors were beginning to oust cumbersome vacuum tubes and in 1958/59, the first 'integrated circuit' on a piece of silicon produced - five components on a piece 1 cm long. The 'chip' is born and entered into commercial production in 1961.
Figure 16 IBM 7090 (1961)
In 1961 IBM introduced a 'Compatible Time Sharing System' into its 7090/94 series which allowed separate terminals in different offices to access the same hardware. The concept of "remote access" to a "host" computer had become reality. And if you could link to one computer from a desktop terminal, why not to another.... why not to all?
Sources of figures