Jump to ContentJump to Main Navigation
The Power of KnowledgeHow Information and Technology Made the Modern World$

Jeremy Black

Print publication date: 2014

Print ISBN-13: 9780300167955

Published to Yale Scholarship Online: September 2014

DOI: 10.12987/yale/9780300167955.001.0001

Show Summary Details
Page of

PRINTED FROM YALE SCHOLARSHIP ONLINE (www.yale.universitypressscholarship.com). (c) Copyright Yale University Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a monograph in YSO for personal use (for details see http://www.yale.universitypressscholarship.com/page/privacy-policy). Subscriber: null; date: 21 June 2018

Information Is All

Information Is All

Chapter:
(p.337) 13 Information Is All
Source:
The Power of Knowledge
Author(s):

Jeremy Black

Publisher:
Yale University Press
DOI:10.12987/yale/9780300167955.003.0013

Abstract and Keywords

In the twentieth century, information was received and appreciated as facts and facts as information, with scientific facts becoming the most significant type of fact. This chapter discusses the authority of science and its significance as a focus of information. It describes how the appreciation of new technology and techniques transformed culture and society and the relationship between humans and machines.

Keywords:   information, facts, science, technology, techniques, culture, society, machines

‘We are so close. You’ll see it all within fifty years. Human cloning. Gene splicing and complete manipulation of DNA. New species. Synthesis of human blood and all the enzymes. Solution of the brain’s mysteries, and mastery of immunology.’

Lawrence Sanders, The Sixth Commandment (1979), a novel

Facts and Science

THE SEARCH FOR the future, to unlock, foster, force and present it, became more important to modern culture in the twentieth century. This impulse had been present in the nineteenth century, but was more apparent in the twentieth. A cult of youth, which eventually became worldwide, contributed to the stress on novelty as part of a series of interacting and social factors that led to what has been termed a culture of anticipation.1 At the same time, embracing the potential of change appeared functionally necessary, in the competitive nature of international power politics, in response to rapidly developing technologies and in order to meet public expectations of rising living standards and the creation of social capital such as health services. This situation helped bring science to the fore.

So too did the cult of the fact, as, in a context of change that was not easy to understand, the scientist became a more prominent figure. Information was received and appreciated as facts, and facts as information, with scientific facts becoming the most significant type of fact. The cult of the fact again looked back to the nineteenth century and reflected broader cultural trends. The idea of society as composed of facts, represented by facts, understood through facts and symbolised by facts, challenged conventional hierarchies and traditional moralities, a process which, across the world, posed particular problems for conservative societies and élites. In place of conventional moralists came (p.338) authority for those who could classify and analyse facts, and, even more, those who could transform them.

This transformation was offered by politicians, most of whom were ideologues of some type or other, but they drew on the authority of the human and natural sciences, as these provided a sense of a kind of progress that must and could be shaped. In 1963, Harold Wilson, the leader of the Labour Party, promised the party conference that he would harness the ‘white heat’ of the technological revolution for the future of the country. Wilson saw scientific socialism as a commitment to modernisation. As Prime Minister, in 1964, Wilson founded a Ministry of Technology and renamed the Department of Education to encompass Education and Science.

At the same time as knowledge across the sciences became more specialised, technical and hard to understand, so public policy occupied an increasingly difficult position, at once indebted to technical expertise and yet facing a political class and democratic society that might be unwilling to accept this expertise. Both facts and policies could be unwelcome, and the very nature of information in a democratised age that was open to all encouraged the expression of different views and gave them an apparent position of equivalence. The controversy over climate change in the early twenty-first century is a current example that is all too relevant.

There is also a longer history in which, alongside the tension between professional expertise and popular response, the role of institutional and social contexts is considered in the establishment of knowledge. This was the argument of the German medical microbiologist Ludwig Fleck (1896–1961) in his Enstehung und Entwicklung einer Wissenschaftlichen Tatsache (Genesis and Development of a Scientific Fact, 1935).2 These institutional and social contexts varied greatly across the world. Alongside, and in part explaining, these variations, science – as a key constituent of modern culture and almost a definition of ideas of progress3 – linked practices of military competition, economic development and social progress with ideologies about politics and knowledge. The ability to secure technological change helped ensure substantial public and private investment in science, and across the world there were far more scientists and scientific institutions than ever before.

The authority of science, and therefore its significance as a focus of information, rose greatly. In his investigations of compliance with harsh orders described in Obedience to Authority: An Experimental View (1974), the American psychologist Stanley Milgram (1933–84) discovered that most of his subjects were willing to administer electric shocks to other subjects if told by an authoritarian figure that it was in the cause of science. Indeed, the extent to which scientific exposition was moving in a less deterministic way, with quantum mechanics, Big Bang cosmology and the mathematics of chaos all (p.339) suggesting a degree of contingency and unpredictability in the real world, did not percolate through to much of the public.

Moreover, alongside science and scientific language came a continuing popular commitment to traditional forms of the irrational such as astrology, as well as increased backing for fashionable movements such as spiritualism. In large part, these beliefs were add-ons in a society suffused with technology, and often frightened, or at least uneasy, about it; but, as aspects of a broader anti-modernism, they could also be aspects of a more pointed criticism of science in part because it was attacked for leaving insufficient room for individuality.4 Changes in religious culture were also pertinent in the developing public culture, notably the relative decline of established churches and the greater emphasis both on sects and on individual choice.

The commitment to science as a product of, and guide to, objective reality, and to the best means to use this reality, also reflected the assertion of the necessity and importance of change, an element strongly present in controversies between (supposedly) scientific and other values.5 This assertion was seen across the political spectrum in the West, but also helped politicise science because the nature and direction of change were contentious. As a consequence, there was a political role in scientific activity irrespective of more blatant episodes of governmental interference.

A prime instance of the latter was Stalin’s Soviet Union where ideology was forcibly preferred to the autonomy of intellectual processes.6 Aside from this pressure, science was significant to Communism as it was presented as a union of reason and materialism, and thus opposed to the religious values held to be inherently irrational and to act as a brake on progress.7 Moreover, science appeared necessary if man was to overcome the constraints of the natural environment. However, what the Party line on progress and in science would be, and what that would actually entail, were often unclear.

Technological Change

In addition to the political dimension, and linked to it, new technology and techniques transformed culture and society across the world, and created an understanding of the modern based on an expectation of the new. Appreciating the new thus became an important aspect of information, as well as a key way by which it was presented for purposes of affirming national and sectional success, whether in totalitarian or democratic regimes.8

Technological change in the availability of information both altered human capabilities and transformed the relationship between humans and machines. From the perspective of the technology of the present, the discussion of change focuses on recent decades because it was then that the technologies of today (p.340) and the foreseeable future developed. Thus, attention is devoted to the computer and to related systems and machines. However, it is also pertinent to note the dominant, as well as changing, technologies and policies of information acquisition, transmission and storage in the first half of the twentieth century. The emphasis on change has to juxtapose the understandable interest in new forms, both their invention and their application, with the development of already existing forms.

Most prominently, despite the rise of long-distance verbal communication with the telephone from the late nineteenth century, the written word long remained more important, for both distant and local communication, and as part of a situation in which reading and writing retained the prominence they had gained in the second half of the nineteenth century with the rise of literacy. In 1900, only 5 per cent of American homes had telephones. Compared to three billion items in 1876, 31 billion were posted in Europe in 1928, a rate of increase far greater than that of the population; the volume of telephone calls in Europe did not exceed that of posted items until 1972. Literacy, indeed, became more prominent across the world, with continued state support via education being complemented by a demand for literacy to ensure economic opportunity and social mobility.9

Alongside, and as a major part of, the continuing importance, indeed growth, of established means of conveying information came the ongoing significance of conventional content. In defiance of the idea that the expansion of the mass media would necessarily lead to modernisation in the form of democratisation, social freedom and secularisation, came the contrary ability of groups sometimes not conventionally comprehended under those heads to embrace the mass media. This ability reflected the complexity and diversity of modernisation.10 Thus, in the USA, religious evangelists successfully turned to broadcasting, with figures such as Paul Rader, who described radio as a ‘new witnessing medium’, and Aimee Semple McPherson being particularly active in mid-century.11 ‘I Believe’ was one of the American hit songs of 1955 on the radio. It is worth listening to as an instance of the continued importance of the Bible and belief as sources and means of information, rather than those favoured by the ‘Doubting Thomases’ decried in the song. In referring to the latter, the songwriter employed the Bible itself as the basis for categorising alternative views. The evangelical usage of radio and television has remained important to the present day.

Offering another use of biblical information, Felix, one of the potential suitors for Sarah Nancy, a visitor in a Texan small town in Horton Foote’s one-act stage comedy Blind Date (1985), proposes seeing who can name the largest number of books of the Bible as a game. Moreover, Sarah Nancy’s aunt, Dolores, recommends to her niece a list of topics for conversations that illustrates the (p.341) fields in which opinion and information were required and believed appropriate, including which side was going to win a particular football game, whether there was enough rain for the cotton yet and what the best car was on the market. As a key definition of social links and aspirations, it was also necessary to ask a suitor what Church he belonged to.

At the same time, there were attempts, notably in the USA, to encourage a dialogue between science and religion. Thus, the Moody Bible Institute, opened in 1889 as the Chicago Bible Institute, founded the Moody Institute of Science and the American Scientific Affiliation in 1941.12 As another aspect of a link, many senior American clerics saw the development of atomic weaponry as necessary to the safeguarding of religious and political freedoms against Communism and defended it accordingly. An acceptance of scientific enquiry extended, moreover, to the past. In 1979, Pope John Paul II admitted in a speech that an injustice had been committed against Galileo. This admission led to the Vatican Study Commission and, in 1992, to another papal speech in which the attempt by senior clerics in effect to retry Galileo was closed, the implication being that a key lesson of the affair was the underlying harmony between science and religion.13

As far as changing technology is concerned, aerial photography has already been discussed in the previous chapter, and it again indicates the importance of the first half of the twentieth century. So too do the development of radio and the greatly increased use of the telephone. A sense of potential was seen with the coining in 1904 by Edouard Estaunié of the word telecommunication, tele meaning ‘distance’. Most dramatically, the technologies of human travel and the transportation of freight, including letters and newspapers, were transformed with the invention of powered human flight, which began in 1903. Aeroplanes accelerated the potential of moving not only people, but also messages in the form of post. More generally, they offered a potent image of modernity based on movement.

So also, more insistently as an image of modernity, with electricity and, at the level of the household as well as in the economy, transport and information transmission. The use of electricity, encouraged by the introduction in the 1890s of alternating current (AC), was seen as a key aspect of modernisation, notably in the USA, the Soviet Union, Britain and Japan. This modernisation was depicted in terms of progress, especially in the 1920s and 1930s.14 The cult of the dam – for example, the Hoover Dam in the USA and the dams on the River Don in the Soviet Union – was an aspect of this iconography and ideology of electric power, for hydroelectric generation was regarded as cleaner than that using coal, and as taming nature. It therefore acted as a precursor to later hopes about nuclear power.

Electricity also promised a new domestic environment, one characterised by devices, such as the vacuum cleaner and the electric hob, and associated (p.342) with cleanliness and labour-saving. In a 1938 novel, John Rhode captured change in a British cottage: ‘Everything here’s absolutely up to date … all the latest gadgets – tiled bathroom, latest type of gas cooker, electric refrigerator, coke boiler for constant hot water … a labour-saving house.’ Radios were part of this new environment, which, focused on consumerism, looked towards the later expansion in the West of information technology in the context of electric-powered household devices.15

The use of telephones, invented in 1876, spread as a tool of government and business, as well as privately. By 1910, there were nine million telephones in the USA, and the number there rose rapidly in the following decades. Only 42 per cent of British households had a telephone in 1972, but the greater prosperity of the following decades ensured a major expansion in ownership. Telephone systems also developed in Europe, where, in 1925, it was decided to link them in order to create a long-distance communication system. In the developing world, however, the use of telephones remained small-scale and, during the 1930s, there was only a modest rise in their global use. Nevertheless, emulating the telegraph cables of the previous century, Europe was linked to the USA in 1956 by underwater telephone cable, the TAT-1 (Trans-Atlantic Telephone Cable) installed by Bell. There were also attempts to spread telephone applications. Telephone banking was successfully introduced in the 1980s. However, the picturephone, developed in the USA in the 1970s by AT&T, proved a failure, partly due to its high cost, but also as a result of privacy concerns.16 Nevertheless, VOIT (Voice Over Internet Technology) enables picture phones and videophones today.

As with electricity, the greater use of aircraft, radio and the telephone encouraged the development of a system. This, more than the initial invention, attracted investment and emulation. Networks, moreover, helped in the increasing level of interaction between business people and scientists.

As with earlier information systems, there was also a pronounced political dimension. In particular, the desire for international communications competed with political and commercial interests. For example, in response to the apparent attempt by the Marconi Company to ensure a radio monopoly by refusing to interconnect with rivals, the 1906 World Radio Conference adopted the principle of universal intercommunication. However, rather than the focus being on global connections, new systems were frequently regarded as means to strengthen and align states and empires, and also served to link them to a sense of purposeful change. In the largest empire, the British, the sustained interest in cheaper cable and postal rates, a commitment by the Empire Press Union from its foundation in 1909, was joined by the promotion of air links, notably in the creation of an imperial airmail service from Britain to Australia via India. Flying boat traffic across the empire rose from three million letters in 1928 to 17.5 million by 1934. (p.343) At the same time, imperial links provided an opportunity for competing commercial strategies and political aspirations within the empire, notably in Australia, New Zealand and India. American competition in providing a route to New Zealand across the Pacific was also an issue.17

The emphasis on national and imperial interests was enhanced in the Depression of the 1930s as it reduced trade between international blocs more severely than that within them. As a result, the pre-First World War liberal internationalism that had been partly revived in the 1920s was superseded, with the use of new technology affected by a nationalistic corporatist patriotism. This process had already been seen in the formation in Britain in 1929 of what become Cable & Wireless; in 1932, state control was increased over German communication companies. The USA moved towards a ‘military-information complex,’18 the eventual global influence of which in part rested on the dismantling of the British form of imperial governance, notably through the Bermuda Telecommunications Agreement of 1945.19 There was also American resistance to British aviation interests. Similarly, in the 1930s, the USA used diplomatic pressure to resist German attempts to expand air services in Latin America and instead sought to make the Continent’s air links an American monopoly. In parallel, the Japanese Ministry of Communications sought to further imperial control by extending telegraph, radio and telephone networks in East Asia.20

Radio

Alongside aircraft, radio technology rapidly improved. The initial longwave radio technology was supplemented by shortwave, which was developed in the early 1920s. International services began, from Britain to Canada, in 1926. Shortwave was faster, and the concentrated signal ensured that it was more reliable and less expensive to operate, although it was still slower and more expensive than telegraph cables. Crystal sets were replaced by valve sets in the 1930s, aiding reception. The use of radio greatly increased. Whereas there were two million radio sets in the USA in 1924, there were fifty million by 1940. As a result, revenues and profits for broadcasters and radio manufacturers rose. In 1929, the combined annual gross network revenue for the Columbia Broadcasting System (CBS) and the National Broadcasting Company (NBC) was $19 million, in 1935 that figure was $49 million, and in 1940 $92.6 million. In the USA, the power at stake was commercial, as advertising revenue was crucial and was linked to the sponsoring of particular programmes.

Like many forms of information, radio had a presence both in the consumer world and in that of government, with military needs playing a major role, specifically the goals of imperial defence, which entailed long-range communications. The British and American navies played significant roles in the (p.344) development of radio. Moreover, research, technology and goals were all linked in the related development of atmospheric physics. Cultural assessments were also significant in research, notably in the idea of discrete layers in the atmosphere, such as the ionosphere, an idea that appeared appropriate given the normative form of classification by means of different categories that were hierarchically structured.

These were not the sole processes that played a role in the development of radio. Domestic, political and social assumptions were significant. In Britain, radio broadcasts began in 1922 and the first political broadcasts in 1924. The British Broadcasting Corporation (BBC), a monopoly acting in the ‘national interest’, was established in 1926. The BBC helped give radio a national character, and the performances of individual politicians on it were regarded as significant politically.21 In part by supporting the establishment and consolidation of public broadcasting authorities around the empire, and by cooperating with them, the BBC also served as a way to integrate the empire, notably the settler diaspora.22

The creation by means of the media of common memories was also a feature in other countries, encouraging the development of national cultures,23 although regional identities could still remain significant.24 Advertising fostered national products as well as a desire for change, with many advertisements for cars and ‘white goods’, as well as other goods that were seen as aspects of the modern.25

In the USA, radio was seen as encouraging an active citizenry, although it did not engage adequately with African-American views.26 Carried over the airwaves, President Franklin Roosevelt’s ‘fireside chats’ in the USA played an important role in creating a sense of national community in the 1930s, and were also important in the evolution of the presidency. Radio news became more prominent in the USA, notably after the Munich Crisis of 1938, when radio passed newspapers as the preferred news source in the USA. The position of individual American radio journalists on the developing world crisis became of great significance and controversy. Radio’s ability to create an impression of nearness and, thus, of both danger and commitment, was significant. At a dinner in New York on 2 December 1941, hosted and broadcast by CBS, Archibald MacLeish, the Librarian of Congress, told Edward Murrow of his reporting on the German air attacks of the Blitz in 1940: ‘You burned the city of London in our houses and we felt the flames that burned it. You destroyed … distance and of time.’27 Aside from reporting and propaganda, radio was also militarily significant during the Second World War, notably in providing communication with distant units, such as submarines.

After the war, the invention of the transistor in the Bell Laboratories in New York in 1947 made smaller radios (and other equipment) possible, prefiguring (p.345) the process that was to be seen with the computer. Like the laser, the transistor reflected an understanding of the wave-like character of the electron that arose from research in the 1920s. Bell Laboratories, the research arm of the telephone company AT&T, played a central role in telephonic research and sought to provide ‘universal connectivity’.28

Long-range communications were celebrated as achievements and the means to a better future. In the early 1960s, British commemorative stamps looked back, with issues for the tercentenary of the establishment of the General Letter Office (1960), the Conference of European Postal and Telecommunications Administrations (1960), and the centenary of the Paris Postal Conference (1963). However, there was also a readiness to engage with technological developments. This was seen with stamps for the opening of the Commonwealth Trans-Pacific telephone cable between Canada and New Zealand (1963) and the opening of the Post Office Tower in London (1965), and with the depiction of a telecommunications network and of radio waves in the stamps for the centenary of the International Telecommunication Union (1965). In 1969, the commemorative Post Office Technology stamp series put an emphasis on new developments, especially the stamp for Telecommunications which showed Pulse Code Modulation.

Television

Television provided another rapidly developing form of technology. The world’s first public television broadcasting service began from the BBC in 1936. John Logie Baird, who developed the world’s first television set and, in 1926, gave the first public demonstration of the technology, relied on mechanical parts; in 1937, the BBC decided instead to use the rival Marconi-EMI system, which utilised electronic components in both television sets and cameras. National regulation and government control were key features in the development of television, but free-market societies proved more willing to accept competition. It was under Conservative governments that, in Britain, commercial television companies, financed by advertising, were established in 1955, and the first national commercial radio station, Classic FM, was established in 1992. Commercial power played a major role, while also presenting the challenge for broadcasters of needing to sustain advertising to stay on air.

Affluence, credit and choice helped ensure that television ownership across the West shot up in the 1950s; in Britain, the numbers of those with regular access to a set rose from 38 per cent of the population in 1955 to 75 per cent in 1959. By 1994, 99 per cent of British households had televisions and 96 per cent had colour televisions. In the 1990s, the already increased number of terrestrial television channels was supplemented by satellite channels, the receiving dishes (p.346) altering the appearance of many houses, just as television aerials had done earlier. Satellites also provided new opportunities for newspapers by allowing dispersed printing and thus improved penetration of a number of markets. Moreover, by the 1990s, more than 70 per cent of British households had video recorders, giving them even greater control over what they watched.

Alongside the continued importance of newspapers, television succeeded radio as a central determinant of the leisure time of many, a moulder of opinions and fashions, a source of conversation and controversy, an occasion for family cohesion or dispute and a major household feature generally. A force for change, a great contributor to the making of the ‘consumer society’ and a ‘window on the world’, which demanded the right to enter everywhere and report on anything, television also increasingly became a reflector of popular taste. Just as radio helped to provide common experiences – royal Christmas messages from 1932, King Edward VIII’s abdication speech in 1936, the war speeches of Winston Churchill, prime minister from 1940 to 1945, heard by millions (as those in the First World War of David Lloyd George could not be) – so television fulfilled the same function, providing much of the nation with common visual images and messages.29 Over twenty million British viewers watched the Morecambe and Wise Christmas comedy specials annually on BBC1 in the late 1970s and early 1980s.

This process really began in Britain with the coronation service for Elizabeth II in 1953, a cause of many households purchasing sets or first watching. Thanks to television, the royals almost became members of viewers’ extended families, treated with the fascination commonly devoted to the stars of soap operas. The Royal Family documentary of 1969 exposed monarchy to the close, domestic scrutiny of television. Indeed, both the ‘New Elizabethan Age of Optimism’, heralded by Elizabeth II’s accession in 1952, and discontents in the 1990s about the position and behaviour of some members of the royal family, owed much to the media; the same had been true with Queen Victoria in the 1860s and 1870s.

Similarly, the television became significant in the USA with the presidential debates in 1960, as much of the American public gained an impression of both Kennedy and Nixon through these televised discussions, the first held of their kind. The debates have occurred regularly since 1976, and have been seen as of great significance, which has attracted attention to the details of screening. Ronald Reagan’s performance in 1980 against Jimmy Carter was important to his success. Like other forms of information, the very process of the presidential debates is scarcely value-free. In 1980, the exclusion of a third-party candidate, John Anderson, at the behest of the other two contenders killed off his chances of election. In 2004, a report issued by ten campaigning organisations argued that the Commission on Presidential Debates, established in 1987, had (p.347) ‘deceptively served the interests of the Republican and Democratic parties at the expense of the American people’ by ‘obediently’ agreeing to the major parties’ demands while claiming to be a nonpartisan institution. As a result, the report argued: ‘Issues the American people want to hear about are often ignored, such as free trade and child poverty. And the debates have been reduced to a series of glorified bipartisan news conferences, in which the Republican and Democratic candidates exchange memorized soundbites.’30

Especially before the advent of hundreds of competing channels, television also provided common memories, notably for the electoral monarchy of the presidency: 41.8 million Americans watched President Ronald Reagan’s inaugural address in January 1981. As a political form, State of the Union Addresses were organised for television.

The use of another form of information was seen in the definition of electoral boundaries, and this process underlined the relationship between power and information. The original ‘gerrymander’ took its name from Governor Eldridge Gerry of Massachusetts and his approval in 1812 of a district that looked like a salamander as well as the way the new boundries favoured his own party. Other names for such districts also reflected power, including ‘bushmanders’ after President George W. Bush (2001–9). New technology, in the shape of GIS software, has simply taken the process of redistricting for partisan advantage forward, albeit in a more complex context due to legislative and judicial decisions and aspirations, notably fairness to minorities. The use of information for districting influenced the nature of society, not least in the USA, where school, health and other districts were affected.31

With a similar matrix of direct and indirect consequences, the radio and television media affected, and then came to mould, society around the world. In a marked restriction of Sabbath observance, the difference of Sunday was eroded, with Sunday cinema legalised in Britain in 1932.32 Television was central to much else: the trendsetting and advertising that are so crucial to the consumer society, and the course and conduct of election campaigns.

At the same time, there was a measure of reluctance in televising national legislatures. The French National Assembly was first broadcast by radio in 1947, but television coverage did not follow until 1993. The German Bundestag was not televised until 1999, when it relocated to the new capital, Berlin. Televised American Congressional hearings began in 1948, but telecasts of the floor proceedings of the House of Representatives and the Senate did not begin until 1977 and 1986 respectively. Television coverage of Parliament in Britain began in 1989, but was hedged by rules about what shots could be shown. Politics, however, in part became a matter of soundbites aimed at catching the evening news bulletins. Television, indeed, increasingly also set the idioms and vocabulary of public and private life. Thus, on 14 July 1989, the (p.348) British then prime minister Margaret Thatcher was attacked by Denis Healey of the Labour Party for adding ‘the diplomacy of Alf Garnett to the economics of Arthur Daley’: Healey knew that listeners would understand his references to popular television comic characters.

Satellite television brought cross-border influences that hit monopolies on control of information. Already terrestrial broadcasting had posed a challenge. The destabilising sense of a better world elsewhere that West German television brought to East Germany in the 1980s was seen in other countries. It is not surprising that Islamic fundamentalists sought to prevent or limit the spread of information about Western life, or that the Western model was perceived as a threat by them. Television was banned by the Taliban regime in Afghanistan.

The potency of televised images as a form of political information was also seen in the West, notably with images of violence in Vietnam, Chicago and elsewhere in 1968. The first two transfixed American politics and society, contributing to a sense of malaise that affected the presidency of Lyndon Johnson and led him to decide in 1968 not to seek re-election. The sense of 1968 as a year of change owed much to television images,33 and this impact looked towards that of images circulated during the collapse of Communism in Eastern Europe in 1989 and during the Arab Spring in 2011.

Offices and Computing

Meanwhile, the scale of the information available in, and from, a variety of formats led to enhanced pressure to establish and improve systems for storage and analysis. Although the twentieth century was one of fast-growing aural and visual media, each of which posed their own problems of record management, there was also a marked upsurge in written material. As demands for written records multiplied, so the nature of the office changed as well as the tasks of office workers. Institutions, both private and public, accumulated information on their activities for commercial and administrative reasons, to deal with regulatory and tax requirements, and because such information provided a way to assess effectiveness and plan policy.

Alongside developing tasks came new machines, and newer types of machine, notably with the typewriter, telephone and duplicating machine. Thus, the electric typewriter helped increase the speed of typing and the volume of typed communication. The photocopier made the fortunes of America’s Xerox Company, with the launching of the Xerox 914 copier in 1960 being particularly significant. Gender differences emerged clearly in tasks and conditions. Thus, office work as an information industry was linked to a marked degree of occupational segregation related to gender – for instance, (p.349) female typing pools – and reflected in pay, prestige and conditions, all of which were worse for women. Moreover, office machinery, notably the Xerox, helped structure the organisation of work spaces.

Meanwhile, the problems created by information overload became a standard theme. It was not to be a refrain comparable to environmental strain, but it drew on the same sense that development was not necessarily benign and, indeed, had become positively dangerous. The concept of overload was also applied to earlier periods, as with the discussion of Linnaeus’s classification of plants in the eighteenth century as responding to such a phenomenon (see p. 181).

In part, the idea of a paperless society, stemming from the ability of computer systems to process, store and transmit information by electronic means, appeared to address anxieties about the volume, and indeed intractability, of information. Computing machines had been developed in the nineteenth century using mechanical means; but the possibilities offered by electronic processes in the twentieth century were far greater. They included the concept of Big Data, the analysis of the vast amount of unstructured data that is collected routinely. Systems were developed to use this data.

Computers and War

As so often with the history of information, military purposes played a prominent role in the development of computers. The intellectual and practical innovations that led to the computer owed much to pressures and developments in the mid-twentieth century, most famously the Anglo-American need to break German and Japanese codes in the Second World War, although the link between computers and warfare went back to Charles Babbage’s difference engine in 1823–42: Babbage’s work was funded by the British Admiralty, which wanted accurate astronomical tables to give it an edge in navigation. Computing methods were already in use prior to the Second World War, notably in the electricity industry. However, wartime activities led to major changes. The origins of modern computers can be traced to codebreaking, as it required the capacity to test very large numbers of possible substitutions rapidly. The British and Americans made particularly good use of such techniques. Computers were also utilised to analyse wave movements to help in the planning of amphibious operations. At the same time, as frequently with information technology, the development of new institutions able to make use of such capabilities was very important. Intelligence organisations became a prime government consumer of new forms of information.34

As with atomic power, which became a source of inexpensive and supposedly clean energy in the 1950s, prewar and wartime work with computing led (p.350) to postwar developments. An all-purpose electronic digital computer, the American army-funded Electronic Numerical Integrator and Calculator (ENIAC), was constructed at the Moore School of Electrical Engineering of the University of Pennsylvania in 1946. In Britain, Alan Turing’s theoretical work in the 1930s and 1940s helped pave the way for the Manchester Small-Scale Experimental Machine, the first stored program computer, which went into action in 1948. This was followed in 1949 by the Manchester Mark 1.

At the same time, there is controversy over which was the first electronic computer, and which the first serially produced commercial computer. Moreover, there are differences of opinion over what actually constitutes a computer: does it have to contain a microprocessor or is it just the first general-purpose information-processing machine?35 Computers, indeed, went through a number of stages, variously defined in terms of working processes, representation and commercialisation. As with other changes in technology – for example, the steam engine – any attempt at an overall account proves too schematic.

Most of the relevant changes in information technology came not in the mid-twentieth century, but later, not least because technological application brought new capabilities within the scope of large numbers of people. Communications satellites provided systems for transmitting words and images rapidly, while the silicon microchip permitted the creation of more effective communication methods based on durable micro-instruments.

The net effect was to underline the nature of information and knowledge as a process of change, a sense for many at odds with their supposed meaning as fixed categories and contents. Moreover, at the same time as providing change, computers and other forms of technology worked only if this process was controlled, notably by standardising data and its use, for such standardisation was important to reducing the disparities between data sets that acted as a friction in the use of information.

The physical shift in information in the second half of the century was remarkable, creating problems of comprehension at least as great as those posed by the telephone. Initially, in the absence of miniaturisation, computers were an industrial product of great scale and cost. The use of computer time was very expensive, and it was employed for large projects. The importance and prestige of these large computers helped make the fortune of IBM International Business Machines, the key American player. IBM was based initially on the sorting of data stored on punched cards by a mechanical tabulator designed by Herman Hollerith (1860–1929). Hollerith had invented mechanical means of processing census data for the 1890 American census. In the 1940s, IBM was producing electronic typewriters and accounting equipment and was moving into electronic calculating machines.

(p.351) IBM, however, went into the market for commercial computers in response to the intensification of the Cold War in the early 1950s, and became the key player in the computer industry. The IBM Defense Calculator, later renamed the 701, was produced in 1953, and was followed by a fully transistorised commercial computer. The largest computers ever built were developed at the Massachusetts Institute of Technology (MIT) for America’s SAGE (Semi-Automatic-Ground Environment) Air Defense system, in which IBM played a major role. This system incorporated the point-and-click graphical interface introduced by an MIT group working on the Whirlwind computer. Launched in 1958, the SAGE system enabled the predicting of the trajectory of aircraft and missiles, and was part of America’s major investment in air defence at a time of real Soviet threat and of anxieties about this threat being greater than it actually was. At the same time, major efforts were put into the construction of early-warning stations in Canada designed to provide notice of Soviet attacks across the Arctic.

Developments in the 1950s occurred in part thanks to technological possibilities, but also due to the developing commercial context provided by the decline in the relative significance of military customers. Linked to the latter came a relaxation in the military security classifications associated with ideas and machines. Commercialisation was fostered by extensive reporting about added value and reliability, which encouraged investment. In the 1950s, major companies in both America and Britain purchased computers and found value from them, justifying their high cost. In 1951, the Lyons Electronic Office was introduced. This was the first British computer designed primarily for business purposes that operated on a ‘stored program’ principle, meaning that it could be rapidly employed to tackle different tasks by loading a new program. British ingenuity, however, was not matched by commercial support, in part because the opportunities offered by business computing were not appreciated. Built by Lyons, the computer ran a weekly program to assess the costs, prices and margins for that week’s production of bread, cakes and pies. In 1954, this computer was used to calculate the company’s weekly payroll. The same year, the British Automatic Computing Engine was employed to assess what had gone wrong when a Comet jet aircraft crashed into the Mediterranean, leading to great governmental and public concern about the viability of jet passenger services.

The key developments occurred instead in the USA, the centre of the world economy and the source of most investment capital and applied research. IBM’s ability to generate massive sales and large profits offered the possibility of investing in new products, such as the IBM System/360, which was announced in 1964 and introduced in 1965. This mainframe system provided an opportunity for upgrading without the need to rewrite all software programmes. This flexibility set the industry model, and the IBM System/360, (p.352) which used integrated circuits and was designed for military and commercial purposes, helped reframe what a computer was assumed to be and do. Indeed, it became the industry standard, and made IBM substantial profits.36

The interaction of technologies was seen with the development of computerised telephone switching systems in the 1960s. These made the operation of telephones cheaper by ensuring that less skilled labour was required. Indeed, if technology is an aspect of modernity, then part of that modernisation is a significant degree of labour differentiation. The regulatory environment is also important. In the USA, the role of monopoly providers was hit in the 1960s when the Federal Communications Commission and the courts ended the system by which only Bell telephones could be used on Bell telephone lines. This new freedom encouraged entrepreneurial initiative, fresh investment and technological innovation, notably the development and use of the answering machine, the fax and the modem.

Computers and Society

Meanwhile, in the computer industry, the size of machinery changed dramatically and, with it, the ability to move to a coverage that would give computer systems the capacity to interact directly with much of society. The miniaturisation of electronic components made it possible to create complete electronic circuits on a small slice of silicon which had been found to be an effective and inexpensive way to store information. The integrated circuit was invented in 1958, and the first hand-held calculator in 1966. The Intel 4004, the first microprocessor chip, was created in 1971. In 1965, Gordon Moore, the co-founder of the company responsible, predicted a dramatic revolution in capability as a result of the doubling of the number of transistors per integrated circuit every eighteen months. He suggested that the power of a computer would double annually for the following decade.37 The American military actively financed the development of semiconductors.

From the late 1970s, computers became widely available as office and then household tools. The development of the Graphical User Interface (GUI) for interacting with the computer was important in reducing the degree of expertise necessary for its operation. The process used in the SAGE system was simplified with the development of the mouse, initially at the Stanford Research Institute. Research laboratories able to foster and supply innovation were an important element, as they were more generally in technological development across the century, a situation that underlined the need for investment capital. Xerox’s Palo Alto Research Center in what became known as Silicon Valley in California played a key role in creating the personal, desktop interactive computer.38

(p.353) However, the latter type of computer was established not by Xerox products such as the Stax Computer of 1981,39 but instead by the far less expensive and very successful IBM PC (personal computer), which was also launched in 1981. This utilised a microprocessor from Intel and operating-system software from Microsoft. However, the same constituents could be used by competitors and IBM discovered that it had no edge in that field. In 1984, the Apple Macintosh offered a cheap, effective, easy-to-produce computer mouse to make the GUI simple to use.40 Improvement in specifications followed rapidly, with increased memory; and an internal disc drive was launched in 1987. Graphical interfaces for other machines followed, including IBM’s Topview, launched in 1985.

Improvements in capability ensured that computing power became cheaper, and thus more accessible. It was applied in concert with other technologies and increasingly played a role in production techniques. This shift affected other forms of information technology. Thus, from the 1960s, phototypesetting used computers rather than metal-setting machines, the electronics enhancing the mechanical operations of the earlier machines. From the 1970s, typesetters introduced cathode-ray tube scanning, creating letters from individual dots or lines. From the 1990s, photography no longer played a role in typesetting. Instead, light was used to create type, the laser-written image, reproduced from computer memory, being scorched directly onto film or paper.41 Computer technology turned the printer’s dream into a reality: to be able to print a text anytime, in any size of run, in any configuration of text and illustrations, in any language and in any font size, but at the same time allowing changes to be made without much human work.

Size, specifically miniaturisation, was to be a crucial element in the popularity of new consumer goods, such as mobile phones, laptop computers and mini-disc systems, as portability was an adjunct of the dynamic quality of modern mobile Western society. Keyboardless, handheld computers followed. Meanwhile, the growing number of business and personal computers facilitated the use of electronic mail and access to the Internet. Improvements in network computing, with programs running on different machines to coordinate their activity, ensured that interconnected machines could operate as a single much more powerful machine, removing the need for the cost of a supercomputer.

Developed in the 1990s, this technique anticipated the later ‘cloud computing’ method by which large numbers of machines were combined. ‘Cloud computing’ accessed the processing power that is on the ‘cloud’ created by the general use of computers, rather than requiring the physical infrastructure itself. This practice was very helpful for those with small computers.

The range of technologies in play was considerable. Fibre-optic cables, another advance of the 1970s, increased the capacity of cable systems and the volume of telephone and computer messages they could carry. A single optical fibre could (p.354) carry ten billion bits per second. The capacity of the electromagnetic spectrum to transmit messages was utilised and, thanks to computers and electronic mail, more messages were sent and more information stored than ever before. Volume rose and costs fell. Whereas, in 1970, it cost $150,000 to send a trillion bits of data between Boston and Los Angeles, the cost in 2000 was twelve cents.

The Internet was developed and funded by the American Department of Defense in order to help scientists using large computers to communicate with each other. The initial link was between the University of California and the Stanford Research Institute. Run by the Defense Departments Defense Advanced Research Projects Agency (DARPA), the Internet was seen as having value in the event of a nuclear attack Email, user groups and databases played a modest role at this stage, but, once transferred from the military to private operation, the Internet was transformed in the 1990s, notably with the development of the World Wide Web (developed by Tim Berners Lee, and launched in August 1991), web browsers and servers, and ecommerce. In 1994, an easy-to-use browser, Netscape, was launched.42 The DARPA also developed a Strategic Computing Initiative that was responsible for advances in technologies such as computer vision and recognition, and the parallel processing useful for codebreaking.

New Products

Companies, such as the personal computer innovator Apple, founded in 1976, and Microsoft, the operating-system writer which launched Windows in 1985, created and transformed the industry, developing recent ideas, producing new products and offering new facilities at lower prices, and establishing a dynamic entrepreneurial context for the industry. The development of new facilities encouraged an emphasis on easy consumer interaction. Electronic books were invented in 1971 while Internet banking was introduced in the 1990s. The rise of small computers using standardised software hit IBM hard as its system of centralised machines and proprietary hardware and software no longer proved attractive. Apples sales rose from $2 million in 1977 to $600 million in 1981. Initial public offerings, such as Google’s in 2004 and Facebooks in 2012, provided capital for new developments.

New technology contributed to economic and social optimism during the economic growth of the 1990s and mid-2000s, and was a huge enabler for other industries, such as telecoms and retailers. In turn, this economic growth helped ensure the profitability of this technology. Moreover, the West’s success in the Cold War encouraged market orientation in investment, and thus the shaping of technology in terms of consumerism. Products and specifications changed rapidly. In 1998, the iMac, launched by Apple, helped enhance the visual appeal of personal computers. The ability to send and receive emails in real time while (p.355) on the move came with the BlackBerry, launched in 1996, and usable also as a mobile phone from 2002, contributing significantly to the over five billion mobile phones in use by 2012.43

The Apple iPhone followed in 2007, with its multi-touch screen and effective combination of computer, phone, web browser and media player. The absence of a keyboard or stylus was important to its ease of use. In effect, the web was put in people’s pockets, so lessening the need to print documents. Other companies followed suit, emulating new methods and seeking to surpass rival specifications. User-friendliness was important in encouraging the adoption of new techniques. Moreover, the expectations of consumers in other industries changed, so that shoppers expected to be able to interact in the same way across their purchasing range.

Different technologies and information systems were brought together in these and other machines. Microprocessor-based technology was designed to communicate readily with external data networks, thus linking individuals to large-scale databases.44 Technologies were rapidly applied. Thus, smartphones rested on the development of mobile operating systems, such as Android, Google’s system.

In the 1990s, the deregulation of telecommunication networks provided business opportunities as the idea of public services under government control receded in favour of free-market solutions, a process encouraged by the desire of governments to gain revenue by asset and licence sales. State holdings in British Telecom (BT) and Cable & Wireless had already been sold in the 1980s under the Conservative government of Margaret Thatcher, with BT, the first tranche of which was sold in 1984, bringing in the greatest proceeds of all the British privatisations. These asset sales were copied across much of the world and were then followed by those of bandwidths. This process was enhanced by the West’s eventual success in the Cold War.

Moreover, the new economy of information technology recorded major growth. Intel became the world’s largest chipmaker, Cisco Systems the foremost manufacturer of Internet networking equipment and, after a long battle, Google the leading Internet search engine. Some companies, such as Microsoft, had networks and annual turnovers that were greater than those of many states, and attracted investment, notably at the end of the 1990s. Greatly overvalued at the height of the dotcom boom, Cisco was briefly worth more than $500 billion in 2000.

After the dotcom crash of 2000, there was a rapid process of adjustment and consolidation, but also fresh growth based on new products and on the availability of investment income. AT&T, the world’s largest telecoms firm, had a stock-market value in early 2006 of about £110 billion. In September 2012, after a major rise in earnings that drew on the global popularity of the iPod (p.356) (digital music player, 2001), touch-screen iPhone (2007) and iPad portable tablet (2010), Apple had the highest valuation on the American stock market, at over $630 billion, over 1.2 per cent of the global equity market. Cisco had a cash reserve of $47 billion in April 2012. Launched in February 2004, Facebook had reached 100 million users by August 2008 and 500 million by July 2010. By the spring of 2012, Facebook had 900 million total users and (in March) had 526 million daily users, although there were reports of many fake users.

The sense of potential was seen in April 2012 when cash-rich Facebook spent $1 billion in order to buy Instagram, a two-year-old company with only thirteen employees that shaped pictures on the web, but that helped Facebook in its aspiration to make its profitable means of sharing content more mobile. In this respect, Facebook suffered from its origins in the era of personal computers, whereas mobility has been taken forward by the use of smartphones and wireless broadband connections. By May 2012, 425 million users of Facebook accessed the network via their smartphones, which posed problems for Facebook in selling advertising. The purchase price of Instagram reflected the value of users, notably in targeting advertisements. As television sets are connected to the Internet, future profitability is also sought by Facebook and other companies through the large sums spent on television advertising.

The gap between innovation and widespread use had greatly diminished by the 2000s. Affluence and a sense of need fuelled the quest for the new and the more powerful in technology, and this quest was linked to a discarding of earlier models. In 2004, more than 130,000 personal computers were replaced in the USA daily.

Internet use increased greatly (from 26 million users worldwide in 1995 to 513 million by August 2001, and about two billion by late 2012), but differentially. The USA was the key site of development for the new information technology. In 1998, the Internet Corporation for Assigned Names and Numbers was established to manage the Internet, assigning the unique indicators essential for the address system. It was based in Los Angeles. Also in 1998, nearly half of the 130 million people in the world with Internet access were Americans, whereas Eritrea in East Africa only obtained a local Internet connection in 2000, in the process putting every African state online. Since the Internet only really became efficient when there were sufficient users to create a widespread system, the take-up rate was particularly important. Moreover, the Internet offered a range and capacity that were different from those of previous national, transnational and, in particular, global information and communications systems. American technology played a key role. By 2012, Google’s Android operating system ran on over half the phones sold globally.

The USA increasingly saw itself as a knowledge society. Culture, economics and politics were presented as dynamic, with information a crucial item and (p.357) ‘messaging’ a major form of interaction, work and opinion-formation. By 2006, about 70 per cent of Americans had mobile phones. Similar processes occurred in other advanced economies. By 2001, it was estimated that forty million text messages were being sent daily in the United Kingdom. By early 2008, Skype, a Europe-based telephone system routed entirely over the Internet, had over 275 million users.

Consumer choice was a key element in the major markets for new information technology, and this situation encouraged rapid improvements in products, as well as a concern to make designs attractive. This process was linked to a more general sense of consumer power that was also seen in conventional media, such as newspapers and radio, with readers/listeners/viewers more willing to change provider and also to question the product.45 Successive advances in technology were made, each moulded in part by the perception and then the reality of consumer interest.

The Internet permitted a more engaged and constant consumer response, with, as a result, consumers becoming users and users becoming producers, as categories were transformed. The user domain became more important with the development of both hacking and fan communities. Media content and software-based products became a matter of co-creation, and the media industry increasingly provided platforms for user-driven social interactions and user-generated content (like the dependence of eighteenth-century newspapers on items sent in), rather than being the crucial player in creating content.46 Wikipedia and Twitter were key instances of user-sourced content, with Twitter providing unfiltered real-time information.

At the same time, other media became dependent, in whole or part, on the Internet. An instance of the new form of authorship was provided by a book I consulted while travelling in Germany in June 2012. Romanesque Sites in Germany, published by Hephaestus Books, lacks, on the title-page, an author or place or year of publication. Readers are referred to a website for further information, but the back cover carries the following notice: ‘Hephaestus Books represents a new publishing paradigm, allowing disparate content sources to be curated into cohesive, relevant, and informative books. To date, this content has been curated from Wikipedia articles and images under Creative Commons licensing, although as Hephaestus Books continues to increase in scope and dimension, more licensed and public domain content is being added. We believe books such as this represent a new and exciting lexicon in the sharing of human knowledge.’ The style of the book is jerky and the coverage uneven, but, presumably, the economics of authorship in this case offers opportunities for book publication.

Although the situation was more problematic in authoritarian states, the process of user influence affected the response to the news, with an ability to (p.358) select news feeds that was much more pronounced than in the days of a limited number of terrestrial television channels. Moreover, the rapidly developing nature of news stories helped lend a freneticism and urgency to the news, sometimes described in terms of a ‘twenty-four-hour news cycle’. The manner in which news was produced and distributed contributed to the way the information was understood. To some, this situation represented information as chaos and crisis, but it also reflected the nature of society.

New technology challenged established spatial distinctions. Living in an area without a good bookshop or an art cinema became less important when books could be purchased, and music listened to, over the Internet, and films viewed online, on video or on DVD. Hollywood was brought to American television screens by DVDs from 1997. Compared to videos, they offered better pictures and sounds, and greater durability, as well as consumer convenience in their smaller size and greater ease of use. The enhancement of home systems with wide-screen televisions and surround-sound systems added to this trend. In the USA in 2003, only $9.2 billion was spent at cinemas compared to $22.5 billion on DVDs and (to a lesser extent) videos.

Also in 2003, the launch by Apple of the iTunes online music store revealed the large size of the market for downloading music. In 2001, the iPod had already proved an easily used and successful handheld digital music player. The Apple empire, with its app store, reflected the appeal of integration and the potential that new technology offered for new forms of such integration. The iPod and iTunes store was linked to the iTunes digital media player, launched also in 2001, and this vertically integrated music distribution business greatly affected the music industry, helping end the sale, and therefore production, of the long-playing album. Apple also created a technology and platform that could be used for other companies. By late 2008, Apple had sold over 110 million iPods and over three billion songs via iTunes, and by October 2011 over sixteen billion songs. In the second quarter of 2012, Apple’s global revenues were $35 billion.

As an instance of the continued adaptability of new technology and the search for comparative commercial advantage, Apple developed a smaller version of its iPad in 2012, a device intended to compete with the Kindle Fire tablet (released in 2011), Google’s Nexus 7 (2012), and Microsoft’s Surface (2012).

A Range of Uses

The capabilities of writers, designers and others had been enhanced by computerised systems. Sound was changed with the development of electronic music. In 1964, the analogue synthesiser, invented by Robert Moog, replaced the physical bulk of previous synthesisers. The resulting opportunities were taken (p.359) up both by avant-garde classical musicians and their popular-music counterparts, and, in turn, were taken further as synthesisers became smaller and less expensive, with computer-based digital synthesisers being used from the 1980s.

At a very different level, computer animation transformed filmmaking, especially cartoons, and the American company Pixar was at the centre of this. Technological application was an aspect of the world of mixed media. Pixar was linked to Apple through Steve Jobs (1955–2012), who was head of Apple and a founder of Pixar.

Digitalisation, the key form for the reception, transfer and presentation of information, also served cultural ends. Not only did it provide access to more information than any earlier form, but, in addition, the ability to reproduce images readily offered the possibility of providing information on cultural trends and products. This reproduction was greatly enhanced by the capacity for mixed media. More generally, as the sensations and environments of experience changed, computers affected consciousness.47

The opportunities and choices on offer challenged established social norms and legal practices. Pornography, an industry in which the USA has long led the world, thanks to its wealth, sexual licence and freedom of expression laws, became more accessible as a result of the Internet. The spread of video cassettes and pornography was synergetic in the 1970s, leading to the success of Deep Throat (1972), which became one of the country’s most popular video cassettes. Video cameras were also significant as a technology fostering individualism, a process seen both in the production of home pornography and in the theme of Steven Soderbergh’s much-applauded film Sex, Lies and Videotape (1989), Access to pornography on the Internet hit the profits of larger commercial producers, resulting in a crisis in the American industry in the 2010s. In some respects, the means of production represented a new iteration of the impact of printing, and, more particularly, of the later changes in cheap, illustrated print, on the spread of sexual material.48

As another aspect of technological opportunities and their link to consumerism, individuals were assailed by unwanted phone calls, faxes and emails. By 2008, it was estimated that four-fifths of all emails were spam. Moreover, computer fraud increased.

Meanwhile, computers served new systems of information requiring hitherto unprecedented volume and speed in the assimilation, analysis and presentation of data: for example, the booking of seats on airlines and air-traffic control. Although the paperless office proved a myth,49 paper records did become less important. Military needs and applications continued to be important in the computer world, while computers were even more significant for the military. The first effective laptop-style portable computer, the GRiD Compass of 1982, which had an electroluminescent display and a robust magnesium case, (p.360) proved attractive to the American military. Adapted for the field, many were purchased by the army, while NASA put them into the space shuttle. More generally, decision-making systems involving the rapid processing of information and entailing speedy communications became of key importance.

In turn, network analysis was both an activity held to be of crucial significance for efficiency and also a subject for intellectual enquiry, product development and human training. The emphasis on information technology was important not only to manufacturing and trade, but also to other aspects of the world of work.50 Among the ten fastest growing occupations in the 2000s listed by the American Labor Department were network systems and data communications analysts; computer software engineers, applications; and computer software engineers, systems software. Digital-linked manufacturing became of greater significance, leading to talk in the 2010s of a new industrial revolution.

Cybernetics

The psychological, cultural and intellectual impact of developments was far less certain than the statistics of change. Explanatory intellectual strategies focused on the relationship between humans and machines – not a new theme, but one that became more insistent. Cybernetics, a term coined in 194751 to mean the study of control and communications systems, became prominent, not least as an aspect of the conceptualisation and practice of the Cold War.52 Military needs in the Cold War spread the new practice of systems thinking and related theories and rhetoric. The RAND Corporation was established to help the American Air Force analyse the likely nature of nuclear war.

The modelling of information systems was applied to both brains and computers, and, in a reranking of disciplines that echoed the earlier age of ‘political arithmetic’ and Newtonian physics, human and natural sciences were rethought in terms of the concise and precise language and methods of mathematics, the latter understood through, and displayed in terms of, computer simulation. Ideas such as entropy and feedback loops were taken from their inorganic background and applied to living organisms and societies: for example, in functional discussions of why wars break out.

Developed in the USA, cybernetics was applied more widely, as in the Soviet Union, where it was deployed in the 1950s as an aspect, during the period of de-Stalinisation from 1953, of the attempt to lessen Marxist-Leninist ideas. In the 1960s, the Scientific-Technological Revolution (NTR) was meant to cure all the ills of Socialism. It was hoped that computers might help with central planning and thus realise the great promise of Socialist progress. However, the possibility of new analogies and unauthorised answers offered by cybernetics led to its bureaucratic stifling in the Soviet Union, not least because (p.361) of a determination to restrict the unpredictable element of computers. This apparent triumph of dialectical materialism therefore was to be short-lived,53 and the resulting lack of engagement with the possibilities offered by computers was to be regarded as a major weakness of the Soviet economy, notably so by the mid-1980s.

Cybernetics was a prime instance of the extent to which, in a period in which behavioural science held sway,54 analogies between humans and machines were pressed. Man-as-machine had results in terms of the understanding of context, causes and consequences in human activity, and this analogy was linked to the emphasis on predictability and the rise of statistical analysis and advice. Both of the latter were long-standing elements in intellectual cultures that put a premium on mathematical information, but the computer helped make the emphasis more normative in Western societies, as well as providing far more data-handling capacity, and thus encouraging systems analysis and operations research. Cybernetics pursued enhanced control, and central control, by understanding and using the capabilities of information systems. However, cybernetics also proposed a degree of predictability that left insufficient room for contingency and the absence of linear progression. In part, this theoretical failure contributed to that of American operations research and systems analysis in the Vietnam War.55 Meanwhile, different analogies for human potential were offered by the development, in the 1960s–1970s, of speculation about, and research into, telepathy, extrasensory perception, mind control and hallucinatory drugs.

Artificial Intelligence

The capacity of machines to produce new knowledge was focused on the abilities of computer systems. Thus, Donald Michie (1923–2007), director of experimental programming (1963–6) and professor of machine intelligence (1967–84) at Edinburgh University, argued, in On Machine Intelligence (1974) and The Creative Computer (1984), that computers were not limited tools. Herbert Simon (1916–2001), another pioneer, remarked in 1964: ‘Machines will be capable within 20 years of doing any work a man can do.’ Such wildly overoptimistic predictions were a feature of this branch of science. In the 1950s, Simon, and other scientists including Marvin Minsky, had established Artificial Intelligence (AI) as a field of research, a field that the American military supported.

AI research addressed all functions of the human brain including not only simple choices between options but also creative thinking. The attempt to employ reverse engineering and to apply insights in action in terms of robotics proved attractive as a way to display benefit, but only fairly simple processes (p.362) could be executed by robots. Indeed, the high hopes for AI proved flawed, in part because mathematical precision turned out to be an inappropriate basis as a model for complex behaviour. It was difficult to reduce such complexity to manageable chunks. However, from the 1950s and notably in the 1980s, computers were used in order to advance AI, although their capacity to respond to unforeseen circumstances proved limited. In the 2010s, however, better techniques of pattern recognition made it easier for machines to learn responses.

Nevertheless, no machine learning yet comes close to the power of (wideranging, long-lived, perhaps open-ended) human learning. Moreover, the language used by humans depends heavily on context: nuanced interactions with others are difficult to emulate. In a 1950 paper on AI, Alan Turing predicted that, by the end of the century, his Imitation Game, in which a human interrogator would decide whether the ‘system’ with which he was communicating was another person or a computer, would be passed on 70 per cent of occasions: by which he meant that, on average, some seventy out of every hundred attempts would result in a misidentification of the computer as a person. In the yearly contests, the current rate of misidentification is zero.

In addition to research into AI, there has been much progress in studying the operations of the brain. The attempt to understand and alleviate brainrelated disorders led to work on silicon-based devices that could be inserted into the brain: for example, artificial synapses, the junctions between brain cells.

The question of whether machines, notably computers, could transcend the limitations of their inventors (and the limits of their inventors’ intentions) moved from the bounds of science fiction. While machines cannot yet think like humans, they are able to draw on banks of online data and make calculations far faster than humans. As a result, computer-aided decision-making became more prominent across a range of spheres including medical diagnostics, financial trading and weather prediction. The rapid electronic provision of financial information transformed international financial markets.

Again, contention emerged, notably in the discussion of climate trends and global warming,56 while there was also controversy, especially in the 2000s, over the role of computer-programmed financial trading in causing damaging herd activity among market traders. The misguided anxiety about a ‘millennium bug’ that would cause computers to malfunction in 2000 indicated a more general disquiet about dependence on machinery; this anxiety in part arose from the fact that, due to their complexity, no significant IT system is ever completely understood.

The application of new information was important to research into a range of technologies, including biotechnology and fuel-cell technology, as well as (p.363) AI. Each was designed to increase the capacity of human society to overcome problems. The changing nature of AI was shown in the challenges set for machines supposedly at the cutting edge of capability. In 1997, in a much-publicised encounter, the Deep Blue computer defeated Garry Kasparov, the world chess champion. In contrast, in 2011, IBM entered a computer that proved able to win in the US television quiz show Jeopardy!, although two types of Jeopardy! question were excluded from the contest. A question-and-answer contest, Jeopardy! demonstrated the ability of computers to handle direct questions rapidly in the world of language.

The ability to grasp vocabulary and syntax is important to this skill, as is that of scrutinising a large amount of digital data. As large amounts of data were digitised in the 2000s, the ability to scrutinise databases became more valuable. Machines can do this, and can note unexpected links, more effectively and more rapidly than humans. IBM deployed this technology in offering more accurate diagnostic services in the medical and financial sectors.

Scrutinising was enhanced in the 2010s with the redesign of Google Search, the world’s leading Internet search engine, a redesign involving the fusion of its keyword search system with semantic search technology. This meant that, in place of producing a large number of links in response to a query, there would be a single answer, carefully related to the intentions of the questioner. Semantic information retrieval, indeed, has long been a goal, although the viable, comprehensive single-answer search engine has not yet arrived.

This aspiration set up concerns about the degree of direction on the web, not least the shutting-off of other options from the questioner, as well as posing problems for companies and other services seeking to be the answer to search requests. Moreover, the spate of litigation in the early 2010s over patents and intellectual property variously involving Apple, Facebook, Google, Oracle and Samsung indicated the extent to which the profitability of innovation also confronted and posed structural constraints, as well as suggesting that profitability, consumerism, copyright and patents were in an unstable relationship. These problems were not new. In the 1970s, IBM faced an antitrust lawsuit from the American Department of Justice that lasted for thirteen years.

Conclusions: The Global Situation

The emphasis in this chapter has been on scientific and technological development, and the related increase in the capability of information systems. As throughout the book, it is appropriate to note the implications of all this in terms of political, social and cultural configurations. A geographical dimension, that of Western power, was considered in the previous chapter, while governmental consequences appear in the next.

(p.364) On the global scale, the situation varied greatly. New technologies played a major role in the changing relationship between regions and countries, notably the rise of Japan in the 1960s and, subsequently, of China. The spread thither of Western-style industry and capital in part resulted from international trade and organisations, government planning and policies, and the strategies of companies. Nevertheless, the change from ‘machinofacture’ to the new technologies, notably IT, was also important, not least because it opened windows for advancement in new places without great capital or infrastructure requirements. Having passed Britain, West Germany and France in GNP in the 1960s, Japan became the second wealthiest country in the world, after the USA, only then to be overtaken by China.

Although the Internet ensured that many information-processing tasks could be outsourced to educated workers elsewhere, notably in India, growth in this and other respects was very unequal. For example, by the end of 1999, 750,000 robots were in use in manufacturing around the world, their distribution reflecting the dominance of technology by the developed world, with not a single country in Africa, Latin America or South Asia in the list of the twenty countries with the most robots. Indeed, information poverty became newly prominent as an issue: in 1999, the United Nations Development Report highlighted the danger that the new digital technologies might accentuate disparities in economic growth due to differential skill and infrastructure resources. Moreover, in 1998, only 12 per cent of Internet users were in non-OECD (less-developed) countries; although, by 2000, that percentage (of a much larger number of users) had risen to 21. By 2012, moreover, the world average was about a third, and some Third World regions had respectable uptake rates: in the Middle East, over a third of the population used the Internet. The technology provided particular opportunities for women, who have only a limited public role, and was embraced in countries such as Morocco for the social opportunities thereby opened up. However, fewer than one in six of the African population was using the Internet by 2012.

New technologies offered opportunities to catch up by leapfrogging previous stages of development. This process was seen in particular with the use of mobile-phone systems in countries with weak terrestrial networks, such as many African states. The same appeared possible with computers: the One Laptop Per Child programme launched in 2005 in order to provide cheap laptops for poor children. However, in Peru, the country offering the largest beneficiary of the programme, the test scores of children in reading and mathematics remained low, and a 2012 report by the Inter-American Development Bank was sceptical about its value. Furthermore, there are studies suggesting the limited benefits of education stemming from personal computer use.57 More positively, the global uptake of the Internet was far faster and more widespread than the comparable uptake of the telephone a century earlier.

(p.365) It is also necessary to note the social implications of computer usage and of other aspects of developing technology. These implications varied by country, religious and ethnic considerations being more significant in some contexts than others, but were generally apparent in gender terms as the use of computers helped overcome social practices that kept women from a public role. The exclusion of women from senior posts in scientific and technological education and development was particularly notable in some Islamic countries, but was also for long an issue in Western counterparts. For example, in the USA during the Second World War, there was a major expansion in state-supported science, but women did not benefit in any proportionate fashion. Moreover, they had to express the information they could contribute in terms of a scientific language and professional code that offered nothing for the particular insights they could give: for example, on nutrition.58 By 2013, however, thanks to social changes and affirmative policies, the general situation for women in the West was considerably more benign.

On the global scale, a significant geographical aspect of access to information was provided by differential knowledge of modern contraceptive methods. Access to them, notably for women, was also an issue. Another geographical dimension was that of comparative literacy rates, which were far higher in the West than in Africa or the Middle East for most of the twentieth century, and which remain higher there still. A different form of geographical variety is provided by the loss of languages, as indigenous peoples succumb to the pressure of globalisation and, at the same time, the rise of pidgin and creole languages as languages are interleaved.59 The variety of languages is an issue for Internet providers. Opportunity comes from providing and improving translation software, with Google in Africa selling Zulu, Afrikaans, Amharic and Swahili software among others. REVERSO is a free Internet translation facility that will translate from almost any written language to another. In 2012, China Telecom launched the first Tibetan-language smartphone. At the same time, the pressure to use major languages, notably English, increases.

The relationship between different parts of the world is in part captured by the fate of languages. At the same time that modern systems of information have become more insistent, the number of distinct languages has fallen. Clearly, there is a relationship between the two phenomena, although the use of language and fate of languages have never been uniform or static.60 The prestige and power of the languages used by imperial powers have been succeeded by the strength of those languages, principally English, associated with information transmission. In early 2008, Wikipedia, the online encyclopaedia, contained over nine million entries in 250 different languages, but of these over two million were in the English-language version, the dominant one.

(p.366) The changing nature of language is also an issue. The Internet presents a new form of oral as well as written culture, as patterns of oral communication play a major role in it,61 notably with Twitter. New words have emerged, such as the noun and verb ‘podcast’, referring to programs for downloading onto iPod players or similar portable devices. Software languages have also become part of the linguistic mix, with major ones proving influential around the world, notably Java, launched in 1995 by Sun Microsystems, the key maker of computer workstations. The speed of generational development in programming languages has proved a major feature in communication changes over recent decades. Thus, information and practices linked to new technology are affecting communications to an unprecedented extent.

Notes:

(1) . R. Panchasi, Future Tense: The Culture of Anticipation in France between the Wars (Ithaca, New York, 2010).

(2) . C. Bonah, ‘“Experimental Rage”: The Development of Medical Ethics and the Genesis of Scientific Facts. Ludwig Fleck: An Answer to the Crisis of Modern Medicine in Interwar Germany?’, Social History of Medicine, 15 (2002), p. 199.

(3) . J.T. Stuart, ‘The Question of Human Progress in Britain after the Great War’, British Scholar, 1 (2008), pp. 53–78.

(4) . M. Thomson, Psychological Subjects: Identity, Culture, and Health in Twentieth-Century Britain (Oxford, 2006).

(p.451) (5) . G. Ortolano, The Two Cultures Controversy: Science, Literature and Cultural Politics in Post-war Britain (Cambridge, 2009).

(6) . E. Pollock, Stalin and the Soviet Science Wars (Princeton, New Jersey, 2006).

(7) . J.T. Andrews, Science for the Masses: The Bolshevik State, Public Science, and the Popular Imagination in Soviet Russia (College Station, Texas, 2003).

(8) . D.E. Nye, Electrifying America: Social Meanings of a New Technology, 1880–1940 (Cambridge, Massachusetts, 1990).

(9) . D. Vincent, The Rise of Mass Literacy: Reading and Writing in Modern Europe (Oxford, 2000).

(10) . C. Ross, Media and the Making of Modern Germany: Mass Communications, Society, and Politics from the Empire to the Third Reich (Oxford, 2008).

(11) . T.J. Hangen, Redeeming the Dial: Radio, Religion, and Popular Culture in America (Chapel Hill, North Carolina, 2002).

(12) . J. Gilbert, Redeeming Culture: American Religion in an Age of Science (Chicago, Illinois, 1997).

(13) . M.A. Finocchiaro, Retrying Galileo, 1633–1992 (Berkeley, California, 2005).

(15) . J. Rhode, Invisible Weapons (London, 1938), p. 145; R.C. Tobey, Technology as Freedom: The New Deal and the Electrical Modernisation of the American Home (Berkeley, California, 1996).

(16) . K. Lepartito, ‘Picturephone and the Information Age: The Social Meaning of Failure’, Technology and Culture, 44 (2003), pp. 50–81.

(17) . J. Hill, Telecommunications and Empire (Urbana, Illinois, 2007); A. Anduaga, Wireless and Empire: Geopolitics, Radio Industry and Ionosphere in the British Empire, 1918–1939 (Oxford, 2009).

(18) . D. Winseck and R.M. Pike, ‘The Global Media and the Empire of Liberal Internationalism, circa 1910–1930’, Media History, 15 (2009), p. 49.

(19) . R.E. Collins, ‘The Bermuda Agreement on Telecommunications 1945’, Media History, 18 (2012), pp. 200–1.

(20) . D. Yang, Technology of Empire: Telecommunications and Japanese Expansion in Asia, 1883–1945 (Cambridge, Massachusetts, 2010).

(21) . L. Beers, Your Britain: Media and the Making of the Labour Party (Cambridge, Massachusetts, 2010).

(22) . S.J. Potter, Broadcasting Empire: The BBC and the British World, 1922–1970 (Oxford, 2012), pp. 78–9.

(23) . R.H. Claxton, From ‘Parsifal’ to Perón: Early Radio in Argentina, 1920–1944 (Gainesville, Florida, 2007).

(24) . A. Russo, Points on the Dial: Golden Age Radio beyond the Networks (Durham, North Carolina, 2010).

(25) . R. Marchand, Advertising the American Dream: Making Way for Modernity, 1920–1940 (Berkeley, California, 1985).

(26) . D. Goodman, Radio’s Civic Ambition: American Broadcasting and Democracy in the 1930s (New York, 2011).

(27) . D.H. Culbert, News for Everyman: Radio and Foreign Affairs in Thirties America (Westport, Connecticut, 1976), and ‘On the Right Wavelength’, History Today, 56, 2 (2006), p. 46.

(28) . J. Gertner, The Idea Factory: Bell Labs and the Great Age of American Innovation (London, 2012).

(29) . T. Hajkowski, The BBC and National Identity in Britain, 1922–53 (Manchester, 2010).

(31) . M. Monmonier, Bushmanders and Bullwinkles: How Politicians Manipulate Electronic Maps and Census Data to Win Elections (Chicago, Illinois, 2001).

(p.452) (32) . S.J.D. Green, The Passing of Protestant England: Secularisation and Social Change, 1920–1960 (Cambridge, 2011).

(33) . D. Culbert, ‘Television’s Visual Impact on Decision-Making in the USA, 1968: The Tet Offensive and Chicago’s Democratic National Convention’, Journal of Contemporary History, 33 (1998), pp. 419–49.

(34) . D. Kahn, The Reader of Gentlemen’s Mail: Herbert O. Yardley and the Birth of American Codebreaking (New Haven, Connecticut, 2004).

(35) . P. Atkinson, Computer (London, 2010), pp. 10–13.

(36) . V. W. Ruttan, ‘Is War Necessary for Economic Growth?’, Historically Speaking, 7, 6 (July–Aug. 2006), p. 17.

(37) . G.E. Moore, ‘Cramming More Components onto Integrated Circuits’, Electronics, 38, 8 (19 Apr. 1965).

(38) . M. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (London, 2000).

(39) . D.K. Smith and R.C. Alexander, Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer (New York, 1988).

(40) . S. Levy, Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything (London, 1995); A. Hertzfeld, Revolution in the Valley: The Insanely Great Story of How the Mac Was Made (Sebastopol, California, 2005).

(41) . W.S. Shirreffs, ‘Typography and the Alphabet’, Cartographic Journal, 30 (1993), pp. 100–1.

(42) . T.P. Hughes, Rescuing Prometheus (New York, 1998); J. Abbate, Inventing the Internet (Cambridge, Massachusetts, 1999).

(43) . J. Agar, Constant Touch: A Global History of the Mobile Phone (Cambridge, 2003).

(44) . J. MacCormick, Nine Algorithms that Changed the Future: The Ingenious Ideas that Drive Today’s Computers (Princeton, New Jersey, 2012).

(45) . D. Hendy, Life on Air: A History of Radio Four (Oxford, 2008).

(46) . M.T. Schäfer, Bastard Culture! How User Participation Transforms Cultural Production (Manchester, 2011).

(47) . S. Terkul, Life on the Screen: Identity in the Age of the Internet (New York, 1995).

(48) . R. Porter and M. Teich (eds), Sexual Knowledge, Sexual Science: The History of Attitudes to Sexuality (Cambridge, 1994).

(49) . A.J. Sellen and R.H.R. Harper, The Myth of the Paperless Office (Cambridge, Massachusetts, 2003).

(50) . A. Blok and G. Downey (eds), Uncovering Labour in Information Revolutions, 1750– 2000, supplement to International Review of Social History (2003).

(51) . N. Wiener, Cybernetics: or Control and Communication in the Animal and the Machine (Cambridge, Massachusetts, 1948).

(52) . M. Farish, The Contours of America’s Cold War (Minneapolis, Minnesota, 2010), pp. 147–92.

(53) . S. Gerovitch, From Newspeak to Cyberspeak: A History of Soviet Cybernetics (Cambridge, Massachusetts, 2002).

(54) . J. Isaac, Working Knowledge: Making the Human Sciences from Parsons to Kuhn (Cambridge, Massachusetts, 2012), p. 9.

(55) . A. Bousquet, The Scientific Way of Warfare: Order and Chaos on the Battlefields of Modernity (New York, 2009); M. Elliott, RAND in Southeast Asia: A History of the Vietnam War Era (Santa Monica, California, 2010).

(56) . P.N. Edwards, A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming (Cambridge, Massachusetts, 2010).

(57) . L. Cuban, Oversold and Underused: Computers in the Classroom (Cambridge, Massachusetts, 2001).

(58) . J. Jack, Science on the Home Front: American Women Scientists in World War II (Champaign, Illinois, 2009).

(59) . J. Holm, Pidgins and Creoles (Cambridge, 1988).

(60) . J. Nichols, Linguistic Diversity in Space and Time (Chicago, Illinois, 1992).

(61) . W.J. Ong, Orality and Literacy: The Technologizing of the Word. New Accents (New York, 1988).