History of the Internet: Core Network

History of the Internet: Core Network

Other articles in the series:

Introduction

In the early 1970s, AT&T, the huge US telecommunications monopoly, was joined by Larry Roberts with an interesting offer. At the time, he was the director of the Computing Division of the Advanced Research Projects Agency (ARPA), a relatively young organization within the Department of Defense dedicated to long-term research out of touch with reality. In the past five years, Roberts had overseen the creation of the ARPANET, the first significant computer network to connect computers in 25 different locations around the country.

The network turned out to be successful, but its long-term existence and all the bureaucracy associated with it did not fall under the authority of ARPA. Roberts was looking for a way to dump this task onto someone else. And so he contacted the directors of AT&T to offer them the "keys" to this system. After carefully considering the offer, AT&T eventually turned it down. The company's senior engineers and managers felt that the fundamental ARPANET technology was impractical and unstable, and had no place in a system designed to provide a reliable and versatile service.

ARPANET naturally became the seed around which the internet crystallized; a prototype of a huge information system covering the whole world, whose kaleidoscopic possibilities are unrealistic to calculate. How could AT&T fail to see such potential, get so stuck in the past? Bob Taylor, who hired Roberts to oversee the ARPANET project in 1966, was later blunt: "Working with AT&T would be like working with the Cro-Magnons." However, before confronting such unreasonable ignorance of unknown corporate bureaucrats with hostility, let's take a step back. The topic of our story will be the history of the Internet, so at first it would be nice to get a more general idea of ​​\uXNUMXb\uXNUMXbwhat we are talking about.

Of all the technological systems created in the later half of the XNUMXth century, the Internet has arguably been the most important to the society, culture, and economy of the modern world. Its closest competitor in this matter will probably be jet travel. Using the Internet, people can instantly share photos, videos, and thoughts, wanted and unwanted, with friends and family around the world. Young people living thousands of miles apart are now constantly falling in love and even getting married within the virtual world. The endless mall is accessible at any moment of the day or night directly from millions of comfortable homes.

For the most part, this is all familiar, and that's exactly how it is. But, as the author himself can attest, the Internet has also proven to be perhaps the greatest distraction, waste of time, and source of mind corruption in human history, surpassing television in this - and it wasn't easy to do. He allowed all sorts of inadequacies, fanatics and lovers of conspiracy theories to spread their nonsense around the globe at the speed of light - some of this information can be considered harmless, and some cannot. It has enabled many organizations, both private and public, to slowly accumulate, and in some cases quickly and ignominiously lose, huge mountains of data. In general, he has become an amplifier of human wisdom and stupidity, and the amount of the latter is frightening.

But what is the object we are discussing, its physical structure, all this machinery that allowed these social and cultural changes to take place? What is the Internet? If we were somehow able to filter this substance by placing it in a glass vessel, we would see how it separates into three layers. A global communications network will be deposited at the bottom. This layer is older than the Internet by about a hundred years, and at first consisted of copper or iron wires, but has since been replaced by coaxial cables, microwave repeaters, optical fiber, and cellular radio.

The next layer consists of computers communicating with each other through this system using common languages, or protocols. Among the most fundamental of these are the Internet Protocol (IP), the Transmission Control Protocol (TCP), and the Border Gateway Protocol (BGP). This is the core of the Internet itself, and its concrete expression comes in the form of a network of special computers called routers, responsible for finding a path for a message to travel from a source computer to a destination computer.

Finally, in the top layer will be the various applications that people and machines use to work and play on the Internet, many of which use specialized languages: the web browser, communication applications, video games, shopping applications, and so on. To use the Internet, the application only needs to enclose the message in a format that routers can understand. The message could be a chess move, a tiny part of a movie, or a request to transfer money from one bank account to another - the routers don't care and will treat it the same way.

Our story will bring these three strands together to tell the story of the internet. First, the global communications network. In the end, all the splendor of various programs that allow computer users to have fun or do something useful over the network. Together they are connected by technologies and protocols that allow different computers to communicate with each other. The creators of these technologies and protocols built on the achievements of the past (network) and had a vague idea of ​​the future, in the direction of which they groped (future programs).

In addition to these creators, one of the permanent actors in our story will be the state. This will be especially true at the level of telecommunications networks, which are either managed by the government or subject to strict supervision on its part. Which brings us back to AT&T. As much as they hate to admit it, the fate of Taylor, Roberts, and their ARPA colleagues was hopelessly tied to the telecommunications operators, the backbone of the Internet's future. Their networks were completely dependent on such services. How to explain their hostility, their belief that the ARPANET represents a new world that is inherently opposed to the retrograde officials who run telecommunications?

In fact, these two groups were separated not by time, but by philosophical differences. The directors and engineers of AT&T saw themselves as overseers of a huge and complex machine that provided reliable and versatile services to connect one person to another. Bell System was responsible for all the equipment. The architects of ARPANET, on the other hand, considered the system to be a conductor of arbitrary pieces of data, and believed that its operators should not interfere with how this data is created and used from both ends of the wire.

Therefore, we must begin by telling how, through the power of the US government, this impasse over the nature of American telecommunications has been resolved.

History of the Internet: Core Network

One system, one-stop service?

The Internet was born in the specific environment of American telecommunications—telephone and telegraph providers were treated very differently in the United States than in the rest of the world—and there is every reason to believe that this environment played a formative role in the design and shaping of the spirit of the future Internet. So let's take a closer look at how this all happened. To do this, we will go back to the time of the birth of the American telegraph.

American anomaly

In 1843 year Samuel Morse and his allies persuaded Congress to spend $30 to build a telegraph line between Washington D.C. and Baltimore. They believed that this would be the first link in a network of telegraph lines being created with government money, which would stretch across the continent. In a letter to the House of Representatives, Morse suggested that the government buy out all the rights to his telegraph patents and then order private companies to build parts of the network, leaving separate lines for official communications. In that case, Morse wrote, "It will not be long before the whole surface of this country is furrowed with these nerves, which, at the speed of thought, will spread knowledge of everything that is happening on earth, turning the whole country into one large settlement."

It seemed to him that such a vital communication system naturally served the public interest, and therefore fell into the circle of government concerns. Providing communication between several states through mail service was one of several tasks of the federal government specifically noted in the US constitution. However, his motives were not entirely determined by the service of society. Government control gave Morse and his supporters the opportunity to successfully complete their enterprise - to receive a single but significant payment from public money. In 1845, Cave Johnson, U.S. Postmaster General under the 11th U.S. President, James Polk, announced his support for the public telegraph system proposed by Morse: person," he wrote. However, that is where it all ended. Other members of Polk's Democratic administration wanted nothing to do with the public telegraph, and neither did the Democratic Congress. The party did not like the schemes Whigsthat forced the government to spend money on "internal improvements" - they considered these schemes to encourage favoritism, venality and corruption.

Due to the reluctance of the government to act, one of the members of the Morse team, Amos Kendal, began to develop a telegraph network scheme with the support of private sponsors. However, the Morse patent was not enough to secure a monopoly on telegraph communications. Within a decade, dozens of competitors emerged, either licensed to alternative telegraph technologies (basically the Royal House printing telegraph) or simply doing semi-legal business on shaky legal grounds. Lawsuits were filed in batches, paper fortunes rose and disappeared, failing companies collapsed or were sold to competitors after artificially inflating the value of shares. Out of all this turmoil, one major player emerged by the late 1860s: Western Union.

Frightened rumors about the "monopoly" began to spread. The telegraph has already become essential to several aspects of American life: finance, railroads, and newspapers. Never before has a private organization grown to this size. The proposal for government control of the telegraph was given new life. In the decade after the Civil War, congressional postal committees came up with various plans to pull the telegraph into the orbit of the postal service. There are three basic options: 1) The Postal Service sponsors another Western Union rival, giving it special access to post offices and highways, in return assigning tariff limits. 2) The Postal Service launches its own telegraph to compete with WU and other privateers. 3) The government nationalizes the entire telegraph, transferring it under the control of the postal service.

The postal telegraph plans gained several loyal supporters in Congress, including Alexander Ramsay, chairman of the Senate Postal Committee. However, much of the energy of the campaign was provided by outside lobbyists, notably Gardiner Hubbard, who had experience in public services as an organizer of the city's water and gas lighting systems in Cambridge (he later became a major early contributor to Alexander Bell and founder of the National Geographic Society). Hubbard and his supporters argued that the public system would provide the same beneficial dissemination of information that paper mail provided, while keeping rates low. They said that this approach would surely serve society better than the WU system, which was aimed at the business elite. WU naturally objected that the cost of telegrams was determined by its cost, and that a public system that artificially lowered rates would run into problems and not benefit anyone.

In any case, the postal telegraph never received enough support to become a topic of congressional battles. All proposed laws quietly suffocated. The volume of the monopoly has not reached such levels that would overcome the fear of government abuse. The Democrats regained control of Congress in 1874, the spirit of National Reconstruction in the immediate post-Civil War period was muted, and the initially feeble push for a postal telegraph fizzled out. The idea of ​​placing the telegraph (and later the telephone) under government control came up intermittently in the following years, but apart from brief periods of (nominal) government control of the telephone during wartime in 1918, nothing really grew out of it.

This government neglect of the telegraph and telephone was an anomaly on a global scale. In France, the telegraph was nationalized even before it was electrified. In 1837, when a private company attempted to set up an optical telegraph (using signal towers) next to an existing government-controlled system, the French Parliament passed a law banning the development of a telegraph not authorized by the government. In Britain, the private telegraph was allowed to develop for several decades. However, public dissatisfaction with the resulting duopoly led to government control of the situation in 1868. Throughout Europe, governments placed telegraphy and telephony under the control of the state post office, as suggested by Hubbard and his supporters. [in Russia, the state enterprise "Central Telegraph" was founded on October 1, 1852 / approx. transl.].

Outside of Europe and North America, most of the world was controlled by the colonial authorities, and therefore had no voice in the development and regulation of telegraphy. Where independent governments existed, they usually set up public telegraph systems along the lines of the European model. These systems generally lacked the funds to expand at the rate seen in the US and European countries. For example, the Brazilian state telegraph company, operating under the wing of the Ministry of Agriculture, Commerce and Labor, by 1869 had only 2100 km of telegraph lines, while in the United States, in a similar area, where 4 times as many people lived, by 1866 there were already stretched 130 km.

New deal

Why did the US take such a unique path? One can draw to this the local system of distribution of state posts among the supporters of the party that won the elections, which existed until the last years of the XNUMXth century. Government bureaucracy, down to postmasters, consisted of political appointments through which loyal allies could be rewarded. Both parties did not want to create large new sources of patronage for their opponents - and this would certainly happen when the telegraph came under the control of the federal government. However, the simplest explanation would be the traditional American distrust of a powerful central government - for the same reason, the structures of American health care, education and other public institutions are just as different from structures in other countries.

Given the increasing importance of telecommunications to public life and security, the US has not been able to completely separate itself from the development of communications. In the first decades of the XNUMXth century, a hybrid system emerged in which private communications systems tested two forces: on the one hand, the bureaucracy constantly monitored the rates of communications companies, ensuring that they did not take a monopoly position and did not extract excessive profits; on the other hand, the threat of being divided under antitrust laws in case of inappropriate behavior. As we shall see, these two forces could conflict: the theory of tariff regulation considered monopoly to be a natural phenomenon under certain circumstances, and duplication of services would be an unnecessary waste of resources. Regulators usually tried to minimize the negative sides of a monopoly by controlling prices. At the same time, the antimonopoly legislation sought to destroy the monopoly in the bud by forcibly organizing a competitive market.

The concept of rate regulation was born in the railroads, and was implemented at the federal level through the Interstate Commerce Commission (ICC), created by Congress in 1887. The law's main driving force was small businesses and independent farmers. They often had no choice but to use the railroads they used to get their products to market, and they claimed that the railroad companies took advantage of this by squeezing the last money out of them, while providing luxurious conditions to large corporations. The five-member commission was given the power to oversee railroad services and rates and to discourage abuse of monopoly power, in particular by prohibiting railroads from granting special rates to select companies (a forerunner of what we today call "net neutrality"). The Mann-Elkins Act of 1910 expanded the ICC's rights to the telegraph and telephone. However, the ICC, while concentrating on transportation, has never been particularly interested in these new areas of responsibility, practically ignoring them.

At the same time, the federal government developed an entirely new tool to fight monopolies. Sherman Act 1890 gave Attorneys General the power to challenge in court any commercial "combination" suspected of "restricting trade"—that is, suppressing competition at the expense of monopoly power. This law was used over the next two decades to liquidate several of the largest corporations, including the 1911 Supreme Court decision to split the Standard Oil Company into 34 parts.

History of the Internet: Core Network
Standard Oil octopus from a 1904 cartoon, before separation

By then, telephony, and its main provider AT&T, had succeeded in eclipsing telegraphy and WU in importance and capability, so much so that in 1909 AT&T was able to buy a controlling interest in WU. Theodore Vail became president of the combined companies and began the process of stitching them together. Vail firmly believed that a benevolent telecommunications monopoly would serve the public interest better, and promoted the company's new slogan, "One Policy, One System, One-Stop Service." As a result, Vale was ripe for the attention of the monopoly busters.

History of the Internet: Core Network
Theodore Vail, c. 1918

The entry into office in 1913 of the Woodrow Wilson administration provided members of its Progressive Party a good time to threaten with your antitrust club. Postal director Sidney Burlson was leaning towards full mailing of the telephone along the European model, but this idea, as usual, did not receive support. Instead, Attorney General George Wickersham opined that AT&T's continuous takeover of independent telephone companies violated the Sherman Act. Instead of going to court, Vail and his deputy, Nathan Kingsbury, entered into an agreement with the company that went down in history as the "Kingsbury Agreement", under which AT&T undertook:

  1. Stop buying independent companies.
  2. Sell ​​your stake in WU.
  3. Allow independent telephone companies to connect to the long distance network.

But after this dangerous moment for the monopolies, there were decades of calm. The calm star of tariff regulation has risen, assuming the presence of natural monopolies in communications. By the early 1920s, easing was made, and AT&T resumed its takeover of small, independent telephone companies. This approach was enshrined in the act of 1934, which founded the US Federal Communications Commission (FCC), instead of ICC, which became the regulator of wire communications tariffs. By that time, the Bell System, by any measure, controlled at least 90% of America's telephone business: 135 out of 140 million kilometers of wires, 2,1 out of 2,3 billion monthly calls, 990 million out of a billion dollars in annual profits. However, the primary goal of the FCC was not to re-compete, but to "make available, as far as possible, to all residents of the United States, fast, efficient, public, and worldwide communications by wire and radio waves with adequate convenience and reasonable cost." If one organization could provide such a service, so be it.

In the middle of the XNUMXth century, local and state telecommunications regulators in the United States developed a multi-stage cross-subsidization system to accelerate the development of universal communication service. Regulatory commissions set rates based on the perceived value of the network for each customer, rather than the cost of providing a service to that customer. Therefore, business users who relied on telephony to conduct business paid more than individuals (for whom the service provided social amenities). Customers in large urban markets, with easy access to many other users, paid more than those in smaller towns, despite the greater efficiency of large telephone exchanges. Long-distance users were paying too much, even as technology relentlessly reduced the cost of long-distance calls, and profits for local switches grew. This complex system of capital redistribution worked quite well as long as there was one monolithic provider within which it could all work.

New technology

We are accustomed to think of monopoly as a retarding force that breeds idleness and lethargy. We expect the monopoly to jealously guard its position and status quo, rather than serve as an engine for technological, economic, and cultural transformation. It is difficult, however, to apply this view to AT&T at its peak, as it churned out innovation after innovation, anticipating and accelerating each new breakthrough in communications.

For example, in 1922, AT&T installed a commercial broadcast radio station on its Manhattan building, just a year and a half after the first such major station, Westinghouse's KDKA, opened. The following year, she used her long-distance network to relay President Warren Harding's address to numerous local radio stations around the country. A few years later, AT&T also gained a foothold in the film industry, after engineers at Bell Labs developed a machine that combined video and recorded sound. Warner Brothers used this "whitephone» for the release of the first Hollywood picture with music synchronization Don Juan, followed by the first-ever feature-length film using synchronized cues dubbing "jazz singer«.

History of the Internet: Core Network
whitephone

Walter Gifford, who became president of AT&T in 1925, decided to rid the company of spin-offs such as broadcasting and movies, in part to avoid antitrust investigations. Although the company had not been threatened by the US Department of Justice since the Kingsbury Accord, it was not worth drawing undue attention with actions that could be interpreted as an attempt to abuse the monopoly position in telephony to dishonestly expand into other markets. So, instead of organizing its own radio broadcasting, AT&T became the primary signaling provider for the American radio corporation RCA and other radio networks, broadcasting programs from their New York studios and other major cities to affiliated radio stations around the country.

Meanwhile, in 1927, a radiotelephony service spread across the Atlantic, launched by a banal question posed by Gifford to his interlocutor from the British postal service: "How is the weather in London?" It certainly isn't, "That's what God does!" [the first phrase officially transmitted in Morse code by telegraph / approx. transl.], but still she marked an important milestone, the emergence of the possibility of intercontinental calls several decades before the laying of an underwater telephone cable, albeit at a huge cost and low quality.

However, the most important events in our history concerned the transmission of large amounts of data over long distances. AT&T had always wanted to increase traffic on its long distance networks, which served as a major competitive advantage over the few remaining independents, as well as high profits. The easiest way to attract customers was to develop new technology that reduced the cost of transmission—usually this meant being able to cram more conversations into the same wires or cables. But, as we have seen, long distance requests went beyond the traditional telegraph and telephone messages from one person to another. Radio networks needed their own channels, and television was already on the horizon, with much larger requests for bandwidth.

The most promising way to meet new demands was the laying of a coaxial cable made up of concentric metal cylinders [coaxial, co-axial - with a common axis / approx. transl. ]. The properties of such a conductor were studied back in the 1920th century by the giants of classical physics: Maxwell, Heaviside, Rayleigh, Kelvin and Thomson. It had huge theoretical advantages as a transmission line, as it could carry a wideband signal, and its own structure completely shielded it from crosstalk and interference from external signals. Since the development of television in the 1936s, none of the existing technologies could provide the megahertz (or more) bandwidth required for high quality broadcasts. Therefore, Bell Labs engineers set out to turn the theoretical advantages of cable into a working long-distance and broadband transmission line, including the creation of all the necessary auxiliary equipment for generating, amplifying, receiving and other signal processing. In 160, AT&T field-tested, with FCC approval, over 27 miles of cable from Manhattan to Philadelphia. After first testing the system with 1937 voice circuits, the engineers successfully learned how to transmit video by the end of XNUMX.

At that time, another demand for high-capacity, long-distance communications began to emerge, radio relay. The radiotelephony used in the 1927 transatlantic communications used a pair of broadcast radio signals and created a two-way voice channel on shortwave. Linking two radio transmitters and a receiver, using the entire frequency band for one telephone conversation, was not economically viable from the point of view of terrestrial communication. If you could squeeze a lot of conversations into one radio beam, then it would be a different conversation. Although each individual radio station would be quite expensive, hundreds of such stations should have been enough to transmit signals throughout the United States.

Two frequency bands fought for the right to use in such a system: ultra-high frequencies (decimeter waves) UHF and microwaves (centimetric waves). The higher frequency of the microwaves promised more throughput, but also presented a greater technological challenge. In the 1930s, responsible opinion at AT&T leaned towards the safer option with UHF.

However, microwave technology made a big leap forward during World War II due to active use in radar. Bell Labs demonstrated the viability of microwave radio with the AN/TRC-69, a mobile system capable of transmitting eight telephone lines to another line-of-sight antenna. This allowed military headquarters to quickly restore voice communication after relocation, without waiting for the cable to be laid (and without the risk of being left without communication after cutting the cable, both accidentally and as part of enemy actions).

History of the Internet: Core Network
Deployed microwave radio relay station AN / TRC-6

After the war, Harold T. Friis, a Danish-born officer at Bell Labs, led the development of microwave radio relay. A 350 km trial line from New York to Boston was opened at the end of 1945. The waves jumped over 50 km long sections between ground towers - using a principle essentially similar to an optical telegraph, or even a chain of signal lights. Up the river to the Hudson Highlands, over the hills of Connecticut, to Mount Ashnebamskit in western Massachusetts, and then down to Boston Harbor.

AT&T was not the only company with an interest in microwave communications and military experience in managing microwave signals. Philco, General Electric, Raytheon, and television broadcasters built or planned their own experimental systems in the postwar years. Philco beat AT&T by building a link between Washington and Philadelphia in the spring of 1945.

History of the Internet: Core Network
AT&T microwave radio relay station in Creston, Wyoming, part of the first transcontinental line, 1951.

For more than 30 years, AT&T has avoided trouble with antitrust and other government regulators. For the most part, it was defended by the notion of a natural monopoly - that it would be terribly inefficient to create many competing and unrelated systems that lay their wires throughout the country. Microwave communication was the first major dent in that armor, allowing many companies to provide long-distance communications at no extra cost.

Microwave transmissions have seriously lowered the barrier to entry for potential competitors. Because the technology required only a chain of stations spaced 50 kilometers apart, it did not require thousands of kilometers of land to be purchased and thousands of kilometers of cable to be maintained to create a useful system. Moreover, the transmission capacity of microwaves was much higher than that of traditional paired cables, because each relay station could transmit thousands of telephone conversations or several television broadcasts. The competitive advantage of AT&T's existing wired long-distance system was fading away.

However, the FCC protected AT&T from the consequences of such competition for many years, with two decisions in the 1940s and 1950s. At first, the commission refused to issue licenses, except for temporary and experimental ones, to new communication providers who did not provide their services to the entire population (but, for example, carried out communications within the framework of one enterprise). Therefore, entering this market threatened to lose the license. The commissioners worried about the same problem that threatened broadcasting twenty years ago and led to the creation of the FCC itself: a cacophony of interference from many different transmitters polluting a limited radio bandwidth.

The second decision concerned interconnection. Recall that the Kingsbury agreement required AT&T to allow local telephone companies to connect to its long distance network. Were these requirements applicable to microwave radio relay communications? The FCC ruled that they were only applicable in locations where adequate public communications coverage did not exist. Therefore, any competitor establishing a regional or local network risked a sudden disconnection from the rest of the country when AT&T decided to enter its area. The only alternative to keep in touch was to create a new national network of our own, which was scary to do under an experimental license.

By the late 1950s, there was thus only one major player in the long-distance telecommunications market, AT&T. Its microwave network carried 6000 telephone lines along each route, reaching every continental state.

History of the Internet: Core Network
Microwave radio relay network AT&T in 1960

However, the first significant obstacle to AT&T's complete and comprehensive control of the telecommunications network came from a completely different direction.

What else to read

  • Gerald W. Brock, The Telecommunications Industry (1981) The telecommunications industry: the dynamics of market structure / Gerald W. Brock
  • John Brooks, Telephone: The First Hundred Years (1976)
  • M.D. Fagen, ed., History of Engineering and Science in the Bell System: Transmission Technology (1985)
  • Joshua D. Wolff, Western Union and the Creation of the American Corporate Order (2013)

Source: habr.com

Add a comment