History of the Internet: the computer as a communication device

History of the Internet: the computer as a communication device

Other articles in the series:

In the first half of the 1970s, the ecology of computer networks moved away from its original ancestor, the ARPANET, and expanded into several different dimensions. ARPANET users have discovered a new application, email, which has become the main activity on the network. Entrepreneurs released their own versions of ARPANET to serve commercial users. Researchers around the world, from Hawaii to Europe, have been developing new types of networks to meet needs or fix bugs not covered by ARPANET.

Nearly everyone involved in this process moved away from the original purpose of the ARPANET, which was to provide common access to computers and programs to a diverse range of research centers, each with its own special resources. Computer networks became primarily a means of connecting people with each other or with remote systems that served as a source or dump of human-readable information, for example, with information databases or printers.

This possibility was foreseen by Licklider and Robert Taylor, although this was not the goal they were trying to achieve when they launched the first network experiments. Their 1968 article "The Computer as a Communications Device" lacks the energy and timeless quality of the prophetic milestone in computer history found in Vanivar Bush's articles.How can we thinkor Turing's Computing Machinery and Intelligence. However, it contains a prophetic piece about the fabric of social interaction weaved by computer systems. Licklider and Taylor described a near future in which:

You will not send letters or telegrams; you'll simply define the people whose files to link to yours, and what parts of the files they need to link to, and perhaps define an urgency factor. You will rarely make phone calls, you will ask the network to link your consoles.

Features and services that you subscribe to and other services that you use as needed will be available online. The first group will include investment and tax advice, selection of information from your field of activity, announcements of cultural, sports and entertainment events that match your interests, etc.

(However, their article also described how unemployment would disappear on the planet, since in the end all people will become programmers serving the needs of the network and will be engaged in interactive debugging of programs.)

The first and most important component of this computer-driven future, email, spread like a virus over the ARPANET in the 1970s, starting to take over the world.

Email

To understand how e-mail evolved on the ARPANET, you first need to understand what an important change took over the computing systems of the entire network in the early 1970s. When ARPANET was first conceived in the mid-1960s, the hardware and control software at each point had little in common. Many points focused on special, one-of-a-kind systems, such as Multics at MIT, TX-2 at Lincoln Lab, ILLIAC IV under construction at the University of Illinois.

But by 1973, the landscape of networked computer systems had become remarkably uniform, thanks to the wild success of Digital Equipment Corporation (DEC) and its entry into the scientific computing market (it was the brainchild of Ken Olsen and Harlan Anderson, based on their experience with TX-2 at Lincoln Lab). DEC developed the mainframe PDP-10, which was released in 1968 and provided reliable time sharing for small organizations by providing a whole set of tools and programming languages ​​built into it to make it easy to customize the system to specific needs. This is exactly what the scientific centers and research laboratories of that time needed.

History of the Internet: the computer as a communication device
Look how many PDPs there are!

The ARPANET company BBN made this kit even more attractive by creating the Tenex operating system, which added paged virtual memory to the PDP-10. This greatly simplified the management and use of the system, since it was no longer necessary to fit the set of running programs to the available memory. BNN shipped Tenex for free to other ARPA nodes, and the OS soon became dominant on the network.

But what does all this have to do with email? Users of time-sharing systems were already familiar with electronic messaging, since by the late 1960s most of these systems provided mailboxes of one kind or another. They provided some kind of internal mail, and only users of the same system could exchange letters. The first person to take advantage of the network to transfer mail from one machine to another was Ray Tomlinson, an engineer at BBN and one of the authors of Tenex. He has already written the SNDMSG program to send mail to another user on the same Tenex system, and the CPYNET program to send files over the network. It only took a little imagination, and he could see how to combine these two programs to create network mail. In previous programs, only a username was required to identify the recipient, so Tomlinson came up with the idea of ​​combining the local username and host name (local or remote), connecting them with the @ symbol, and getting an email address that is unique for the entire network (the @ symbol was rarely used before, mainly for price indication: 4 cakes @ $2 each).

History of the Internet: the computer as a communication device
Ray Tomlinson in his later years, against the backdrop of his distinctive @ sign

Tomlinson began testing his new program locally in 1971, and in 1972 his network version of SNDMSG was included in a new release of Tenex, with the result that Tenex mail was able to break out of a single node and spread throughout the network. The abundance of Tenex-run machines gave Tomlinson's hybrid program immediate access to the majority of ARPANET users, and the email was an immediate success. Pretty quickly, ARPA leaders incorporated the use of email into everyday life. Stephen Lukasik, director of ARPA, was one of the first users, as was Larry Roberts, still the agency's head of computer science. This habit inevitably passed to their subordinates, and soon email became one of the basic facts of ARPANET life and culture.

Tomlinson's mail program spawned many different imitations and new developments as users looked for ways to improve its rudimentary functionality. Much of the early innovation focused on fixing bugs in the mail reader. As mail moved beyond a single computer, the volume of email received by active users began to grow along with the growth of the network, and the traditional approach to incoming email as plain text was no longer effective. Larry Roberts himself, unable to cope with the flurry of incoming messages, wrote his own program for working with the inbox called RD. But by the mid-1970s, the MSG program, written by John Vittal of the University of Southern California, was leading by a wide margin in popularity. The ability to automatically fill in the name and recipient fields of an outgoing message based on the incoming message by pressing a button is taken for granted. However, it was Vital's MSG program that first introduced this amazing opportunity to "reply" a letter in 1975; and she was also included in the set of programs for Tenex.

A variety of such attempts required the introduction of standards. And this was the first, but by no means the last, case in which the networked computing community had to develop standards retroactively. Unlike the basic ARPANET protocols, before there were any standards for email, there were already many variations in the wild. Inevitably, controversy and political friction arose, focusing on the main documents describing the email standard, RFC 680 and 720. In particular, users of non-Tenex operating systems annoyed that the assumptions found in the proposals were tied to Tenex features. The conflict never flared up too much - all users of the ARPANET in the 1970s were still part of one, relatively small scientific community, and the differences were not so big. However, this was an example of future battles.

The unexpected success of email was the most important development in the development of the software layer of the network in the 1970sβ€”the layer most abstracted from the physical details of the network. At the same time, other people ventured to redefine the underlying layer of "communication" in which bits flowed from one machine to another.

ALOHA

In 1968, Norma Abramson arrived at the University of Hawaii from California to take up the combined position of professor of electrical engineering and computer science. His university had a main campus on Oahu and an additional campus in Hilo, as well as several community colleges and research centers scattered across the islands of Oahu, Kauai, Maui and Hawaii. Between them lay hundreds of kilometers of water and mountainous terrain. A powerful IBM 360/65 ran on the main campus, but getting a dedicated line from AT&T to connect to a terminal located in one of the community colleges was not as easy as on the mainland.

Abramson was an expert in radar systems and information theory, and at one time worked as an engineer for Hughes Aircraft in Los Angeles. And his new environment, with all its physical problems associated with wired data transmission, inspired Abramson to a new idea - what if radio was a better way to connect computers than the telephone system, which, after all, was designed to transmit voice, not data?

To test his idea and create a system he called ALOHAnet, Abramson received funding from Bob Taylor of ARPA. In its original form, it was not a computer network at all, but a medium for connecting remote terminals to a single time-sharing system designed for an IBM computer located on the Oahu campus. Like the ARPANET, it had a dedicated minicomputer to process packets received and sent by the 360/65 machine, the Menehune, the Hawaiian equivalent of the IMP. However, ALOHAnet did not make life difficult for itself by routing packets between different points, which was used in ARPANET. Instead, each terminal that wanted to send a message simply sent it over the air on a dedicated frequency.

History of the Internet: the computer as a communication device
A fully deployed ALOHAnet in the late 1970s, with several computers on the network

The traditional engineering way to handle such a common transmission bandwidth was to cut it into segments with broadcast time or frequency divisions, and allocate a segment to each terminal. But to process messages from hundreds of terminals in such a way, each of them would have to be limited to a small fraction of the available bandwidth, despite the fact that only a few of them could actually be in operation. But instead, Abramson decided not to interfere with the terminals to send messages at the same time. If two or more messages overlapped each other, the central computer detected this through error correction codes, and simply did not accept these packets. After receiving no confirmation of receipt of the packets, the senders tried to send them again after a random amount of time had passed. Abramson calculated that such a simple operation protocol could support up to several hundred simultaneous terminals, and due to numerous signal overlaps, 15% of the bandwidth would be utilized. However, according to his calculations, it turned out that as the network increased, the entire system would fall into a chaos of noise.

office of the future

Abramson's "packet broadcast" concept did not generate much hype at first. But then she was born again - a few years later, and already on the mainland. This was due to Xerox's new Palo Alto Research Center (PARC), which opened in 1970 right next to Stanford University, in an area that had recently been nicknamed "Silicon Valley." Some of Xerox's xerography patents were about to expire, so the company risked falling into the trap of its own success, not adapting, either unwillingly or not, to the rise of computing and integrated circuits. Jack Goldman, head of research at Xerox, convinced the big bosses that the new lab - separated from the influence of headquarters, set in a comfortable climate, with good salaries - would attract the talent needed to keep the company at the forefront of progress, developing information architecture future.

PARC has certainly succeeded in attracting the best talent in the computer science field, not only because of the working conditions and generous salary, but also due to the presence of Robert Taylor, who started the ARPANET project in 1966, as head of ARPA's information processing technology department. Robert Metcalf, a short-tempered and ambitious young engineer and computer scientist from Brooklyn, was one of the people brought to PARC through his ARPA connections. He joined the lab in June 1972 after working part-time for ARPA as a graduate student inventing an interface to connect MIT to the network. Having settled in PARC, he still remained the "intermediary" of ARPANET - he traveled around the country, helping to connect new points to the network, and also preparing for the presentation of ARPA at the 1972 international conference on computer communications.

Among the projects going around at PARC when Metcalfe arrived was Taylor's proposed plan to connect dozens, if not hundreds, of small computers to the network. Year after year, the cost and size of computers fell, obeying the indomitable will Gordon Moore. The forward-thinking engineers at PARC foresaw that in the not-too-distant future, every office worker would have their own personal computer. As part of this idea, they designed and built the Alto personal computer, copies of which were distributed to every researcher in the lab. Taylor, who had only grown stronger in his belief in the usefulness of a computer network over the previous five years, also wanted to tie all these computers together.

History of the Internet: the computer as a communication device
Alto. The computer itself is located below, in a cabinet the size of a mini-fridge.

Arriving at PARC, Metcalfe took it upon himself to connect the laboratory's clone PDP-10 to the ARPANET, and quickly earned a reputation as a "networker". So when Taylor needed a network from Alto, his assistants turned to Metcalfe. Like the computers on the ARPANET, the Alto computers on PARC had little to say to each other. Therefore, an interesting application of the network again became the task of communicating between people - in this case, in the form of words and images printed with a laser.

The key idea for the laser printer did not originate at PARC, but on the East Bank, at the original Xerox laboratory in Webster, New York. Local physicist Gary Starkweather proved that a coherent laser beam could be used to deactivate the electrical charge of a xerographic drum, just like the scattered light used in photocopy up to that point. The beam, when properly modulated, can draw an image of arbitrary detail on the drum, which can then be transferred to paper (because only the uncharged parts of the drum pick up the toner). Such a computer-controlled machine would be able to produce any combination of images and text that a person could think of, and not just reproduce existing documents, like a photocopier. However, Starkweather's wild ideas were not supported by either his colleagues or his superiors at Webster, so he transferred to PARC in 1971, where he met with a much more interested audience. The laser printer's ability to output arbitrary images dot by pixel made it an ideal partner for the Alto Workstation, with its pixelated monochrome graphics. With a laser printer, half a million pixels on the user's display could be directly printed onto paper with perfect clarity.

History of the Internet: the computer as a communication device
Bitmap on Alto. Nobody has seen anything like this on computer displays before.

In about a year, Starkweather, with the help of several other engineers from PARC, fixed the main technical problems, and built a working prototype laser printer on the chassis of the Xerox 7000 workhorse. per inch The character generator built into the printer printed text in predefined fonts. Arbitrary images (other than those that could be generated from fonts) were not yet supported, so the network did not need to send 500 Mbps to the printer. However, in order to completely occupy the printer, the network bandwidth would have been incredible for those times - when 25 bits per second were the limit of the ARPANET.

History of the Internet: the computer as a communication device
Second generation PARC laser printer, Dover (1976)

Alto Aloha Network

And how was Metcalfe able to fill that gap in speed? So we returned to ALOHAnet - it turned out that Metcalfe was better than anyone else in packet broadcasting. The year before, during the summer, while in Washington with Steve Crocker on ARPA, Metcalfe had been researching the fall computer conference proceedings when he stumbled upon Abramson's work on ALOHAnet. He immediately realized the genius of the basic idea, and that its implementation was not good enough. By making some changes to the algorithm and its assumptionsβ€”for example, by making senders listen to the air first, waiting for the channel to be cleared before attempting to send messages, and exponentially increasing the retransmission interval in the event of a clogged channelβ€”he could achieve bandwidth utilization. bands by 90%, and not 15%, as it came out from Abramson's calculations. Metcalfe took a short vacation to Hawaii, where he incorporated his ALOHAnet ideas into a revised version of his doctoral thesis after Harvard rejected the original for lack of a theoretical basis.

At first, Metcalfe referred to his plan to introduce packet broadcasting to PARC as the "ALTO ALOHA Network". Then, in a memorandum dated May 1973, he renamed it the Ether Net [ethereal network], in reference to the luminiferous ether, a XNUMXth-century physical idea of ​​a substance that carries electromagnetic radiation. β€œThis will contribute to the spread of the network,” he wrote, β€œand who knows what other methods of signal transmission will be better than a cable for a broadcast network; perhaps it will be radio waves, or telephone wires, or power, or frequency-multiplexed cable TV, or microwaves, or combinations of both.”

History of the Internet: the computer as a communication device
Sketch from Metcalfe's 1973 memo

From June 1973, Metcalfe worked with another PARC engineer, David Boggs, to translate his theoretical concept of a new high-speed network into a working system. Instead of transmitting signals over the air, as with ALOHA, he limited the radio spectrum to coaxial cable, which dramatically increased the bandwidth compared to the limited bandwidth of the Menehune. The transmission medium itself was completely passive, and did not require any routers to route messages. It was cheap, easily connected hundreds of workstations - PARC engineers simply ran coax through the building and added connections to it as needed - and was also capable of passing three million bits per second.

History of the Internet: the computer as a communication device
Robert Metcalfe and David Boggs, 1980s, a few years after Metcalfe founded 3Com to sell Ethernet technology

By the fall of 1974, a complete prototype office of the future was up and running in Palo Altoβ€”the first batch of Alto computers, with drawing software, email and word processors, a Starkweather prototype printer, and an Ethernet network to network it all. The central file server, which stored data that would not fit on the Alto's local drive, was the only shared resource. PARC originally offered an Ethernet controller as an optional accessory for the Alto, but when the system went live, it became clear that it was a necessary part; an unchanging stream of messages went through the coax, many of which came out of the printer - technical reports, memos or scientific papers.

Simultaneously with the Alto development, another project from PARC tried to push the idea of ​​resource sharing in a new direction. The PARC Online Office System (POLOS), designed and implemented by Bill English and other escapees from the Stanford Research Institute's Doug Engelbart Online System (NLS) project, consisted of a network of Data General Nova microcomputers. But instead of devoting each individual machine to a specific user need, POLOS moved work between them to serve the interests of the system as a whole in the most efficient way. One machine could generate images for user screens, another could process ARPANET traffic, and a third could handle word processors. But the complexity and cost of coordination in this approach proved excessive, and the scheme collapsed under its own weight.

In the meantime, nothing showed Taylor's emotional rejection of the resource sharing approach to the network better than his acceptance of the Alto project. Alan Kay, Butler Lampson, and other Alto authors brought all the computing power a user could need to their independent desktop computer that they didn't have to share with anyone. The function of the network was not to provide access to a heterogeneous set of computer resources, but to transfer messages between these independent islands, or store them on some distant shore - for printing or long-term archiving.

Although both email and ALOHA were developed under the auspices of ARPA, the advent of Ethernet was one of several signs that appeared in the 1970s that computer networks were becoming too large and diverse for a single company to dominate the field, a trend we we will follow in the next article.

What else to read

  • Michael Hiltzik, Dealers of Lightning (1999)
  • James Pelty, The History of Computer Communications, 1968-1988 (2007) [http://www.historyofcomputercommunications.info/]
  • M. Mitchell Waldrop, The Dream Machine (2001)

Source: habr.com

Add a comment