Internet History: The Age of Fragmentation; part 1: load factor

Internet History: The Age of Fragmentation; part 1: load factor

By the early 1980s, the foundation was laid for what we know today as the “Internet”—its core protocols were developed and field-tested—but the system remained closed and under almost complete control of a single entity, the US Department of Defense. Soon this will have to change - there will be an expansion of the system to all departments of computer science of different institutes using CSNET. The network will continue to grow in academia before finally opening up fully to general commercial use in the 1990s.

But that the Internet would become the very center of the coming digital world, the much-touted “information society,” was far from clear in the 1980s. Even to people who had heard of it, it remained only a promising scientific experiment. But the rest of the world did not stand still, holding their breath, waiting for his arrival. Instead, many options for providing access to online services for the mass consumer competed for money and attention.

Personal Computing

Around 1975, breakthroughs in semiconductor manufacturing led to a new type of computer. A few years earlier, engineers had figured out how to cram the core data processing logic onto a single microchip—the microprocessor. Companies such as Intel have begun to offer high-speed short-term memory on chips to replace the magnetic core memory of the previous generation of computers. As a result, the most important and expensive parts of the computer fell under the influence of Moore's law, which in the following decades constantly reduced the cost of processor and memory chips. By the middle of the decade, this process had already reduced the cost of these components so much that a member of the American middle class could well consider buying and building their own computer. Such machines became known as microcomputers (or sometimes personal computers).

There was a fierce struggle for the right to be called the first personal computer. Some thought it was Wes Clark's LINC, or Lincoln Labs' TX-0, because after all, it could only be used interactively by one person. Questions of primacy aside, any candidate for first place, if we evaluate the historical sequence of events, is bound to give way to one obvious champion. No other machine has been as catalytic as the MITS Altair 8800 provided in the explosion of microcomputers in the late 1970s.

Internet History: The Age of Fragmentation; part 1: load factor
Altair 8800 standing on an additional module with an 8" drive

Altair has become the seed crystal for the electronics community. He convinced hobbyists that a person could build their own computer at a reasonable price, and these hobbyists began to gather in communities discussing their new machines, such as the Homebrew Computer Club in Menlo Park. These amateur cells launched a much more powerful wave of commercial microcomputers based on mass-produced machines that did not require electronics skills, such as the Apple II and the Radio Shack TRS-80.

By 1984, 8% of US households owned their own computer, which was about seven million cars. In the meantime, enterprises were acquiring their own fleets of personal computers at a rate of hundreds of thousands per year - mainly IBM 5150s and their clones. In the more expensive segment of single-user computers, there was a growing market for workstations from Silicon Graphics and Sun Microsystems, more powerful computers with advanced graphic displays and networking equipment intended for use by scientists, engineers, and other technicians.

Such machines could not be invited to the exquisite world of ARPANET. However, many of their users wanted access to the promised fusion of computers and communications that theoretical scientists had been talking about in the popular press since Taylor and Licklider's 1968 article "The Computer as a Communication Device", and some even earlier. As early as 1966, scientist John McCarthy promised in Scientific American that "the technology already shown to us is enough to imagine how computer consoles appear in every home, connected to public computers by telephone." He stated that the range of services offered by such a system is simply impossible to enumerate, but gave several examples: “Everyone will have access to the Library of Congress, and of a better quality than what librarians currently have. Full reports of current events will be available, whether it be baseball scores, the Los Angeles smog index, or a description of the 178th meeting of the Korean Armistice Commission. Taxes on income will automatically be calculated thanks to the constantly accumulating records of income, deductions, contributions and expenses.

Articles in the popular literature described the possibilities of e-mail, digital games, services ranging from legal and medical advice to online shopping. But what exactly will it all look like? Many of the answers turned out to be far from the truth. Looking back, that era looks like a broken mirror. All the services and concepts that characterized the commercial Internet of the 1990s—and many others—appeared in the 1980s, but in fragments, scattered across dozens of different systems. With a few exceptions, these systems did not intersect, they stood apart. There was no way for users of one system to interact or communicate with users of another, so attempts to get more users into either system were mostly zero sum game.

In this article, we'll look at one subset of the players in this new digital takeover—companies selling shared access that were trying to break into a new market with attractive terms.

load factor

In 1892 Samuel Insall, a protégé Thomas Edison, went west to head a new division of Edison's electrical empire, the Chicago Edison Company. In this position, he has consolidated many of the key principles of modern utility management, in particular the concept of the load factor, which is calculated as the average load on an electrical system divided by the highest load. The higher the load factor, the better, because any deviation from the ideal 1/1 ratio means waste - extra funds that are needed to handle peak loads, but are idle in schedule dips. Insal decided to fill in the gaps in the demand curve by developing new classes of consumers who would use electricity at different times of the day (or even different seasons), even if they had to sell electricity to them at a discount. In the early days of electricity supply, it was mainly used to light houses, and mostly in the evening. Therefore, Insal began to advertise the use of electricity in industrial production, increasing its daily consumption. This left dips in the mornings and evenings, so he convinced the Chicago transit system to convert streetcars to electric traction. In this way, Insal maximized the value of his invested capital, even though he sometimes had to sell electricity at a discount.

Internet History: The Age of Fragmentation; part 1: load factor
Insal in 1926, when he was featured on the cover of Time magazine

The same principles apply to investing in computers nearly a century later—and it was the drive for load balancing, which led to offering discounts at times when there were no peaks, that allowed two new online microcomputer services to launch almost simultaneously in the summer of 1979: CompuServe and The Source.

CompuServe

In 1969, the newly formed Golden United Life Insurance Company in Columbus, Ohio incorporated the Compu-Serv Network subsidiary. The founder of Golden United wanted to build the most advanced, high-tech company with computerized record keeping, so he hired a young computer science graduate student, John Goltz, to lead the project. However, a DEC sales manager talked Goltz into buying a PDP-10, an expensive machine whose computing power far exceeded Golden United's current needs. The idea behind Compu-Serv was to turn this bug into an opportunity—the plan was to sell surplus processing power to customers who could dial into the PDP-10 from a remote terminal. In the late 1960s, this model of time-sharing and sales of computing services was quite active, and Golden United wanted to grab a piece of the pie. In the 1970s, a division of the company spun off into its own entity, renamed CompuServe, and created its own packet-switched network to offer low-cost, nationwide access to computer centers in Columbus.

The national market not only gave the company access to more potential customers, it also widened the demand curve for computer time, spreading it across four time zones. However, there was still a big gap between the end of the working day in California and the beginning of the working day on the East Coast, not to mention weekends. CompuServe CEO Jeff Wilkins saw an opportunity to solve this problem with a growing fleet of home computers, as many of their owners spent evenings and weekends on their electronic hobby. What if we offered them access to email, bulletin boards, and games on CompuServe computers at a discounted rate during evenings and weekends ($5/hour vs. $12/hour during business hours)? [in current money, these are $24 and $58, respectively].

Wilkins launched a trial service, calling it MicroNET (specially distanced from the main CompuServe brand), and after a slow start, it gradually grew into an incredibly successful project. Thanks to CompuServe's national data network, most users could simply call a local number to reach MicroNET and avoid long distance bills, even though the actual computers they were calling were in Ohio. When the experiment was considered successful, Wilkins abandoned the MicroNET brand and transferred it to the CompuServe brand. The company soon began offering services specifically designed for microcomputer users, such as games and other software that could be purchased online.

However, communication platforms became the most popular services by a wide margin. For long-term discussions and posting content, there were forums, the topics of which ranged from literature to medicine, from woodworking to pop music. CompuServe usually left the forums at the mercy of the users themselves, and some of them, who took on the role of "sysops", were engaged in moderation with administration. The other mainstream messaging platform was CB Simulator, which was sketched over one weekend by Sandy Trevor, one of CompuServe's directors. It was named after the popular amateur radio hobby (citizen band, CB) at the time, and allowed users to sit in on real-time text chats on dedicated channels - this model was similar to the talk programs available on many time-sharing systems. Many users have been hanging out in CB Simulator for hours, chatting, making friends and even finding lovers.

The Source

On the heels of MicroNET was another online service for microcomputers, launched just eight days after it, in July 1979. In fact, it was aimed at almost the same audience as the service of Jeff Wilkins, despite the fact that it developed completely according to another scheme. William von Meister, the son of German immigrants whose father helped organize airship flights between Germany and the United States, was a serial entrepreneur. He entered a new venture as soon as he lost interest in the old one or as soon as disappointed investors stopped supporting him. It was difficult to imagine a person more unlike Wilkins. By the mid-1970s, his greatest successes were: the Telepost electronic messaging system, which sent messages electronically across the country to the nearest exchange to the recipient, and traveled the last mile in the form of next-day mail; the TDX system, which used computers to optimize the routing of telephone calls, reducing the cost of long-distance calls for large enterprises.

Predictably losing interest in TDX, von Meister became enthusiastic in the late 1970s with a new project, Infocast, which he wanted to launch in McClean, Virginia. It was essentially an extension of the Telepost concept, only instead of using mail to deliver a message on the last mile, it had to use sideband FM (this technology transmits the station name, musician name and song title to modern radios) to deliver digital data to computer terminals. In particular, he planned to offer it to highly geographically dispersed enterprises that had many outlets that needed regular updating of information from the central office - banks, insurance companies, grocery stores.

Internet History: The Age of Fragmentation; part 1: load factor
Bill von Meister

However, what von Meister really wanted to create was a nationwide network for delivering data to homes via terminals for millions, not thousands, of people. However, it is one thing to convince a business to spend $1000 on a custom FM radio and terminal, and another to ask private consumers to do the same. So von Meister went looking for other ways to bring news, weather information, and other things home; and he found it in the hundreds of thousands of microcomputers that were sprouting like mushrooms through American offices and homes, popping up in homes already equipped with telephone lines. He partnered with Jack Taub, a wealthy and well-connected businessman who liked the idea so much that he wanted to invest in it. Taub and von Meister first called their new service CompuCom, in a typical way of cutting and composing words for the computer companies of those days, but then came to a more abstract and ideological name - The Source [source].

The main problem they faced was the lack of technical infrastructure capable of serving the implementation of this idea. To get it, they entered into an agreement with two companies whose combined resources were comparable to those of CompuServe. They had time-sharing capable computers and a nationwide data network. Both of these resources were practically idle in the evenings and weekends. Computer power was provided by Dialcom, which was headquartered on the banks of the Potomac River in Silver Spring, Maryland. It, like CompuServe, began in 1970 as a time-sharing computer services provider, although by the end of the decade it was offering many other services. By the way, it was thanks to the Dialcom terminal that I first met computers Eric Emerson Schmidt, future chairman of the board of directors and chief executive officer of Google. The communications infrastructure was provided by Telenet, a packet-switched network spun off from the company earlier in the decade. Bolt, Beranek and Newman.BBN. By paying discounted access to Dialcom and Telenet services during off-peak hours, Taub and von Meister were able to offer access to The Source for $2,75 per hour nights and weekends with a down payment of $100 (that's $13 per hour and $480 down payment in today's dollars).

Aside from the payment system, the main difference between The Source and CompuServe was the expectation of users using their system. The earliest services from CompuServe included email, forums, CB, and software sharing. Users were supposed to create their own communities and build their own superstructures on top of the underlying hardware and software, just like corporate users of time-sharing systems do. Taub and von Meister had no experience with such systems. Their business plan was based on providing a wealth of information to higher-end professional consumers: the New York Times database, news from United Press International, stock information from Dow Jones, airfare, reviews of local restaurants, wine prices. Perhaps the most distinctive feature was that The Source users were welcomed by an on-screen menu of available options, while CompuServe users were welcomed by the command line.

In keeping with the personal differences between Wilkins and von Meister, the launch of The Source proved to be about as big an event as MicroNET was quietly launched. Isaac Asimov was invited to the first event so that he could personally announce how the arrival of science fiction became science fact. And, typical of von Meister, his tenure at The Source did not last long. The company immediately ran into financial difficulties due to a serious cost over income. Taub and his brother had a large enough stake in the business to force von Meister out of it, and in October 1979, just a few months after the launch party, they did just that.

The decline of time-sharing systems

The latest company to enter the microcomputer market with load-factor logic is General Electric Information Services (GEIS), a division of the electrical engineering giant. GEIS was founded in the mid-1960s, when GE was still trying to compete with others in the production of computers, as part of an attempt to dislodge IBM from its dominant position in computer sales. GE tried to convince customers that instead of buying computers from IBM, it was easier for them to rent computers from GE. This attempt had little effect on IBM's market share, but the company was making enough money to continue investing in it until the 1980s, by which time GEIS already owned a worldwide data network and two large computing centers - in Cleveland, Ohio, and in Europe.

In 1984, someone at GEIS noticed how well The Source and CompuServe (the latter had more than 100 users at that time) were growing, and figured out how to get the data centers to work outside of the main load hours. To create their own offering for users, they hired CompuServe veteran Bill Lowden. Lowden, irritated by the way corporate salespeople were beginning to try to break into the increasingly attractive consumer business, left the company with a group of colleagues to try to build his own online service in Atlanta, calling it Georgia OnLine. They tried to turn the lack of access to the national data network into an advantage by offering services tailored to the local market, such as special advertising and information about various events, but the company went bust, so Lowden was happy with the offer from GEIS.

Lowden named the new service GEnie. genie - genie] - it was a backronym for the General Electric Network for Information Exchange [GE network for information exchange]. It offered all the services developed by The Source and CompuServe up to that point - chat (CB simulator), bulletin boards, news, weather and sports information.

GEnie is the latest personal computing service born out of the time-shared computing power access industry and load factor logic. As the number of small computers increased to millions of pieces, digital services for the mass market began to gradually become an attractive business in their own right, and ceased to be just a way to optimize existing capital. In the early days, The Source and CompuServe were tiny companies serving a few thousand subscribers in the 1980s. Ten years later, millions of subscribers were paying monthly fees in the US - and CompuServe was at the forefront of this market, absorbing its former competitor, The Source. The same process has made time-sharing access less attractive to businesses - why pay for communications and access to someone else's remote computer when it's so easy to equip your own office with powerful machines? And until the advent of fiber optic channels, which dramatically dropped the cost of communications, this logic did not change its direction to the opposite.

However, it was not only companies that offered time-sharing access that could enter this market. Instead of starting with large mainframes and looking for ways to boot them to their fullest, other companies started with the technology that was already in the homes of millions of people and began to look for ways to connect it to a computer.

What else to read

  • Michael A. Banks, On the Way to the Web (2008)
  • Jimmy Maher, “A Net Before the Web,” filfre.net (2017)

Source: habr.com

Add a comment