This article was initially published in Today's Engineer on November 2011
For a century after Alexander Graham Bell invented the telephone in 1876, all telephone messages traveled as modulation of electric currents transmitted over copper wire, or to a lesser extent, modulation of radio waves transmitted through the air. Within that framework, there were multiple generations of improvements, many of them developed by AT&T’s Bell Telephone Laboratories, including metallic two-wire circuits, loading coils, vacuum-tube amplifiers, coaxial cable, and microwave radio relay systems. Together, these innovations achieved interrelated goals of better sound quality, greater distances, greater capacity, and lower costs, as the telephone evolved from a local medium used by businesses and the well-to-do to a global medium and an integral feature of contemporary life. Up until the 1960s, all transmission was analog; the modulations were electrical analogs of the original sounds being transmitted.
In 1938, Alec Reeves at the ITT (International Telephone and Telegraph Company) research laboratory in Paris conceived the idea that telephone signals could be converted into digital signals for efficient transmission and then back into analog form for delivery. In this system, known as Pulse Code Modulation (PCM), the amplitude of an analog signal is periodically sampled, and the sample translated into a digital binary code. However, he could not reduce the idea to a practical device using existing technology. With the availability of solid-state electronics, PCM was finally used in 1962 for AT&T’s T-1 digital transmission system. Digital transmission was more efficient, in large part because it could operate at higher frequencies. Digital circuits therefore had greater information carrying capacity, which helped the telephone industry keep up with the ever increasing demand, not only from telephone calls but from transmission of television programming, and increasingly from the transmission of digital computer data.
The invention of the laser in the early 1960s spawned a field of engineering known as optoelectronics, which grew steadily in the 1980s and 1990s. Many engineers believed that the laser would be useful for transmitting information through the air, but they soon learned that clouds, rain, and other atmospheric conditions sometimes blocked the beam. An alternative was to send laser light along glass fibers (similar to the way electric signals are sent along copper wires).
Researchers throughout the industry continued to look for a still higher frequency and hence higher capacity transmission medium. The idea of using fibers to telecommunication was first considered by K.C. Kao and G.A. Hockham in England in 1966. In 1966, Charles Kao, at ITT’s Standard Telecommunications Laboratory in England, demonstrated that there was no theoretical reason preventing a sufficiently pure glass fiber from having a low enough attenuation to allow it to be used as a medium for light waves bearing information. Light waves have much higher frequencies than microwaves which were the highest frequency waves then used in telephone transmission. Kao received the Nobel Prize in Physics in 2009 for this work.
However, although glass fibers were already in use to transmit light for short distances for purposes such as medical diagnostics, no fiber with sufficiently low attenuation for longer distances existed. In 1970, a team led by Robert Maurer with Donald Keck and Peter Schultz at Corning Glass developed the first suitable glass, which Corning then continued to improve. That same year, a team at Bell Labs developed the first room-temperature semiconductor laser, providing a practical pulsing light source suitable for a digital optical system. However it was still difficult to make glass fibers that were capable of carrying pulses of light for very long distances without weakening the signal. This remained the case until the early 1980s.
Engineers installed an experimental fiber optic system in 1976. Using a gallium-arsenide semiconductor laser, the AT&T company installed an experimental 2000-meter-long (1.25-mile-long) fiber optic cable under the streets of Atlanta, Georgia.
Test systems in several countries were quickly followed by field trials with customers. GTE installed a test fiber optic cable system in Long Beach, California, in 1977. AT&T quickly followed with one in Chicago, and the British Post Office with a system at Martelsham Heath. Finally, in 1976 J. Jim Hsieh at MIT Lincoln Laboratory developed a laser that emitted light at the same frequency, 1.3 micrometers, that a fiber developed by Masaru Horiguchi at NTT could optimally transmit, providing for a higher capacity, lower loss, and more efficient system. Other advances followed over the next few years. In 1979, AT&T installed a public demonstration system in Lake Placid, New York, which was used with great success carrying multiple television signals during the 1980 Winter Olympics. In 1983, U.S. long distance company MCI, working with Corning, opened a commercial, 1.3 micrometer, fiber-optic cable system between New York and Washington, which AT&T soon followed with a competitive line. Since fiber optic transmission was digital it was particularly well suited for the ever increasing quantity of digital computer data being sent over the world’s telephone lines.
Beginning in the mid-1980s, fiber optic installations expanded rapidly all over the globe, and generations of improved systems followed quickly one after the other. Fiber had enormously higher capacity, which increased even further with each generation, and much cheaper operating costs. For example, the last copper transatlantic cable, TAT-7, opened in 1978 with a capacity of 4,000 calls; the first fiber cable, TAT-8, opened in 1988, with a capacity ten times greater. That was just the beginning of a massive increase in capacity; by the late 1990s, new generations of fiber optic systems could carry millions of calls, though in practice by this time most of what was transmitted was data, and not conversation. Or to put it in data terms, coaxial copper cable carried millions of bits, or megabits, per second; early 1980s fiber optic cable, hundreds of megabits; 1990s fiber, gigabits; and 2000s fiber, terabits.
In the 1980s, engineers assumed that optical cables would replace more expensive copper cables for telephone service, saving money in the process. When the use of the Internet exploded in the 1990s, suddenly there was a great demand for cables that could carry heavy loads of digital data. Optical fiber fit the bill perfectly, and many thousands of miles of new cable have been laid all around the world.
Fiber optics rendered all previous telephone network transmission media obsolete. By 2000, copper wire for the most part persisted only in local loops that ran between telephone exchanges and individual subscribers, and microwave systems had been largely decommissioned. The cost of transmitting a phone call to any place on Earth within reach of a fiber-optic cable rapidly approached zero, thus knitting the planet more closely into a single instant communications web, greatly facilitating global commerce. Among other things, the widespread adoption of fiber optics made the global internet possible.