About John Mayo
Mayo’s early interest in amateur radio led him to the study of electrical engineering. His masters thesis was on the use of silicone-bonded silica as an electrical material and his PhD thesis on a digital device for a computer, used for processing noise data. He went to work at Bell Labs. His first work was the use of a Transistorized Digital Computer for control systems, especially radar applications. He then became supervisor for the T1 carrier project, department head, director of ocean systems (military applications), executive director of the #4ESS project (commercial telecommunications), executive vice president of Bell Labs in 1979, president of Bell Labs in 1991, and reached mandatory retirement age in 1995. He had a role in the creation of a new R&D paradigm that set the stage for Lucent Technologies, formed in 1996. His career also included work in the Pulse Code Modulation system, the Telstar satellite, and wireless communications.
He comments that the transistor is the crucial invention for the telecommunications revolution of the last fifty years, and that it won the Cold War. The three “killer technologies” have been electronics, photonics (lasers and glass fibers), and software. Nothing comparable has been invented since, and he sees future development as largely exploration of those three technologies. He discusses significant developments including the Electronic Switching System, the Operation Support System, and the slow process of bringing wireless communication to the market, despite significant economic and regulatory hurdles. He describes how the break-up of AT&T affected research—not so much by shifting the areas of research as by introducing competition among researchers and increasing commercial pressures on costs and schedules. Some other R&D changes/improvements lately are a focus on quality (inspired by the Japanese), a switch from serial design stages to parallel design, a preference for reusable design and assets, and more involvement of customers in the design process.
He discusses his involvement in IEEE and the Communications Society. He thought the IRE and IEEE Proceedings were wonderful publications, was on various standards committees, and involved in the Solid State Circuits Conference.
About the Interview
JOHN MAYO: An Interview Conducted by David Hochfelder, IEEE History Center, 9 December 1999
Interview # 383 for the IEEE History Center, The Institute of Electrical and Electronics Engineers, Inc.
This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.
Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, 39 Union Street, New Brunswick, NJ 08901-8538 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.
It is recommended that this oral history be cited as follows:
John Mayo, an oral history conducted in 1999 by David Hochfelder, IEEE History Center, New Brunswick, NJ, USA
Interview: John Mayo
Interviewer: David Hochfelder
Date: 9 December 1999
Place: Chatham, New Jersey
Milestones in the history of communications
What do you see as the milestones in the history of communications in the last fifty years?
It’s interesting that you specify fifty years, that is right after the invention of the transistor. Clearly, the greatest milestone in communications in the last 50 years was the creation of solid-state electronics based on the transistor. Other great milestones came as a consequence of the potential of solid-state electronics. This includes breakup of the telephone monopoly, the merging of communications and computing, recent restructuring of industry, and the increasing availability of advanced services and electronic commerce.
Incidentally, the transistor changed a lot more than communications. It changed the world and ways of life. It was also a major force in the winning of the cold war. Japan and the West benefited enormously from the economic power of solid-state electronics. Much of the world wanted to emulate Japan and Silicon Valley. Communism did not provide the opportunity to do that.
The three killer technologies are solid-state electronics, photonics, and software. Software is the means for putting electronics to work. Photonics based on the solid-state laser and glass fibers is probably the second greatest creation of the last 50 years.
Invention and development of the transistor
Tell me more about the environment leading up to the invention of the transistor and how you got involved in follow-up developments?
<flashmp3>383 - mayo - clip 1.mp3</flashmp3>
Communications has been steadily improving for thousands of years. It moved into the beginning of what we call the Information Age about 125 years ago with the invention of the telephone. Telephony gave us instant communications and that started an initiative that rapidly gained momentum. By fifty years ago the technologies providing telecommunications were barely able to support the increasing demands on them. Wires were growing in number at such a tremendous rate that telephone central offices found it extremely difficult and costly to manage all those wires. A point came where the Bell System had so many vacuum tubes in service that reliability level and power consumption were severe limitations to orderly evolution of the telephone network. It became apparent that the technology of vacuum tubes and wire was not going to meet the demand of telecommunications in the second half of the twentieth century.
The pressure for research to find better solutions to meet the needs of telecommunications led to the invention of the transistor. The transistor quickly became a promising technology for enhancing reliability and eliminating wire congestion. It also promised to reduce the power consumption and size of electronic equipment in both outside plants and central offices. Its success was so spectacular that modern telecommunications began with the invention of the transistor. With that came the opportunity to develop the full-blown Information Age in a way that did not excessively burden the economics of the nation.
The transistor was rapidly improved. It started out as a point contact transistor and quickly moved into junction devices. It then moved from germanium to silicon and became the technology underlying most electronic systems. At the time it didn’t seem like it was happening that fast. It took about ten years from the invention of the transistor to the time that telecommunications truly benefited from it. At the same time the technology was being applied to radio, television, and information processing. The computer of that day was usually a stand-alone computer in an information center. Later the computer became a vital component in telecommunications equipment. A notable milestone was the introduction of stored-program, electronic switching in telephone central offices.
I was fortunate to arrive on the scene at about the time that transistors showed great promise but did not yet have the capability to fully open up the Information Age. In those days one of the goals was to get the cost of a transistor to one dollar. But even at that cost, the transistor was powerful enough to break the backbone of the problem of wire congestion in central offices and under the streets of metropolitan areas.
T1 carrier system and Pulse Code Modulation (PCM) systems
I worked on the T1 carrier system, which was an early user of the transistor. T1 carrier put twenty-four voice channels onto a pair of telephone wire. This gave a 12:1 congestion relief on wires running under the streets and into central offices.
When did T1 become operational?
The T1 carrier was put into commercial service in 1962 and was very successful. It was clear from the onset that its characteristics were much more important than just reducing wire congestion. For example, we were facing the challenge of connecting our large computer centers at Whippany and Murray Hill, two New Jersey locations about 15 miles apart. T1 lines were installed between these computers very early days of T1. We immediately saw a tremendous advantage in having an enormous bit rate, 1½ megabits per second, on each wire pair. We could instantaneously switch from voice to data, and later we could have voice on some of the channels and data on others. These links between Murray Hill and Whippany showed the far reaching impact digital transmissions would have on the evolution of telecommunications.
T1 carrier converts each voice channel to 64 kilobits per second. It took a lot of creativity to do that. Many people thought analog carrier was still going to be the cheapest method because of all that circuitry. However given the flexibility of digital transmission and the large number of channels T1 could carry on a wire pair, it prospered even before the semiconductor technology moved into the domain of integrated circuits and very low cost.
Where did the work on T1 carrier lead you?
We went on to work on high-speed Pulse Code Modulation (PCM) systems. Multiple 1½-megabit signals, multiple groups of twenty-channels, were put onto what at that time was a very high speed digital transmission system using coaxial cable. We built a model operating at 224 million bits per second. It carried television as well as voice and data, and it was operating by 1963 in the laboratory. Television was encoded at 108 megabits per second. The coaxial cable could carry a couple of television channels or about three thousand voice channels. The telephone network of the day had analog multiplexed channels called Mastergroups. This was 600 voice channels multiplexed into one broadband signal and carried in the telephone network over coaxial cable and radio. We could digitize that entire group of 600 channels and put multiple Mastergroups through the system. By 1963 discrete transistors, augmented by tunnel diodes, were the highest speed devices. We were able to show that these high-speed digital systems with encoded television, voice and large amounts of data were feasible.
Was your work limited to coaxial cable?
We had two initiatives. During the early ‘60s we worked first on coaxial cable. There was also work on intercity wave guides. These wave guides operated at millimeter wavelengths and were about two inches in diameter. The one that received the most attention was a round guide that had a spiral copper lining. Copper wire was wound and bonded to the inner surface of a pipe. It could be placed in the ground, much like large coaxial cables, it had the ability to transmit an enormous number of channels. It was a complex microwave system and the research people were always looking for better guides, and better solutions. That included the so-called dielectric guides, and they dreamed of more elegant solutions.
That dream came true in a way that was almost unthinkable in the ‘60s. The wavelength went all the way up to light, and the size of the guide came down to microns of active area. Glass, a material that man has a tremendous history of manipulating and manufacturing was drawn into ultra-pure fibers, and they became the transmission media of choice. The work on high-speed PCM systems, which had languished somewhat for lack of an excellent, low cost transmission media took on new life. By the mid 1970s fiber guides were installed between Washington and New York and they carried high-speed digital signals. The rate on each guide at that point was only 45 million bits per second. Forty-five million bits per second in the mid 1970s was not as dramatic as 224 was in the early 1960s, but having it in real service rather than just in the lab was a tremendous accomplishment.
Evolution of telecommunications through transistors and software
Transistors were the primary technology of the telecommunications revolution of the last fifty years and they created the digital technology that set the stage for the laser and the optical fiber. The transistor moved on to integrated circuits, and we can put many millions of transistors on a chip. That solved the issue of the complexity of digital technology versus analog. Laser and fibers moved on to even higher speeds and increased functionality. Today, we can send tens of gigabits per second around the globe on those tiny fibers. These two technologies, transistors and photonics have relegated analog technology to short distances and specialized terminal equipment and made digital technology the choice for telecommunications.
What other technologies were critical to progress?
The third technology that evolved during this time was software. Modern telecommunication systems are based on the three technologies of the integrated circuit, the laser-fiber combination and software. In the early work on the T1 carrier there was virtually no software. At that time the Bell System was essentially hardwired equipment and terminal blocks. Service changes took time, for they required wiring changes. Such systems were very flexible in dealing with enhanced services and data. Many things that were being accomplished by manually installing equipment and rearranging wires could be done best in software. That led to two major initiatives.
Electronic Switching System (ESS); telecommunications service, reliability
<flashmp3>383 - mayo - clip 2.mp3</flashmp3>
The first initiative was the #1 Electronic Switching System (ESS), a software controlled switching system for telephone offices based on a digital processor and magnetic switches. By the late 1960s we had an increasing amount electronic switching and a lot of inter-exchange transmission by T1 carrier. Looking at the toll switching market, it was concluded by Earle Vaughan and others that it made sense to build an all digital toll switch that would accept telephone conversations in the digital format of T1 carrier. Out of that came the #4ESS in 1976, which was a leading edge, all digital switching system for telephone central offices. I was fortunate to manage that project for a number of years. It was highly successful. It’s a system that is continually modernized and remains a workhorse to this day. A major advantage of the all digital #4ESS was that it greatly simplified network management and maintenance by putting much of it under software control. And that brings me to the second major initiative in software.
Starting with the wire congestion of the post-World War II era, there was a lot of pressure put on the issue of maintenance and quality of service. There was a service crisis in the 1960s. It was not easy to get a dial tone in some places because the technology wasn’t able to handle the demand. That put strong focus on the growing use of computers and software to managing the network. There had been a major initiative and it was known as Operations Support Systems. That initiative to automate operations started in AT&T moved to Bell Labs. These systems greatly multiplied the productivity of craftspeople thereby greatly reducing time, effort and costs. That eventually led to today's concept that of pre-provisioning services and automated networks that operate basically under their own surveillance. If a circuit pack fails it tells the computer, and the computer informs the craftsperson that a replacement is needed. Automation of account handling and billing also became possible. Operations Support Systems established the concept that more services can be provided and maintained without having to drive trucks and move wires. That work can be done before capacity is exhausted. These are very simple points, but as you go into depth it’s a very complex technology.
Thus, I think the technology is so rich that the telecommunications network today is not limiting the development of the Information Age. There are a lot of companies that want to provide new services, and the Internet is a fertile domain for them. However, the local loop is still somewhat limiting because of the economics of bringing versatile, broadband, high-speed capability to individual customers.
Is that the “last mile”?
It’s been referred to as the “last mile problem.” I like to think of it as the “last mile challenge.” We have the technology to do it. Fiber, coaxial cable, and broadband radio can be brought into every home. Coaxial cable is already in a large fraction of homes. The challenge is to do this in a way that appeals to the customer and the community. The streets can’t all be dug up or everything hung on poles. As a nation we have to deal with the fact that this is a very expensive part of the network, because it’s largely one per customer. There is a large embedded investment in what’s out there, and not all of that investment has been recovered. This issue remains a major force in the Telecommunications regulations and the positioning of various industries. Again we have powerful technologies that can bring whatever bandwidth is needed to the user's fingertips, ears or eyes. We don't have the economic, social, or political issues resolved completely. But, I must say we have made a lot of progress in recent years.
What other technologies did you work on?
I also had the good fortune of working on the command system for the Telstar Satellite. When that technology was ready to proceed there was a regulatory delay of significant proportions. The Bell System was not allowed to get into that business. It became a very important part of global communications, but was to some degree eclipsed by optical fiber. From the very beginning there was a concern about the delay in getting up to and back from synchronous satellites. Various technologies help accommodate that delay, but even today there is a difference between fiber optic and satellite channels due to that delay. It’s a very important technology but after Telstar was never a mainstream activity in Bell Laboratories. As a manager at Bell Laboratories I was involved in contracting for satellites from outside contractors. We had groups that worked both in Bell Laboratories and at the contractor sites to achieve the global high-density communication satellites that AT&T needed.
A technology I was involved with as a manager but not as an engineer is wireless communications. It’s transistors and software put to work to give mobility without an umbilical cord, which is very important. It was invented in Bell Labs about 1947 and the research was largely completed by 1957. There were many years of delay because it required significant radio spectrum. Radio spectrum was the easy way to communicate. Broadcasting licenses and various military and other applications were extremely high leverage, extremely important, and many were not readily willing to see spectrum allocated to an upstart service. Some people didn’t see the difference between an improved mobile car telephone and the cellular wireless radio concept. We technologists at Bell Labs did and AT&T management did. Getting 25 or 40 MHz bandwidth in order to make this system viable was a very serious national issue that was not taken seriously for almost a decade. AT&T almost backed out of it altogether because they had little reason to spend more money on a system that required spectrum they couldn’t get. Spectrum was finally freed up, and the first system was put in service in Chicago as a field trial with actual service to a select customer base. I was one of the customers. It immediately became obvious that this wasn’t just a better telephone in the trunk of a car, although in the early days that’s exactly what it was. It was not an improved mobile telephone system, but something different. It was much better. It was much more capable, more flexible and much higher quality. That opened the door to wireless communications, which has expanded with the same kind of exponential growth that the transistor, laser and software showed. It could be called a new technology but it’s fundamentally transistors and software put together.
What year was the mobile communication system set up in Chicago?
The first service was in 1978. Commercial systems went into service starting in 1985
My understanding is that AT&T was poised to enter the wireless market. Is that correct? I also understand that McKenzie recommended that AT&T not get involved because the market was too small.
I think you should go to a primary source for business issues. Bell Labs people are not a primary source for business issues. As a bystander, because I was not responsible during that period, what I saw from AT&T was an initial enthusiasm for mobile communications. From Bell Laboratories, I saw the selection of cellular as the preferred means. There were many other proposals, like running wires along highways and things of that sort. But, by 1960 Bell Labs and AT&T had settled on cellular radio as the technology of choice for mobile telephony. AT&T pressed the FCC for spectrum to get on with allowing people to benefit from the technology. A limitation the FCC placed on AT&T was that AT&T could not make terminal equipment. Bell Laboratories then worked with outside suppliers. Motorola, Johnson and Oki were the three providers of terminal equipment. Of course later, with divestiture, things changed altogether.
Through the early 1980s growth rate on wireless was significant but the size of the wireless equipment business was still modest. There was a question particularly on the equipment manufacturing side, as to what volume of business could be anticipated. There is a divergence of opinion. If that is the subject of the McKenzie study you reference, I would urge you to talk to McKenzie. That’s only fair. I believe what McKenzie said was that as long as the terminal equipment cost $1,500 the volumes would remain low. What this translates into for equipment businesses is that with multiple suppliers no supplier would do more than a few hundreds of millions of dollars of business a year.
I believe that there was nothing wrong with the McKenzie Study given the time frame in which it was produced. But they missed what we all missed. That is, the innovative capabilities of sales and marketing people. They came up with the notion that a service provider can get substantial revenue by incrementally adding a new customer to an existing customer base. Therefore, a finder’s fee could be given to the terminal equipment supplier, who could use it to offset the cost of the cellular phones. The result was that the price of phones plummeted. Though there are some very advanced phones that still sell for $1,500 or more, people like me buy much cheaper ones. On occasion people get their terminals for free simply by signing up for the service. That marketing innovation leading to very low cost phones was key to creating the exponential growth in wireless telephony that we enjoy today. It has made both the service business and the equipment business very attractive.
Thanks for pointing that out.
Those of us involved in the design and manufacture of cellular equipment had to manage it as a small business for a significant number of years. It was very difficult, because any powerful new technology requires a lot of money to get started. Equipment sales did not build up right away. That meant money invested might not earn returns for a long time. The level of expenditures we were making, were quite high compared to any other stand alone small business. If a low priced terminal had been available from the outset, things would have been different.
Do you think that wireless communications business might have in that case taken off a decade sooner?
No. Regulatory delay cost us a decade. High priced terminals only slowed the initial build up. If we had gotten a low priced terminal a few years earlier, it would have helped business cases. Even so, a lot of companies, including AT&T, spent a lot of money in getting cellular systems in service. The study to which you referred did not delay cellular technology. It made some of the early work harder to justify.
Thanks for that clarification. Tell me about how you got interested in communications and what jobs you had at Bell Labs.
<flashmp3>383 - mayo - clip 3.mp3</flashmp3>
I went to grade school in the 1930s and high school in the 1940s. That was a time when radio broadcasting had just begun to mature. By World War II most communities of any size had a radio broadcasting station. In that day, and continuing to a lesser extent today, there were radio initiatives quite like aspects of the Internet. There’s a cycling of that phenomenon. People used amateur radio as a chat medium. One had to be licensed to do it. The chatting was mostly about the technology, and particularly the home-built transmitter and home-built receiver. I got caught up in that, and my tie to electronics was a fascination with amateur radio. I knew without any doubt that I wanted to go into electrical engineering and that I wanted to go through to a Ph.D. My first contribution as a student was my Master’s thesis, where I worked on materials. Silicone had just come out at that time, and I worked on the use of silicone-bonded silica, as an electrical material. After finishing my Master’s thesis, I wanted to continue in something more mainstream to communications and information processing. My Ph.D. thesis was done in the early to mid-1950s and had to do with a digital device for a computer. At that time computers were analog and I was involved in the processing of noise data recorded from all over the world. We tried to automatically capture that data and deal with it on a hybrid analog-digital computer. The goal of the project was to derive statistical measures that define the essence of radio noise recordings.
Work with TRADIC at Bell Labs
When I came to Bell Laboratories I was fascinated by the first transistorized digital computer, TRADIC. It had been built using point contact transistors. It could do fascinating things and it opened up a new digital world. I got involved in the use of that computer for control systems, particularly in the use of the digital computer in radar ranging and in the pointing of radar antennas. It was very interesting work. The TRADIC computer was done under contract with the Air Force. When they heard of the transistor they immediately saw a chance to put computers on airplanes, because the transistor is such a low power and lightweight device. Not in my department but in a sister department, there was a project called TRADIC Flyable whose goal was to put our computer into more capable system inside an airplane. Back on the control systems work, I recall advancements ushered in by the computer at that time. Particularly, there was the use of the computers to do real time systems simulations each few milliseconds and use the simulations to turn a motor on or shut the motor off so it would come to a halt at just the right point. That was my life as an engineer.
Management of T1 carrier project, high-speed pulse code modulation
What did you do next?
I was offered the opportunity to work on the T1 carrier project as a supervisor. The repeater was my responsibility, and we had to figure out how to transmit 1½ million bits over ordinary telephone wires. The problems of transmission at those high frequencies were cross-talk, pulse retiming, and regeneration, level control, and equalization as well as installation and maintenance in the outside plant environment. That work led to the T1 carrier system. The final development of the T1 carrier system was done by the Merrimack Valley branch of Bell Labs. I was involved in explanatory development and field experiment.
I then became a department head and we moved on to high-speed pulse code modulation work. We built the 224 million bits per second system discussed earlier. We also put 6 million bits per second on wire pairs, the so-called T2 system.
Telstar Satellite System; ocean systems
During that time frame the Telstar satellite was being developed. Because we had digital experience, we worked on the design of the command decoder for the Telstar Satellite System. After a period of excellent service, Telstar was damaged by radiation, probably due to nuclear testing. We had a very interesting little event where service was restored by earth-bound diagnosis and correction of the satellite malfunction.
I was offered the opportunity to go back into the military work as a director and was responsible for what we called ocean systems. Incidentally, that is a wonderful business. The work involved ocean cables and ocean sonar, directed primarily at military applications. We worked on very interesting scientific applications of acoustics and transmission, digital technology, and particularly the processing of data. I think the use of minicomputers to deal with data was pioneered with that kind of work. We used minicomputers to support the operations of the sonar systems, creating some of the earliest microcomputer based operation support systems.
What did you do next?
I was then offered the opportunity to return to commercial telecommunications as executive director of the #4ESS project. The work there was primarily at Naperville, Illinois and Columbus, Ohio with the systems engineering work being done in a separate group at Holmdel, New Jersey. I had the development and systems design responsibilities for #4ESS and the interaction with the manufacturer, Western Electric Bell Labs was heavily involved in bringing up the first few systems of the service. Then I was offered the opportunity to become vice president of electronics technology. I had four years experience there in overseeing and providing the basic technologies of the Information Age. Focus was on the integrated circuit, and the laser, as well as power, energy, and printed wiring. I worked through four years of technology’s vibrant growth. That was very valuable experience.
Service as vice president, executive vice president of Bell Labs
In 1979 I became executive vice president of Bell Laboratories and was responsible for the systems and technologies that went into the telephone network. That included electronics technology, switching, transmission and operations support systems. Later, as AT&T was divested, my role expanded to include communications services as well. Those were critical years, because of the repositioning of AT&T and the resulting impact on Bell Laboratories. Ian Ross was the president of Bell Laboratories at the time and he did a tremendous job of repositioning Bell Laboratories.
When did you become president of Bell Labs?
In 1991 I was asked to be the president of Bell Laboratories. I managed the research and development activities in the Bell Laboratories for four years. I was fortunate, because that was the time when the impact of the new communications environment was really being set. The 1984 break up of the Bell System only set the stage. What eventually became the Communications Act basically opened up everything to competition. I reached mandatory retirement age in 1995. AT&T chose to form Lucent Technologies in 1996. What we did in the first half of the 1990s helped position Lucent Technologies and Bell Laboratories for the tremendous success they have enjoyed. Importantly we created a new R&D paradigm. We brought R&D closer to customers, and we found ways to do things much faster and cheaper. We put the quality methodology to work. It was a challenging and very productive time, and it’s gratifying to see how well it positioned the launch of Lucent Technologies.
Bell divestiture and Bell Labs
Would you talk a little more about how the breakup of the Bell System affected innovation at Bell Labs?
<flashmp3>383 - mayo - clip 4.mp3</flashmp3>
There were two events. In 1983 the FCC "computer inquiry" directives put partitions between parts of telephony. In 1984 Judge Green split off local service. The net effect of those two actions was that a new environment for R&D was created. However, what R&D was doing was not significantly impacted. Research still needed to go after advancements in software, microelectronics, lasers, fibers, and create new innovations. Development continued to improve electronics technology and design new transmission and switching systems as well as new terminals and operations support systems.
The equipment market stayed pretty much the same. AT&T had started buying a lot of equipment for the general trade, but a large fraction of the equipment in the divested operating companies, and in the new AT&T, was from Western Electric. The divested companies were major customers of the AT&T manufacturing units. Bell Labs continued to be oriented around electronics technologies, switching systems, transmission systems, terminals, and operations support systems. The business units came in to manage how the technology goes to the market. Western Electric was broken into business units and disappeared as an entity. Bell Labs remained an entity doing the same kind of work on the same kinds of systems. Reporting assignments were altered, but I’m hard pressed to think of a single system that the R&D community stopped working on as a result of the 1984 decree. However, Bell Labs split in 1984 in order to provide an R&D capability to the divested companies.
The regional companies formed their own R&D and began doing their own research, systems development and systems engineering. We transferred about 10% of Bell Labs to Bell Communications Research, the new R&D unit for the changed companies. That was done by having people follow their work. Work on local switching engineering and work on large operations support systems moved almost entirely out of Bell Labs along with a selection of researchers, transmission staff, and other people.
Through the 1980s, Bell Laboratories evolved the relationships under which it did much the same thing as before. People continued to work on what they would have been doing anyway. Customers became more important and every year the priorities were reexamined. The typical engineer’s job continued, but the competitive environment put more pressure on cost and schedules. We were always challenging ourselves to doing things faster, at lower cost and with higher quality. We focused on the product realization process and how to lead as strongly in product realization as the Bell System had led in telephone service. It was a different mindset. It also evolved that the customer had to become a part of the process, whereas that was not the case in the Bell System days. That was a change for the R&D community. However the real change in the R&D paradigm, the way in which R&D is done, came in the early ‘90s as a maturing of the quality initiative.
Quality initiative and new R&D paradigm
Please talk more about the quality innovation and the new R&D environment.
The quality initiative was born as a solution to competitiveness. The Japanese perfected it as their means to compete in our environment by providing us things we like at attractive costs. American Industry began to question the very way in which it was doing R&D. A new paradigm emerged where time to market was a critical factor. Things had to be done in parallel by multi-function teams for many projects. For example, the optical amplifier was a tremendous enhancement on undersea fiber optic cables. To achieve that, we brought together the optics research, optics component development, systems design and systems development people as well as the manufacturers and installers as a team. We worked as a team and did as much as we could in parallel. We tried to get rid of serial handoffs. A lot of delay and costs are caused by serial handoff. Even though there was risk in doing everything in parallel, we caught problems early enough that we reduced the risk to an acceptable level. In the old days design fixes would be sent back upstream to the appropriate link in the serial process. In those days people worked very hard to see that there was no problem so they wouldn’t have to reiterate, but when serious problems arose, it could take a year or more to go back, redesign and make things right. Thus through multi-function teams and reiterating in the small we were able to dramatically reduce the time to bring the technology to market.
Another tool of the new R&D paradigm was reuse of designs. For example, software development had gotten to the point of consuming enormous resources, controlling time to market and being the source of most customer complaints. The new R&D paradigm preferred the concept of reusable assets. Rather than design software for each system, we could create a "warehouse" of reusable software components. The components were designed once and reused across many systems. It improves quality, saves a lot of effort and time, and brings in a certain amount of standardization.
In the new R&D paradigm, the end-user customer is brought closer to R&D. So we pursued the notion that customers should not be kept in the dark until the product was already developed. We began to couple them better to design and problem solving process. In many companies in the past, products went off to market at great expense only to find that the customer would not buy them.
Did you have any projects like that?
The first big project in the Bell System that I saw that was like that was the picture phone. R&D worked hard to get everything right. A full motion picture phone was designed in about 1963. When it was brought to market, the customer didn't buy it. Willingness to pay was not a sufficient part of the design process. If that had been done through teams and with the customer as a part of the team, perhaps the project would have ended differently, I don’t know. My guess is that the project would have been stopped earlier. After all it is now forty years later and even with so many technological advances, there is still no real picture phone. The customer hasn’t embraced it yet.
The picture phone is an interesting example. My understanding is that it was unveiled at the New York World’s Fair in 1964 or 1965.
Yes. We were fully prepared to market the product, and we introduced it at the 1964 World’s Fair. I had it on my desk for quite a few years. We used it internally. It added a new dimension to telecommunications, but it never really caught on somehow. If you asked how many picture phone calls did I make, I would say about 25 percent. I tended not to use the picture phone when I was in a hurry and just had a simple question to ask. It’s a generational thing too. If I’d had a picture phone all my life, I probably couldn’t have lived without it.
I remember my first color television. We got it as an anniversary present. And we said, “Well, we know we don’t need it. Color doesn’t do anything, but that’s what anniversary gifts are all about -- opportunities to spend money on things that are not really needed and don’t make any sense.” Well, about three weeks later we didn’t want to look at a black and white set again, and never did. A picture phone somehow doesn’t catch on in three weeks. Nobody knows why.
IEEE, Communications Society
I’ve been a fan of the IEEE for many years. There was something in my generation and in me about a thrust for excellence that was overpowering. It may also exist in today's generation. The reason I came to Bell Laboratories was because the Bell Labs books I used as textbooks were excellent. The book by Bode was one of those textbooks. It opened my eyes to the depth that can be achieved over and beyond regular textbook material. That’s a way to lead into something that will show my age now. The IRE Proceedings, and later the IEEE Proceedings, were wonderful publications. I was intrigued by the caliber of their technical articles and what could be gotten out of a really good paper written by an expert in one’s field of interest.
Bell Laboratories provided both support and a high standard of excellence. Management at Bell Laboratories was very supportive of work for the IEEE. I was very impressed with the [IEEE History|IEEE]] (IRE at the time), so I got involved. Since my first job entailed using computers for automatic control, I joined the subcommittee dealing with that technology and standards. It was a very important subject for the [IEEE History|IEEE]] because computers in control were a hot topic of that day. I attended many meetings at headquarters in New York City relating to that work.
When involved with the T1 carrier, I was very much involved in semiconductor devices. If you ask who made T1 carrier and a success, I’d have to say it was the device community. Without good devices the T1 carrier is nothing. Without integrated circuits, digital systems would be nothing. So, I got involved with the Solid State Circuits Conference. That conference then was held in Philadelphia every year at the University of Pennsylvania in the snow. I was on the program committee for the Solid State Circuits Conference for almost ten years, was program chair one year and general chair one year. I was also involved in the Northeast Electronics Research Meetings (NEREM), especially in organizing seasons.
I joined the IEEE group working on standards for integrated circuits and operational amplifiers, and got involved in standards on a broader front. I’ve served on IEEE awards committees and, and basically got involved in the IEEE where I could provide a service. I’ve been fortunate to have my work recognized by the IEEE. That’s a matter of record.
Predictions on the future of telecommunications
What do you see for the future in telecommunications over the next ten to twenty years? What might be some of the technical challenges?
<flashmp3>383 - mayo - clip 5.mp3</flashmp3>
I’ve given many talks on this subject, so you can find my views by looking up some of the papers and talks on the future of telecommunications. In summary, nothing in the foreseeable future is going to displace the three killer technologies that have brought us the Information Age. Microelectronics is going to continue to evolve for a number of years, even though we may be reaching maturity on feature size. For the last fifty years innovation in microelectronics has largely been putting more circuitry on a chip. That means circuits that are faster, cheaper and use less power per function. There will be innovations on other fronts, such as getting more software onto the chips and more one-chip systems. However I don’t see anything that’s going to challenge the silicon chip for mainstream Information Age circuitry. I see fiber optics and photonics continuing to advance rapidly in both capacity and distance and distance. Wavelength multiplexing and other innovations will add flexibility to fully use capacity. There will certainly be more innovations in structures for lasers and laser arrays. So far photonics has been used primarily for transmission and storage. Early work at Bell showed photonics can also do computing functions such as logic and data manipulation. Photonic logic circuits look primitive compared to the integrated circuit chip, but I wouldn’t write it off. I believe that someday it will enter that domain. The resulting computers will be massively parallel, very different from today's massively serial computers. People correctly argue that it’s tough to control light in small dimensions, but my response to that is to point out that we form integrated circuits with light. If light can be controlled to form the features of the integrated circuit, then maybe it can be controlled to do what the integrated circuit does.
In regard to software, we have a long way to go. I don’t see the technological limits for software being reached for many, many decades to come. Even if the hardware technologies mature, software will continue to prosper around those mature technologies. The Information Age may mature in another ten or twenty years to a point where hardware and software engineering looks al lot like civil engineering. That is, we’ll keep building new systems, but largely off standardized charts, computer aids and other reusable assets.
We have a challenge with mobility. The radio spectrum is the only way to get mobility right now, and that’s limiting. Radio propagation is not as nice as fiber propagation. We still have problems with size, but we can build many things as small as humans can handle. There are still many challenges with wires, boards and physical design. Batteries running down and wires that break are some of the biggest impediments. I don’t think those problems will be solved in ten years. It’s sort of like cars breaking down. Cars are made better and better, but there is no technology stops breakdowns.
After Information Age hardware and software technology matures, the frontier will move into how these systems and technologies are used for the good of society. Bold new frontiers will open up, and the Internet is at the leading edge. I was just out Christmas shopping and every time I got back in the car I told my wife, “The Internet is the answer.”
Thus far the Information Age has been primarily a technology or capability concept, but it’s going to turn into a “doing it” concept. Internet is just the beginning of a new way of life. It opens doors to all kinds of new frontiers, making life more interesting, rewarding and challenging. Perhaps the biggest challenge is to smooth out the bumps on the road to this new way of life. The Industrial Revolution had its negative side and the Information Age will have its negative side. We will have to deal with that. Some of it will have to come through laws and law enforcement. A lot will also come through changing generations. A lot happens to makes our children better than we are. That’s the good side of evolution, and laws aren’t needed for that. New generations can pick the best, go with it and can make the change to a better world.
- 1 About John Mayo
- 2 About the Interview
- 3 Copyright Statement
- 4 Interview
- 4.1 Milestones in the history of communications
- 4.2 Invention and development of the transistor
- 4.3 T1 carrier system and Pulse Code Modulation (PCM) systems
- 4.4 Evolution of telecommunications through transistors and software
- 4.5 Electronic Switching System (ESS); telecommunications service, reliability
- 4.6 Telstar Satellite
- 4.7 Wireless communications
- 4.8 Career summary
- 4.9 Bell divestiture and Bell Labs
- 4.10 Quality initiative and new R&D paradigm
- 4.11 IEEE, Communications Society
- 4.12 Predictions on the future of telecommunications