About Jack Sipress
Sipress received his Bachelors, Masters, and Doctorate at Brooklyn Polytechnic. He then spent his career at Bell Laboratories, from ca. 1960. The first part of his career, through 1976, was involved with the development of digital transmission, particularly the T1-T4M systems. He spent two years working on satellite systems, particularly American Bell International, trying to set up a satellite system for the Shah of Iran. In 1978 he went to work on Bell Labs’ fiber optic system, particularly its installation in submarine cables. He notes that the installation of fiber-optics was in part an end-run by AT&T around the FCC’s favoring of satellites over submarine cables.
About the Interview
JACK SIPRESS: An Interview Conducted by David Hochfelder, IEEE History Center, 10 September 1999
Interview # 365 for the IEEE History Center, The Institute of Electrical and Electronics Engineers, Inc.
This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.
Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, IEEE History Center at Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 USA or firstname.lastname@example.org. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.
It is recommended that this oral history be cited as follows:
Jack Sipress, an oral history conducted in 1999 by David Hochfelder, IEEE History Center, Hoboken, NJ, USA.
Interview: Jack Sipress
Interviewer: David Hochfelder
Date: 10 September 1999
Place: IEEE History Center, New Brunswick, New Jersey
Communications engineering; digital transmission
Give a brief outline of the history of communications engineering since World War II.
I’m not an expert in the history of communications since World War II, I can only talk about my involvement with it. I got involved with communications in approximately 1960, a couple years after I joined Bell Laboratories. I first was involved for a number of years in developing digital transmission facilities operating on copper cables, starting with wire-pair cable. That’s the era when T1 was first starting. I got involved with T2 in approximately 1962, which operated at six megabits. Most of this was just exploratory work, including some at 274 megabits, but these systems never found much application.
The early ‘60s had a series of transmission facilities, which were analog on wire-pair cable and coaxial media. In the early ‘60s, the digital era started with one and a half megabits on wire-pair. By the late ‘70s, people were talking about forty-five to ninety megabits on optical fiber. Roughly about 1980, serious work got started on deploying optical technology on undersea cables. That work started out at 280 megabits on optical fiber under the ocean, and currently people are talking about several terabits on a cable under the ocean.
Would you explain the terms T1, T2, T4, and T4M?
T1 was the first incarnation of a digital transmission facility intended primarily as a facility to go between local telephone company central offices. It used wire-pair cables, copper wires. It operated at one and a half megabits, twenty-four voice channels. It was the first major digital communications facility developed, and it was a great success and still has a widespread use; it hasn’t been retired yet. T2 was an attempt to extend that technology to six megabits. It was a development effort at Bell Laboratories in the mid-sixties timeframe and concluded around ’66. It never found widespread use with only a few applications in North Carolina in the mid-sixties. T2 operated at the second level of the digital signal hierarchy, the DS2 level, six megabits, which was four times the DS1 one and a half megabit level. T3 is the DS3 level, which is seven times DS2 with approximately forty-five megabits. That never had a copper wire application associated with it. T4 in the mid-’70s operated at the DS4 level, which was six times DS3, which is 274. T4 found a couple of applications. The major one was in New York City in ’73 or so. The “M” in T4M referred to “metropolitan,” which meant it was a metropolitan application of the T4 technology as opposed to a long haul or Long Lines application.
So your entire career, basically, has been involved with digital transmission?
With the exception of a couple of years, yes. Roughly the ’76–’78 timeframe.
Technical challenges of digital transmission
Would you talk about some of the technical challenges involved in digital transmission?
- Audio File
- MP3 Audio
Play (365 - sipress - clip 1.mp3)
My first involvement was with T2, which was an attempt to apply the same kind of technology as was used on T1. Moving from one and a half megabits to six megabits on a wire-pair cable rapidly got us into a crosstalk problem. After various approaches to attempt to alleviate that, the solution that was chosen was to develop a new so-called low-cap or low-capacitance cable. That never found very much application. The next attempt that I was involved with was to apply 274 megabits to coaxial cable. Back in the early ‘60s we were very happy with the first experimental results that we had gotten. We were able to get transmission of approximately one mile at 274 megabits before we had to regenerate, which was quite a breakthrough. Again, that system had little application.
There were two reasons these systems found little application. One was that the radio spectrum was a free resource. You didn’t have to pay for the right-of-way, so the radio systems were quite a bit cheaper. Digital systems on cable required right-of-way, which was very, very expensive. So they had problems competing with the advanced radio systems. The second reason was analog systems were able to achieve larger capacity on the same transmission medium than digital systems could in those days. The overriding perspective within the Bell system in those days, which was driving all of this, was minimum cost. On a per voice channel basis the cost of analog systems was less than that of digital systems, and digital systems never really got off the ground because of that. A number of years later the Bell system had to write-off $6 billion dollars worth of investment in analog systems because of customer requirements for digital technology. But it did take the development of fiber systems to provide cost effective digital transmission.
Now is this for data transmission or voice telephone conversations?
Data was such a minimal requirement in those days that voice was the overriding requirement which then drove the per channel cost of the systems. It took the development of fiber technology plus the increase in demand for digital data before the analog systems no longer could cut it.
Pulse code modulation
Let’s talk about some of the challenges involved in digitizing voice communications. I mean, we normally think of a telephone conversation as an analog signal. Can you talk about how you convert that?
I was not directly involved in that aspect. This is so-called pulse code modulation, in which we took samples of the voice signal. We had to use the appropriate mathematical sampling algorithms to be sure you sampled at a high enough rate, so information wouldn’t be lost. Then the information is quantized into a binary format to get to 64 kilobits for a voice channel. The sampling rate was eight kilohertz because the bandwidth of the voice signal in the Bell system was typically a three-kilohertz channel. Previous technology in analog channel systems was used to limit the bandwidth of the voice signal to three kilohertz, so we sampled at eight kilohertz, something greater than twice the bandwidth. Actually I think the bandwidth was .3 to 3.3. We used a seven-bit code, which had lots of testing done at Bell Labs and other places to determine how many bits per sample was appropriate. Seven bits was selected as the appropriate coding word length, and an additional bit was added for signaling purposes, which is how you got eight bits. So eight times eight, or 64 kilobits, is the standard 64-kilobit signal that’s used since the early ‘60s and late ‘50s. That’s where the 64 kilobits came from. Very simply it was something greater than a two times sampling rate on a .3 to 3.3 kilohertz signal, times seven bits, on which various listening testing was done to determine the appropriate rate plus an additional bit per sample to get to 64 kilobits.
In 1980 you started work on lightwave systems?
- Audio File
- MP3 Audio
Play (365 - sipress - clip 2.mp3)
I started work on light wavesystems in 1978. After I concluded the work on T4M coaxial systems, I went off to work on satellite systems for a couple of years, which involved two aspects. One was AT&T had a contract with the Shah of Iran, which would help him develop communications facilities for Iran. A new company was formed, American Bell International Incorporated. Part of the work they were doing for Iran was the development of a multi-purpose satellite system: educational television, military intelligence gathering, etc. That work was subcontracted to Bell Laboratories, and I was asked to be head of the department to working on that. Also, part of that work involved implementing earth stations for connecting Iran to the Intelsat system. And the department I was in at Bell Laboratories was also responsible for various earth stations used by AT&T.
In 1978 there was a fiber development program under way at Bell Labs that had run into trouble. I was asked to come back and take over that work because of my digital transmission background. The work started out at six megabits on each fiber. A small application in Trumbull, Connecticut was the first real commercial system application deployed using fiber. That then led to the development of a forty-five megabit system on fiber, which was to be applied in the northeast corridor of between New York, Philadelphia, and Washington, D.C. That was subsequently changed to 90 megabits because technology was moving so quickly. Then I was asked to go over and head up the development work for submarine cable systems.
As I was leaving digital transmission in 1976, my boss in those days, Ira Jacobs (who was a really great boss, and a great teacher) pulled together a meeting between himself, his department heads, and the supervisors in the organization, in which he presented a lecture on fiber optic transmission. Ira of course already knew I was leaving the organization to go to work on satellites. I listened to this and I walked out of that room shaking my head, saying to myself, “Boy am I glad to be leaving this organization. I’m going to be long retired before any of this makes any sense.” I don’t know anybody who was ever optimistic enough about how rapidly the optical transmission technology would develop. Instead of me being long retired before the technology would find an application, two years later I was back working on a specific development, and by the time I retired we were talking about two terabits on a optical submarine cable.
Optic transmission development
Would you talk about some of the technical challenges involved, and would you talk about optic transmission as opposed to coaxial, twisted pair?
You name it, we had to have new components, devices, optical detectors, lasers, the new transmission medium, fiber. We had the capability of putting signals on step index fibers, which were useful only for low rates, and had to move to graded index at higher rates than anybody was using, higher speed technology. We had to deal with glass chemistry, manufacturing techniques, factory techniques, and all the other non-components related issues, all of the transmission technology—how do you transmit the pulses? What kind of coding schemes do you use? How do you generate and amplify optical pulses and equalize them? How do you do this? You name it.
More specifically, could you talk about some of the challenges that you’ve faced in developing these systems? What are some of the big hurdles?
- Audio File
- MP3 Audio
Play (365 - sipress - clip 3.mp3)
With the six and forty-five megabit systems it was getting the components, getting the designs. In those days with forty-five megabits, the silicon barely made it. We had to deal with the development of amplifiers, the decision circuits, the timing circuits, which would operate at those rates. The involvement of my organization was the development of the system, working to find someone to get you the components, which permitted you to work at those speeds. I then went off to work on submarine cable. To answer your question about technology, starting with fiber, how do you get a fiber, which is strong enough to get it onto the bottom of the ocean? How do you design a cable, which protects the fiber, so it doesn’t see the stresses of the deep sea? How do you get it in the water? How do you get it back out of the water if you’ve got to take it out to repair it? If there’s a failure down below, you have to pick it up. It’s got to be strong enough to take the strain to get it back up again. In those days for fiber, you’re talking about two percent strain on fiber, 200-kpsi fiber. Typical fibers in those days were maybe one-tenth of that, which is a real technical challenge. Great work was done at Bell Laboratories in Murray Hill and Bell Laboratories in Atlanta. How do you splice the fiber together with a splice strong enough so it would survive, within the cable, the rigors of the ocean? Some very high-strength splicing techniques had to be developed. Great work was done at Murray Hill on the chemistry of fusion and techniques developed to do that. So we had a major program on fiber, cable, extraction techniques, splicing techniques. Then you get over into devices where you have the lasers and laser technology. Detector technology. A decision has got to be made on what wavelength to select for operating under the ocean. Terrestrial systems operating on dry land were using short wavelength, 820, 830. And we rapidly concluded from the analysis of the undersea system that that wavelength really couldn’t cut it. We could not use that kind of technology. So we decided to use 1.3-micron single wavelength technology, which then meant the development of fibers that not only had the strength requirements but also had graded indexes rather than step index. Lasers were all short wavelength—we needed long wavelength lasers. Bell Laboratories could not be depended upon for the laser technology. We were up to our eyeballs trying to make the short wavelength laser. There was actually very few people available to work on long wavelength lasers, so we went to Japan, which was previously unheard of for Bell Laboratories to get the lasers. To put it into perspective, up until that time, all undersea cables were basically relatively minor modifications of existing dry land technology. We decided in this 1980 timeframe that the existing dry land fiber transmission technology at short wavelength just wouldn’t make it for submarine cable, so we decided to leapfrog the dry land technology and develop technology that went well beyond that being used by the dry land folks. A lot of people thought it couldn’t be done, but we decided we had no choice. In fact we had to have completely new lasers and detectors. We had to have silicon that ran at 280 megabits. And silicon didn’t exist in dry land systems running at that speed in those days. The devices also had to be developed with a reliability that was appropriate for the bottom of the ocean, which meant component reliability in terms of a few fits, failures in ten to the ninth hours, which was unheard of for silicon in those days. Also unheard of for lasers and detectors. For the lasers we had to develop new reliability approaches and new protection approaches. We used multiple lasers. We used protection switching under the ocean for the first time.
Federal Communications Commission; satellite and cable transmission
We had challenges. Why were we so aggressive? That is another story that is not too well known. In those days, the Federal Communications Commission had an industrial policy in support of U.S. industry. They would deny it, but in a sense they did have an industrial policy. It manifested itself in supporting satellite transmission, which was the competitor of undersea cables. The reason they were supporting it was that the U.S. had a technological and industrial lead in satellite transmission. So by encouraging satellite transmission, they were supporting U.S. industry. They kept the books on how you calculate the cost of the transmission of cable versus satellites in a way, which favored satellites. Since the FCC in those days had the role of making sure the Bell system would provide the cheapest facilities possible to minimize the cost of international communications, they wouldn’t let the Bell system put in new cables.
The other aspect of it was that the FCC had a social policy. The way the cost of the satellite transmission was calculated, it cost the same to transmit a signal from New York to London as it did from Iowa to the heart of Africa. The cost of the terrestrial dry land and wet circuit from Iowa to the middle of Africa was very, very expensive compared to that satellite circuit. The cable circuit from New York to London was a lot less expensive than that for the satellite. The bulk of the traffic was big routes: New York, London, circuits like that. The FCC didn’t look at it that way, and prevented the Bell system from putting in ocean cables. So a couple of people, particularly a guy named Frank Tuttle at AT&T Long Lines, had great foresight, great vision, and decided the only way the Bell system was ever going to get another cable was to provide the FCC with competing technology. So Frank went to the FCC and proposed undersea cables using fiber optics, and the FCC bought in. Bell Laboratories in effect told him he was crazy. He wouldn’t take no for an answer. He came to Bell Labs in effect with an open checkbook and said, “Make it happen. This is the right technology, and this is the right thing to do.” The net result was a development program started in 1981, and in 88 we were putting in cables across the Atlantic and Pacific Oceans.
So you expect the undersea cables to be around for quite a while?
Satellites and undersea cables are complimentary. Undersea cables are ideal for large traffic streams between major population centers. Satellite circuits, on the other hand, are well suited for mobile communications. Cable is not going to compete with satellite in that application. Cable is going to have a difficult time also competing, although they’re beginning to do so, for one to many transmission, broadcast kind of transmission, and also to provide transmission on very low dense routes, Iowa to the middle of Africa. So they’re complimentary technologies. The satellite folks have concluded that also. In fact, in the ‘95–’96 timeframe I was meeting with some folks from Comsat, and it dawned on me for the first time that we could get more bandwidth on one fiber than the total capacity of the Intelsat system, even with all of the technological advances they were forecasting for satellites. So there was just no competition. Optical transmission cables will be around for a long time.
Fiber systems bandwidth and development
I’m curious about a technical issue you raised. What’s the advantage of using 1300 over 800?
Higher bandwidth and lower cost. The loss on the fiber is substantially less at 1.3, less than a db versus a few db at the shorter wavelength. And since the need for regeneration is determined by the loss on the fiber, in the 1.3 case the repeaters are much further apart. You have less noise accumulation, less timing accumulation problems, and you have quite a less expensive system and a much more reliable system.
An interesting story goes with that. Back in the ’80–’81 timeframe, when serious development started on undersea cable, we assumed that we would get across the Atlantic with repeater spacing of about 20 kilometers. After some analysis we realized if we really pushed it, we might get even 40 or 50. By the time it went in the water, we were doing somewhere between 60 and 70. This just indicates how rapidly the loss performance of fiber at 1.3 improved, as well as the power capability of the lasers and the sensitivity of the detectors.
Was it comparable repeater spacing to copper?
No, much greater. The copper in those days had spacings of ten kilometers, fifteen kilometers, something like that. And that was for a system that had a capability of about 4,000 channels. We are talking about fiber systems that have capability of at least twice that. Cables that instead of a couple of inches in diameter were less than an inch in diameter, which had all kinds of implication. For installation, we could load almost the entire transatlantic cable on one ship, which means a much less expensive installation, which means less expensive systems. So between additional channel capacity, longer repeater spacings, and smaller cable with less materials, we ended up with a system that was substantially less expensive per channel.
All under development within seven or eight years.
Yes. Well there was research work that had been previously done as well as exploratory development in the late ‘70s. The basic cable patent, for example, was issued in ‘76. The firm decision to start the development was in ‘81.
Education: Brooklyn Polytechnic
Would you talk a little bit about your education at Brooklyn Poly, and some of the people that you worked with there, and some of the associations you formed?
I arrived in Brooklyn Poly in 1952. It had a very, very interesting student body in those days. The distribution was bipolar. We had some very, very outstanding students. Most of the students in those days, as it still is at Brooklyn Poly, were first generation children of immigrants. Most Brooklyn Poly students continue to be that way; only the faces have changed. Likewise I was a first generation immigrant child. I was able to get a New York Regent scholarship, which enabled me to come up with the funds to pay the tuition at Brooklyn Poly so I wouldn’t have to go to City College. Those bright students who did not get scholarships went to City College, and those who couldn’t make it to City College came to Poly. So you had this bipolar distribution with of a group of outstanding students, and then you had a hole, and then you had everybody else in terms of capability of the students. It was a great, great school in terms of the people they had on the faculty to teach. Also lots of honors programs, special courses. So I did my undergraduate there. Then in-between my junior and senior year I worked a summer at Bell Laboratories. And then I went on to get my Masters at Brooklyn Poly, was a research assistant at the Microwave Research Institute, which was formed in the World War II days. The research group leader was Isaac Horowitz, a very, very capable person. And his boss was John Truxal, an outstanding instructor, an outstanding person. I finished up my Masters and started on my doctoral program. I got my BEE in ‘56, my MEE in ‘57. Then I started on my doctoral program. In the spring of 1958, I sort of raised my head up out of the sand and I realized that John Truxal was leaving to go off to become a Dean at Stony Brook and Isaac Horowitz was leaving to go off to the University of Colorado. The leadership of the research group, which was working on active filters, control theory, things like that, was disappearing. It no longer made sense for me to stay at Brooklyn Poly. So I contacted a person that I had worked for at Bell Labs. Several weeks then went by and I hadn’t heard anything. I called Bell Labs and asked what was going on, and they said, “Oh, you haven’t heard from us?” And I said, “No.” A couple days later I got a letter saying, “No, thank you.” A day later I got a phone call from Frank Blecher, who was also very active at Brooklyn Poly, saying, “There were two jobs at Bell Labs. Are you interested?” I said, “Sure.” One of them was working in Frank Blecher’s own group working on the same thing that I had done my Master’s thesis on and was doing my doctoral thesis on. It was an interesting interview. I knew more about the subject that the guy who was interviewing me. So I got the job with Bell Labs. I finished up my doctorate part time. I did my thesis for Jim Angelo, who was an absolutely great thesis advisor. I finished up the work in ’60, but had neglected to take care of the language exam, so I got the degree in ’61. Until I left for Bell Labs, I was a graduate assistant, research assistant at Poly and also doing some teaching.
So the professors there were great people, people like John Truxal; Isaac Horowitz; Jim Angelo; Mischa Schwartz, who was a great professor; Papoulis, who I never had a course with. So they had a very powerful EE department. It was one of the highest rated in the country, and they turned out some very, very good students.
That was my impression also, just from talking to other folks who graduated from there. Can you talk a little bit about some of your classmates, and what they went on to?
Most of the classmates that I was close with -- I suspect you don’t know them. As a graduate student I shared an office with three other graduate students. Two stayed on at Brooklyn Poly to teach, Joe Bongornio and Dick Haddad, who both taught at Poly for their entire careers. I forget the name of the fourth person. I sort of lost track of him. He went off to industry. Other people who were students at Poly went on to careers in which they made major contributions.
- 1 About Jack Sipress
- 2 About the Interview
- 3 Copyright Statement
- 4 Interview
- 4.1 Communications engineering; digital transmission
- 4.2 Technical challenges of digital transmission
- 4.3 Pulse code modulation
- 4.4 Lightwave systems
- 4.5 Optic transmission development
- 4.6 Federal Communications Commission; satellite and cable transmission
- 4.7 Fiber systems bandwidth and development
- 4.8 Education: Brooklyn Polytechnic