Oral-History:Donald Cox

From ETHW

About Donald Cox

Donald C. Cox

Cox received his Bachelors in Electrical Engineering and Masters in Microwave Electronics (1960) at the University of Nebraska (1959). He spent three years as a Communications Research and Development Officer at Wright Patterson Air Force Base, working on the X-20 Dynasoar, then went to Stanford University for his PhD (1967). He worked at Bell Labs from 1967 to the AT&T divestiture in 1983, to Bellcore until 1993, managing radio research activity, and then to Stanford as a Professor of Electrical Engineering. Cox’s career research focus has been on mobile wireless communication. At Bell Laboratories Cox took multipath measurements of the effects of buildings, houses, trees, etc., on the propagation of radio waves; measured time delay and Doppler shift in multipath; studied dynamic channel allocation for mobile systems; looked for ways of using smaller cells and coverage areas, took propagation measurements, and studied co-channel interference and antenna diversity. At Bellcore, Cox developed digital mobile systems, particularly the Wireless Access Communication System (WACS), a.k.a. a wireless loop. WACS is now the (as yet unimplemented) US standard, known as the Personal Access Communication System (PACS) standard. Cox’s research at Stanford includes network and circuit issues regarding mobile phones (particularly multicarrier transmission), interference cancellation, error detection, amplifier technology, and smart antennas.

Cox discusses the effects of the AT&T divestiture on the nature of research and development nationwide. He discusses some of his colleagues at Bell Labs/Bellcore, including Bob Lucky, Bob Wilson, and Bill Jakes. He discusses the different environments of the private sector and the university. He mentions his involvement with the IRE, the AIEE, IEEE, the Communications Society, and the IEEE Journal of Selected Areas in Communications. He discusses the importance of the Communications Society for the mobile wireless field, and the importance of the mobile wireless field for the Communications Society. He discusses changes in communications engineering during his career, focusing on digital computer signal processing and network control, microprocessors, and computer random logic. He speculates that wireless communications will be increasingly important, but always dependent on a core of wired communications. He mention particular technical challenges such as network integration and hand off issues.

About the Interview

DONALD COX: An Interview Conducted by David Hochfelder, IEEE History Center, 28 September 1999

Interview #364 for the IEEE History Center, The Institute of Electrical and Electronics Engineers, Inc.

Copyright Statement

This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.

Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, IEEE History Center, 445 Hoes Lane, Piscataway, NJ 08854 USA or ieee-history@ieee.org. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.

It is recommended that this oral history be cited as follows:

Donald Cox, an oral history conducted in 1999 by David Hochfelder, IEEE History Center, Piscataway, NJ, USA.

Interview

INTERVIEW: Donald Cox

INTERVIEWER: David Hochfelder

DATE: 28 September 1999

PLACE: IEEE History Center at Rutgers University, New Brunswick NJ

Education and career overview

Hochfelder:

Let’s start with your undergraduate and graduate education.

Cox:

I received a Bachelor’s degree in Electrical Engineering from the University of Nebraska in 1959. My major was Electrical Engineering, with an option in Electronics. I stayed on there, and in 1960 got a Master’s degree, mostly in Microwave Electronics.

Hochfelder:

Did you move on to Stanford directly from there?

Cox:

No, I spent three years as a Communications Research and Development Officer at Wright Patterson Air Force Base. I was on a communications project for a space program called Dynasoar that never got off the ground.

Hochfelder:

The X-20 Dynasoar?

Cox:

Yes. The communications for X-20 was rather exotic for the time. It was going to be in X band. When I finished my ROTC commitment tour of duty at Wright Patterson I went out to Stanford to graduate school and worked on my Ph.D. until I finished in the last quarter of 1967.

Hochfelder:

Is it correct that one of your earlier areas of research was in phased array antennas?

Cox:

Yes. I built a phased array antenna at Stanford for use in tropospheric matter, that is over the horizon and propagation studies. The antenna was somewhat unique at that time in that we sampled a signal at each of the array elements with a computer and stored them on tape. We had to do the processing later because we couldn’t process in real time with the computers available in those days.

Hochfelder:

After that you began work on mobile wireless communications.

Cox:

When I left Stanford I went to Bell Laboratories at Crawford Hill Lab and started doing mobile radio research. In those days we called it mobile radio. That was several word changes ago.

Hochfelder:

PCS was a distant concept.

Cox:

Yes. It wasn’t even called cellular in those days.

Multipath

Hochfelder:

How did you make the transition from electromagnetic theory to mobile communications, if there in fact was a transition?

Cox:

The transition was fairly smooth, because mobile radio uses electromagnetic waves. My first activities in mobile radio were studies on the effects of buildings, houses and trees in urban areas on the propagation of radio waves.

Hochfelder:

Was that multipath?

Cox:

Yes, we made multipath measurements. I set up a rake receiver, which is used in a spread spectrum system or CDMA system these days, to make measurements of the delay spreads in a multipath medium. We made measurements around Holmdel, New Jersey and then went into New York City and made some of the earlier measurements that were made in that type of environment.

Hochfelder:

What is multipath and why is it a problem?

Cox:

In the case of mobile radio you usually have cars down on streets, which is not a very good environment from the radio standpoint. In those days base stations would be put on towers or sides of buildings. Frequently the direct signal is blocked by buildings, trees or hills, and the way the radio signals find their way to the vehicle in the street or the person with a handset is by reflections off of any objects that are along the way. These different reflections create different paths between the base station and the mobile, and they have different time delays. The different time delays associated with the different paths cause the signal to interfere with itself in the receiver, and this makes demodulation difficult.

Hochfelder:

What are some of the techniques used to overcome the problems due to multipath?

Cox:

The three dominant ways used to overcome the problems with multipath today are (1) transmitting at a low enough data rate so that it doesn’t cause a problem; (2) multipath adaptive equalizers, which are used in several of the digital mobile radio standards, including the North American Time Division Multiple Access standard and the GSM; and (3) a spread spectrum technique or rake receiver that is used in the CDMA system.

Bell Labs

Hochfelder:

Did you go to work for Bell Labs immediately after obtaining the Ph.D.?

Cox:

Yes. I was with Bell Labs at Crawford Hill until the AT&T divestiture in 1983. Parts of Bell Labs went with the operating telephone companies to a company that was at that time called Bellcore. I managed the radio research activity there from about 1983 until 1993 when I left for Stanford.

Hochfelder:

What were some of your technical accomplishments in the 25 years or so you were with Bell Labs?

Cox:


Audio File
MP3 Audio
(364 - cox - clip 1.mp3)


The earliest thing I did was measuring time delay differences in the multipath. I also measured the Doppler shift associated with those, which had not been done before. That work became very fundamental and very useful when digital cellular systems began to be looked at in the 1980s. Later on one of my colleagues and I created what was some of the first work on dynamic channel allocation for mobile systems. We assigned the channels based on demand rather than the fixed pattern that’s used in many cellular systems today. Dynamic channel allocation has been used in some of the advanced systems, including the PHS system in Japan and the DECT system in Europe, but I don’t believe it has been implemented in any of the cellular systems yet.

I took a tour away from mobile radio and into satellite systems from 1973 to the late 1970s. During that period I did an experiment looking at the effects of rain and ice on millimeter waves transmitted from satellite beacons. We built some receiving equipment and used a large antenna to make some measurements. I went back into the mobile radio business on my own in the late ‘70s and started looking at the possibility of using much smaller cells and coverage areas and did a number of propagation measurements and co-channel interference studies. We also did some diversity studies with computer simulations of antenna diversity while I was still at Bell Labs.

Bellcore; Wireless Access Communication System

Cox:

In 1983 or ‘84 when I went to Bellcore, I managed a radio research group there. In that group we investigated a lot of different ways to build digital mobile systems or digital personal communication systems. We looked at different problems, including multipath propagation, types of modulation, channel assignment issues, and virtually all the aspects that need to be considered in configuring a new digital radio standard. The work we did there eventually evolved into a Bellcore set of requirements called Wireless Access Communication System (WACS), sometimes referred to as a wireless loop. We were perhaps the first to use the term wireless loop, but not in the same context it’s used today. We considered a system with mobility at the ends of the telephone loops rather than a fixed loop that some people talk about today. The WACS became a U.S. standard after I went to Stanford. It’s now known as the Personal Access Communication System (PACS) standard. Hardware was developed by several manufacturers in the U.S. and Japan. That system has never been deployed, but it’s still being considered by various groups, including the Taiwanese.

Hochfelder:

Why hasn’t it been implemented?

Cox:

PACS is a full mobility standard. It wasn’t implemented in Japan because they generated their own standard that is in some ways similar and in some ways different, referred to as the Personal Handyphone System (PHS). PHS is not a full hand off system, and PACS is a full mobility standard. In the beginning PHS enjoyed a high degree of success in Japan, but because its capability to do hand off was limited it did not continue to be as popular. About the same time we were developing PACS, the Europeans were working on what has become known as Digital European Cordless Telephone (DECT). There is probably another name that goes with that acronym. The Europeans were already in the process of attempting to deploy the DECT standard when PACS came long. It’s hard to say why PACS was not implemented in the U.S. It was not a cellular standard, so the cellular companies were not particularly interested. At the time it was aimed mainly at local exchanger carrier activities, and those carriers belonged to the seven regions and GT&E. One can speculate as to the reasons it was not deployed, but certainly the wireless and radio activity in companies in the regions was turned over to cellular entities, and local exchange carriers were not permitted to get deeply involved in the wireless business.

Research at Stanford

Hochfelder:

Since your move to Stanford in 1993 and in your current position there as Professor of Electrical Engineering, what kind of research have you been doing?

Cox:

When I came to Stanford I was awarded the Harald Trap Friis Chair. Friis was one of the early radio pioneers at Bell Labs who did some of the very early work in the point-to-point microwave area and in HF. That was an interesting coincidence to me, since I had used Friis’ equation for many years to predict the performance of radio links.

I started a research group at Stanford, and we looked at many different problems associated with mobile systems, cellular systems, personal communications systems – whatever you want to call them today. Our research spanned from network issues to circuit issues. Generally I have one or two Ph.D. students doing research with me on any particular topic. I have students doing work on mobility management, databases and protocols for large scale mobile networks with emphasis on looking at ways to implement personal numbers that could be used anywhere at any time. If you change your location from California to New Jersey, for instance, the concept is that you would keep the same telephone number. This makes it convenient for the person, but very difficult for the network. That’s one extreme of the things we do.

I’ve had students looking at hand off algorithms for cellular and PCS systems, making use of information that will be measured and stored as the system operates. The idea is to improve the ability to do hand off using pattern recognition and pattern classification. We also have projects on resource allocation. We’ve had a student doing work on dynamic channel allocation and power control in a mobile environment looking at various algorithms. Mobility is an extreme challenge to these algorithms, and he generated some algorithms that were very effective in the mobile environment for channel assignment and power control jointly. I have another student following up on that, adding in the adaptive antenna arrays or so-called smart antennas. The three techniques of channel assignment, power control and smart antennas attempt to cope with the interference environment caused by reusing frequencies. They all attempt to maximize the reuse of frequencies and minimize interference so they interact. The effects are not additive. It’s not yet clear what the proper algorithm is for a system with the three working in combination.

We have had students looking at adaptive equalization in various contexts for mobile systems and wireless local area networks. I’ve had a student looking at synchronization and carrier recovery issues for multicarrier transmission. Multicarrier is sometimes referred to as orthogonal frequency division multiplexing. In the wire line context it’s called discrete multitone, and it’s yet another way to get around the multipath. You break a bandwidth or a band into a number of sub-bands and transmit at lower data rates over a number of narrow band channels, then combine them at the other end. This essentially gets around the multipath by creating a bunch of channels which are not as affected by it.

Hochfelder:

Is it a spread spectrum technique?

Cox:


Audio File
MP3 Audio
(364 - cox - clip 2.mp3)


It’s not really like spread spectrum. Spread spectrum tends to take the channel and spread everything out over an extended bandwidth, whereas this takes bandwidths and divides them into pieces. It’s analogous to taking a number of frequency channels on AM radio and transmitting different parts of different messages over different radio stations. I do have a student studying multiuser detection for CDMA systems. I had a student look at what I’ve sometimes referred to as semi-smart antennas for use in handsets. I call them semi-smart because the algorithms are not particularly esoteric but designed for low complexity and low power consumption for canceling interference between two antennas. Two antennas seem to be about the most that can be put on a handset. This work was aimed at improving performance by reducing the impact of interference.

We also worked on interference cancellation. In that case we did computer simulations and built a large scale integrated circuit to do the signal processing. We designed and fabricated the chip and ran additional tests on the signal processing chip. I had another student look at multi-standard signal processing architectures. There is quite a bit of activity in that in Silicon Valley now. The idea is to create a signal processor that will work with more than one standard at a time. You can do that with a general purpose digital signal processor, but that’s not a very efficient way to do it with some of the necessary algorithms. Another way to build a very efficient algorithm or processing chip is to build an application-specific chip to do a particular standard. That’s very efficient in terms of how it processes the data signals, but very inflexible. We worked in between to try to configure a signal processor flexible enough to deal with multiple standards but not so flexible that a lot of efficiency would be lost.

If you look at what has to be done in a digital wireless system, digital cellular system, digital personal communication system, there are a number of activities that are done over and over again on the signals. One thing is speech processing. They all use some sort of speech compression algorithms, and these all do similar mathematical operations on the speech signal. There are modulators and demodulators. All the systems have some type of error control, forward error correction or error detection, usually a combination of the two. These are very specifically different in the different standards. All of the standards take the same kind of processing to cope with multipath in one way or another. Some use adaptive equalizers and some use spread spectrum techniques. These can be formulated in such a way that they can be processed with a common signal processing architecture by downloading software to deal with specific standards. Since we did that, there has been a lot of work going on aimed at this same thing. The popular term for that is software radios, at least the signal processing part of it. We didn’t do any of the RF part of it, which is another problem.

I have a student that’s looking at signal processing techniques for improving the efficiency of linear amplifiers. Linear amplifiers are needed to efficiently amplify the types of signals that are in the U.S. digital standards. Increases in amplifier efficiency directly impact battery life. That work is going on now. That covers most of what I can remember that we’ve done. Since I came to Stanford in ’93 I’ve graduated twelve Ph.D. students and I have ten students in my group now.

Hochfelder:

It’s surprising to hear that there is room to improve the efficiency of linear amplifiers, considering that it’s such an old technology.

Cox:

Linear amplifiers have been around for a long time and a lot of work has been done to improve their linearity, but in this case we are attempting to use nonlinear amplifier techniques and signal processing to improve efficiency. In other words, the aim here is improving efficiency, not linearity. For digital cellular standards you don’t need the 40 dB dynamic range that some applications require. The variations are not that great. You need enough linearity for 10-15 dB signal variations, and it looks like we can do that.

Hochfelder:

What is a smart antenna?

Cox:

In my mind a smart antenna is an antenna array. To start with, we can look at it as a receiving array. In today’s technology, signals are digitized on each of the antennas in an array. Then an algorithm is applied to signals received on various array elements to combine the signals to improve the signal to noise plus interference ratio. There are a lot of different ways the signals can be processed. Many algorithms have been created. Some are effective in mobile environments, but most are more effective in static environments because it takes time for them to converge. These types of antennas – smart antennas (adaptive arrays) – are similar to the type of thing I did in my Ph.D. work.

Hochfelder:

Can it be steered electronically?

Cox:

Yes, some of the algorithms do just that. Some algorithms that are being implemented generate beams and attempt to steer beams and nulls as you would in a conventional phase array. That’s what we did in the one that I built. Some don’t take the view of it being a particular beam that’s generated and they combine signals in ways to generate multiple beams in multiple directions, and multiple nulls in multiple directions, in an attempt to null the interference.

AT&T divestiture and work environment

Hochfelder:

How did the work environment at Bell Labs change after the breakup of AT&T?

Cox:

I left Bell Labs at the breakup of AT&T and went to Bellcore. From what I could see there was a move in the research area towards more applied projects. I had always worked on relatively applied projects anyway, so it probably wouldn’t have had much effect on me had I stayed. As I understand it, some of the theoretical and less applied activities got reoriented.

Hochfelder:

Such as radio astronomy?

Cox:

Yes. My old friends, Arno Penzias and Bob Wilson. The radio astronomers did contribute to some applied activities. They contributed very significantly to some of the early looks at the impact of rain on earth space paths for millimeter waves.

Hochfelder:

That makes sense.

Cox:

Bob Wilson built a sun tracker that tracked the sun when we didn’t have satellites to track. He measured the effects of rain on the signal from the sun. He also built radiometers that were used to estimate attenuation.

Hochfelder:

There were some practical applications of their work.

Cox:

There certainly were.

Bellcore projects

Hochfelder:

I have been under the impression that Bellcore inherited the mantle, if you will, of Bell Labs as a research entity. Would you agree?

Cox:

We liked to think we did, at least for the ten years I was there. We had a wide variety of activities in research similar to what had been done at Bell Labs. We had activity in radio research, fiber optics and switching, and it was done pretty much the way we did it at Bell Labs. And there was a math group. Many of us in the research area at Bellcore came from the research area in Bell Labs.

Hochfelder:

Were the types of projects you worked on at Bellcore similar to those you had worked on at Bell Labs?

Cox:

The major project that my group worked on at Bellcore was what became the PACS standard for PCS, and that was a carryover of work I had started at Bell Labs. I just carried it over to Bellcore, and managed to support and staff it. Some of the work in my group on satellites and point-to-point microwave eventually died for lack of interest. It was really the wireless loop, the personal communications work, that continued. And yes, it was a continuation of the work I had been doing. However it was greatly expanded, with more people working on it.

Hochfelder:

I understand that Bob Lucky was your boss. Did you have a close or distant working relationship with him?

Cox:

Bob Lucky was always a couple of levels above me. He had the position of executive director at Bell Labs for only a short time before I left for Bellcore. A few years later when Bob Lucky came to Bellcore as vice president of research, he wasn’t there too long before I departed for Stanford. Although I worked for Bob Lucky, I never worked for him for very long.

Hochfelder:

Who are some of the other talented people you worked with at Bell Labs or Bellcore and what sorts of work did they do?

Cox:

I worked with Bob Wilson pretty closely on the satellite propagation experiment at Bell Labs. He was the project engineer, designer and implementer of a large antenna that sits on Crawford Hill today. It’s got a huge mass of concrete under it.

Hochfelder:

Too big to move?

Cox:

Probably. It’s built like a battleship. Wilson oversaw the antenna. If you talk to him about it, he’ll tell you it was a radio telescope. It was a joint use instrument. On clear days they used it for radio astronomy, and on rainy days we used it for satellite propagation research. It was a relatively compatible activity. I worked with various other people at Bell Labs over the years. Did you have anyone specifically in mind?

Hochfelder:

No, I just wanted to get a sense of the other engineers you’ve worked with over the years that you feel are deserving of recognition.

Cox:

I should have prepared a list before I came. A name that floats around the industry is Bill Jakes, my first boss at Bell Labs. Bill was the department head of radio research when I went there, and mobile radio research was under him. His name has gone down in history because he edited the first really good technical textbook on microwave mobile radio. It was first put out in the early ‘70s, sold out and was reissued in 1994 or ’95 by the IEEE Press. People that have been in the business for a few years generally refer to Jakes’ book as being the Bible on early mobile radio. There were a lot of good people at Bell Labs that I worked with from time to time. Most of them have gone elsewhere, and some are now at AT&T Labs or off on their own or at universities. I hesitate to mention any name, because if I mention one person I’ll forget someone else.

Comparison of private and academic research environments

Hochfelder:

How would you characterize the difference between the work environments in the private sector of Bell Labs and Bellcore versus working in a university environment?

Cox:

I’ve often been asked that question. There are many similarities and a few differences. One similarity that engineers probably dislike the most is the task of raising money. At Bellcore, I had to raise money to support our research activities, or at least had to justify them. At Bell Labs it wasn’t so apparent who decided what on those things, at least it wasn’t to me from working in the trenches. As a manager at Bellcore. I had that activity, and at Stanford I have to grub for money to support my students. The type of research work we do is similar in some cases. We carried the projects much closer to implementation or applications at Bellcore, and also at Bell Labs with some things. At the university, none of my projects have been carried as far. We did design and fabricate an integrated circuit chip that came pretty close. Another big difference I see is that in industry we hired people who had just finished their education, mostly Ph.D.s, and a few Masters people, whereas universities get people much earlier in their careers. At Stanford I’d get them after they have a Bachelor’s degree. One’s perspective and view of the world is considerably different when it’s not just a homework assignment you can finish by the end of the week. Some go into near panic, having always been in a situation where they had a well thought out assignment and knew that someplace there was an answer book. Nobody knows the answer to a Ph.D. project, and if they do it isn’t a good project. Many adapt very well, however, and at Stanford we have a lot of very bright students. It’s a joy to work with them. I am able work more closely with students than I could with individuals I managed at Bellcore. Another major activity that’s different is that I teach classes. That’s very different from anything I did in industry. Overall, though, there is less difference than one might think.

IEEE and Communications Society

Hochfelder:

Let’s talk about your involvement with the IEEE throughout your career, and particularly the Communications Society.

Cox:

I joined the IRE and the AIEE, the predecessors of the IEEE, when I was an undergraduate at Nebraska. There was always a little bit of competition between the two organizations, and collaboration too. Our student chapter was a joint chapter of the two. I became a member of IEEE when it was formed. I don’t remember when that was.

Hochfelder:

I believe it was formed in 1962.

Cox:

I transferred my membership over. I had joined a special technical group on communications in the AIEE, and IRE had some specialty groups with which I was involved. I think there was one group on antennas and propagation. I became a member of the Communications Society very early on. I have been active in the IEEE, the Communications Society and other societies throughout my career. I became particularly active after I went to Bellcore. There was not a home for personal communications. Cellular radio had a home in vehicular technology. There was a radio committee in IEEE in the Communications Society, but it was largely point-to-point microwave. There was also a satellite committee.

We thought that the Communications Society was a good home for this type of activity, and spent some effort in the 1980s trying to get sessions at ICC and Globecom on this kind of wireless communications, referred to at the time as personal communications. We were somewhat successful, and eventually a special group was formed under the Communications Society that was oriented to personal communications. I’ve been involved in reviewing papers from day one, served a stint as associate editor in antennas and propagation, and was co-guest editor of two or three JSAC issues of the Communications Society.

Hochfelder:

JSAC?

Cox:

I should remember whatJSAC stands for. It’s a journal that puts out special issues on particular topics. I don’t remember what the acronym stands for. The C is Communications. [IEEE Journal of Selected Areas in Communications]

Hochfelder:

And the J is Journal.

Cox:

Yes, the J is Journal. I participated in organizing special issues, and have organized and chaired a number of sessions at many IEEE conferences, Globecoms and ICCs.

Hochfelder:

You were one of the first people to push for the inclusion of personal communications topics in the larger conferences like Globecom and ICC.

Cox:

Yes. I encouraged my people to submit papers, went to some of the committee meetings and tried to get people interested. Fred Andrews, who was at Bellcore at the time, was fairly heavily involved in IEEE Communications Society activities, and we managed to get his receptive attention. We probably never interacted directly at any meetings on it, but our indirect involvement probably helped with the infusion of personal communications into the Communication Society. I pursued that fairly heavily in the early to mid-‘80s, and when I got too busy with the work in my department, a couple of my other people pursued it pretty heavily. Pete Arnold was one.

Hochfelder:

Were the Communications Society and conference organizers receptive to including personal communications topics?

Cox:

In the early days there was significant resistance. When it became evident in the late ‘80s that this was becoming an important aspect of communications the resistance died and a lot of support came. It was hard to swing anything for personal communications by itself, and I remember significant criticism in the Communications Society. We had managed to get a session approved first on personal communications issues that was joint between the radio group and one of the networking groups. One painful memory was when a gentleman came to one of the committee meetings and made a statement that we shouldn’t have anymore such sessions. I had a somewhat heated debate with him and cited the large attendance that we had. That diffused the issue, because the significant increase in interest was obvious.

Hochfelder:

That went on until about ten years ago, didn’t it?

Cox:

The dates are kind of fuzzy, but yes, until ten or fifteen years ago.

Hochfelder:

My impression is that today papers on aspects of personal communications make up a significant portion of the work presented at ICC or Globecom.

Cox:

They certainly do. In fact it was not long before it became difficult to get anything in there due to the volume of papers being submitted. Globecom held a mini-conference within the conference itself that was dedicated to just that. It’s hard to follow these threads through the organization because names change all the time. We called it cellular for awhile, then tetherless communications, then personal communications, and now we call it wireless communications. There are little differences in the branches, but the central theme has been the same: access to large scale networks with wireless.

Hochfelder:

How important do you think the Communications Society has been in advancing the state of the art in communications technology in general?

Cox:

I think it’s been very valuable. The Transactions, the major meetings that they hold and the other meetings they support have been very important. They also support a lot of specialty meetings that have been very helpful, such as the ICUPC and the PIMRC. The PIMRC is not an IEEE-sponsored conference, but it’s one to which the IEEE lends their name. The involvement of IEEE has been extremely important to the technology. Without Communications Magazine, Personal Communications Magazine, and the Transactions, it would be much harder to disseminate the information.

Major developments in communications engineering

Hochfelder:

Generally speaking, how has the field of communications engineering changed throughout your career?

Cox:


Audio File
MP3 Audio
(364 - cox - clip 3.mp3)


The major change has been brought about by the same thing that’s changed most of electrical engineering, and that’s the processing that we have, the digital computer. The way we process signals and control networks today is vastly different than it was in the early days with relays and discrete components. In the early days of cellular at Bell Labs the radio people thought that the cellular radio concept was insane because they couldn’t see how we would ever have the capability of handing one radio channel off from one base station to another, and change the connection of the land network at the same time. The switching people who worried about the network side thought it was insane because the circuits we were trying to hand off from one base station to the other were flaky. With discrete logic boxes, it was an extreme stretch.

Of course the microprocessor came along and solved the problem from both the mobile and fixed standpoints. A lot of intelligence gets built into these networks. Channel assignment and hand off are very processor intensive. The signal processing, adaptive arrays, equalization, rake receivers and spread spectrum are all done on the radio links, and those things make use of significant amounts of signal processing. With the ability to sample and digitize a signal, store it and then work on it, you have a little bit of time to go back over the signal again before you have to make a commitment. These things have changed the way we look at building the various parts of a communication system.

Hochfelder:

Has digital signal processing been the major breakthrough?

Cox:

Digital signal processing is one side of it. The control, microprocessor and computer random logic aspect of it has been equally important. These are two different sides of the same coin. The digital signal processor is working on the same kind of problem over and over again at a high speed. Their random logic is a control function of dealing with what you find, doing computations on it and deciding what to do. In most of the advanced digital cellular and PCS sets you find both a microprocessor and a digital signal processor, or at least the two functions. You find them in base stations too.

Hochfelder:

Would any of the developments you worked on personally have been possible without this increase in computer power?

Cox:

The things that I’ve done while at Stanford would not have been possible without it. I started long enough ago to have worked on some of these kinds of problems without this advantage.

Evolution of the communications engineering profession

Hochfelder:

How has the profession changed as a whole throughout the course of your career?

Cox:

That’s a good question. One thing that had a significant impact on communications engineering was the breakup of AT&T. Bell Labs was a very dominant force in communications, and there’s no way to deny that. As a result of some of the constraints that were on AT&T and Bell Labs, there was a policy of free publication. Papers were looked at before they went out, but most things got published, and in a fairly timely manner.

Hochfelder:

Through Bell Systems’ technical journal?

Cox:

And through IEEE publications. Many of the articles published in the Communications Society journals have been written by Bell Labs authors. Bell Labs people participated very heavily in the Communications Society and in the IEEE in general and were generally permitted to do that as part of their work. Since then there has been somewhat of a retrenchment on free publication. Articles tend to be more cryptic, less revealing and less timely.

Hochfelder:

Is this because Bell no longer has a monopoly?

Cox:

That’s right. There is competition there. I doubt if there is data to support it, but there’s probably less free labor provided to the IEEE because competition has caused a tighter control over time and money. It’s much harder now for people to devote the quality of activity they once did, though there are individual cases that are an exception.

Hochfelder:

Do you think the breakup of Bell had beneficial as well as detrimental effects on research and communications engineering?

Cox:

It certainly had a negative impact on the exchange of information. Others will say it had a positive impact on innovation because there are more groups working on these problems now.

Hochfelder:

Based on what you’ve said, it seems like the emphasis shifted in the field as a whole from more pure research to more applied research, and that was one of the consequences of the breakup of Bell. Would you say that’s a fair statement?

Cox:

There’s been a shift in that direction.

Hochfelder:

Your remarks dovetail with what other people in the field have said, those who have worked through Bell Labs and Bellcore and people outside those organizations.

Cox:

It used to be that people from outside the Bell System who would come into Bell Labs or even Bellcore would be told everything. They might not be told about what was worked on yesterday, but certainly what was worked on the day before yesterday. I don’t think that’s true anymore.

Hochfelder:

Is this because of the need to keep proprietary information confidential?

Cox:

Yes, the need to keep proprietary rights until they’ve taken advantage of them. People have always said there wasn’t any competition before the Bell System breakup, but there was a significant amount of internal competition between different groups within Bell Labs.

Hochfelder:

Was research money easier to obtain back then because of the 1 percent that used to be awarded?

Cox:

That’s my perception. I’m not a good person to comment on that, since most of my work at Bell Labs in the early days was as a research engineer. Later on when I worked as a manager my perception was that it was much more difficult for me to get money and justify it than it had been for my bosses at Bell Labs.

Predictions for telecommunications research

Hochfelder:

What do you see as significant innovations that may take place in telecommunications in the next twenty years?

Cox:


Audio File
MP3 Audio
(364 - cox - clip 4.mp3)


I just gave a talk last week where I pointed out that to attempt to predict the future is rather foolish, because you can only be wrong. But fools tend to rush in where angels fear to tread. I don’t have a crystal ball. I think there is going to be continued use of wireless technology attached to networks. There is really no doubt about that at this point. I think that a lot of wire line phones are going to be replaced by wireless. We are seeing some of that already. I don’t think wire lines will ever be completely replaced because there are needs to have telephones at specific locations. We will probably continue to keep wire phones, but I think we are going to see a continuing proliferation of wireless attachments to the network.

I think we are going to see most of our voice communications carried on with wireless in some part. I think we are going to see low data rate messaging, e-mail, and you are going to be able to get at least short messages on your wireless communicator. I think we are going to see increased reliability in performance as the firestorm subsides. About all that’s been going on in the last ten years is trying to get as much wireless equipment in the field as possible, because it’s been very difficult for the industry to keep up with demand. I think that wireless is going to have a significant impact on the backbone networks. There is no doubt that the basic infrastructure is going to be fixed network. Wireless is just a network attachment, an access technology, but if you have mobile users everywhere that has a significant impact on the processing, data storage and protocols that are used in the network. That’s going to have a profound effect on the way these networks are built and operated. In a mobile network the user essentially resides only in a database, which is very different from a wire line network. There’s wire that starts at its own terminal point in a central office, works its way through cables and things to the end point, and users can be identified at any point along the way. In a mobile network you can’t identify users that way. They could be anywhere. That has a very profound effect on switching, routing and management, not knowing where users are and what they do. This requires a lot more processing.

Hochfelder:

Processing will continue to be important?

Cox:

Right. People who come from fixed network switching are always unhappy using their switches in a mobile environment because the switch won’t process nearly as many mobile calls as on a wire line network. I’ve heard some cite this as a big disadvantage of mobile networks. It is a disadvantage from the standpoint of switching, but when you look at what you’re getting from the switching intelligence, you’re getting what people want, which is the ability to be anywhere with it. The solution to that is to put more processing power in the switch. That is what is happening in the switches that are being adapted more generally for mobile systems. More processing has to be done before a connection is made, and it requires more processing and less connection oriented activities.

Hochfelder:

You see a continuing role for wired forms of communication technologies. Some people say that the future will be wireless.

Cox:

I always ask them what they mean by that. We have gone from attempting to use wireless or radio for the entire connection. HF radio you use between continents, satellites you use between continents. In the case of cellular or PCS we’ve gone to using wireless as an access technology, as an endpoint to a fixed network. A cellular network has a lot of fixed interconnect. All the base stations are connected to fixed switching locations and all the switching locations are interconnected in the fixed network. I think part of the networks will always be fixed. I’m somewhat biased here, but in my view the trend is in the direction that our PACS activity was. If you look at what’s going on in the cellular and PCS environment, there is a trend toward using smaller base stations spaced closer together to increase capacity and overcome blockage of radio paths by buildings, hills, etc.

I have been in several different cities in the last couple weeks, and found it fascinating to see the old towers that were put up in the early days. They are high, with big antennas on top to cover large areas. Then you see a lot of fill-in that has been installed, antennas and base stations on the corners of large buildings; shorter antennas on masts sticking up alongside of highways in valleys; and in the cities at street corner intersections. In some places base stations have even been installed in hotel lobbies to take care of the surge that happens when a session turns out at a conference. There’s a tendency toward more and more wireless attachment points in the network, and when that happens you’re going to have more and more fixed interconnect.

Some will probably say we’re going to interconnect all these base stations with wireless, but I don’t believe it. Some of them will be connected with point-to-point microwave, and they are today, but most of it is going to be fixed network interconnection and connection to switching of some kind. There’s a lot of activity in packet oriented switching versus the classic type of switching, but it’s still fixed network switching.

Hochfelder:

Is that circuit based switching?

Cox:

Either circuit or packet based, they’re still oriented to fixed networks. I don’t share the vision of some that we’ll have 10 megabits to sets in our pockets.

Hochfelder:

That does seem high.

Cox:

It’s a little outlandish. Considering the system margins that are required to do that, it can never be done with the kind of technology we have today to power from the battery. If we get base stations every 100-200 feet, we might be able to do that, but I don't see things going that far. I don’t know what you do with 10 megabits you can put in your shirt pocket.

Hochfelder:

That’s another interesting question.

Cox:

I may be at odds with some in the industry with this view. I think we’re going to see a lot of wireless, but I don’t think it’s going to be a broadband medium. I’ve spent a lot of time down the hall from the fiber optics people, and I can’t help but believe that fiber is the way to provide very high bandwidth to a fixed location. If you’ve got a large screen you’re not going to move around anyway, you might as well plug it into a fiber.

Hochfelder:

It sounds like the more wireless phones or personal data assistance people have the more need there will be for a wired backbone network.

Cox:

I think so – a smart wired backbone that can handle people being in different places.

Hochfelder:

What do you think some of the technical challenges will be over the next ten to twenty years? At the beginning of the interview you talked about lifelong phone numbers and the challenges that presents for switching.

Cox:


Audio File
MP3 Audio
(364 - cox - clip 5.mp3)


I see things like lifelong numbers. That’s part of the mobility management, data management part of the network. It’s the intelligence in the network. I see lots of room for advances there. The project I mentioned earlier that I have on that, when I started that project at Stanford my collaborator on it was a professor in the computer science area who was a database specialist. I had the problem and she had the solution. We managed to tailor the database and protocol techniques into wireless mobility management techniques in a very profitable way, and there’s definitely a need for that. There’s a need for better resource management, channel assignments and power control whether or not you use smart antennas.

Having been a year or two in this business, I’ve seen a large influx of people into the wireless business who have historically concentrated on the link itself – equalization, coding, modulation, detection, all the things that are done on a single radio link. A lot of people have spent a lot of effort on that. There are a lot fewer people working on the radio systems issues and dealing with the millions of users and tens of thousands of base stations interacting with each other. We have a lot of room for more effort there, and it’s beginning. There has been a lot of effort put into the network side too, because a lot of fixed network people have come into this business. We’ve got people on both ends, and in the middle we need to deal with this new concept of a radio system with many, many base stations. We have to deal with the interactions of the users of the base stations.

One thing that is very different in cellular mobile systems today is that we deliberately reuse frequencies in a way that subjects some users to interference, and work is done to make sure that this interference is insignificant enough that the system is still acceptable. With the frequency reuse and either dynamic channel allocation or fixed allocation among closely-spaced base stations we incur the need for hand off, and there is a lot of room for improvement in the way we deal with hand off. Nowadays hand off is dealt with as a zero memory process. You drive down the street and this set keeps looking around at the various signals that it sees and says, “Oh, what I’m on isn’t so good, it hasn’t been so good for awhile, and that one looks better. Go.” In the next block it may be the other way around. Hand off algorithms designed to cope with that situation tend to create large delays in the hand off which can create dropped calls or a lot of hand offs, which is lot of work for the network that doesn’t accomplish much.

There are intermediate things that can be done, like the soft hand off that’s used in the CMA systems. Similar things can be done in TDMA systems and other types of systems. In deciding when to invoke soft hand off and when to be prepared to do a hand off, we can make use of information that a system can collect. The large-scale variations don’t change: buildings are always in the way and always do the same shadowing, hills are always in the way, and you’re always the same distance in a particular street from the same base station. Those large-scale effects are repeatable but small-scale effects aren’t, so you have to average them out. Those large-scale effects could be measured and stored. There is a lot of communication between base station and a mobile to do that, additional communication or signaling, and that’s a lot of information to be stored in the base station and a lot of processing. This is where the computer industry is taking us down a very steep slope. There’s a lot of benefit to be accrued there. Control of the system is the key thing.

Hochfelder:

Do you have any concluding comments or thoughts?

Cox:

Not unless you have more questions. I came completely unprepared.

Hochfelder:

Thank you very much. I appreciate your time.