Oral-History:Bede Liu
About Bede Liu
Dr. Bede Liu was born in Shanghai, China in 1934. His family moved from Shanghai to Chungking, and returned after the end of World War II. In 1949 his family moved to Taiwan, where he attended the National Taiwan University and received his bachelor’s degree in electrical engineering. Then he emigrated to the United States, joining his parents who had already moved, and began working at the Western Electric Company. Liu began graduate school at Brooklyn Polytechnic Institute, where he received his EE master's degree and doctorate in 1956 and 1960, respectively.
He worked for the Bell Laboratories technical staff from 1959 to 1962, and then joined the faculty of Princeton University. His pioneering digital signal processing research has focused upon floating-point arithmetic, developing various digital filters and processor programs, image processing for HDTV and medical imaging, and especially the development of discrete-time signal and system theory, for which he was elected an IEEE Fellow (1972) [fellow award for "contributions to discrete-time signal and system theory and engineering education"]. He has received many other awards, including the IEEE Centennial Award, the Signal Processing Society's Technical Achievement Award (1985), and the Circuits and System's Society's Educational Award (1988). He has been active on many ASSP Society and IEEE general committees, and has served on the IEEE Board of Directors (1984, 1985), and President of the Circuits and Systems Society.
The interview covers Liu's pioneering career in the digital signal processing field. Liu describes his childhood in China and the educational experiences which led to his career at Princeton University. He recalls receiving his graduate degrees from Brooklyn Polytechnic, working at Western Electric, and being part of the Bell Labs technical staff. He describes his early work with computer circuit networks and how this led into his digital signal processing focus. Liu explains many things, including the challenges of working with floating-point arithmetic, early digital filters, digital computational systems, signal processor chips, and questions of image processing. He discusses important books and articles on digital signal processing, as well as the contributions of various pioneers in the field. Liu gives his own definition of digital signal processing, and debates others' definitions of it. The interview concludes with Liu's discussion of digital signal processing's applications in the biomedical field.
Related interviews on the history of signal processing include Leo Beranek Oral_History (1996), Anthony Constantinides Oral History, James W. Cooley Oral History, Alfred Fettweis Oral History, James Kaiser Oral History, Sanjit Mitra Oral History, Takao Nishitani Oral History, Tom Parks Oral History, Hans Wilhelm Schuessler Oral History, and Stanley A. White Oral History
About the Interview
BEDE LIU: An Interview Conducted by Frederik Nebeker, Center for the History of Electrical Engineering, 10 April 1997
Interview # 333 for the Center for the History of Electrical Engineering, The Institute of Electrical and Electronics Engineers, Inc.
Copyright Statement
This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.
Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, IEEE History Center, 445 Hoes Lane, Piscataway, NJ 08854 USA or ieee-history@ieee.org. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.
It is recommended that this oral history be cited as follows:
Bede Liu, an oral history conducted in 1997 by Frederik Nebeker, IEEE History Center, Piscataway, NJ, USA.
Interview
Interview: Bede Liu
Interviewer: Frederik Nebeker
Place: Princeton, N.J.
Date: 10 April 1997
Childhood, family, and education
Nebeker:
Let's start with your background, where and when you were born, and a little about your family.
Liu:
I was born in Shanghai, 1934, before the Second World War. We moved away from the coastal provinces to the inland step-by-step as the battle front was getter closer and closer. After the war, we returned to Shanghai. Then in ‘49 my parents took us to Taiwan. I went to National Taiwan University. At the time I finished, my parents were already in the United States.
Nebeker:
Your father was an engineer?
Liu:
Yes, I'll tell you a little bit about him. My grandfather was a farmer. He had his own land and was able to hire help to till it. My father went to an old fashioned tutor for a couple of years, studying the Confucian classics. He had to give it up to work in the fields. When he was seventeen or eighteen, he went to the equivalent of eighth grade in Nanking. It was a very poor school, and every night he had to go outside the city to find a tutor to help him. A year later, he skipped a grade and got into the best high school in Nanking as a tenth grader. When he finished high school he got into the best engineering school, Chao Tung University in Shanghai. He finished in 1930. He worked at various power plants and in administrative positions.
Nebeker:
His field was electric power?
Liu:
Electric power engineering. In that sense, I had no choice but to become an electrical engineer. My brother rebelled, and became a mathematician! As I told you a little earlier, my father was sent by the government to the United States in 1944 during war time. In fact, on New Year’s Eve in ‘45 he was at Times Square. He told us about the crowd. I was still in Chungking, in mainland China.
After the war, we all returned to Shanghai. He came back and worked there until the spring of ‘49, when the whole family moved to Taiwan. I finished my last year of high school in Taiwan, and then studied at Taiwan National University. He immigrated to this country in ‘52. He started work as an electrical engineer, designing chemical plants for American Cyanamid and maybe a couple of other places. I think one other company was called the Chemical Construction Company (CCC) at that time, I don’t know if it’s still there or not. I came here after college.
Immigration to the U.S.; Western Electric
Nebeker:
You had a bachelor’s degree in hand?
Liu:
Yes. Most of my classmates came here as students, but I came as an immigrant. That means I had to register at the draft board and be classified as 1-A and go through all that. I started working at Western Electric.
Nebeker:
While you were a student?
Liu:
Yes. Let me go back. I arrived here in New York City at Idlewild airport, the previous name for JFK. It was a Thursday night. On Sunday, my father showed me the New York Times and asked me to look through it. Monday morning I took the paper, got on the subway, and looked for a job in the city.
Nebeker:
Was your English good at that point?
Liu:
It was terrible, but I could get by. I was close to some of the Jesuit missionaries and some Benedictine sisters at Taiwan College. That helped my English, but my command of the language was still rather poor. The first place I walked in was ninety-nine Church Street, in New York City. Western Electric Company had a branch there. I walked in and showed them my transcripts. One hour later, I had a job at something like $365 a month. That was in November, less than a week after landing in New York.
That February I started graduate school at Brooklyn College. I remember walking into the office of Tony Giordano. He looked at my transcripts and had me sign various forms to start my graduate education.
Nebeker:
This was while you were still working at Western Electric?
Liu:
Yes, I took the maximum number of credits allowed for a part-time student. In addition to the thirty-seven and a half hour work week, there was roughly ten hours of overtime. I did that one year, until February of ‘56, then I decided I really had to go to school full-time. I could keep up with the courses, but I really didn’t have time to think deeply, to have a really good understanding of what I wanted to do. I went to see Giordano and a couple of other professors and I was made a teaching fellow. I resigned from Western Electric.
Brooklyn College
Liu:
As a teaching fellow, I had to teach I believe a five credit, sophomore circuit course. I was paid $550 for a semester. I got my master’s in ‘56. My father and I got the degree at the same time; I think it was the first [time] that happened in the history of the college. We got our picture in the paper and all that. It was the 102nd or 101st commencement.
Nebeker:
You were in electrical engineering?
Liu:
Yes, mostly electronics applications. My father was in power, so we never overlapped. By the time I started, he was done with course work and was doing his thesis.
After I got my master’s, I continued my Ph.D. work. That summer I took a job with Alan Dumont Laboratories in their instrumentation division, working on distributed amplifiers, which is one way to get the bandwidth out of vacuum tubes. I continued with my work at Poly.
Bell Labs
Liu:
I applied for and was fortunate enough to be given a Bell Laboratories fellowship. That was to begin in September 1957. The summer of 1957 before I started my fellowship I had a summer job at Bell Labs, Murray Hill.
Nebeker:
What were you doing there?
Liu:
What was called a transversal filter, using transmission lines.
Nebeker:
What was the application of that?
Liu:
That division was interested in data communications, or pulse transmissions. These were the days before data communications really got into full swing—there were a number of circuit problems that people needed to look at.
Nebeker:
This was some kind of filter using analog electronics?
Liu:
Yes, digital was nowhere in sight. That’s not true—digital was beginning to show up. That was the first summer. I also worked there in the summer of ‘58, the second summer; I forget what I did.
Ph.D. thesis
Liu:
I finished my thesis in December of ‘59.
Nebeker:
Your degree was granted then in 1960?
Liu:
Yes. Poly grants degrees once a year. I was cutting it quite close because my fellowship ran out after two years. I was married and a child was on the way. I had to pass my language examination. We had only one language requirement, so I studied French off and on for one month, and then intensely in the last week. We were allowed to use a dictionary to translate, so I trained myself how to flip the pages fast. I somehow managed to squeeze through that. I handed in the draft of my thesis at the end of August, and started work for Bell Labs.
Nebeker:
What was your thesis?
Liu:
It was on time-domain approximation of network responses. At that time, the IRE had many professional groups, what are now called technical societies. The group that I was associated with was called the Professional Group on Circuit Theory (later, the Circuits and Systems Society). As far as networks of circuits were concerned, it was very popular to synthesize networks from a given specification. Very elegant theory, this network synthesis. I had the good fortune of studying this topic under some of the world leaders, including Ronald Foster. When synthesis first started, it was primarily a specification in the frequency domain. But at that time, people began to feel that pulse transmission was important. So the time response, or transient response, was important. So that’s how I chose my thesis.
My advisor was John Truxell, for whom I have a lot of admiration. I also worked very closely with Professor Popoulis from whom I learned a great deal. I studied several subjects under him. When I was trying to pick a thesis advisor I had two choices: one was to do circuits work with Truxell, the other was to do E & M under Leo Felsen. I took the graduate E & M course from him—a great teacher and easy for a student to admire. But I picked Truxell to do my thesis. Let’s see, where was I? At Bell Labs.
Bell Labs Exploratory Development Department
Nebeker:
What was your initial assignment at Bell Labs?
Liu:
I have to think. The department I joined used to be called the Exploratory Development Department—halfway between research and development. It was concerned with transmission circuits. At that time, there needed to be some planning for the demand for communication channels. The conclusion was that they needed much wider band facilities. The candidate at that time was a circular wave-guide system because of the bandwidth. So the whole department was pushed in that direction.
Nebeker:
This would be for trunk lines?
Liu:
- Audio File
- MP3 Audio
(333 - liu - clip 1.mp3)
Yes, long-haul telephone systems. This wave-guide system was never put into service; the technology was not quite ready. It was supposed to be for long distance PCS. That was about the time the T-1 was first put into service. T-1 was developed by a neighboring department—Bob Aaron, Eric Sumner, that group of people. At that time there was a lot known about analog transmission, but not much about digital, particularly with respect to PCS. There was question of timing, for example—the repeaters. When you use multiple repeaters, the timing error accumulates. I was put on that problem a little. It was felt that analysis alone probably would not carry us all the way to a solution; we should complement the work using computer simulation. Digital computers at that time were not quite to the stage they are now, so analog computers were used. Talk about hybrid computers!
Talking about computers—when I was in the last stage of writing my thesis, I had to do some computation. Bell gave Poly an old relay computer they had retired. It had twelve registers and a whole room full of equipment, and I used that to do some of the computation. That was my first time doing that.
Nebeker:
How long did you stay on that project?
Liu:
I think I was supposed to always be on that project. I was there for about three years and I really didn’t do much for them at all. It was very interesting, looking back. I guess it’s not uncommon with a Ph.D. When one gets a Ph.D., one feels very special, that the world owes them a great deal. So, I felt that I should be able to do whatever I wanted and not be responsible for anything.
Bell Labs work environment
Nebeker:
Did you like those years at Bell Labs?
Liu:
I liked it very much in the sense that it helped me to mature, but I don’t think Bell liked me because I did very little for them. I remember a supervisor that I had, he was a little older and gave me good advice—that one has to deal with the real world. If something has to be done, somebody in the company has to do it. I did manage to publish one chapter of my thesis, but I was very frustrated because I did not have a clear-cut goal.
Nebeker:
You mean in your career?
Liu:
Yes. I did not realize that the Ph.D. is a training ground. From getting your degree to becoming a more mature researcher, there’s a big gap. It takes nurturing to grow. You have to have a good environment, and you also have to understand yourself—your strengths and weaknesses.
Nebeker:
Were you unenthusiastic about the project you were working on?
Liu:
Yes, that’s part of it. I'll tell you what my supervisor told me. He said, you know, if the director of the lab were told to do something, he would give the assignment to the next level manager. Then the next level manager would go down to the MTS and assign it to him. If that MTS could not do it, then his manager would have to do it himself. If that manager couldn’t do it, the next level would have to do it. The important thing was somebody had to get it done. This was hard for a fresh Ph.D. to accept, particularly thinking the world owed him a good living. Certainly a living, maybe even a good living.
Nebeker:
Wasn't there a practice at Bell Labs of allowing people to choose their own projects to a large degree? Certainly in the research division.
Liu:
I think Bell Labs is this way. I was not in the research division. I do not know how much difference it makes. If you talk about Bell Labs as a whole, basic research is a small part. There are many other divisions because it is basically the technical arm of AT&T. But somebody has to do the system work, somebody has to come up with the design that is eventually implemented. That's where the income comes from. The reputation of Bell Labs in the research world was made primarily by a small group of researchers, and this was hard to get through to a fresh Ph.D.
Princeton Univ.; signal sampling
Nebeker:
What happened after three years?
Liu:
I was fortunate to receive an offer to come here to Princeton and teach; that was in September of ‘62. I guess the three years at Bell Labs really helped me to grow up. I was able to take responsibility much better. Took a cut in salary, but I have enjoyed every minute of it.
Nebeker:
Tell me about your research in the first years here.
Liu:
I was very fortunate to have a senior professor, John Thomas, who gave me a lot of good advice and guidance. We collaborated on some work, as well. I had gotten away from circuits work during the three years of communications at Bell Labs. When I came here, the first thing that John Thomas and I did was some joint work on signal sampling.
- Audio File
- MP3 Audio
(333 - liu - clip 2.mp3)
Thomas told me, “You really should get involved with advising graduate students. Here is a person that I am advising. Why don’t we do it jointly?”
I was very fortunate to be given that opportunity. The person was extremely talented, Bob Kahn. We are about the same age, but he was probably more mature than I was, even though I was faculty and he was a graduate student. Bob Kahn was really the driving force behind Arpanet and Internet. I learned a great deal working with John and Bob. We did some very interesting work together on signal sampling.
Bob essentially wrote two theses. It was his misfortune to be supervised by two professors with somewhat different interests. With me, our work was primarily sampling of signals. With John Thomas he did some work on the modulation effect, particularly simultaneous amplitude and frequency modulation. Bob should have gotten two degrees. In ‘63 we recruited another faculty member, Ken Steiglitz, who was one of the earliest people to work on digital filters. One chapter of Ken’s thesis from NYU is on using bilinear transformation to design digital filters.
Fast Fourier Transform; digital filters
Liu:
During the summer of 1966, there was a COSINE Committee meeting at Princeton. The COSINE Committee is an NSF committee. Mac Van Valkenburg was very active at NSF doing things of this sort, and I was involved in various decisions. Most of the things went way above my head. A colleague of mine said, “Bede, at night we have a discussion group on this thing. Why don’t you handle one of the sessions?” I said, “What do you want me to discuss?” He said, “Here is this new paper published called FFT. Why don’t you talk about that?” I said, “I never heard of anything like this.” So I quickly read up—it was the classic paper of James Cooley and John Tukey. I found it fascinating. I was able to understand it and present it to a room full of people and convince them of its merits. I think it was the summer of 1966 or ‘67.
Nebeker:
Cooley-Tukey came out in '65.
Liu:
I can tell you a little bit later on of my view of the development of signal processing. You might as well get a biased version someplace.
At the 1967 Ellerton Conference, a very innovative, original researcher at Bell Labs named Erwin Sandburg presented a paper on the analysis of the accuracy of digital filters using floating point arithmetic. At that time, the hardware was very primitive, so most digital filters were implemented on the computer. The algorithms used mostly floating point arithmetic; six-point arithmetic is easier to analyze for accuracy, the round-off errors and so forth. Floating point is much more difficult, certainly at that time. Now, we all know how to do that. It’s interesting to go back a little bit. The first digital filter was built by Bell Labs, in Hack McDonald's group.
Nebeker:
What was that for?
Liu:
- Audio File
- MP3 Audio
(333 - liu - clip 3.mp3)
I don’t think it had a specific application, but people saw that down the road it would be very useful. I think it’s primarily for channel-dropping filters, and maybe even for touch-tone dialing. Jim Kaiser can give you a very good history of that. Well, they built the filter, and everybody thought digital circuits would be just ones and zeros: you flip a switch and it should work like a charm. But they turned the switch and it didn’t work. They checked it out, and no mistakes had been made. It just didn’t work. Some very smart people got into it and found out it was really the effect of the round-off error. It causes oscillations, limit cycles, all sorts of problems.
Work was done to correct that. You know that in the digital world, a number is represented by a finite number of bits. When you take a two- and an eight-bit number and multiply them, you get sixteen bits. But you only have an eight-bit register. You have to squeeze it in and drop something out. What’s left over is called a round-off error, and it can play havoc. Now, this is a six-point thing, but when you implement things on the computer, typically these are floating point arithmetic. With floating point arithmetic—without getting into detail—the errors are introduced through a slightly different mechanism. Well, you had the same mechanism, but in a different way, which makes the analysis very hard. Erwin Sandburg presented a very good paper at the Allerton Conference which pointed out the problem and ways to handle it. The approach he took was a deterministic approach. I guess you can call it classical error-analysis, or the numerical analysis approach.
I found it very interesting. I came back and talked to Toyohisa Kaneko and said the proper way to look at the problem is through statistical probability. I said, “This is a very interesting paper, but the underlying problem is one of many, many small errors being introduced, and we should be able to analyze the problem using this probabilistic model.” We talked about this for a little while, and a few days later Kaneko said, “Yes, everything carries through fine.” So we wrote it up and sent the abstract to Allerton again for the ‘68 conference. One week before the conference he came in and he said, “No, it didn’t work." Sure enough, it did not work; we overlooked something. We quickly tried to do a lot of things, and finally I was able to find a way to analyze the problem, which actually made the problem much more interesting. It got the result out. And that’s my first work on digital filters.
Nebeker:
Can you explain that in layman’s terms?
Liu:
Okay, in fixed-point arithmetic, the whole problem can be treated as an additive noise. That is, the error and the quantity you are interested in (the signal) are more or less independent of each other. In floating point, the round-off error is dependent on the signal. That makes the problem nonlinear, and so a lot of the linear analysis tools simply do not apply.
Nebeker:
And you are also bringing to it this probabilistic analysis, is that right?
Liu:
You can use the probabilistic analysis on the fixed-point arithmetic round-off error; people did that right around that time. The linear dependence makes it easier to analyze. Linear analysis tools do not apply to floating point, because the error is dependent on the signal, certainly not in any obvious way. If your system works reasonably well, the error should be small. When the error is that small, we can make certain approximations which can be very good. With those approximations, the problems will again converge into a linear problem, and you are able to analyze that.
Nebeker:
Was the success with floating-point error analysis influential?
Liu:
I don’t know, it depends on how modest you are. When you finish a particularly frustrating problem, you think it’s the world’s most important thing.
Nebeker:
Was it cited much?
Liu:
I think quite widely. Next, we said, "Okay, we did this digital filter, so let’s look at it [unintelligible], because it is used very widely to do that." Again, it turned out to be a very interesting problem. The referee report on the article that came out of it was very interesting.
Nebeker:
Who were you collaborating with on this?
Liu:
A graduate student, Kaneko, from Tokyo University. We sent an article on it to The Journal of the ACM. I remember two reviews, both very short. One reviewer said, “This is very significant work. I have already presented it to my class.” Another reviewer said, “This is the way the problem should be analyzed, and this is the kind of work that the journal will be known for.” Just two opinions!
At that time, there was a lot of interest in putting these things in hardware. Digital hardware then was very primitive compared with what we can do now. One of the bottlenecks was the multiplying of the signal. We thought of various schemes to eliminate the hardware multipliers.
Delta filter project; distributed arithmetic
Liu:
- Audio File
- MP3 Audio
(333 - liu - clip 4.mp3)
In 1967 or so, I proposed to NASA a way to implement digital filters using delta modulation, without the multiplier. In delta modulation, it is easy to do multiplication by one bit. That was not funded. But, two or three years later, I was working with Abe Peled and I asked him to look at it again. He came back a couple of days later and said, “This is a great idea. It works.” We continued working on it and he said, “Oh yes, you can do this sort of thing; that will be very good.” It is now the scheme most people call distributed arithmetic.
The idea is this: in signal processing, you do not need the full multiplier. A full multiplier takes two arbitrary numbers and gets a product. In digital signal processing, what you have is the signal, which is arbitrary, multiplied by a constant, a coefficient of some sort. You don’t need the full-fledged multiplier to do that, which can be used to advantage. Of course, we thought it was great work. It turned out some others thought of the same idea just around that time, maybe even a little bit ahead of us, but for some reason they did not receive wide publicity. Stan White at Rockwell was very interested in this. He dubbed the scheme the “Princeton Multiplier.” I worked on this for a little while—I really should say my students and I—because a lot of the ideas really came from them, not me.
Nebeker:
Who were your students on the delta filter project?
Liu:
I can give you a list; I have had forty-three successful Ph.D. students so far. A lot of them worked on digital signal processing problems.
Nebeker:
I was wondering about the distributed arithmetic.
Liu:
Abe Peled. He later went to work for IBM and became the Vice-President of Research in charge at Yorktown. A very interesting person. If you want I can talk to you about him for a little while.
Nebeker:
I saw Stan White’s review article on distributed arithmetic some years back in a Signal Processing magazine.
Liu:
That was some years back, how did you find it?
Nebeker:
Just looking through old issues. I interviewed him a couple of years ago, actually. A very interesting fellow.
Liu:
We looked along the same line of using adaptive DPCM, for implementation of digital signal processors.
Nebeker:
What was the direction of your efforts? Was it the bigger applications that seemed to need a certain type of filter?
Liu:
Who can say why the spirit moves him in a certain direction? You know, you think about an area, you see what has and has not been done. Wouldn’t it be nice to have this thing done or that thing?
DSP and control
Nebeker:
So this is more internally directed?
Liu:
By and large that is the case. Dave Munson, for example, did some very interesting work with it. Here's one thing that Dave and a couple of others have done. Because there is error, you want to reduce it. There is a scheme called error spectra shaping. Various people worked on it, Sid Burrus, Arimosha Agamoff at Rice did some very interesting work on it. Essentially, if you were able to take the error, modify it a little bit, and put it back into the filter, it becomes a self-cancellation mechanism. With that, your life becomes a lot easier.
Now all of this has to do with the technology, because at that time, the hardware was expensive. Every bit costs you something, so you try to squeeze the maximum out of whatever hardware you have. Right now, the economic consideration is very different, you say, “Three or four more bits, fine.”
Nebeker:
Economics has changed even the way theoretical work is done?
Liu:
Yes. Maybe I can go back a little bit to my view of what DSP (digital signal processing) is. If you go back to the early days, before Euler numerical integration, before Houghteger formulas, and you take the computation away from engineering applications and then ask where DSP began, I would have to say that it began with the sample data system, digital control. That was in the ‘50s. That’s where the Z transforms were developed. Maybe you called them hybrid systems, but some of these certainly had digital signal processing embedded very well. This is why Ken Steiglitz was doing a digital filter design. His advisor, Sheldon Chang, was one of the pioneers in sample data systems.
Nebeker:
This was data sampling for control systems?
Liu:
Yes. At that time, people thought in terms of digital logic, digital computer—you can already find the terminology of computer control systems coming in.
Nebeker:
There were much older control systems, entirely analog. What changed was the availability of digital computation. So then you moved to some kind of ADC in order to be able to use the digital computation?
Liu:
Yes, a very primitive ADC. You wouldn’t even call it a computation, more of a simple manipulation. In terms of communication, the signals were analog. The T-1 system is PCM. Bill Bennett did a lot of the PCM work in the 1950s. His classic paper on signal quantization came from sampling quantization. The ultimate quantizer paper by Lloyd and Max came around that time. If you're thinking, "What can I do with it?" the hardware was very primitive. You’re left with some very isolated application areas, audio being one, digital control another, and some other things.
Digital filter researchers, 1960s; Arden House
Liu:
Around that time, the Cooley-Tukey paper was published. There’s really no inherent reason why digital filters and FFT should be tied together, no more than a lot of fields should be tied together. But a lot of people who were interested in FFT were also interested in digital filters. So after Cooley and Tukey's paper, a bunch of people organized a workshop called Arden House. I think that the push to publicize DSP really owes a lot to this group of people. Cooley was there, Jim Kaiser, Al Oppenheim, Ken Steiglitz—a whole bunch of people did a lot. It was a very good atmosphere, people went there just to talk about it.
If you look at that point in time, there were a lot of things people did not know, like how to design digital filters in a systematic way. People did not understand the kind of accuracy you could get out of the thing, or the limitations. Some digital filter design is straightforward: use analog filters to change it into a digital filter design, the so-called recursive or IRR filter design. You can more or less piggyback on the analog filter design. On the non-recursive filter, or the FIR filter design, it’s really for a series approximations. The Fourier series is an L-2 norm, but engineers do not like to use L-2; they like L-infinity. So that becomes a Chebyshev approximation problem. The classic paper is really the paper by Tom Parks and Jim McClellan. Tom Parks is at Cornell, a wonderful person, and Jim McClellan is very clever. Their algorithms were later refined by others. Larry Rabiner also helped a lot. Talk about impact, that filter program is probably the most widely used filter design program of any sort. After that paper, most people would have some idea how to design the dura filter, at least at first cut.
Nebeker:
In the mid-to-late 1960s, who were the people that were interested in digital filters? What are the communities?
Liu:
One group was the academicians, particularly the analog filter design people, who were looking for interesting problems to investigate. I don’t know what Tom Parks' Ph.D. thesis is on, but I am sure there is something in there on analog systems. You find people like that. The one group is the audio people. I don't think the speech processing people were quite there yet, although maybe now. I think just around 1967 people began to talk about LPC.
Nebeker:
So speech processing is one community.
Liu:
Yes, plus the circuit people, because of the word “filter.”
Nebeker:
These would be the academic circuits people?
Liu:
Yes.
Nebeker:
Are there application areas for this?
Liu:
No. As far as digital filtering is concerned, the serious applications have not emerged yet. By serious applications, I mean ones you cannot do without, or ones that are put in a large number of applications. For large quantities, you have to turn to the consumer. Another group is the radar people. At that time, they used chirp radar. I think the paper on this "Chirp" radar was published in the late 1950s in STJ. Chirp radar has a linear frequency change and it’s very hard to do in analog. So once the Cooley-Tukey paper was published, you will find it in a couple of places. Actually, the old hardware comes from the FFT.
Nebeker:
Was that in the radar field?
Liu:
Yes, radar processing, and you also found sonar people interested. That was the group for serious applications. It was a serious application because people actually built it to put in systems for that purpose. Speech at that time was mostly in the research stage, but they did have an application goal.
Emergence of DSP; DSP communities
Nebeker:
You were telling me about the beginnings of digital signal processing. What I’ve heard credits its start more to the speech processing people: Bell Labs, the MIT group. You’re telling me that the control-systems work was important. Maybe I've been talking to the speech processing people too much.
Liu:
The transformer was developed there.
Nebeker:
You were mentioning the radar and sonar people—if one looks at that field of digital signal processing, do you think of it as emerging in the mid-‘60s?
Liu:
Yes, in the following sense: from ‘65 to ‘70, a lot of people found it interesting, which means they saw potential in it. There were also issues that we didn't understand. Right now we have a very complete understanding (okay, a biased comment of mine).
- Audio File
- MP3 Audio
(333 - liu - clip 5.mp3)
I think signal processing is becoming so popular for two reasons. It is application-driven, particularly as seen now, and it is very much tied with technology. So you can identify applications; you have needs to actually build systems to do certain tasks.
Nebeker:
Are you saying that this class of mathematical problems is attractive to people because there are immediate applications?
Liu:
I would like to modify the comment. I think digital signal processing involves no profound mathematics. There are clever techniques to solve these things, but discrete systems are easier to analyze than continuous systems. For example, there is a concept called statistical stationarity. It means the statistical properties do not change in time. Now, as far as a linear system is concerned, everybody understands the concept. There is also a concept called timing variance or a stationary system. That is, the properties of a system do not change. It is well known that if you have a random signal that is stationary and you put the signal through a linear and a timing invariance system, then look at the output signal, its statistical properties also do not change.
If the system is time varied, then this property is not preserved, in general. However, there is a special class of time-varying systems where such a property is preserved. That’s a very interesting problem. The proof of that for the analog system is much deeper and more difficult than for the digital system. You can always say that whatever you can do with an analog system you can do with a digital system (with some exceptions, because the study of analog goes back so many years; there’s much accumulated information). Although the math of most digital systems is equivalent to analog, there are different techniques involved for solving specific problems—you can almost trace the counterparts. I may have a minority view on this.
DSP and mathematics
Nebeker:
Maybe you can explain to me something of the makeup of the signal processing community. It seems to me that a lot of it looks like applied mathematics—devising algorithms and proving their properties.
Liu:
I think devising algorithms is different from proving them. You will find the signal processing paper which is published in Circuits and Systems, or Signal Processing, (the predecessor of S-P) to be far less mathematical than some of the others of the IEEE. They are far less theorem than proof. People would like to state it in theorem-proof because they think it’s elegant, but there’s no need to do that. People are interested in the signal processing papers because at the end there’s a use.
Nebeker:
Their eyes are fixed on the utility of the algorithm?
Liu:
Or its likely utility.
Nebeker:
But still functioning in a formal mathematical world?
Liu:
I would not call it a mathematical world. I would call it the analytic approach to these things. You have to make it quantitatively analyzable to do this. For example, if you look at the paper on design of say FIR digital filters, what is the theory behind that? It's the Chebyshev approximation theory, and that’s from some time ago.
Nebeker:
So it is more a case of the application of well known mathematics to particular problems.
Liu:
That’s my belief.
Nebeker:
So by inference, numerical analysis or applied mathematics have not made large contributions to the work in signal processing?
Liu:
I think that is a true statement. By the way, I think that numerical analysis is applied mathematics. The emphasis to me is on the word “applied” rather than on the word “mathematics.”
Nebeker:
I'm acquainted with different fields of applied mathematics like operations research and it seems to me that often, at least among academics, they work very much as mathematicians have traditionally functioned—that is in making assumptions, proving theorems, demonstrating the quality of approximations—you are telling me this is not a typical activity in signal processing?
Liu:
I think the central thrust of signal processing is not mathematical development, but rather specific potential uses. Let me fall back a little bit. I think there are topics in signal processing that have no counterpart in the analog world, at least not obviously. For example, decimation and interpolation—the so-called multi-rate signal processor—that is a very interesting problem that has no obvious analog counterpart, although you can stretch it.
Signal processing applications and theory, 1970s
Nebeker:
I’m remembering someone—I think maybe it was Don Johnson—who wrote in a recent Signal Processing article that in the early years, digital signal processing was developed from ideas outward, whereas in recent years, it has been more application driven. Data compression has stimulated some work on algorithms, rather than the opposite. Does that make sense?
Liu:
I would put it slightly differently. I think signal processing has always been application-driven, but now we are seeing the fruits of it. Around 1980, the first programmable signal processor came out. So you are able to do that. I can give you an example of how some of the quote “theory" does not apply: in FIR digital filters, because they have symmetrical coefficients. If you manipulate the equation a little bit, you will find you can cut down the number of multiplications by half. The number of additions is the same. You say, “Great, I can reduce my computations.” On the other hand, if you put this digital filter on the signal processor, the processor has just one multiplier and one adder. So if you cut down the number of multiplications without cutting additions, you end up with the same thing, and because you have to do this extra manipulation, you end up losing time. Now, do you call this theory? I don’t. I think it’s just a question of implementation.
So if I may go back to the very beginning, the various methods—the Z transform, the numerical analysis, and in the ‘60s, the digital filters and the FFT—these do not really have any reason why they should be put together, but people are interested in them together. This group of people who publicized it really did a service to the community, and they eventually took over the Audio Society and moved it in that direction. I think around 1970, the circuit theory group changed to the Circuits and Systems Society. At that time, I proposed to some people who have the power to do things that signals really should be included in the Circuits and Systems Society. But signals was a no-no at that time in that group of people, even though you find people doing digital filter design, or FFT (look at my papers published at both places). I think after the digital filter people knew how to design digital filters, they knew the basic algorithms. The next step was really the implementation, and that’s a hardware issue. The programmable signal processor was really a major step to make the thing work.
DSP chips
Nebeker:
That was around 1980?
Liu:
Yes. A little bit earlier, there was I think the Rockwell 6502 or something of the sort. The Intel 2920, I think was also earlier. It was very primitive, and you could not do multipliers. You could do shift and add for that. The full program of a signal processor was the NECs, the Dell, and later on, TI came in.
Nebeker:
What happened when these chips became available?
Liu:
People underestimated how difficult it was to program them. They did not realize that you need a whole support structure to do it. It’s not that you can plug it in, twiddle a few knobs and it works. Assembly language programming was not widely taught, there were no compilers. You could not write in high-level language and then compile. That came much later. There were some transitions to go through. I think something like the Intel 2920 was actually very useful for a while, but the story I heard is that they did not realize that each of these products required multi-million dollar support. So some of these failed. I think Texas Instruments succeeded because of this. Once the chip is out, they have a developing board, an interface board; they have a whole system to use. Those are in the mid-‘80s—it took roughly five years for those to become useful.
IEEE Circuits and Systems Society
Nebeker:
I’m interested in your suggestion that organizationally, the IEEE missed a chance to combine the Circuits and Systems Society with the Audio Society. What I find remarkable about the Professional Group on Audio was how quickly it grew and took up new directions. There was quite a transformation of the society in the 1960s. In ‘62, the fiftieth anniversary of the Proceedings, they had this huge anniversary volume and all seven of the PGA chapters were on audio—mainly magnetic recording. And yet, a decade later, a formerly audio-only group had embraced signal processing and became audio and electroactoustics group. But you’re saying that Circuits and Systems should have been incorporated into that branch as well, instead of being a separate society?
Liu:
I was on the board of IEEE for a little while, and it was always a cantankerous issue. To some extent, it does not matter, as long as the big umbrella of IEEE covers everything and nothing falls through the cracks. There are, however, problems. As a technology matures, new pursuits emerge within its professional society. The people involved are usually dynamic types who want to push forward, and the typical path is for them to form a new society or, their interests rub off and come to permeate the old group. Some traditional members are very concerned with the healthiness of a change in direction, so how does the whole IEEE address this sort of thing? That’s the question. Well, it really doesn’t matter where a technology is grouped, so long as it is included somewhere and there's overlapping.
Nebeker:
What is your personal history with the IRE and IEEE? When did you join, and which groups?
Liu:
I was always in the Professional Group on Circuit Theory. In fact, I am studying the history of it a little. I’ve been drafted to give a keynote speech at a Circuits and Systems symposium in Hong Kong in June. I don’t know what to say yet, but anyway.
Nebeker:
We may have some archives that could help you.
Liu:
I really want to look into them, particularly the first issues of Circuits and Systems. I remember as a student I subscribed to that. There was one issue on sequential circuits, where you would say, “Gosh, those are computer society things.” There was also the issue on graph theory, the one on active circuit design, and a whole bunch of things of that sort.
But back to my own history. When I worked at Bell Labs, my work shifted more to communications. When I came to Princeton, I was still a member of the circuits, communications, and information theory societies. My choice of memberships really has to do with where I publish papers. After getting involved with digital filters, then audio and electrical acoustics, I did some work on HDTV. I became a member of the Broadcast Society, and that sort of thing. It changes. Do you have a follow-up question?
Nebeker:
I’m wondering about this relationship between the Circuits and Systems and Signal Processing Societies. How do you see that evolving over time?
Liu:
You find a lot of people active in both. Once, I was simultaneously an AdCom member (now it’s called a Board of Governors) of both societies. I think it’s very useful to bring the perspective of one society to another. When I was nominated there were serious objections over whether I would be loyal to one society or the other. I believe I have not betrayed the trust of either. I was the Technical Program Chairman for a circuits assistance symposium in New York in ‘78. It's interesting because Jim Kaiser was supposed to be Technical Program Chairman and most people would attach him more to signal processing. But he was too busy and would only help, so, somehow I got to do that. Then I became a vice-president of the Circuits and Systems Society and later president. I ran for the directorate for two years. That was the centennial year. I thought it would be fun to do it in the centennial year, put on a penguin suit and get marched here and there.
Nebeker:
I read your comments about the many meetings.
Liu:
I keep on reminding my fellow directors that an IBM Board of Directors meeting typically lasts less than an hour, or so I'm told. Anyway, I’m active in both societies.
Nebeker:
Is there a turf battle there?
Liu:
Not really.
Nebeker:
Aren't there papers that both societies would like to publish?
Liu:
Oh yes, a lot of papers like that. Dave Munson was vice president of the Circuits and Systems Society and became vice president of processing. I have students actively associated with both societies.
Nebeker:
An outsider might say there should be a merger or some sort of realignment.
Liu:
There’s overlap, but also things that are very different. Since the early '70s there were switched capacitor filters, a typical circuits problem, so I probably wouldn’t want to touch it. But probably the Circuits and Systems Society wouldn’t want to touch audio and speech.
Nebeker:
It also seems, again to an outsider, that there are some unusual topics in the Signal Processing Society, the ultra-sonic, underwater sound.
Liu:
That’s a remnant of the audio interest. Underwater sound can fit into several things. You’ll find papers published by one Transactions that could go to three or four other places.
Nebeker:
Do you think things are working reasonably well, or are structural changes called for?
Liu:
If we lived in an ideal world, I'd think it should be re-organized. But, we are not, there are a lot of personalities involved. IEEE is a volunteer organization. It’s easy to create something, but much more difficult to make something die.
Nebeker:
Are there clashes between Circuits and Systems and the Signal Processing societies?
Liu:
I don’t believe so. For example, the society meetings the year before last at Atlanta were coordinated to be back-to-back.
Nebeker:
Are there complaints from engineers that, “I’ve got to join two societies to get the publications I need?”
Liu:
I don’t know. I think there is some, and that’s another one of my pet peeves. I think the engineers simply do not do enough to keep themselves up to date. They do not want to be challenged in intellect. Joining a Society is like ten or twenty dollars. What’s twenty dollars to an engineer? Learning is a lifelong endeavor. I think, if anything, this is a serious failure in education. By the time an engineer graduates, we have not convinced him to learn systematically. Let me go back into the history a little bit. After the 1980s, the ASIC design was becoming important. So a lot of people were doing the VLSI design implementation of digital signal processing algorithms. I did some of that, and so did a lot of other people.
Nebeker:
There we have an overlap with the Computer Society.
Liu:
Yes, because when we talk about turf issues, the Circuits and Systems and Computer Societies were involved in a big turf fight when I was the president of Circuits and Systems. It was because of the two conferences: the ICCASP and ICCD. From then on there were a lot of other applications, a lot of people. I think it’s very interesting to look over the titles of papers in Transactions and how they branch out. The Circuits and Systems and Signal Processing Societies branch out quite a bit, mostly in applications rather than the underlying algorithms. That's my view.
Video image processing, 1980s-90s
Liu:
Starting around the middle of 1980s my interest turned to video image processing.
Nebeker:
How did your interest turn in that direction?
Liu:
Well, from 1-D to 2-D, it’s sort of natural.
Nebeker:
So after some of the work you did earlier, you thought, “Oh, we can do a similar thing for images?”
Liu:
Yes, and no. Perhaps. I’ve forgotten how I got into it now. Initially it may have been part of that, but when you move from one-dimensional signal processing to two-dimensional, it's not just an extension. Each field has its own way of looking at things, its own emphasis, and for good reasons. You cannot simply say, “I did this here, I can pick it up and put it there.” It will not work. I should go back a little bit. In the late ‘70s, I also did some work on digital holography. You can use computers to generate holograms.
Nebeker:
You give a mathematical description of a scene to the computer?
Liu:
That’s right. If I look at it carefully, it becomes a signal quantization problem, and that’s interesting. A couple of my students followed that line of inquiry. From there, it’s natural to lead into half-toning of images.
Nebeker:
Did other people find applications for the digital holography?
Liu:
I don't believe so. I think the applications that people would like to see in holography will not happen for a long time. It is simply not easy to work with optics. I got into holography because I took a summer job at Bell Labs and that group wanted to do digital holography. I thought that was fun, so I went there to do it. Again, I didn’t do anything for them, but after I came back, I started research activities for them. So Bell ought to feel proud that at least they spun off something even though they didn’t get anything directly useful from it. I feel bad about that. Then there’s a binary image problem in printing, for example. So half-toning evolved from that work. The person who did that thesis was Jan P. Allebach at Purdue. I’ve been fortunate to have worked with really some of the best minds that there are.
Nebeker:
Has almost all of your research been done together with graduate students?
Liu:
Yes, I’ll give you a list of my students. The title of their thesis will be very informative. Neil Gallagher is another person. I mention him because his thesis here had to do with optical signal processing, but after he graduated and went to Purdue, he and another student of mine, Ed Coyle, did some work on median filters. One of the fundamentals of a median filter is the concept of the root filter, he and another graduate student from here (not one of mine) Gary Wise did that. Conceptually, that is a fundamental paper. He and Ed Coil, another student of mine who did a thesis on computer networks, did some work on the decomposition of the lead-in filter into something else. That’s a signal processing problem that they picked out.
Back to image processing. Some of the work I did at first really shook things up. What I thought would have some impact later on was the work on motion-compensated coding, the so-called fast motion search. That paper won a best paper award for video technology. That’s a Circuits and Systems Society award. That was in ‘92. Two years later, another student and I did a paper on the extraction of information from compressed video without having to fully decompress it. That won another best paper award. Right now I’m moving further and further away from signal processing per se, or video processing per se. I’m looking at issues of video libraries, intellectual property rights, digital watermarking, how do you search for all this new computer information, index browsing. By the way, digital watermarking will turn out to be a signal processing problem, in a very disguised way.
Influential signal processing texts
Liu:
I mentioned the Ellerton conferences. One of the earlier collections is by Alan Oppenheim. He collected a number of papers on this. This one here is published in MIT Press, 1969. Ken Steiglitz has a paper in there, as do Gold and Rader, Oppenheim and Jim Kaiser. The paper on round-off error that Kaneko and I did. Cooley and Tukey are in here. Another thing I thought would be interesting is this book, again, in ‘69. This I would say is the earliest textbook on digital signal processing; it's by Gold and Rader.
Nebeker:
That one is famous.
Liu:
Let me also give you another volume. This is Benchmark Papers, Volume 12, Digital Filters and the Fast Fourier Transform, 1975. It represents what I considered to be a reasonably important paper from that time.
Nebeker:
If we could look for just a minute at some of the things we put down in the History of the Signal Processing Society. In 1967 or ’68 was the first Arden House Workshop. If we look at the way that we’ve periodized signal processing, we have this new field by the early ‘70s, although it’s true that mid-‘60s on, there was activity in digital filters and FFT. But for the larger engineering community, do you think it’s fair to say that signal processing was emerging as a recognizable field in the early ‘70s?
Liu:
Yes. Before 1970, a lot of people had not heard of FFT, or digital filters. Talk to McClellan—in 1973 he published "Unified Approach to the Design of Optimum FIR, Linear Phase Digital Filters." That’s the IEEE publication on circuit theory.
Nebeker:
A related question. The IEEE history center is trying to identify the big topics for each of the periods. Where would you put image processing? When did it become an area where a lot of people were working?
Image processing as a field
Liu:
I don’t know if I can give you a precise answer. Image processing is an important field not because it has such deep theory, but because it is useful, and because we have the technology to do it. Look at things like medical imaging—tomography, ultrasound—things along that line.
Nebeker:
That is a side of image processing where they had to create an image in unusual ways.
Liu:
- Audio File
- MP3 Audio
(333 - liu - clip 6.mp3)
Yes, once you create an image you try to extract information from it. From that point of view, another field that looks into that is relay pattern recognition. The transaction of pattern analysis and the intelligence there, computer vision—all these have to do with image processing. And now there is video.
You asked when a lot of people got into image processing. The progression was similar to a lot of things we have been talking about. You cannot do image processing on paper—it’s a near-objective criteria, such as signal-to-noise ratio, something that doesn’t always make sense—you really have to look at the image. Doing that is difficult because you need the facilities.
This gets into another one of my pet peeves. At a private university like Princeton, it takes $40,000 to support a graduate student for one year, and that’s bare bones. It takes four to five years to get a Ph.D., costing $150,000 to $200,000. It takes less than $5,000 nowadays to get a reasonably powerful workstation, and that should last the life of a student's studies. It was not too many years ago that the ratio was the other way around.
Before the availability of reasonable-cost color computer monitors, it was impossible for a lot of people to do image processing. When the Sun 360 color monitor came out about ten years ago, I bet from then on there were a lot of papers on image processing popping up. Once you have the ability to do the processing and display it and everybody is jumping into it, there’s bound to be some good work popping out. Video is another example. Digital video is a fairly new game; it's prompted by digital HDTV, which has an interesting history. Most people could not afford to do video work, even after color workstations became popular, because the memory requirements were too high. Right now it costs fifty dollars or so for a gigabyte. When a graduate student starts to do video work, his or her demand for disk space skyrockets. Right now, disks are cheap, processing power is increasing, so people are doing a lot of video image processing. The tools and facilities are now available.
Nebeker:
So it’s not only that we have some emerging application areas, like HDTV and video disks and so on, but you have the facilities to do it at the research end. That’s very interesting.
Liu:
Look back at the early video work—Bell Labs had people doing it way, way back.
Nebeker:
Picture phones!
Liu:
By what we know now, it was very primitive, although that's not surprising. There were very few people who could afford to do that work. The amount of equipment! Frank Moss built a frame buffer. I can't imagine how many resources he needed to build that, but now it’s trivial.
Nebeker:
I’m very glad to get your comments on that. We want to explain why in the ‘80s image processing became a big field and attracted a lot of research. It’s nice to have some explanation.
Liu:
I caution you—you get a pretty biased opinion from me.
Computers
Nebeker:
What about workstations? They were affordable in the ‘80s, and obviously integrated circuits changed things drastically. Even in the late ‘50s and ‘60s, laboratories were getting digital computers and then mini computers.
Liu:
Mini computers were in the ‘60s. They were very, very costly. We were thinking of getting one, but it was simply too much.
Nebeker:
The electrical engineering department at Princeton?
Liu:
Yes.
Nebeker:
When did you get your first computer?
Liu:
Very late, I'd have to think about when. Let's see, Dick Lipton in computer science joined us as part of the department along with the 750 UNIX machine. A workhorse, that. We must have gotten our first in the early '80s.
Nebeker:
That late?
Liu:
But the university had one.
Nebeker:
Yes, you had access to the big ones. I know it was in the late ‘60s that the Research Department at Bell Labs started getting mini computers and people said that that had a big influence on their research.
Liu:
Yes, Larry and Jim Flanagan were looking at the inflection of speech and trying to reproduce it.
Charge-coupled devices
Nebeker:
In your own work, have the charge-coupled devices or these surface acoustic-wave devices been important? 1970 is when they appear on the chronology I'm familiar with.
Liu:
I think charge-coupled devices are really a competing technology with filters. For imaging, CCDs are a different story. But for filters, it’s very much a technology competing with zero filters, and it simply cannot compete, just like analog and digital computers cannot compete.
Nebeker:
Explain that a little more; how were digital filters and CCDs competing?
Liu:
They are both FIR filters, finite duration impulse response filters. To digress a little, I think competing technology is very interesting. Almost two years ago I visited a friend who is a computer scientist. In his office I saw three computers: a UNIX workstation, a PC, and a Mac. I asked, “What about the future?” Between the Mac and the PC, he thinks the Mac is going to die, the PC wave is just too strong. But between the UNIX workstation and the PC, that’s interesting. But he believes PC will ultimately win the war. UNIX workstations will not be there anymore. I know of research labs that have changed their platform from UNIX to PC.
I think the economic force is very real in this. I don’t mean that there are no niches for CCD filters. There are. And there is certainly a niche for the surface wave and acoustic saw filters, particularly in some frequency ranges. As technology moves forward, the boundaries of these keep on changing. A simple example: we usually think the front end of a radio receiver as an analog filter, but it's moving into digital now—it can operate that fast.
Expansion of DSP field
Nebeker:
Isn't that a general trend, that digital signal processing is expanding its domain?
Liu:
Yes. People ask me, “What are you working on?” I say, “I’m doing digital signal processing,” and I say it with such pride, and complacency. It’s a little bit of the herd instinct. I belong to the blue tribe, which is better than the others. It’s almost like computers. Computers now are so prevalent. They are almost every place. You say, “I work on computers,” and it doesn’t give that much information. Signal processing is getting to a point where it will be almost every place. It’s probably unnecessary to toot the horn for signal processing—it’s proven!
Nebeker:
But still, when people look back on this time a hundred years from now, surely it will be a noticeable trend that digital signal processing expanded its realm. Where signals used to be processed in analog form, more and more we're seeing A-to-D conversion at the front end, everything digital.
Liu:
If you say digital signal processing will be more prevalent as times goes on, the answer is, yes.
Nebeker:
It’s not only that it’s taking over old territory, but there’s just more information processing of all sorts.
Liu:
For example, pattern recognition, speech recognition. Is it really signal processing? I don’t believe it is signal processing. If you let each of these terminologies take on a broader meaning, then it’s questionable why you'd want to call it that. Taken as a narrow meaning, you have a signal and you manipulate it, that's signal processing. But I think speech recognition would be very different from that.
Nebeker:
Can you give a narrow definition of signal processing?
Liu:
I think it is simply the manipulation of digital signals, without referring too much to the ultimate goal of the manipulation.
Nebeker:
You want to rule out some types of information processing where you’re doing computations, and I assume you want to rule out pattern recognition.
Liu:
It's easy to say what it is not, but it's hard to say what it is, to give a positive definition in just a few words.
Nebeker:
Maybe we can stumble to something reasonable. You create a signal, and you want to get the information in that signal somewhere else. For reasons of economy or whatever, you’ve got to do something to that signal to preserve it in time, or send it in space.
Liu:
I think that’s a very good point. I would not use the word information contained in the signal. I would put the emphasis on attributes of the signal, rather than the meaning of the signal. The meaning of the signal we get through the information.
Nebeker:
So what you want to do is convey or preserve certain attributes of a signal.
Liu:
Or enhance or use them. But even to make use would begin to touch upon something else. In some ways, the signal processor is someone who is being told to "do this" with the signal, but does not ask why we want to do that to the signal. For example, if you talk about medical diagnosis, it uses a lot of techniques of signal processing. It has the contrast, it will extract the edges, find segments, and different textures. But you do not question what each texture represents. I think the texture is an attribute. Edges are an attribute, so are the contrast and colors. I think it has more to do with the signals themselves, rather than the meaning of a signal.
Nebeker:
I was interested to see the Princeton electrical engineering web site where it described your areas of work. It has digital signal processing early on the list, and image processing toward the end. Is one a subset of the other, or something different? Do you think of images as a specific type of signal processing, or does the term "signal processing" mean just one-dimensional content?
Liu:
Yes. Well, let me think. It may be the general interpretation of the word. If I tell one hundred people in the field, "I’m working on digital signal processing," most will not infer that I work on images. So it has its historical factors.
Nebeker:
Maybe even more so today, since the Transactions on image processing has split off from digital signal processing.
Liu:
Yes, it’s a chicken-egg question. If you say, “I’m working on image processing,” most people will not include video or video coding. Why is that? Is that logical? Probably not. But that’s a fact.
Biomedicine and signal processing
Nebeker:
From your perspective, when did the biomedical field begin to overlap significantly with signal processing?
Liu:
Is the emphasis on medical?
Nebeker:
I guess so. What I’m thinking of is this type of signal processing that the Biomedical Engineering Society would also be interested in—computerized tomography and ultrasound. That’s a branch of signal processing today.
Liu:
Biomedical imaging is certainly very close to image processing. Biomedical signal processing could also include something such as prosthetics. You use computer control for that. The artificial larynx—you can think of a lot of useful possibilities for that.
Nebeker:
What’s been the history of that? Has the artificial larynx always been in the Signal Processing Society? I know Flanagan was involved with it.
Liu:
If you talk about the mainstream of the society, no. Now, the mainstream is not even audio.
Nebeker:
But are a lot of people who are doing medical imaging members of the Signal Processing Society?
Liu:
Yes. If you take someone who works on medical imaging, most likely they belong to both medical and the other society.
Nebeker:
Is this something that sort of happened in a rush because of some new technologies, or is it something that’s always been there, just gradually developing?
Liu:
It’s a little bit of both. A lot of it has to do with human nature. Throughout IEEE history, have you ever seen two societies merge into one?
Nebeker:
No.
Liu:
Have you ever seen a society die?
Nebeker:
Let me rephrase the question. Most of the speech processing people came out of the acoustical society; the field emerged directly out of the other. Did a similar thing happen with medical imaging? A lot of signal processing people suddenly turned to the medical work?
Liu:
I don't think so. You meet people who are doing technical work gaining the respect in the field. You meet people who spent a lot of time organizing a society. The transformation of the Audio Society has very few parallels. At that time it was a small society. It’s easier to effect a coup.
Nebeker:
Thanks so much for the interview.
- People and organizations
- Engineers
- Research and development labs
- Universities
- Signals
- Signal processing
- Digital signal processing
- Imaging
- Holography
- Filters
- Digital filters
- Computing and electronics
- Image processing
- Profession
- Engineering disciplines
- Engineering fundamentals
- Mathematics
- Algorithms
- Professional communication
- Bioengineering
- IEEE
- News