# Oral-History:James W. Cooley

## About James W. Cooley

Dr. James Cooley was born in 1926. He received a bachelor's degree in the arts from Manhattan College in 1949. He then earned his M.A. in mathematics and Ph.D. in applied mathematics from Columbia University in 1951 and 1961, respectively. Cooley is a pioneer in the digital signal processing field, having developed the fast Fourier transform (FFT), which has been used in atmospheric studies and to analyze signals sent to the earth from outer space. He developed the FFT through mathematical theory and applications, and has helped make it more widely available by devising algorithms for scientific and engineering applications. Cooley was a member of IEEE's Digital Signal Processing Committee from 1965 to 1979, and helped plan the Arden House conferences on digital signal processing. He has been actively involved in other IEEE Acoustics Speech and Signals Processing Committee(ASSP), and from 1992 to 1993 he was Associate Editor for the IEEE Transactions on ASSP. His professional awards include the ASSP's Meritorious Service Award, 1980, the ASSP Society Award in 1984, and the IEEE Centennial Medal in 1984. Cooley became an IEEE Fellow in 1981 [Fellow award for the development of the FFT].

The interview focuses almost wholly upon Cooley's career, particularly his development of the fast Fourier transform (FFT). He explains his early interest in digital signal processing and covariance problems, and explains how this led to his work with FFT algorithms. He discusses his work on the IEEE Digital Signal Processing Committee as well as various uses for the FFT. Cooley recalls his work to make the FFT available to increasing numbers of scientists through developing appropriate computer software. The interview closes with Cooley's thoughts on the changing interests of the Digital Signals Processing Committee and its predecessors.

Other interviews detailing the emergence of the digital signal processing field include Ben Gold Oral History, Wolfgang Mecklenbräuker Oral History, Russel Mersereau Oral History, Alan Oppenheim Oral History, Lawrence Rabiner Oral History, Charles Rader Oral History, Ron Schafer Oral History, and Tom Parks Oral History.

## About the Interview

JAMES W. COOLEY: An Interview Conducted by Andrew Goldstein, Center for the History of Electrical Engineering, 11 March 1997

Interview #327 for the Center for the History of Electrical Engineering, The Institute of Electrical and Electronics Engineers, Inc., and Rutgers, The State University of New Jersey

## Copyright Statement

This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.

Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, Rutgers - the State University, 39 Union Street, New Brunswick, NJ 08901-8538 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.

It is recommended that this oral history be cited as follows:

James W. Cooley, an oral history conducted in 1997 by Andrew Goldstein, IEEE History Center, Rutgers University, New Brunswick, NJ, USA.

## Interview

INTERVIEW: James W. Cooley

INTERVIEWER: Andrew Goldstein

DATE: 11 March 1997

PLACE: Yorktown Heights, NY

### FFT and numerical analysis

**Goldstein:**

Could you tell me about how you got interested in convolution and co-variance problems?

**Cooley:**

Yes, it was about ways of applying the FFT to problems in numerical analysis.

**Goldstein:**

For example?

**Cooley:**

I wasn't working on any specific application project, but I worked on the algorithms and methods for using the fast Fourier transform (FFT) to do convolution calculations. I also worked on factorizations of the FFT and some refinements of the bit reversal and the mapping algorithms. Lots of people were working on FFTs back then, but I worked on getting better programs. But after a while, with so many people working on programs and the best ones getting wide circulation, I felt there was little point in going further. But the FFT work got me invited to nice places and a few awards—there's nothing like that to stimulate one's interest. I got invited to the DSP committee, got into IEEE work, conferences, and traveled and met a lot of people.

### Digital signal processing community, 1960s

**Goldstein:**

Were you very familiar with the applications of the math in signal processing? How well connected to engineers, such as in the DSP committee, were you?

**Cooley:**

I wasn't connected at all until I got involved in the fast Fourier transform. An important part of my education was that people always thought I was in digital signal processing and knew a lot about it, but I wasn't. So in the beginning, when programs weren't available, people would call asking for programs, and I got to talk to them. I learned from them while they thought that they were learning from me. It was a good way to get started in digital signal processing.

**Goldstein:**

So you came to signal processing from square one in the mid-'60s. How did you see the field? Where was it going?

**Cooley:**

I had very little appreciation of the field until I got involved with the DSP committee and saw things Tom Stockham was doing, and also Charlie Rader, and Larry Rabiner in Bell Labs—if I start naming names I know I'm going to slight some important people. I got to meet most of the important people in digital signal processing at the time. From the DSP committee and the conferences, I became aware of most of the good work going on. Now they are all way ahead of me, I haven't stayed in much contact with this in recent years.

### FFT in mathematics and engineering

**Goldstein:**

It sounds almost like two separate worlds: mathematical and engineering. Did the FFT develop independently in these two worlds, or was there a lot of communication? For example, the engineer Charlie Rader translated a mathematical algorithm so that it was accessible to engineers. Did that algorithm take on a life and development of its own once it entered the engineering world, or did the innovation continue to come from the math side?

**Cooley:**

Well, there are several kinds of innovations. There were lots of innovations in engineering because engineers had great incentives: the computers were getting big and people were building special-purpose computers. In the mathematical world, Ephraim Feig, whom you just met, and his professor Louis Auslander did a lot of work in pure mathematics, i.e. group theory. They looked back into some theorems in group theory that had the FFT structure. From this they developed important new applications in new fields, such as crystallography. In crystallography, you have to describe crystals by their group properties—the groups of transformations which leave them invariant. One of the big problems in crystallography is the computation of the Fourier transform. It's a large three-dimensional Fourier transform of data from a crystal—a calculation which would be really huge. But you can use the symmetries, so they developed some very good Fourier transforms that used the symmetry of the data of crystals. So I took the group theory methods from the mathematical side and combined them with numerical methods, and developed new and better Fourier transform algorithms for data with symmetry.

**Goldstein:**

You say engineers had great incentive to work on it, what were some of the advances that came from that side?

**Cooley:**

They are so numerous they could fill many books. One big field was Fourier spectrometry. I really got to appreciate that shortly after the FFT came out. I had visitors from France who flew all the way over here to find out about the FFT. One was Janine Connes, who was head of the computing lab for the University of Paris. Her husband was Pierre Connes, a very important astrophysicist who was studying the atmospheric composition of the planets. His method was to pass infrared through a spectrometer (like the old Michelson-Morley experiment). This splits the sample apart, brings the light beams together, and gives an interference pattern. As you remember from school, you can determine the frequency by counting the peaks in the reinforcement pattern.

The important thing here is that the pattern can get very complicated; it's more than just counting the peaks and their separation. So you do its Fourier transform. This is actually a correlation function, so you use the Fourier transform in the same way as for the co-variance in statistics. You do the Fourier transform of the interferogram to get the spectrum of the reflected light from the planet. That spectrum will have a set of peaks characteristic of each diatomic molecule—each corresponding to a certain vibrational frequency in the electromagnetic spectrum.

I gave Janine Connes some programs and described the methods. They went back and reprogrammed their calculations and produced a huge book with the spectra of all of the planets. It's now in a little display case upstairs. They invited me to a conference in Vienna on Fourier spectroscopy. They had a lot of talks about its applications and a big manufacturer's display of machines that analyzed emission spectra. With the machines, you took a specimen of something—the user could be a physicist, a policeman, textile processor, anything at all—put the specimen in the machine and it produced an emission spectrum. It calculated the spectrum, compared it with templates of spectra for various materials, and then determined the composition of the specimen.

For several days they had people from many different careers talking about the FFT and looking at these machines. The conference celebrated three things: the experiment of Michelson-Morley, the interferometer, and the discovery that you could use the FFT on the interference spectrum. All of these machines had an FFT built into them to analyze the interference pattern. There were machines bigger than an automobile, and some like a desktop copying machine—small enough to carry in a Volkswagen. I was really impressed—it was a big field covering many, many applications.

Gauss used the method, and also Herman Goldstine (although I didn't know it then); he was the first Chairman of the Department of Mathematical Sciences here, and he was also Director of the Mathematics project at the Institute for Advanced Study. Goldstine was the man who hired me for my first electronic computer job. Anyway, he was writing a book on the history, and he found a book by Gauss. It was written in classical Latin, so there weren't too many readers, but he told me about it. I couldn't read it myself, but I could see the formulas; he had this idea of factoring the Fourier series, and he was using it (but of course, N wasn’t very big there, either.) So we wrote another history paper, and finally Heidemann, a student of Don Johnson at Rice University, wrote a book in which he described Gauss and the early FFT. He managed to get a translation of Gauss' paper. The idea was there, but it wasn't that useful because since N wasn't big. However, he did manage to find ways of using it. I might mention Sidney Burrows too; he did a lot of very important work on FFT algorithms.

### Digital Signal Processing Committee; Arden House conferences

**Goldstein:**

I'm interested in your work on the Digital Signal Processing Committee. What actions did the committee take while you were a part of it, and what was your role in them?

**Cooley:**

The first big action was the Arden House conference I mentioned, which brought together a lot of people interested in the FFT and to explain what they were doing. They talked about the program, what was needed, and so on. It was a great stimulus to the field—a lot of good papers came out of it. There was another conference two years later, I think we called it FFT and Digital Filtering. Then there were a few more Arden House conferences on digital signal processing, and the people at these conferences decided to have an international conference, which became ICASSP. The first one was in 1976 in Philadelphia, and many of the papers were on digital signal processing, FFTs, digital filtering and so on.

**Goldstein:**

What was the goal of the workshop—to be sure that everybody understood FFT and what it could do for them?

**Goldstein:**

No, the people who were invited had already been programming or using the FFT. It was so they could bring in their ideas and explain to each other how they were using it. We got the names of people to invite from the list of people who had been calling and asking for programs and article reprints and so on. I had quite a few names, and so did everybody on the committee. The important people were there—the MIT people, Tom Stockham, Charlie Rader, the Bell Labs people, Bridge Kaenel, Larry Rabiner, and of course Ron Schafer.

### FFT applications and computers

**Goldstein:**

Was the interest more in the mathematics of FFT or the applications?

**Cooley:**

I think everyone there was involved in important applications. They were interested in seeing how they could get programs to do these various tasks. Charlie Rader was into both the algorithm programming and the applications.

**Goldstein:**

Did you personally get involved in any of the applications?

**Cooley:**

I can't say I did.

**Goldstein:**

That brings us back to the issue of whether development was stimulated by the math side or applications. It sounds like you're saying from both.

**Cooley:**

Yes. Well, it just came naturally because people needed the results. Of course, when computers became larger you could do more DSP applications. For example, when Tom Stockham worked on the Caruso records, he used a big machine. I think it was a DEC mainframe machine. I said to him, "Would it be practical in the future to do Fourier transforms of real-time speech, and music?" He said, no, it wasn't practical. It may not have been practical then, but later he went on and founded Soundstream, one of the first digital signal processing companies. Now they are doing FFTs, filtering, and transforming speech and music in real time.

**Goldstein:**

This issue of practicality is an important one. When people were working on these algorithms, were there different tiers of work—one for the most advanced super-computing facilities available and others for widely-available computers? I'm wondering if some people worked on algorithms for more common, lower-level computers, while others developed higher-level algorithms that took advantage of the capacity of big machines

**Cooley:**

Higher level meaning very large Fourier transforms?

**Goldstein:**

Yes.

**Cooley:**

The largest Fourier transforms I have heard of are in Stanford, California, which are used to study signals from outer space for signs of intelligent life. They sample a very, very wide spectrum, and if they want the details of a signal, they need very, very large FFTs. They have written some papers (I can't remember the names of them) on how they do factorizations of N. When N gets too large for the computer, they have to break the data into blocks.

The largest I had heard of before that was used by the astronomers in Paris who were doing infrared plotting. Again, they were looking at extremely wide spectra, yet had to have very fine detail to identify the positions of the peaks in the spectrum of the radiation from diatomic molecules.

**Goldstein:**

Here's the situation I had in mind—help me with it if this is not the best way to look at it: Say it's 1968 and you are working on FFT algorithms. You know that some computers have 40-bit words or so of memory, but there are very few of these machines and time on them is expensive. But there are some smaller computers, like D-8s, that are more available. Did you have different people working on different algorithms for these two levels of computing power?

**Cooley:**

Well, there are variations on the algorithm which use this idea of breaking a big Fourier transform into smaller blocks which could be processed separately on a computer. If you are able to compute Fourier transforms of only size A or size B, you break them into blocks. A blocks of size B and B blocks of size A.

**Goldstein:**

Were people at all levels of computing power able to process the signals they needed to, or was FFT really only available to the more powerful computers?

**Cooley:**

It was available to smaller computers too, by breaking the data into blocks. You could keep all your data on tapes. The smaller computer would process a block of data, put it on a tape, then bring in another block. There were some papers on that, including one in the Arden House conference. A man named Singleton designed it with two tapes containing all the data. The data would come in on one tape, then get processed by blocks and put on the other tape.

Some of these block-processing ideas were used later with cache memories. In fact, in our engineering scientific sub-routine library, a lot of the sub-routines we wrote had to make efficient use of the cache, again by breaking it into blocks that fit in the cache.

**Goldstein:**

Was breaking it into blocks a fairly straightforward problem, or was it tricky and did it require real insight?

**Cooley:**

It got very tricky. There were lots of papers written on all the tricks that could be used. Image processing has become important too, because images have a huge amount of data. The question was whether it could be done on a small computer, and it could. It would work very efficiently even on a desk calculator, if you could organize the algorithm. And Lanczos did it.

**Goldstein:**

So it sounds like an economics problem. If somebody's building a signal processing system, they decide how much computing power they can afford, and then how much pre-processing will be required.

**Cooley:**

Or you can choose hierarchical storage. You can have several levels by using high-speed memory and the disks for cache. You schedule the algorithm to use these several levels of storage, but operate at optimum speed in the high-speed memory.

**Goldstein:**

Over time, the availability of computing power has increased dramatically. Under Moore's Law, computers become cheaper and more powerful. So far as you are aware, has that hardware development influenced the direction of the FFT algorithms?

**Cooley:**

Definitely. It gave some incentive for designing these nice scheduling algorithms for breaking the FFT into blocks and scheduling the blocks of calculation. Whole papers have been written just on the scheduling of data through hierarchical storage.

### Digital Signal Processing Committee, 1970s-1980s

**Goldstein:**

How long were you involved with the Digital Signal Processing Committee?

**Cooley:**

I tapered off after the New York ICASSP conference in 1988. I was registration chairman for that, and also for ICASSP 1977 at Hartford. I went to the Toronto meeting after that, and then I started working at the University of Rhode Island.

**Goldstein:**

That's all the way into the '80s. Had the interests of the committee changed during the '70s and during the '80s?

**Cooley:**

Oh, yes. Before I got on the committee, it was writing standards for the measurement of the spectra of bursts. Then after the FFT came along, the committee was devoted almost entirely to FFT algorithms and applications. Then came digital filtering, digital speech processing, and it broadened out into the whole field of digital signal processing. In fact, it wasn't even called the DSP committee in the beginning, it was Audio and Electroacoustics.

**Goldstein:**

And it became Audio, Speech and Signal Processing.

**Cooley:**

Yes, that shows the change.

**Goldstein:**

I have a picture of the rush of activities in the 1960s, but it is less clear for the '70s. It wasn't called the DSP then, but later on, do you know what they were promoting and what were the most exciting areas of research in the '70s?

**Cooley:**

I think it was broadening out into very general areas. Speech recognition was important—people were writing very nice filtering algorithms.

One committee project was to compile a set of the best programs in digital signal processing and put them through a certain set of standards. They were all in FORTRAN, and there was a program that tested to see if they conformed to a set of basic rules. They were put through the machine at Bell Labs and then all the copies were put onto a tape available to anybody. All the best programs were put in one thick book, including digital filtering, FFTs, and filter design programs.

Later on, they decided to do it again, but it was much harder to pick the best programs because there were so many more available. So the second one became very big and it became practically impossible to get everything in.

**Goldstein:**

What impact did the availability of these algorithm books have?

**Cooley:**

I haven't heard how many people ordered it, but I'm sure they sold lots. It was extremely valuable to have. Labs anywhere could have the best programs written by the top people in the field.

**Goldstein:**

Was there any feedback from the audience?

**Cooley:**

I'm sure there was lots of feedback, but the people who were doing the book were, with their co-workers, among the chief users. They worked for places where these things were used, so feedback could come directly from them.

After that, a lot of the committee work was running these ICASSP conferences every year. That was quite a lot of work, and it dominated.

**Goldstein:**

Did the committee try to give direction to the field through the selection of papers or the programming of the sessions at the ICASSP?

**Cooley:**

I don't think they tried to give direction; I think they tried to follow what was important and what people were interested in.

### Career highlights

**Goldstein:**

We've been talking mainly about the FFT because that's what you are best known for. Is there other work of yours that is important to talk about?

**Cooley:**

No, I don't think I really did anything else that was very important. Well, let's see, I did a lot of work on nerve simulation, and some on solving the equations of quantum chemistry, also the solving of Schroedinger's equations, and some articles. I did some work on diffusion calculations with people in the lab who had various research problems—diffusion of contaminants through insulators was one. I also did some quantum mechanical calculations for lasers. What else? Towards the end of my career at IBM, I got in the group which did the engineering scientific sub-routine library for the IBM vector machine, and then later for the RISC machine, the RS-6000. Then I left to take a teaching job and go into semi-retirement.

**Goldstein:**

All right, that wraps it up.

**Cooley:**

Good, I hope it's of interest.