First-Hand:Real-time Telemetry Processing
Real-time Telemetry Processing Evolution
Submitted by Bill Rymer, 50-year member, IEEE.
At the end of WW II rudimentary telemetry was in use for missile and air vehicle testing. Early work used FM/FM encoding with a single measurement per subcarrier. These efforts were soon followed by PAM and PDM methods to allow acquisition of greater numbers of measurands. By the early 1960’s vacuum tube versions of PCM “decommutators” were built for NASA. In 1968 the first successful solid-state stored program decoms were produced by the Stellametrics Corp. in response to Navy requirements. Mainframe computers were rapidly being developed that allowed small numbers of derived measurements to be computed in real-time. A hot topic in that era was “What is real-time?” My definition, which stood the test of time better than any other I found, was this: Real-time processing occurs when each sample of data is processed prior to the arrival of the next sample of that measurement.
After the lunar lander had been tested using raw data and without real-time processing, the Grumman Corp. tasked Control Data and Astrodata to build the “Automated Telemetry Station” (ATS) primarily aimed at flight testing aircraft. ATS, delivered in 1969, was to have computer-driven graphics, real-time conversion to scaled floating point data, and real-time writing of engineering unit data to digital tape. In that era, a mainframe computer (CDC 6400) was housed in a sealed room and cooled by chilled water piped under the floor from massive air conditioning capacity. Front end equipment such as PCM decoms and engineering units conversion hardware occupied several equipment racks. Computer driven graphics was in its infancy, hardcopy from the CRT was difficult and expensive, and human interfaces were a challenge.
In 1972 the Xerox Corp. built the first fully successful real-time telemetry processing system (RTPS) used at DoD test ranges for the Navy at Patuxent River, MD. In those days hardcopy from computer driven graphics tended to be “brown and yellow and wet” with various kludges to copy images for test engineers. Expensive and large stroke-writing CRTs were the best computer driven graphics available and the only devices capable of real-time display in that context. During development of RTPS, Sony Trinitron TVs were adapted for terminals with character generators to feed them and keyboards interfaced to produce what amounted to user terminals. One challenge was that high-end, very large and high-cost mainframes were barely capable of real-time calculations such as converting raw samples to engineering unit floating point data, applying calibrations, and the like. In some systems developed at that time, IBM had built microinstructions in firmware for the 360/55 or 360/65 mainframes, but still encumbered large fractions of the mainframes capacity. Memory in those days was still magnetic core and 256,000 words of core was considered a large mainframe computer. Gigabyte disk farms were just coming on line at huge expense and the “washtubs” of disk units occupied on the order of 100 feet of linear space to achieve 1 GB. You now carry orders of magnitude more in the SD card in your camera. These large disk farms were generally not involved or accessed by real-time telemetry processing systems. For RTPS in 1972, Xerox built special purpose computers called Programmed Algorithmic Units (PAUs) capable of converting 50,000 samples per second of raw data to single precision floating point while doing 5th order fit to non-linear instrumentation calibrations for around 20% of that data. This technology (mainframes, special purpose PAUs, stroke writing graphics, PCM and FM front ends, A/D converters, strip chart recorders, hardcopy units) were integrated for first actual use in an aircraft flight test in March 1973. One of my senior mentors always said that “How good it is is how much it is used.” Over the next decade RTPS handled in excess of 12,000 test flights involving every airframe type in the Navy inventory and several other vehicles. The basic concepts and tools of course were hosted on newer and faster hardware over many generations of processors and remain in use today.