A Brief History of the U.S. Federal Government and Innovation (Part I): From Independence to World War I (1787 – 1917)
Introduction
In recent years, a time of tight budgets, there has been great debate in the United States about the role of federal government investment in the economy. Some political thinkers suggest that the private sector is always better than the central government in directing resources, and that when government involvement is necessary, it is preferable at the state level (with perhaps the exception of defense). The followers of this school of thought single out technology as an area where clearly the free market is better at picking winners and allocating funds than any centralized bureaucracy. Others, of course, disagree. We therefore thought that it would be interesting to briefly outline the history of U.S. government involvement in technological innovation, which turns out to be long, broad and deep.
These sorts of political arguments go back to the founding of the republic. The United States Constitution of 1787 represents a brilliant compromise between those who favored a strong central government with broad powers (who came to be known as the Federalist Party), and those whose favored states rights and a narrow central government (who came to be known as the Democratic Republicans). Part of the compromise rests in what is not said in the Constitution: it remains a framework on which subsequent generations can build a democratic government. For example, the Constitution gives to the President the right to appoint (with the advice and consent of the Senate) “principle Officers in each of the executive Departments,” but does not name the departments. Originally, there were four departments (the equivalents of what today are called State, Defense, Treasury, and Justice); now there are 16 such departments (not counting a number of “cabinet-level” organizations).
With all of the arguments about the scope of federal power, and the tendency toward non-specificity, it is interesting to note that the at the Constitutional Convention, on 5 September 1787, based on earlier discussions, the Committee of Detail proposed that Congress have the power to “promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.” Despite all the months of bickering in other areas, less than two weeks later (17 September) this intellectual property clause passed unanimously. In his very first Presidential message, in January 1790, George Washington, who was considered above party politics, urged Congress to make as one of its top priorities the passage of a patent law that would bring the clause into practice and “give effectual encouragement…to the introduction of new and useful inventions.” The law was passed within weeks, and the first patent was issued by August. There is no doubt that the consensus of America’s founders was that the government should promote innovation. As the examples in the rest of this article and the two other articles in the series will show, this consensus was to last for the 200 years from 1787 – 1987 (25 years ago being a good cut-off an analysis meant to be historical and not political; that is the criterion for IEEE Milestones) and contribute greatly to the welfare of America and the world.
Armory Practice
One could, of course, argue that patent law, like corporate law, exists only so that the government can be a fair referee on a neutral playing field, and the impetus for invention is still held in the private sector. The government, however, did not stay in that position for long. First of all, there was broad political agreement that a strong defense was necessary for the young country, and in those days (as today) military success depended on advanced engineering. Originally, engineering meant what we would today call civil engineering — designing and building of roads, bridges, tunnels and fortifications — plus the science of attacking such works with artillery and other siege devices. So, in 1802 President Thomas Jefferson established the U.S. Military Academy at West Point, New York, so that military officers would not need to rely on foreign training in these matters. The Army Corps of Engineers was considered the most elite unit in the U.S. military.
Of course, increasingly military success in the 19th century came to rely on weapons technology. As early as 1794, President Washington asked Congress to designate two armories that previously only stored munitions — one in Springfield, Massachusetts, the other at Harpers Ferry, Virginia — to manufacture weapons so that the U.S. would not have its defense depend on foreign trade. Although part of the War department, Congress established civilian positions at the armories that had the ability to hire subcontractors on site. Initially Congress gave in parallel grants to private entrepreneurs to develop weapons factories to improve production, most notably the $5,000 advance given to the famous inventor Eli Whitney in 1798 to build a gun factory at Hamden, Connecticut.
Beginning with Col. Roswell Lee at Springfield in 1815, however, the Superintendents of the armories realized that by giving employment to inventors they could not just produce arms within the government, but push the envelope of weapons technology. In 1819 Lee brought in Thomas Blanchard, who was just beginning to establish himself as one of the great inventors of the day. Blanchard soon invented a lathe that could reliably produce interchangeable gun barrels, followed by a number of other devices that speeded up and regularized production, and enabled semi-skilled laborers to do the work of artisans. His counterpart at Harpers Ferry, John H. Hall was involved in similar work, and Lee developed management techniques to oversee the new modes of production. This “armory practice” not only gave a boost to the young country’s defense, but before long spread to other industries such as New England clockmakers, accelerating the industrial revolution. By 1854, British parliament sent a delegation to its former colony to investigate reports of new manufacturing techniques that surpassed anything in Europe. The delegation reported back that there was indeed a new “American System of Manufacture,” a name that stuck and a method that then spread to Britain and around the globe. This radical transformation in the global mode of production was the direct result of U.S. government funding and management, and it illustrates two key points that will be seen throughout the examples of government innovation. First, the government can be most effective when it intervenes in areas where innovation is needed but the risk for any individual entrepreneur to invest in the research and development would be too great. Second, under proper conditions the results of this government investment can benefit society well beyond its original narrow goals.
The Morse Telegraph
While weapons are of course of paramount importance for defense, both the military and mercantile sectors of the country also needed to rely on efficient transportation and communication. In the nineteenth century, roads and canals were generally in the hands of the States, which used a mixture of granting private monopolies and providing public funding to create a transportation network that also led to improved technologies. Thus, technological innovation stayed in private hands or in the hands of the states. The building of the New York State-owned Erie Canal served as a hands-on classroom for many of the great civil engineers of the age. So, for example, when the thrust of the transportation system moved to the steam locomotive, the advances were encouraged by a mixture of private and state funding. John Jervis, who designed the nation’s first operating railroad for the Delaware and Hudson canal system, had been trained on the Erie Canal.
Increasingly, however, interstate commerce meant that the federal government needed to also get involved in the transportation network. The high-tech (for the day) invention and spread of the steamship led to the development of law that allowed the federal government to regulate both transportation lanes, but also the public safety of devices (i.e., steam engines). However, as with patents, the government was protecting public interest and serving as a referee, but not directly influencing invention. The role of “sparking” innovation, begun in the armories, was to be next enhanced by a new invention — the first to harness electricity — the electric telegraph.
In 1832 the young American artist Samuel F. B. Morse, along with his partners Leonard Gale and Alfred Vail, was one of several inventors around the world trying to develop electric telegraphy. He convinced the U.S. Congress that it was in the interest of this nation that this technology be perfected, and Congress allotted $30,000 (a great sum of money in those days) to build a demonstration line between Washington, DC, and Baltimore, Maryland. Many of the technological innovations that would be incorporated into the ultimately vast national telegraph grid were developed on that initial government project — including something as basic as using telegraph poles and suspended wires rather than burying wires underground. Various other government funding, culminating in the Pacific Telegraph Act of 1860, led to the spread and improvement of telegraphy.
Congress acted similarly in the case of the major in-land transportation technology, the railroad. Although railroads were private affairs, the government wanted to insure that they were interconnected into a national transportation grid. The Pacific Railroad Acts of the 1860s authorized the issuance of government bonds, gave away federal land and established standards for the industry.
On the eve of the Civil War that temporarily divided a still young country on political lines, technology — in the form of the telegraph and the railroad — had joined it culturally and economically. After the hiatus caused by the war, the work was finished. On 10 May 1869, a ceremonial spike was driven in a track in Utah to show that a transcontinental railway system was complete and — in what some have called the first nationwide media event — the word “done” was telegraphed around the country.
Such transportation and communication systems were vital for a continent-sized nation state (unlike many smaller European countries), and this union was the direct result of the investment by the central government.
The Hollerith Machine
Another area where the U.S. government had unusual needs was in counting its citizens. The idea of counting citizens goes back to ancient times, and several European states pioneered the modern census in the 18th century. The representative democracy of the United States, however, required constant and accurate counting at an unprecedented scale. Article I, Section 2 of the Constitution called for a full enumeration every 10 years, with the first to be held within three years of the seating of the first Congress; the first census was conducted in 1790. The rapid growth of the young country posed increasing challenges, and the Census Bureau — the department charged with carrying out the count and publishing the data — had an active research arm, and developed a series of advanced techniques and devices. Then, following the 1880 census, a Census Bureau employee named Herman Hollerith, who was specifically given the task of improving mechanization, had an idea for a new way to automate the tallying based on what he had learned at the Bureau. After four more years doing development directly for the Bureau, he left the bureau and developed and patented an electric tabulating machine based on punched cards (the use of punched cards to encode data goes back to automated looms at the beginning of that century). He was then able to establish a firm to produce his machines and lease them back to his former employer, where they were successfully employed for the 1890 census. By 1891 Canada, Norway and Austria were also leasing Hollerith machines for their census activities. The technology spread to other sectors of the economy, such as railroad fare tabulation.
The government was not through, however. The expense of the Hollerith leases in 1890 and 1900 led the Bureau’s research and development arm to devise their own machines that were improvements on those of Hollerith while not infringing his patents. These Powers machines were employed in 1910 (James Powers later left the Bureau to form his own company). Meanwhile, Hollerith’s company struggled even as it expanded into new markets. In 1918, Thomas J. Watson joined the firm — which was to become known as IBM — and developed important new business models. Those business models plus the technology developed for the U.S. census began the march toward a technological revolution as important as the American System of Manufacture — the computer revolution!
Federal Research Begins
Private nonprofit organizations for the advancement of science and engineering, such as the American Philosophical Society and the American Academy of Arts and sciences, had existed for some time (the former from before the founding of the Republic!). In 1836, the U.S. Government accepted a bequest from a wealthy British scientist, and, supplying some additional resources, established a national museum, the Smithsonian Institution, that also carried out research in the natural sciences. Something was still missing from the national scene, however.
The U.S. Civil war was one of the earliest conflicts in which inventors consciously thought that innovative technology could sway the tide of battle. The Gatling gun, surveillance balloons, the Henry repeating rifle, and the ironclad ship all came into use. The conditions enabled a group of scientists and engineers to successfully lobby Congress to fulfill one of their long-time dreams, the establishment of a national body for advising the government on “any subject of science or art,” with a broader mandate than the machine shops of the armories. On 3 March 1863, President Abraham Lincoln signed the National Academy of Sciences into being. The NAS did not receive direct funding, but any of the branches or departments of government could supply it with appropriations to carry out specific projects. Over the years the NAS was involved in various activities, growing to the point where it spun off the National Research Council (1916) and the National Academy of Engineering (1964).
War was not the only human activity becoming increasingly technologized in the late 19th century. As many areas of basic production — including agriculture, the most basic production of all — became increasingly mechanized, the U.S. government was concerned that not enough citizens were sufficiently versed in science and engineering to carry out the activities that were needed. In a unique nod to the republican system, the Congress, in the Morrill Acts of 1862 and 1890, offered free federal land to states that would then establish schools for teaching “such branches of learning as are related to agriculture and the mechanic arts.” The results of these acts came to be known as “land-grant colleges” (or “land-grant universities” or “land grant institutions”). Immediately after the passage of the first Morrill Act in 1862, the state of Iowa accepted the Morrill provisions to expand and better fund the already existing Iowa State Agricultural College (now Iowa State University). Other states with existing agricultural and mechanical colleges followed suit. In 1863, Kansas established Kansas State University as the first institution made from scratch under the act.
States soon recognized that teaching of the practical arts had to go hand-in-hand with cutting edge research in those fields. Several states established agricultural experiment stations to advance the frontiers of science-based farming, often giving grants to private colleges (the first was at Wesleyan University in Connecticut). Congress realized that greater investment was needed. The Hatch Act of 1887 provided federal funding to states to establish and run agricultural experiment stations at their land-grant institutions. As of 2008, there were 76 land-grant institutions in every U.S. state and territory (and the District of Columbia). Over the years the research at these public universities, funded by a combination of federal, state and private resources, has been responsible for countless discoveries, inventions, innovations, and patents in every field of agriculture, engineering and science.
Medical Advances
It was not just a question of swords vs. ploughshares, however. Even the military had interest in soldiers extended beyond simply supplying weapons. A key concern was supplying medical care when they served and afterwards when they were veterans. Way back in 1798, Congress established a Marine Hospital Fund to provide health care for merchant seamen by contracting to various hospitals; in 1871 this became centralized into the Marine Hospital Service. In 1887, recognizing that science was revolutionizing medical care, the Marine Hospital Service established a “Hygienic Laboratory.” Within months its first staff member, Dr. Joseph J. Kinyoun, identified microscopically the bacillus that causes cholera, putting American medical science on the road to parity with that of Europe.
In 1902, Congress expanded the scope of the Marine Hospital Service in response to growing issues of public health, spinning it off from the military and renaming it the Public Health and Marine Hospital Service. This change gave the Service jurisdiction over testing and regulating vaccines; this is part of the government’s public safety and level-playing field roles. It also gave it broader funding and license to do basic medical and biological research, resulting in a great number of medical discoveries. For example, in 1914, Joseph Goldberger identified the cause of a then prevalent disease, pellagra, as being a lack of niacin and therefore was able to determine an inexpensive and widely available cure. The Public Health and Marine Hospital Service eventually evolved into the National Institutes of Health, probably the greatest medical research operation in the world today.
The physical sciences and engineering were not ignored in this period either. Led by Thomas Edison, private entrepreneurs established research and development laboratories to deal with the ever increasing complexity of technological systems. There remained, however, an issue over which standards these technological systems could use, and the risks if multiple systems were not compatible. Private, nonprofit associations had tied to fill the void, notable the American Institute of Electrical Engineers (AIEE, predecessor to the IEEE). The AIEE, founded in 1884, issued its first electrical standards (for copper wire) in 1893. Congress decided that the federal government needed to be involved, at least, again as guarantor of a level playing field, and in 1901 established the National Bureau of Standards (NBS; today known as the National Institute of Standards and Technology). The research at NBS soon moved beyond mere refereeing, however, and resulted in a great many innovations. For example, much of the work on early aviation instrumentation was conducted at NBS.
Thus a network of federal laboratories in a wide range of fields was in place on the eve of what many consider the first modern technological war.