Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Rhetoric and Reality: Comparing Obama and Romney’s Talking Points versus the Real Role that the Government Plays in Innovation

Tags

For about the last century, the role of the Federal Government in funding research and development (R&D) in the sciences has gone unchallenged.  From democratic presidents like FDR to Republicans like Richard Nixon, the U.S. Government has invested in and promoted R&D.  However, with mounting budget deficits and concerns of government spending, the bipartisan consensus over government funding for R&D is starting to see some cracks.  As an example, the 2010 reauthorization of the America Competes Act faced numerous procedural hurdles in the Senate, and it was only until funding was scaled back for R&D that the bill was able to pass.  The original America Competes Act was a priority of then-President George W. Bush and sailed to bipartisan, almost-unanimous passage.

This issue has largely remained out of the Presidential arena until the second Presidential debate, where it came up.  President Obama was answering a question about improving the economy, and rightfully focusing his answer on the important role government investment in R&D plays in our present and future economy.  In response, Governor Romney gave a generally correct answer, that the government doesn’t create jobs.  However, what Governor Romney missed in his answer, and a core reason why the U.S. has the most innovative economy in the world, is that the government understands it is not the entrepreneur nor scientist when it comes to R&D funding.  (He did get the message right in the final debate and seemed to clarify that he does not oppose the government’s role in R&D  he has a problem when it directly invests in companies.)

This needs further explanation.  Here is a quick summary of how basic government R&D eventually leads to commercialization.  The U.S. Government will authorize programs that fund R&D.  These programs exist throughout the U.S. Government, including the Department of Defense (DOD), Department of Agriculture (USDA), Department of Health and Human Services (HHS), Department of Commerce and others.  Agencies within these departments have scientific research as a part of their mission.  Many of these agencies are known to most of the public — National Institutes of Health (NIH), the National Science and Aeronautical Agency (NASA) and National Science Foundation (NSF) are well known.  Others, including the Advanced Research Projects Agency (ARPA) and the Defense Advanced Research Agency (DARPA), are less known.  (The National Institute of Standards and Technology also plays an important role in promoting innovation and U.S. competitiveness by working with the private sector to facilitate technological standards and conduct essential research needed for baseline knowledge in a wide variety of industries.)

Generally, one of two things will happen.  Either these agencies will have in-house scientists who use R&D money to do research in the agency, or the agencies will give grants to private laboratories, universities or even private sector companies to do basic research.  This research sometimes succeeds and leads to a key discovery.  Other times it does not.  However, it is what happens with those successes that is at the core of our success.  Rather than the government trying to commercialize and figure out ways to make money off of these discoveries, or our universities and labs doing so, the technology is licensed to private sector entities to allow them to try and develop it into commercial applications.  This is not always a clean process, and certainly there can be some challenges, but key is the fact that rather than NIH or DARPA taking the fruits of research and bringing it to market, it is a pharmaceutical company or startup technology company that does so.

There is a second role that the government often plays here, beyond R&D, which is as a purchaser of technology.  The U.S. Government is, arguably, the largest purchaser of technology in the world.  Between IT systems and our military, our government has been a sophisticated customer on the edge of technology for a century.  Because of this, the needs of the U.S. Government have often fueled some of the biggest technological advances and disruptions in the marketplace.

The result of this system, and its effect on our economy, is profound.  In December of 2010, the Breakthrough Institute released a great compilation of all the innovations the government has played a direct role in enabling, whether through funding or as a consumer, in “Case Studies in American Innovation.”  It has a laundry list of innovations where the government played a key role, if not the key role, in the development of a technology.  The list includes many classic disruptive innovations, including modern agriculture, railroads, aviation, jet engines, gas turbines, nuclear power, synthetic fuels, commercial wind power, solar power, printable solar cells and biotech.  Also included are four key high-tech disruptive technologies, which are worth examining further:

1. The Internet

Most people know that the Internet was invented by ARPA (which became DARPA).  The story has been told many times.  It is worth repeating.  The U.S. Government, embroiled in an arms and space races with the Soviet Union, was looking for a way to help computers talk to each other better and share information.  In order to do so, ARPA funded research at MIT by J.C.R. Licklider to work on a system of time sharing where central terminals could be accessed remotely via a telephone connection.  Another government-funded institution, RAND, was working on durable communications switching and developed packet-switching and a true peer-to-peer network.

With the design in place for the Internet thanks to RAND, the Air Force went to AT&T to build the infrastructure.  AT&T turned it down, a non-American government institution – the British Post Office – stepped up and with U.S. government funding, showed the viability of RANDs communications switching and peer-to-peer idea.

ARPA then took over the project, attempting to get IBM to build a personal computer supported by the network, but IBM rejected because they saw the Internet as a disruptive threat.  Instead, ARPA hired a small Massachusetts firm to create the computers and the network that would support it.  This project, called ARPANET, became the world’s first packet switching network that eventually matured into the Internet as we know it today.

2. Semiconductors/Microchips

Microchips were invented by the private sector without government backing.  However, it was the government as a customer that drove the development of the industry.  The first big microchip customer was the U.S. Air Force, for the Minuteman II missile.  This missile helped to build the microchip industry in the U.S.  NASA’s Apollo project accelerated disruption in the microchip industry in the 1960s, driving prices down from $1000 per unit to $20-$30 per unit.

3. Personal Computing  

The development of the personal computer was fueled by the government.  The first electric computer was built in 1945 to crunch numbers for the Army Ballistics Research Laboratory.  Later, the Army Signal Corps funded much of the early semiconductor research.  Among the first purchasers of supercomputers in the 1950s were NASA, DOD, the National Center for Atmospheric Research, the U.S. Weather Bureau and the Social Security Administration.

One of the biggest disruptions in computer technology was the development of cheap methods for memory manufacturing — a process that was developed by the Air Force’s SAGE air defense project.  Another massive disruption, the keyboard, was also the product of SAGE.

The software that runs computers is no different.  Government R&D from DOD funded the research that led to early programs and computer languages.  In the 1970s, DOD was responsible for over half of all academic computing research and grants from ARPA established computer science programs at MIT, Stanford and Carnegie Mellon.

These researchers at ARPA became the leaders of the personal computer revolution.  Many of them ended up working at Xerox’s Palo Alto Research Center (PARC), where the graphical user interface and the first modern PC were invented.  These developers left PARC to create the products we know and love today, playing key roles in the success of Microsoft, IBM and Adobe.

4. Global Positioning System (GPS)

GPS has become a staple for location and maps programming on the modern smartphone that has changed the way we consume, access and share information.  GPS was a government technology, run out of DOD, for tracking missile submarines and aircraft.  In 1983, the DOD decided to make GPS public and allow the development of commercial technologies on the system.  In 1995, this became a reality.

Innovation

New technologies are constantly emerging that promise to change our lives for the better. These disruptive technologies give us an increase in choice, make technologies more accessible, make things more affordable, and give consumers a voice. And the pace of innovation has only quickened in recent years, as the Internet has enabled a wave of new, inter-connected devices that have benefited consumers around the world, seemingly in all aspects of their lives. Preserving an innovation-friendly market is, therefore, tantamount not only to businesses but society at large.