Economic Benefits of the Global Positioning System (GPS)
oleh ahmad issa
The Global Positioning System (GPS) is a network of monitoring stations and satellites that distributes a signal used for positioning, navigation, and timing (PNT). This signal is free, ubiquitous, reliable, accurate, and extremely precise. These attributes make GPS a platform for innovation. Originally launched for military use, in the years since it was made available for private-sector use, it has enabled innovators to develop a host of applications, services, and products, increasing efficiency, productivity, and personal enjoyment. From people driving to someplace new to multinational corporations coordinating complex logistics networks, hundreds of millions of users rely on GPS every day for navigation and positioning. Its precision timing capability supports industries as diverse as finance, electricity, mining, and telecommunications. Even the term GPS has entered the vernacular to mean one’s specific location at a specific point in time. The focus of this report is on the valuation of the economic benefits of GPS since it was first made available for private-sector use. GPS has been widely adopted by many industries, including 14 of the 16 industries deemed to be critical infrastructure (“Presidential Policy Directive,” 2013). Benefits were measured relative to a counterfactual in which GPS was not available and existing PNT systems and technologies continued to be used. Setting aside for the moment questions of cost, quality, and availability, where a technology alternative could have met a particular need, our valuation approach means that for that application the benefits of GPS are negligible. Where needs could not have been met, the incremental precision and accuracy provided by GPS were critical and benefits were quantified. Netting out the value that could have been delivered by GPS alternatives prevents gross overestimation of benefits. The important role GPS plays in the U.S. economy has given rise to questions about service disruptions and available alternatives. In consultation with other federal agencies, the sponsor of this work, the National Institute of Standards and Technology (NIST), expanded the initial scope of this study to add research question: what is the potential impact of a 30-day disruption of GPS service? A 30- day disruption seems unlikely, the impact of a disruption certainly differs on Day 1 than on Day 30, and devices capable of receiving GPS signals are also capable of acquiring signals from other GPS-like satellite constellations. However, from a policy and planning perspective, understanding the relative magnitude of potential impacts is important for making informed decisions about investments in back-up systems and contingency plans. This report also charts the evolution of GPS’s development, including its emergence from technologies and concepts pioneered in U.S. national laboratories and the role of different agencies and laboratories in working with different industries to take advantage of GPS’s potential. 1.1 Defining Positioning, Navigation, and Timing The ability to measure time intervals and frequencies extremely precisely is what allows GPS users to pinpoint their location anytime, anywhere in the world. The launch of Sputnik in 1957 and the resulting space race led the United States to accelerate scientific efforts deemed essential for national security and Economic Benefits of the Global Positioning System (GPS) 1-2 spaceflight capability, including the creation of what is today the Defense Advanced Research Projects Agency (DARPA). An important breakthrough occurred when U.S. researchers discovered they could discern the location of ground-receiving stations based on Sputnik’s radio transmissions and accurately determine the satellite’s orbit. That realization catalyzed research and development programs within the U.S. national laboratories, the military, and contractors in the private sector and academia to develop satellite systems to further American geopolitical and defense interests. The comparison of multiple timing signals allows three general applications. Position applications leverage the ability to determine the precise location of a feature or object. Navigation is the comparison of current to the desired position and the ability to apply for the necessary course, altitude, and speed corrections to bring an object into the desired position. Timing is the ability to acquire and maintain accurate and precise time. Although the position and navigation aspects are perhaps GPS’s best-known uses, they are enabled by GPS’s timing attribute. The timing attribute of the GPS signal is used for precision timing services by a variety of industries. For example, the telecommunications sector relies on GPS to synchronize the flow of data and voice traffic across the network (see Section 4), financial markets use GPS to timestamp transactions for high-frequency trading (see Section 7), and electric utilities use GPS to increase the efficiency of the transmission grid (see Section 6). 1.2 How GPS Works The current GPS infrastructure involves three segments: space, control, and user. The space segment currently consists of more than 30 satellites in multiple orbital planes. GPS satellites complete orbits every 12 hours. Each satellite contains four atomic clocks, a radio transmitter, and at least two antennas to communicate with ground control stations (Federal Aviation Administration [FAA], 2014a). Table 1-1. Defining Positioning, Navigation, and Timing (PNT) Position The ability to accurately and precisely determine one’s location and orientation referenced to a standard geodetic system Navigation The ability to determine current and desired position (relative or absolute) and apply corrections to course, orientation, and speed at various altitudes Timing The ability to acquire and maintain accurate and precise time from a standard (Coordinated Universal Time, or UTC), anywhere in the world and within user-defined timeliness parameters; includes time transfer Source: U.S. Department of Transportation. Section 1 — Introduction 1-3 Each satellite’s transmitter broadcasts a PNT message. The message contains location, status, and a highly precise timestamp of when the message was transmitted. All GPS satellites are synchronized to Coordinated Universal Time (UTC),1 which allows messages from different satellites to be reliably compared. The U.S. Air Force operates and maintains the GPS through the Global Positioning Systems Directorate, a unit within the Space and Missile Systems Center, Air Force Space Command, at Los Angeles Air Force Base. The directorate is responsible for the acquisition, development, and production of GPS satellites, ground systems, and military user equipment. In 2004, President George W. Bush directed the establishment of an interagency board to provide “guidance and implementation actions” for space-based PNT “programs, augmentations, and activities for the U.S. national and homeland security, civil, scientific, and commercial purposes” (GPS.gov, 2004). Figure 1-1 presents the current structure. The control segment consists of 15 global monitoring stations, including a Master Control Station located at Schriever Air Force Base in Colorado Springs. Six monitor stations are managed by the Air Force and nine by the National Geospatial-Intelligence Agency. The master control station is responsible for the overall management of the monitoring station system. The individual monitoring stations continually check the altitude, position, speed, and operational health of the satellites and feed this information to the master control station (FAA, 2014b).
GPS relies on precise, synchronized time to provide accurate PNT. The U.S. Naval Observatory is tasked with maintaining the time and frequency standard for all Department of Defense (DoD) activities, including GPS.2 A secondary master clock is located at Schriever Air Force Base. The master clocks incorporate multiple cesium atomic clocks (described later) and hydrogen masers. The user segment consists of GPS receivers used in myriad military, government, commercial, and civilian applications. To pinpoint location, GPS receivers use messages from a minimum of four satellites.3 By measuring and comparing the timestamps on messages from multiple satellites, the GPS receiver can determine its position in three dimensions. See Figure 1-2 for an illustration of this concept. Basic, unassisted GPS service is accurate to roughly 8 meters 95% of the time anywhere on or near the Earth’s surface. However, augmentation techniques such as Assisted GPS (A-GPS), Wide Area Augmentation Service (WAAS), and Real-Time Kinematic (RTK) can acquire greater precision from the GPS signal and improve performance in other ways. 1.3 Analysis Scope and Objectives To better understand the value of GPS for the private sector, the National Institute of Standards and Technology (NIST) sponsored this analysis. It had three major objectives: Present a detailed qualitative and quantitative analysis of the retrospective economic impacts resulting from the availability of the GPS signal for use by the private sector. Present a detailed qualitative and quantitative analysis of the potential economic damages resulting from a 30-day GPS outage. Identify and characterize federal research and technology transfer activities that supported the development and deployment of GPS. As mentioned above, this study initially began as a retrospective analysis to estimate the economic benefits of GPS relative to other sources of position and timing information. The potential impact of a 30- day outage was later added to the scope following discussions with several federal agencies about industries’ reliance on GPS and potentially significant economic impacts resulting from natural disruptions (such as solar flares) or nefarious activities by bad actors. The 30-day outage period was set by this study’s sponsor in consultation with other parties within the Department of Commerce.
We selected industries by determining those whose use of GPS provides them with benefits that could not have been met by other technologies or that would not have received the same level of benefits. Before GPS was made available for commercial use, the Loran system (as reviewed later in this report) delivered a signal that was readily available, if not as robust or precise. If GPS had not been made available for commercial and civilian use, Loran likely would have been expanded over time as technological advances required greater access to more precise sources of position and timing information. Thus, not only did we measure benefits relative to a counterfactual in which other technologies would have been available, but we also only included industries that developed a reliance on the incremental precision offered by GPS. Ten industries were included in this analysis. As we review later in the methodology section, we considered several others, but it was determined that these did not require the precision delivered by GPS over and above what was available from other technologies. Of course, if they are using GPS today and an outage were to occur, they could be adversely affected. A limitation from our initial scope definition means that such industries are not included. This also means that the summary results for the 30-day outage scenario should be interpreted as an underestimate of likely impacts. Because of this limitation, the Department of Homeland Security (DHS) provided additional funding for the maritime sector to be included in our 30-day outage analysis. DHS is completing a technical assessment of GPS vulnerabilities and available back-up systems and is leveraging the results of this economic analysis. GPS’s simplicity and ubiquity have led to widespread use in the maritime sector. Although the benefits of GPS relative to what mariners used to use are not great, mariners have migrated to its use. Thirty-day outage impacts could be significant. The maritime sector is an excellent example of this study’s limitations. All benefits monetized and presented here may be underestimated as a consequence. Table 1-2. Industry Sector Coverage Sector-Specific Analytical Focus Agriculture Precision agriculture technologies and practices Electricity Electrical system reliability and efficiency Finance High-frequency trading Location-based services Smartphone apps and consumer devices that use location services to deliver services and experiences Mining Efficiency gains, cost reductions, and increased accuracy Maritime Navigation, port operations, and recreational boating Oil and gas Positioning for offshore drilling and exploration Surveying Productivity gains, cost reductions, and increased accuracy in surveying Telecommunications Improved reliability and bandwidth utilization for wireless networks Telematics Efficiency gains, cost reductions, and environmental benefits through improved vehicle dispatch and navigation.
Our approach brought together information about the prevalence of GPS use by industry, its adoption history, industry trends, and available alternatives. We interviewed almost 200 experts in the industry, academia, and government and conducted two surveys. The first survey was fielded to surveyors with the support of the National Society of Professional Surveyors. The second was fielded to a representative sample of American smartphone users to understand the extent to which they rely on their phones’ location services for emergency services, navigation, games, check-ins, and other activities. Although other studies have quantified impacts associated with GPS, they either were specific to an application (e.g., precision agriculture [Schimmelpfennig, 2016]), or they were generalizations that relied on desk research to assess impacts (e.g., Leveson, 2015; Pham, 2011). Note that we do not assess the geopolitical and national defense value of GPS. Our scope was civilian and commercial use, with an emphasis on the role GPS plays in meeting private-sector needs for precision PNT information. We present the results for each industry as a separate case study. How GPS adds value differs by industry, as does the method used to quantify that value. Providing a case study for each industry ensures that each industry’s use was appropriately contextualized and described.
The History of GPS Technology: National Laboratory Innovation and Technology Transfer The technology comprising today’s GPS has progressed through multiple technology life cycles over the past 60 years. Military objectives drove the initial development of space-based navigation systems, and as such, the national laboratories with Department of Defense (DoD) support took the lead role in research and development of enabling technologies. For example, research at the Naval Research Laboratory (NRL) and Air Force Research Laboratory (AFRL) established satellite orbital planes, discovered passive ranging techniques to account for signal delays, and performed R&D and testing procedures for critical equipment designed for space travel. Multiple early systems saw the underlying technologies and infrastructure be developed and proven, moving capabilities from two-dimensional (latitude and longitude) to three-dimensional (adding altitude) positioning. In 1973, NAVSTAR integrated existing programs to produce a more accurate and robust system in use today. The network of technology developers ultimately included the labs, government agencies, research universities, and private-sector contractors. Once selective availability was turned off in the mid-1990s, the private sector took the lead in developing the technologies needed for most of today’s commercial applications, building on the base of technology from earlier years. The timeline in Table 2-1 illustrates how research objectives and the roles of government and industry have evolved. This section describes how today’s GPS evolved from national laboratory technologies and programs. Our focus is on noting key programs and milestones; many comprehensive histories of GPS development exist. For reference, Table 2-2 presents a timeline of significant milestones. 2.1 Project Vanguard The International Geophysical Year (IGY)—from July 1957 to December 1958—was an international cooperative engagement to study the geophysical properties of Earth. The Naval Research Laboratory established Project Vanguard in 1955 to represent the United States in the IGY. On December 6, 1957, Project Vanguard’s first satellite launch failed, resulting in a near-immediate explosion at Cape Canaveral, Florida. The failed launch, dubbed “Flopnik,” was carrying Satellite TV3 (Test Vehicle 3), the purpose of which was to conduct an orbital analysis and collect other information on the environmental (radiation) effects on TV3. Four months later, Project Vanguard successfully launched a replacement satellite into orbit. The following year, Vanguard 1—the first solar-powered satellite in space—was successfully launched. Vanguard 1 ultimately met the project’s goals by collecting valuable information on Earth’s physical, atmospheric, and environmental properties. Project Vanguard ended with the launch of Vanguard 3 in 1959.
U.S. Satellite Navigation Systems, Programs, and Manufacturers Program Owner Years Active Key Technology Capabilities Vanguard U.S. Navy 1955–1959 Used solar cells to power radio transmitter, collected novel information about satellite orbits as well as geophysical characteristics of Earth Transit U.S. Navy 1959–1996 Established orbital patterns and predictions System 621B U.S. Air Force 1963–1973 Developed “pseudo-random noise” signal to resist jamming Timation U.S. Navy 1964–1973 Developed passive-ranging technique using high-stability clocks and time reference for positioning NAVSTAR GPS U.S. Air Force, JPO 1973–present Installed atomic clocks onboard GPS satellites; delivered civil and military signal; ground, control, and space segments maintain timing integrity GPS Satellite Block Manufacturer Launch Period Key Technology Capabilities/Improvements I Rockwell International (Boeing) 1978–1985 Design life of 5 years, two L-band navigation signals, served as concept testing series II Rockwell International (Boeing) 1989–1990 Nuclear detection sensors, designed to operate for 14 days without contact from control segment IIA Rockwell International (Boeing) 1990–1997 Durability improvements, designed to operate for 180 days without contact from control segment; 7.5-year design lifespan IIR Lockheed Martin 1997–2004 Replacement satellites for Block II, 7.5-year design lifespan IIR-M Lockheed Martin 2005–2009 Included military signal (M-code) and new civil signal (L2C), 7.5-year design lifespan IIF Boeing 2010–2011 Included third civil signal (L5), inertial navigation systems, 12-year design lifespan IIIA Lockheed Martin 2014 onwards Include a fourth civil signal (L1C), higher broadcasting power, navigation enhancements, improved interoperability, greater jamming resistance, 15-year design lifespan Source: Whitlock, R.R. & McCaskill, T.B. (2009); Pace, S. (1995); GPS.gov (2017b). Table 2-2. Notable Milestones in the Development of GPS Year Achievement 1954 The utility of space-based satellites is in review by various scientific agencies; a study is proposed to NSF 1955 DoD recommends the Naval Research Laboratory Scientific Satellite Program—which became Project Vanguard 1957 Soviet Union launches Sputnik I and II satellites Attempt to launch Project Vanguard’s first satellite (TV3) is unsuccessful 1958 United States launches first satellite into orbit—Explorer 1—under the direction of the Army Ballistic Missile Agency Project Vanguard successfully launches Vanguard 1 satellite 1959 Transit satellite navigation system developed at Johns Hopkins Applied Physics Laboratory (continued) Section 2 — The History of GPS Technology: Government Collaboration and Technology Transfer 2-3 Table 2-2. Timeline of GNSS Development (continued) Year Achievement 1963 System 621B, a navigation system developed by Air Force, is established 1964 Animation is established by the Naval Research Laboratory and led by Roger Easton 1968 DoD establishes steering committee—NAVSEG (Navigation Satellite Executive Steering Group)—to coordinate satellite navigation efforts 1973 In April, DoD further pushes for coordination, naming the Air Force to lead a new initiative called the Defense Navigation Satellite System (DNSS). DNSS was overseen by the Joint Program Office (JPO). NRL is still involved. In December, the NAVSTAR GPS concept is approved by the Defense System Acquisition and Review Council (DSARC). Phase 1 of the GPS program begins; intended to confirm the concept of space-based navigation. 1974 First NAVSTAR satellite—Navigation Technology Satellite (NTS)—is launched. It was a refurbished Timation satellite built by the NRL. It used the first atomic clock in space—a rubidium atomic standard. NRL expands cesium clock development for use on future satellites. 1977 NTS-2 satellite is launched carrying first cesium atomic clock into space. 1978 First of 11 Block I satellites launched between 1978 and 1985. 1983 After a Korean plane was accidentally shot down by the Soviet Union, President Reagan announces his intentions to make GPS available to civilian aircraft for free when the system is operational. 1989 The U.S. Coast Guard assumes responsibility as the lead agency for the Civil GPS Service within the Department of Transportation. The first five GPS Block II satellites are launched; From 1989 to 1997, 28 satellites are launched, including the last 19 being updated versions (Block IIA). 1991 First combat use of GPS is used in the Persian Gulf War, enabling U.S. military forces to validate its usefulness in the featureless Iraqi desert. 1994 GPS is announced as operational and integrated into the U.S. air traffic control system. FAA announces the implementation of the WAAS to improve GPS integrity and availability for civil users in all phases of flight. 1996 Transit satellite system ceases operation on December 31 at 2359 GMT. 2001–2003 Combat following the 9/11 attacks and during Operation Iraqi Freedom further demonstrates the precision of GPS in military conflict. 2005 First “modernized” GPS satellite is launched (IIR-M) that transmits a second civilian signal for enhanced performance. 2008 U.S. Air Force announces award to Lockheed Martin for the development and production of GPS III satellites. 2010 Russian GLONASS system completes constellation of 24 satellites, becomes fully operational. U.S. Air Force announces an award to Raytheon to develop next-generation Operation Control System (OCX). 2012 BeiDou reaches regional Asia-Pacific coverage. 2016 The EU’s Galileo achieves Early Operational Capability with 18 satellites in orbit.
Transit was initially developed to accurately provide navigation data for Polaris missile submarines and other ships at the ocean surface. After the unexpected launch of Sputnik 1, researchers studied the satellite’s radio signals and eventually could determine the satellite’s location in orbit (Guier & Weiffenbach, 1998). This research contributed to Transit’s development in 1958 through a joint effort between DARPA and Johns Hopkins Applied Physics Laboratory (Aerospace Corporation, 2010). After a failed satellite (Transit 1A) launch in 1959, the second attempt in launching a Transit satellite (Transit 1B) was successful. In 1964, the system was transitioned to the Naval Research Laboratory. By 1968, Transit was fully operational with 36 satellites in orbit. Transit operated for 28 years until 1996 when the Defense Department replaced it with the current GPS. Transit was initially designed to provide accuracy within about 0.5 nautical miles (926 meters) but eventually reached a level of accuracy of 0.1 nautical miles (185 meters) (“Transit—US Navy Navigation Satellite System,” n.d.). The system was two dimensional and thus did not measure altitude. Transit was significant in proving space-based navigation was possible and provided technical contributions to later navigation systems through the orbits and orbital prediction methods. 2.3 System 621B System 621B originated at the Aerospace Corporation and was supported by the U.S. Air Force. It was the first satellite navigation system to feature three-dimensional navigation, which was needed to monitor aircraft positioning (“Evolving Solutions,” n.d.). Another contribution of 621B was that it used a signal called pseudo-random noise to resist jamming. The signal was tested on aircraft between 1968 and 1971 and was ultimately verified in 1972 at White Sands Proving Ground in New Mexico (Stanford, 1995). System 621B’s signal structure and frequency were ultimately used in the first iteration of GPS (Pace, 1995). 2.4 Timation (short for Time and Navigation) was a program developed by the Naval Research Laboratory in 1964. This program was instrumental in the history of navigation systems because of its emphasis on precision time references to provide accurate positioning. Although atomic clocks were not yet employed in satellites, Timation satellites used high-stability clocks that were regularly updated and synchronized with a master clock on the ground (Beard, Murray, & White, 1986). The program proved threedimensional navigation (latitude, longitude, and altitude) was possible through its “passive ranging” technique. The U.S. Naval Observatory was also involved in developing the timing equipment used in Timation satellites and was active in research toward atomic time standards. The Timation program launched only two satellites—in 1967 and 1969—but was instrumental in the use of time references to pinpoint locations on Earth (“Navigation Technology Satellites,” n.d.). In 1973, the program merged with System 621B, and its third satellite (Timation III) was redesigned under the new NAVSTAR program and launched in 1974.
Atomic Clock Development The most accurate clocks rely on some source of frequency. The frequency needs to have an oscillation period that is well characterized, resistant to external disrupters, and highly stable (Lombardi et al., 2007). Mechanical clocks used pendulums that swung at relatively constant rates to measure time to within up to one-hundredth of a second per day. In 1927, a breakthrough in timekeeping was developed using the frequency provided by quartz crystals. The quartz clock’s accuracy exceeded the pendulum-driven clocks because of the piezoelectric properties of the crystal, which vibrates at a precise frequency when jolted with electricity. Quartz clocks are still abundantly available today in watches, clocks, and appliances. This concept of frequency remains true even in the most advanced clocks. The development of advanced atomic clocks by NIST (then called the National Bureau of Standards [NBS]) and the UK’s National Physical Laboratory provided one of the most critical technology components of satellite navigation. In 1949, the NBS built the world’s first atomic clock using ammonium absorption. However, this clock was primarily experimental and was never used for practical purposes. “Atomic,” in more recent terms, refers to the use of measuring the electron frequency of an atom—cesium or rubidium, most commonly—for timekeeping. In 1955, researchers at the National Physical Laboratory in the UK built the first cesium atomic clock. Although early research was performed at national laboratories, private companies innovated and improved the atomic clock for practical purposes, such as space-compatible clocks to be launched aboard satellites. In 1974, the first atomic clock (NTS-1) was launched aboard a GPS satellite. This was followed by continued research to improve timing standards and develop a series of new atomic clocks, which were incorporated into the GPS as it evolved. Table 2-3 presents the noteworthy timeline of atomic clock development. Table 2-3. Notable Milestones in Precision Timing Year Achievement 1949 World’s first atomic clock is built by NIST (then the NBS)—it used ammonia absorption. 1951 Cesium atomic beam device is completed at NBS with the Office of Naval Research funding. 1952 First atomic clock using cesium atoms for frequency is built by NIST, named NBS-1, although not accurate enough to be a time standard. 1955 ▪ Louis Essen at the UK’s National Physical Laboratory built the first atomic clock accurate enough to be a time standard. ▪ ONR contracts the National Company, Malden, Massachusetts, to produce a military atomic clock based on that of Jerrold R. Zacharias of MIT, with engineering characteristics set forward by the Navy Bureaus of Ships and Aeronautics and the Naval Research Laboratory. 1956 The National Company produces Atomichron, the first commercial cesium atomic beam clock. 1958 Commercial cesium clocks become available, costing $20,000 each, developed by The National Company. 1959 NBS-1 becomes NIST’s primary frequency standard. 1960 ▪ First atomic hydrogen maser (or frequency standard) was built at Harvard. ▪ NBS-2 is developed at NIST’s laboratories in Boulder, Colorado. 1963 NBS-3 is developed and offers improved accuracy and stability.
Developments in Precision Timing (continued) Year Achievement 1964 Cesium atomic beam tubes are developed by Varian Associates for Hewlett Packard. 1967 The 13th General Conference on Weights and Measures defines the second as the vibrations of the cesium atom, which replaced astronomical timekeeping. 1968 NBS-4 is developed as the world’s most stable clock, used in the 1990s as part of the NIST time system. 1972 NBS-5 is developed and serves as the new primary standard. 1975 NBS-6 is developed; it is accurate to within 1 second in 300,000 years. 1993 NIST-7 is developed and is 20 times more accurate than NBS-6. 1999 NIST-F1 begins operation—it is accurate to 1 second in 20 million years. 2014 NIST launched NIST-F2, an atomic clock accurate to 1 second in 300 million years. Source: Pace, S. (1995); Whitlock, R.R. & McCaskill, T.B. (2009); Bhaskar et al. (1996); Lombardi, M.A. (2012). The atomic clock has become perhaps the most critical technology component of satellite navigation because of the level of precision required. The signals sent and received by satellites, ground stations, and GPS receivers rely on coordinated time standards so that accurate positioning is maintained. If there is a lack of timing coordination or if timing standards were less accurate by some orders of magnitude, then the positioning errors of meters to hundreds of meters could likely result. Given that four satellites need to be simultaneously visible to provide location information, it is critical that timing and signal delivery from satellite to ground-based clocks and receivers are synchronized to support today’s applications (Piester et al., 2011). The Time and Frequency Division at NIST, through Defense Advanced Research Projects Agency (DARPA) funding and collaboration, developed the chip-scale atomic clock (CSAC), which is now being developed and marketed by private companies. Weighing just 35 grams and consuming 115 mW of power, the CSACs enable portable applications for military and commercial uses (Fetter, 2013). For military uses, the CSAC’s low size, weight, and power consumption complement other gear required in the field such as improvised explosive device jammers. It's highly accurate timing synchronization also helps to prevent self-jamming and can track GPS signals more quickly and with less visibility (i.e., a minimum of three satellites in view versus the normal four). Unmanned aerial vehicles also benefit from CSAC’s small physical properties as well as its ability to improve signal detection and jamming resistance (“Quantum SA.45s CSAC,” 2016). Since the United States’ primary civilian time and frequency standard, NIST-F2, was initiated in 2014, NIST researchers have already been developing significantly more precise atomic clocks. Although the Naval Observatory oversees GPS time, NIST manages civilian timing applications, and its R&D activities contribute to GPS timing and frequency standards in the future. In 2015, the strontium lattice atomic clock demonstrated that it would “neither lose nor gain one second in some 15 billion years.” For comparison, this level of performance is 50 times more precise than the NIST-F2 cesium fountain atomic clock. An example of the strontium atomic clock’s practical application is measuring gravitational shift based on marginal changes in altitude—as marginal as 2 centimeters (NIST, 2015). Additional research at NIST has combined two experimental atomic clocks based on the frequency of ytterbium atoms. The double clock has further increased precision and stability by eliminating a distortion in laser frequency that synchronizes that atoms (NIST, 2016a). Like the potential new applications and discoveries made possible through the strontium atomic clock, the double clock could improve the current performance of PNT as well as enable new capabilities. 2.6 NAVSTAR GPS With Navy and Air Force satellite-based navigation programs advancing in parallel and limited by the availability of resources, the DoD designated that the existing systems be consolidated into one comprehensive system led by the Air Force (Pace, 1995). In December 1973, the Joint Program Office (JPO) approved the concept of NAVSTAR (Navigation System with Timing And Ranging) GPS and incorporated the best features of Transit, Timation, and System 621B (Rip & Hasik, 2002). The new GPS launched its first satellite in July 1974, designated Navigation Technology Satellite 1 (NTS-1), which was a refurbished Timation satellite. NTS-1 carried the first atomic clock into orbit—a rubidium frequency standard. The second (and last) of the NTS series carried the first cesium atomic clock into space (Pace, 1995). GPS was originally planned as a constellation of 24 satellites, which were to be launched in phases, or “Blocks.” Eleven Block I satellites—launched between 1978 and 1985—were launched as the initial developmental satellites to establish feasibility (“Global Positioning System,” n.d.). The Block II satellites were designed and launched to establish operational capacity. The first Block II satellite was launched in February 1989 and featured significant improvements over the Block Is including radiation-hardened electronics, selective availability and anti-spoofing capabilities, and automatic error detection for certain conditions (Pace, 1995). Between 1989 and 1996, 27 Block II and Block IIA (Advanced) satellites were launched. Figure 2-1 presents the number and timeline of GPS satellite launches by Block. All satellites launched before 1996 are now retired. GPS provides two levels of service that operate on different frequencies: 1) the PPS frequency restricted to U.S. Armed Forces, federal agencies, and selected allied armed forces and governments and 2) the SPS available for civil and commercial use (NASA, 2012). The first military test of GPS was carried out during Operation Desert Storm during the Persian Gulf War in 1990. In the vast expanse of the Iraqi desert, military personnel used GPS to navigate the featureless terrain (Space and Missile Systems Center, 2016). Their weapons’ precision and movement capabilities were considered crucial to success in the conflict. Not only did GPS prove to be valuable for military purposes, but it also began to be used in humanitarian operations, such as delivering relief supplies through airdrops. In 1996, President Clinton made good on President Reagan’s promise to make GPS available for civilian use at no cost after Korean Air Lines Flight 007 was shot down in 1983 after flying too close to Soviet airspace (“Korean airliner ‘shot down’,” n.d.). The system’s availability for civilian use sparked new industry segments and applications. Select availability was turned off in 2000.
Methodology Overview The methodology for valuing the economic benefits of GPS’s provision of space-based positioning, navigation, and timing (PNT) signals will be specific to each industry sector included in the study. This methodology overview discusses our general approach, common assumptions, and areas of overlap. Then, each sector has a stand-alone case study with methodological notes specific to the valuation of the benefits GPS delivers to it. To reiterate a point made in the introduction, our goal is not to value PNT services themselves, but to value PNT as it is delivered by the GPS system. This framing allows us to consider other PNT delivery systems as potential alternatives to GPS when developing counterfactual scenarios for benefits estimation. When GPS became available, a variety of delivery systems were in place that provided PNT services, including NIST time, Loran-C, and OMEGA. 3.1 Conceptual Approach to Valuing Economics Benefits The economic assessment characterizes the benefits of GPS by how it improves production methods, improves product attributes, or both. To illustrate these distinctions, consider the following three technology examples: 1. Improved production: A mining company uses GPS positioning to increase the efficiency of its transportation and hauling activities. The process now produces an identical commodity at a lower cost. 2. Improved product: The telecommunications industry uses GPS precision timing to synchronize its towers. This reduces/eliminates dropped calls and increases the bandwidth, enabling more advanced networks such as 4G LTE and 5G. 3. Both improved: The electricity industry uses GPS frequency to synchronize its phase measurement units, which in turn reduce transmission and distribution losses (improved production of electricity) and increase system reliability (improved product). Figure 3-1 provides a simple graphical depiction of the three scenarios, illustrating how the market impacts of these technological innovations differ. In the first example, production costs have lowered, shifting the supply (marginal cost) curve to the right. In the second example, the net benefit to consumers is now greater, shifting the demand curve to the right. In the final example, both curves have shifted to the right: we refer to this sort of technological innovation as a “market-spanning” innovation; it changes both the supply and demand curves in the market. Each of the three chosen examples increases total welfare, measured by the area above the supply curve and below the demand curve. Thus, conceptually the benefit of GPS is measured by the incremental welfare area generated by the shift in the curves. The graphs in Figure 3-1 illustrate improvements in production and/or in quality for an existing product or service. For example, prior to GPS, the United States had a highly functional electricity system serving all customers. GPS then lowered the cost and increased the quality of electricity service.
However, in some instances, it can be claimed that certain products or services would not be possible or would not have been developed without GPS. For example, most of the location-based apps popular with consumers today would not exist without the free and ubiquitous precision location capabilities provided by GPS. In this instance, the entire welfare triangle above the supply curve and below the demand curve can be attributed to GPS. As we have already noted, the exact method for quantifying the benefits of GPS will be different for each sector and may also differ for specific products and services within a given sector. Thus, each sector chapter will have its own economic benefits methodology section. 3.2 Counterfactual A: In the Absence of the Availability of GPS for Civilian Use Economic impacts are measured relative to a counterfactual scenario that describes what otherwise would have been in place or would have occurred in the absence of the technology being analyzed. For this study, developing a counterfactual means answering two key questions: 1. What did each sector of interest use before GPS was available (if anything)? 2. In the absence of GPS, are there other technologies that would have evolved or been invented to provide some of the same services that GPS provides? To answer these questions, we conducted research and scoping interviews with sector-specific GPS experts to understand the technology landscape in the late 1980s and early 1990s when the private sector first began leveraging GPS for commercial applications. In general, the feedback was that some industries (e.g., agriculture) would have continued using the same technologies for their PNT needs that had been used before GPS was available. Other sectors (e.g., telecom) were actively exploring alternative technologies at the time when GPS was adopted; both scenarios provide insight into what technologies might otherwise have been used in the absence of GPS. One such alternative technology is Loran-C, a land-based PNT system that was originally developed for marine navigation purposes (Justice et al., 1993). Additionally, as recently as the 2000s, both government and the private sector were researching and testing an enhanced Loran system known as eLoran, although it was never made operational. Both Loran-C and eLoran provide the same kind of timing and frequency signal as GPS, but in most cases, Loran is less accurate and precise (see Table 3-1). Additionally, the evolution of Loran-C into eLoran over time would have been different from GPS, potentially affecting the development of some commercial applications. For most of the sectors, our counterfactual assumption is that in the absence of GPS a Loran-based network (similar to Loran-C) likely would have received more investment to fully cover the U.S. This would have been used for many of the same applications that rely on GPS today. Note that it is possible that many applications across several sectors would be able to leverage a Loran-C signal to achieve the same benefits that are experienced using GPS today, effectively eliminating the benefits of GPS for those applications under this counterfactual scenario. 3.3 Counterfactual B: An Unexpected 30-Day Outage of the GPS System Under a 30-day outage of GPS scenario, we assumed that neither Loran-C nor eLoran would be available as a backup. Neither one of these systems is operating today, nor could these systems be implemented within the 30-day time window of our analysis. We also assumed that other international global navigation satellite systems (GNSS), such as GLONASS or Galileo, would also not be available for use within the 30-day time window. Hence, the counterfactual for the 30-day failure of GPS is simply the quality/reliability of each sector’s current backup system. These systems include backup timing systems and the associated level of holdover these clocks/systems may have. For most positioning applications, the location-based GPS functionality would not be possible at all, forcing the sectors to revert to pre-GPS alternative processes. 3.4 Approach for Selecting Industry Sectors Because of the vast and growing number of applications reliant on GPS, the study needed to down-select to the key sectors whose use comprises the bulk of the economic benefits. Our approach to selecting sectors was based on the following criteria: Table 3-1. Precision and Accuracy Performance of Loran-C, eLoran, and GPS Loran-C eLoran GPS Frequency 1 x 10-11 frequency stability 1 x 10-11 frequency stability 1 x 10-13 frequency stability Timing 100 ns 10–50 ns 10 ns Positioning (meters) an 18–90 m 8–20 m 1.6–4 m a The positioning accuracy of each of these technologies varies widely by type of receiver and augmentations being applied. The accuracy quoted here for GPS is from the GPS Wide Area Augmentation System (WAAS) 2008 Performance Standard. Sources: Narins et al. (2012); Curry (2014); Celano et al. (2003); GPS.gov Need for precision: For both position and timing, the level of precision needed for the GPS application for the industry/sector was ranked as high, medium, or low as follows – Position
High: less than +/− 1 meter • Medium: +/− 1 meter to 10 meters • Low: greater than +/− 10 meters – Timing • High: less than +/− 1 microsecond • Medium: +/− 1 meter to 1,000 microseconds • Low: greater than +/− 1-millisecond Alternatives: In the absence of GPS, are there technology, behavioral, or process options available to achieve the associated function? – Yes: Alternatives to GPS are available. They might be less efficient, but the industry/sector would not be dramatically affected. – No: The function or application that GPS enables would not be possible. – Costly: Alternatives are available but at a significantly higher cost or loss of efficiency/functionality. Scale: What is the size of the industry or market for GPS applications? – Large: Large industry/application size with significant market penetration. – Medium: Either industry/application size or market penetration is modest. – Small: Both industry/application size or market penetration is modest/small. Table 3-2 summarizes the assessment of the criteria for the industries with significant GPS applications. This table was based on an assessment and review of available literature, and in some instances, individual ranking (high, medium, low) were changed based on further research and scoping interviews. We finalized the key industries to be included in the detailed analysis and present the methodology for quantifying economic impacts along with detailed counterfactuals for each selected industry. Based on our assessment and discussions with industry and government agencies, the following focus sectors were selected: agriculture, electricity, financial services, location-based services, maritime, mining, oil, and gas.
Section 3 — Methodology Overview Summary of Precision Need, Alternatives and Application Scale by Industry Sector Industry/Sector Need for Precision Position Need for Precision Timing Alternatives to GPS Potential Scale of Impacts Aviation Low Medium Yes Medium Maritime transportation Medium Yes Large Rail transportation High No Medium Road navigation/telematics Medium Yes Large Agriculture High–Medium No Large Conservation Low Yes Small Forestry Low Yes Small Surveying High Costly Large Public safety and disaster relief Low Yes Medium Mining Medium Yes Medium Oil and gas sector Medium Yes Medium Electricity sector High Costly Large Construction and mining Medium-low Yes Large Space Medium No Small Finance Low Yes Large Telecommunications High Costly Large, surveying, telecommunications, telematics 3.5 Approach for Quantifying Economic Benefits by Sector Because of the variety of sectors included in the analysis, we employed several different methods to estimate the benefits delivered by GPS. These valuation approaches can generally be grouped into the following categories: Changes in production costs: Additional labor, capital, materials, or energy is needed to produce the same product or service. For example, GPS improves vehicle fleet management and logistics, reducing fuel costs, and increasing utilization of the existing fleet. Changes in productivity and/or revenue: For example, precision agriculture increases crop yield which can be valued at market prices. Willingness to pay (WTP): WTP is a stated preference approach where individuals or businesses are asked to value a service, activity, or product attribute. For example, what are consumers willing to pay for location-based services/apps via their smartphones.
In almost all sectors GPS helps lower production costs high levels of precision at very low costs. In some sectors, GPS enables totally new products and services and can be valued by increased revenue. If it is likely that new services are generating significant consumer surplus above market price, a willingness to pay approach is used. Some sectors (such as maritime) used multiple approaches to value different benefits in different subsectors (commercial fishing: lost revenue, recreational boating: WTP, navigation in seaways: increased operating costs). Details on individual valuation approaches are provided in each sector section. All dollar values are presented in real, 2017 terms except where on the TableSummary of Benefits Valuation Approach by Sector Sector Changes in Production Costs Changes in Productivity and/or Revenue Willingness to Pay Agriculture X X Surveying X Telematics X Location-based services X Mining X Oil and gas X X Telecommunications X Electricity X X Financial services X Maritime X X X 4-1 4. Telecommunications Sector The telecommunications industry relies on continuous, error-free information transfer across a large country and a myriad of independent network operators, all of which requires sophisticated synchronization systems. Today, this synchronization is predominantly accomplished by leveraging precision time and frequency signals from GPS. GPS functions as a common source of synchronization for the entire industry. Because it was critical to unlocking advanced wireless networks with the implementation of 4G LTE, we estimate the economic impact of GPS to range from $81 billion (based on firms’ willingness to pay for spectrum) to $686 billion (based on consumers’ willingness to pay for increased bandwidth and speeds). During a 30-day outage of GPS, we estimate the economic loss would range from $5.5 to $14.2 billion. This analysis considers the role of GPS in both the wireline and wireless telecommunications networks but does not consider other telecommunications services such as home internet services, cable television services, and broadcast radio and television. 4.1 Sector Introduction and Overview Although telecom network operators have used other sources of precision time and frequency in the past and still use atomic clocks extensively, the network infrastructure has evolved to rely heavily on GPS. GPS is a free, ubiquitous signal that can be captured with relatively inexpensive equipment from anywhere in the world—something that cannot be said for any other source of precision timing. As the demand for ever more sophisticated and high-performance telecom services has grown, the technology has evolved around GPS. This is especially true in wireless networks, which often do not have access to a precision timing signal from the wireline network. Although other technologies exist to meet the needs of network providers, none are widely implemented or available. The result is a critical infrastructure (telecommunications) that is heavily reliant on a single source of precision timing. 4.1.1 The Role of Precision Timing in Telecom Precision timing enables a number of telecom services, including the synchronization of traffic between carrier networks and across wide geographic areas, initializing calls between wireless handsets, wireless handoff between base stations, carrier aggregation, directional antennas, and adaptive transmission power control, and billing management. On wireless networks, higher levels of precision timing enable service providers to increase bandwidth and handle more devices within the same infrastructure and wireless spectrum as technology evolves. Table 4-1 details the level of precision timing required for both wireline and wireless networks by standard-setting bodies, including the International Telecommunications Union (ITU), the European Telecommunications Standards Institute (ETSI), and the Alliance for Telecommunications Industry Solutions (ATIS).4 4 This is an abbreviated version of a table in ATIS (2017), a report that details the timing requirements of the telecom sector and examines the vulnerabilities posed by reliance GPS without a backup system. Economic Benefits of the Global Positioning System (GPS) 4-2 Table 4-1. Timing Precision Requirements in Telecommunications Application Precision Needed Wireline (sources of timing) PRTC (primary reference time clock) ±100 ns with respect to Coordinated Universal Time (UTC) UTC (enhanced primary reference time clock) ± 30 ns with respect to UTC Wireless CDMA2000 ± 3–10 µs TD-SCDMA ± 3 µs WCDMA-TDD (NodeB TDD mode) ± 2.5 µs W-CDMA MBSFN ± 12.8 µs LTE MBSFN < ± 1 µs (spec. still under study) W-CDMA (Home NodeB TDD mode) Microsecond-level accuracy WiMax ± 1–1.43 µs LTE-TDD (wide area base station) 3 µs Source: Curry (2010). X Failure of GPS would cause failure within the indicated time period. ○ Failure of GPS may cause degradation of service within the indicated time period. ● Failure of GPS would not affect service within the indicated time period. Section 4 — Telecommunications Sector 4-15 Table 4-7. Wireless Base Station Timing—Holdover Capability (FDD Systems) Mobile Base Station.
Failure of GPS may cause degradation of service within the indicated time period. Failure of GPS would not affect service within the indicated time period. The most important qualifier in this analysis is that there is a large degree of uncertainty around the impact of a 30-day outage for two reasons. First, such a catastrophic failure has never happened before, making the outcome unpredictable. Second, the resilience of different parts of the network will vary because the GPS receiver and holdover equipment installed varies depending on several factors, including the network operator’s risk tolerance, the criticality of maintaining service in some areas (e.g., New York City vs. a remote desert region), and the vintage of the receiver and holdover equipment. Despite the uncertainties, some key areas of agreement emerged among the subject matter experts. First, 62% of experts who responded to questions about a 30-day outage agreed that the wireline network would remain largely unaffected by a GPS outage. GPS is used in the wireline infrastructure primarily to discipline cesium and rubidium clocks. Although rubidium clocks cannot supply holdover for a full 30 days, all cesium clocks can provide well over 30 days of holdover. Additionally, according to industry Economic Benefits of the Global Positioning System (GPS) 4-16 estimates, 95% of GPS receiver equipment installed on telecom networks supports wireless infrastructure rather than wired infrastructure (ATIS, 2017). Because we expect that the wireline network will be mostly unaffected, we excluded it from our analysis of the economic impact of a 30-day outage. The second key area of agreement is that most wireless infrastructure is equipped with oscillators with sufficient holdover to run for 24 hours unaffected by a GPS outage. This is necessarily a generalized conclusion—the exact timing control algorithms and equipment employed are typically proprietary and closely guarded by telecom network operators (ATIS, 2017). The exception to this general conclusion is if a wireless base station is operating LTE-TDD, which is more heavily reliant on GPS and may begin to degrade more quickly. As mentioned previously, reliable data on LTE-TDD penetration is unavailable, but subject matter experts felt that LTE-TDD makes up a relatively small portion of the infrastructure. The third area of agreement is in how the impacts early on in the outage would manifest. After the first 48 hours, all of our subject matter experts agreed that wireless base stations would see a degradation in the quality of service characterized by failure of handovers from one base station to another, increased call drops and lost frames in a video, and a general slowdown in data speeds as base stations cope with a degrading timing signal and mobile handsets can no longer draw data from two base stations to increase bandwidth speeds. All experts agreed that, eventually, all handovers would fail and a user would have to remain stationary to have any hope of maintaining a connection (albeit still degraded). Handovers rely on two base stations being able to agree on the precise time to complete a successful handoff. Finally, after some time of steady degradation of the quality of service, wireless service would cease to function altogether. Perhaps the most significant area of uncertainty and disagreement after the first 24 hours of an outage is the pace at which the wireless service would degrade over the remainder of the 30-day outage and whether the wireless service would fail altogether before or after the 30-day mark. One expert thought that wireless networks would fail completely after about 2 weeks, while several others thought that some service (most likely voice and text service only) would still remain at the end of 30 days. To translate qualitative descriptions of what might happen to wireless networks in the event of a 30-day outage into something useful for making quantitative estimates of economic impact, we took the following steps: 1. Seven of the individuals we interviewed were willing to offer opinions on what might happen in the event of a 30-day outage. For each of these, we mapped their qualitative inputs to individual curves representative of the condition or functionality of the wireless network as a percentage of normal service levels over the 30-day period. 2. We averaged the resulting curves together to derive an average estimate of the impact of a 30-day GPS outage on wireless networks. 3. We calculated one standard deviation above and below the average to represent the range of uncertainty across the subject matter experts that offered opinions. Section 4 — Telecommunications Sector 4-17 Figure 4-4 illustrates how we expect the impact to progress over the course of the outage period on average. The shaded area on either side of the bold line represents the range of uncertainty in our analysis. On average, we expect very few impacts on wireless infrastructure for the first 48 hours, after which service quality will degrade quickly as handovers become less reliable and data speeds continue to drop. After approximately 4 days, we expect service to continue to degrade, albeit at a slower pace. It is important to reiterate that these findings are highly uncertain estimates of what might happen in the event of a catastrophic outage that has never happened before. We made every attempt to represent the range of uncertainty when possible, and results should be treated cautiously. We used average revenue per wireless user as a proxy for WTP for wireless, estimating that wireless telecom providers earn $552.3 million in revenue every day. Using this data point, we estimated the economic loss associated with a 30-day outage of GPS by reducing the expected daily revenue based on the estimated functionality of the network, which we describe in Section 12.5.1. Table 4-9 presents the estimated damages per day over the outage period. Figure 4-5 graphically represents the same data in cumulative form. Measured by ARPU as a proxy for WTP, a GPS outage would result in damages of $5.5 to $14.2 billion in damages. Figure 4-4. Impact of 30-Day GPS Outage on Wireless Network Functionality (based on expert qualitative opinion) Source: RTI estimates 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Wireless Network Functionality (%) Days since loss of GPS signal Economic Benefits of the Global Positioning System (GPS) 4-18 Table 4-9. Estimated Economic Damages in Telecom due to 30-Day Outage Day Damages per Day (million) Low Average High 1 $0.0 $37.3 $108.0 2 $0.0 $61.3 $143.4 3 $36.1 $151.5 $266.9 4 $41.3 $221.9 $402.6 5 $62.3 $234.2 $406.2 6 $82.3 $246.5 $410.7 7 $101.2 $258.8 $416.4 8 $118.9 $271.1 $423.3 9 $135.2 $283.4 $431.7 10 $149.9 $295.7 $441.5 11 $163.1 $308.0 $453.0 12 $174.6 $320.3 $466.0 13 $184.6 $332.6 $480.7 14 $193.0 $344.9 $496.9 15 $200.0 $357.2 $514.4 16 $211.3 $364.3 $517.3 17 $221.7 $371.3 $520.9 18 $231.3 $378.4 $525.5 19 $239.9 $385.4 $531.0 20 $247.5 $392.5 $537.4 21 $254.2 $399.5 $544.8 22 $260.0 $406.6 $553.1 23 $264.8 $413.6 $562.4 24 $268.7 $420.7 $572.6 25 $272.4 $422.3 $572.2 26 $276.1 $424.0 $571.9 27 $279.7 $425.7 $571.7 28 $283.2 $427.3 $571.5 29 $286.6 $429.0 $571.4 30 $290.0 $430.7 $571.3 Total $5,529.90 $9,816.21 $14,156.66 Source: RTI analysis Section 4 — Telecommunications Sector 4-19 Figure 4-5. Estimated Economic Damages in Telecom due to 30-Day Outage Source: RTI analysis As discussed earlier, telecommunications is considered a general-purpose technology with broad applicability across the economy that unlocks significant productivity gains. Thus, it is reasonable to expect that the loss of such services would result in a significant negative economic impact. 4.5 Future Applications The telecom industry is currently in the middle of early-stage implementation of the next-generation 5G wireless network standard. The first spectrum auctions were scheduled for the end of 2018, and the first commercially available 5G-capable mobile devices are expected in 2019. 5G technology marks a significant advancement in the performance and underlying structure of the wireless telecommunications networks in the United States. 5G will leverage higher density placement of smaller cells operating at high frequencies to maximize the bandwidth availability. To coordinate an increasing number of base stations and make efficient use of spectrum, 5G networks will require even more stringent precision timing and will continue to reduce wireless networks’ use of frequency and increase its reliance on precision time. 4.6 Concluding Remarks GPS came along at a time of significant evolution in the telecom sector and played a critical role in the digitization of telecom infrastructure and the advent of wireless technology. We estimate the economic impact of GPS in the telecom sector to be $81 billion to $686 billion. In the event of a 30-day outage, our research suggests that while the wireline networks would remain largely unaffected, wireless networks Economic Benefits of the Global Positioning System (GPS) 4-20 would quickly begin to degrade in quality of service and would result in anywhere from $5.5 to $14.2 billion in damage over the course of 30 days. Looking forward, wireless technology continues to evolve in ways that increase its reliance on highly precise timing, which in turn increases reliance on GPS. Multiple technological trends—from autonomous cars to the internet of things—will be stretching wireless technology to new limits in the coming years; recognizing the role that precision timing plays in enabling the next decade of technological development in telecom highlights just how important the reliability and security of GPS is. 5-1 5. Precision Agriculture The agricultural sector uses the precision location information provided by GPS to improve agricultural mechanization and efficiency. In agriculture, efficiency refers to the ability to produce more food, feed, and fiber per unit of labor and other inputs (e.g., seeds, chemicals). Precision agriculture, abbreviated as PA in this case study, is a concept that refers to the ability for farmers to conduct site-specific management—to observe, measure, and respond more precisely to intend infra field variability in crops. Before GPS was available for commercial use, farmers had few technologies that allowed them to proactively manage their fields according to the fields’ spatial characteristics. GPS played an essential role in the advent and continued adoption of PA technologies and methods. The retrospective impact of GPS, net of adoption costs, is conservatively about $5.8 billion. If GPS were not available for civilian use, farmers would have continued managing their operations as before, planting, and harvesting as they have done historically without the benefit of automated steering or the ability to make decisions based on site-specific information. This counterfactual is not entirely speculative; many farmers today do not use GPS and still farm this way. An alternative system with less accuracy, such as eLoran, could have helped farmers take advantage of some PA technologies such as aerial spraying, coarse yield, and soil mapping, or certain kinds of variable-rate technologies (e.g., applying fertilizer to more precisely meet site-specific crop needs), but without GPS these would have been less effective and provided fewer benefits. These technologies would also likely have taken longer to develop. The proportion of farmers using PA technologies has increased steadily over the last three decades, and these technologies are now used on the majority of U.S. farmland. In the event of a 30-day failure of GPS today, there would be a significant planting delay and adverse impact on yields for many farmers, especially the large, mechanized farmers that have fully embraced PA. Many tractors, combines, and other equipments have GPS technologies integrated into their systems. These farmers would face a steep learning curve and significant efficiency losses trying to either retrofit or operate this equipment without GPS, or they would return to earlier ways of applying inputs. The impacts would be highly dependent on the time of year, with the largest impacts expected during planting seasons. We estimate that in a worst-case scenario the economic loss would be more than $15 billion if it occurred during the planting season. 5.1 Sector Introduction and Overview GPA-assisted PA technologies allow farmers to manage inputs such as seeds, agrochemicals, and fuel more efficiently, increase yields and reduce farmworker fatigue and errors. The three most common categories of these technologies are yield and soil mapping, machinery guidance and control systems, and variable-rate technologies (see Table 5-1). 5-2 Economic Benefits of the Global Positioning System (GPS) Table 5-1. Three Most Common Categories of GPS-Enabled Precision Agriculture Technologies Application Precision Needed Co-technologies Benefits: Qualitative Description Counterfactual Technical Impact Metric Economic Value Metric Potential Magnitude of Impacts Yield and soil mapping 10 m GPS + combine yield monitor GPS + soil sampling data GPS + mapping software Helps farmers more intensively manage their fields; allows farmers to make more informed planting and input application decisions, including how much and where to apply agrochemicals, plant seeds, and irrigate Collecting yield data using sensors and without mapping or mapping using alternatives to GPS. Changes in crop yield, input costs (e.g., seeds, fertilizer), and overhead costs (e.g., labor, capital). Environmental benefits include reductions in greenhouse gas (GHG) emissions and nutrient loads in waterways. Additional net returns on adoption vs. non-adoption in an area where the technology was applied. Value of ecosystem services from applying fewer agrochemicals. Medium Tractor and combine guidance system 5 cm–1 m depending on use GPS + navigation tool (e.g., parallel swathing) Allows farmers to more precisely apply inputs and harvest crops while reducing overlap and/or skips within a field. Also reduces machine operator error, operator time, operator fatigue, and multitasking Manual steering of tractors and combines. Apply inputs manually based on markers such as a mechanical marker on a planter or harvester or foam marker on a sprayer. Changes in crop yield, input costs (e.g., seeds, fertilizer), and overhead costs (e.g., labor, capital). Environmental benefits include reductions in GHG emissions and nutrient loads in waterways. Value of additional net returns on adoption vs. non-adoption area. Value of ecosystem services from applying fewer agrochemicals. Medium Variable-rate technology 10 cm–1 m depending on use GPS + variable-rate planter drive GPS + variable-rate spreader drive GPS + variable-rate applicator Allows farmers to apply inputs (e.g., seeds, agrochemicals) at predetermined rates at different locations in a farmer’s field Adjust inputs manually or apply at one rate throughout the field. Changes in crop yield, input costs (e.g., seeds, fertilizer), and overhead costs (e.g., labor, capital). Environmental benefits include reductions in GHG emissions and nutrient loads in waterways. Additional net returns on adoption vs. non-adoption in an area where the technology was applied. Value of ecosystem services from applying fewer agrochemicals. Medium Section 5 —Precision Agriculture 5-3 GPS-assisted yield and soil mapping quantifies and maps information pertaining to yield and/or soil variability throughout a field. Farmers can use this information along with other farm-specific information (e.g., soil, climate, pests) to diagnose issues within the field and respond proactively. GPS-assisted machinery guidance and control systems automatically steer farm equipment in a predetermined path to help farmers reduce overlaps or skips or have built-in input control valves to avoid applying inputs where they are not needed (e.g., headlands). Aerial spraying, or crop dusting, is another GPS-assisted technology that has transformed the way that agrochemicals are applied to agricultural fields. GPS-assisted variable-rate technologies enable farmers to vary the timing and rate at which they apply inputs such as seeds and agrochemicals to more precisely meet their crops’ needs. Many different technologies fall into one of these three broad categories. These categories are also not mutually exclusive; farmers also frequently employ these techniques in combination with each other. PA in large-scale farming in the United States became possible when the NAVSTAR GPS system became available for civilian use in the early 1990s. The development of the first PA technologies preceded the civilian availability of GPS, but they were limited to small field boundaries marked with posts or flags. For a short time before GPS, radar positioning systems were used as location devices for agricultural applications (mostly for research), but the systems were cumbersome and needed radar posts to function (Tillett, 1991). When GPS became available for civilian use, there were some limits to its precision. Fortunately, differential GPS (DGPS) was introduced in the late 1990s, which improved location accuracy, thereby paving the way for increased precision in agrochemical applications and enabling automated steering in farm vehicles. Because some applications require higher precision than others (Table 5-1), DGPS became essential to the widespread adoption of PA. The annual PA dealership surveys of crop input dealers, sponsored by CropLife and Purdue University, detail the current state and trends of the industry (Erickson et al., 2017). Retailers expect their market areas to expand for all PA uses (Table 5-2); they expect some categories of technologies to expand more than others including variable-rate technologies and some new and emerging GPS-enabled technologies such as unmanned aerial vehicles, satellite or imagery, and data storage and analysis. Producer Use of Precision Technologies, Current and Projected Market Area Precision Agriculture Technologies Category Estimated Market Area, % 2017 2020 Projected 3-Year Growth Guidance/autosteer Tractor and combine guidance system 60 72 20 Field mapping Yield and soil mapping 45 61 36 Grid or zone soil sampling Yield and soil mapping 45 62 38 VRT lime application Variable-rate technology 40 51 28 (continued) Economic Benefits of the Global Positioning System (GPS). Producer Use of Precision Technologies, Current and Projected Market Area (continued) Precision Agriculture Technologies Category Estimated Market Area, % 2017 2020 Projected 3-Year Growth VRT fertilizer application Variable-rate technology 38 54 42 Planter adaptations to improve precision N/A 22 37 68 Satellite or aerial imagery Yield and soil mapping 19 33 74 Cloud storage of farm data N/A 14 32 129 Variable down pressure on planter Variable-rate technology 14 28 100 Variable-rate technology seeding Variable-rate technology 13 30 131 Any data analysis service N/A 13 30 131 Soil electrical conductivity mapping Yield and soil mapping 9 17 89 Variable hybrid placement within fields Variable-rate technology 7 19 171 UAV or drone imagery Yield and soil mapping 6 22 267 Y drops on fertilizer applicator N/A 6 16 167 Telematics N/A 5 12 140 VRT pesticide application Variable-rate technology 3 13 333 Chlorophyll/greenness sensors for Nitrogen management Variable-rate technology 3 10 233 Source: Adapted from Erickson et al. (2017). The U.S. Department of Agriculture’s (USDA’s) Agricultural Resource Management Survey (ARMS) has been surveying farmers about PA adoption for several major crops since 1996 and is the best source of data for tracking the national-level PA technology adoption rates over time. The most comprehensive study to quantify the net benefits of adopting PA across the United States was conducted by USDA’s Economic Research Service (ERS) (Schimmelpfennig, 2016). This study used national-level data from ARMS to calculate the net benefits of the three categories of PA technologies, finding that net returns for corn and soybean farmers increased by 1 to 2% with the introduction of PA, depending on the technology. Percentage Change in Profits from Adopting Specific Technologies Technology GPS Soil/Yield Mapping Guidance Systems Variable-Rate Technologies Net returns (including overhead) impact of precision technology 1.8% 1.5% 1.1% Operating profit impact including farm size scale effect 2.8% 2.5% 1.1% Source: Schimmelpfennig (2016). Section 5 —Precision Agriculture 5-5 Another ERS study from the same year used ARMS data to estimate savings in variable production costs from PA. This study estimated variable per-acre cost savings of variable-rate technologies from $13 to $21 depending on the scenario, but these effects did not take into account the capital investment costs or yield effects (Schimmelpfennig & Ebel, 2016). Other studies have quantified the benefits of PA in specific regions or with respect to crops in different time periods.10 Other studies of economic impacts are extrapolated from site-specific data. Recent estimates of GPS’s impact on the commercial agricultural sector include Pham (2011) and Leveson (2015); the former estimated that GPS provided the agricultural sector benefits of $19.9 billion per year in 2010, and the latter estimated a range from $10 to $17.7 billion in 2013, both extrapolating from other original data collection efforts. Both studies are countrywide, but neither study accounted for investment costs, which are significant. Also, at the time that these studies were developed, no comprehensive analyses or models of adoption across the United States existed, so both studies had to use simplified assumptions derived from a range of studies from different sources based on various time ranges, crops, and geographic contexts. 5.2 Sector Applications PA technologies are often used in conjunction with each other. Farmers use soil and yield mapping to gain insights on the relationship between soil and land characteristics and yields to make decisions about their input use for the following season. This technology is sometimes used in conjunction with variable-rate technology and/or equipment auto-guidance systems to vary input rates according to spatial characteristics and to automate their application. The market penetration rates of the three categories of PA adoption based on ARMS data and Schimmelpfennig (2016). These rates are shown in the addendum to this section and can be summarized as follows: Guidance systems are used on 45 to 50% of acres for all crops surveyed except cotton and winter wheat, making it the GPS application with the widest adoption.11 Yield mapping has been adopted on 30% or more of cropland acres for corn and soybeans as of 201212, although it is 20% or lower for peanuts, rice, spring wheat, and cotton. Soil mapping has been highly variable over time, with adoption mostly decreasing from 2000 to 2005, only to increase again from 2005 to 2013.13 VRT adoption was above 20% for corn, soybeans, and rice in the most recent survey years but lower for peanuts, spring wheat, winter wheat, and cotton. 10 A compilation and analysis of 108 research studies found that 63% of precision farming applications had positive net returns, 11% had negative returns, and 26% had mixed results (Lambert & Lowenberg-DeBoer, 2000). 11 Based on data available from 2007-2013. It is likely that guidance systems were applied to more than 45% of acres for all crops after 2014 based on extrapolated ARMS data, Erickson et al. (2017), and stakeholder interviews. 12 Based on extrapolated ARMS data. 13 ARMS tracks the adoption of yield and soil mapping separately, but Schimmelpfennig (2016) categorizes them together and calculates their net benefits conjunctively. Economic Benefits of the Global Positioning System (GPS) 5-6 Figure 5-1 shows adoption on planted acres over time for corn and soybeans for the three high-level categories of GPS-assisted technologies: GPS-enabled soil and yield mapping, guidance or autosteer, and VRT, as reported by the USDA’s ARMS survey. Farmers have increased adoption over time. Guidance systems were the most widely adopted GPS-enabled PA technology used, followed by yield and soil mapping and variable-rate technology.
5.3 Methodological Notes To quantify the economic impacts of GPS for the agricultural sector, we first quantified the additional net benefits that GPS-assisted technologies provide to adopters vs. nonadopters and applied those benefits to the percentage acreage where PA has been applied over space and time. The approach is relatively straightforward because there are many PA adopters and nonadopters in the United States, and the USDA has recently compared the net returns between the two groups. Figure 5-1. Adoption Rates of Precision Agriculture Technologies in Corn and Soybeans Note: Data past vertical blue line (2010) are extrapolated for corn; data past vertical orange line (2012) are extrapolated for soybeans (a value of 30% was used for soybean yield mapping as depicted in Figure 3 of Schimmelpfennig ). Linear extrapolation was used, although capped at the highest recent estimated use of PA technologies based on either ARMS or Erickson et al. (2017, Figure 9). Yield and soil mapping are shown separately in the figure but combined into one category for the economic analysis to be consistent with the way that Schimmelpfennig (2016) reports PA impacts on profits. Section 5 —Precision Agriculture 5-7 5.3.1 Approach for Quantifying Retrospective Benefits The counterfactual for this scenario is to assume that without GPS PA would have only progressed marginally with Loran, but it would have been much more limited in its scope of applications compared with GPS. We assumed that in the absence of GPS farmers would primarily continue to farm using currently available technologies in the way that nonadopters do today. Note that we do not value benefits for PA technologies that do not rely on GPS or require the high levels of precision accuracy that GPS offers.14 Also, because a counterfactual technology would have had time to evolve in the absence of GPS, we assumed no increase in price volatility or disruption in global agricultural markets. However, it is likely that reduced yields could have led to higher bulk commodity prices and an associated decrease in consumer welfare. The USDA’s ERS commissioned the first study that used an empirical model to estimate the net returns and operating profits of PA from a nationally representative sample of corn farms (Schimmelpfennig, 2016). They used a robust model that considers total net returns, including overhead costs, input costs, and yield changes of PA adopters vs. nonadopters. They found that the additional net returns to three PA technologies, namely GPS soil/yield mapping, guidance systems, and variable-rate technologies, were 1.8%, 1.5%, and 1.1%, respectively for 2010 corn. The use of GPS for agricultural applications has steadily increased over time as the technologies have improved and farmers have benefitted from adopting them. Because the USDA ERS net benefits are specific to 2010 corn, we surveyed agricultural experts on how these percentages have changed by crop and over time. We then scaled these net benefits (as well as net returns by crop) using the historical adoption rates of PA technology by crop. We used historical data on adoption rates from the ARMS database16 and historical net returns from agricultural census data published by USDA’s National Agricultural Statistical Service (NASS). These data were aggregated into a spreadsheet model that included a time series of PA adoption by crop and by year, as well as net returns by crop and by year. Table 5-4 summarizes the data sources used for the analysis. 14 Some sensor-based technologies, such as the real-time sense and apply (e.g., GreenSeeker variable-rate technology), miscellaneous laser technologies and improvements in aerial spraying either do not require accurate positioning or require much less accuracy than GPS offers. These exceptions would have a marginal effect on our results. 15 These percentages include overhead, labor, and capital costs from investing in GPS technologies. 16Accessible at https://data.ers.usda.gov/reports.aspx?ID=17883. Economic Benefits of the Global Positioning System (GPS) 5-8 Table 5-4. Key Data Sources Variable Source Source of Download Years Total area harvested by crop NASS https://quickstats.nass.usda.gov/. Used USDA/NASS (n.d.) QuickStats selected: period: all years; geo level: national; state: U.S. total; commodity: select crop. date item: acres harvested; domain: total. Annually, 1998 to present PA adoption rates ARMS https://data.ers.usda.gov/reports.aspx?ID=17883. Used USDA/ERS (2019b). Selected: report: precision agriculture; filter by all survey states; row group: all farms; subject: commodity; from year: earliest year available; subgroup: all farms. Variable years (see the addendum to this section) Net benefits of using PA technologies, 2010 USDA/ERS, Commodity Costs, and Returns https://www.ers.usda.gov/data-products/commodity-costsand-returns/. Accessed USDA/ERS (2019a) “Recent Costs and Returns” for corn, cotton, peanuts, soybeans, rice, and wheat on 6/25/2018. Annually, 1998 to present Net benefits of using PA technologies, scaled up to 2017 Schimmelpfennig (2016) and expert interviews N/A Used Schimmelpfennig’s net revenues for 2010 and before. Scaled to expert feedback for 2017. 5.3.2 Approach for Quantifying the Potential Impacts of a 30-Day Outage In the case of a 30-day catastrophic failure of GPS, farming operations could experience significant delays, depending on the timing of the outage, but it would not prevent farmers from planting. Given the high learning costs of an alternative, for this aspect of the analysis, we assumed that an alternative system to GPS could not be deployed insufficient time to take its place. Farming is a seasonal enterprise, so the impact of a GPS failure on the agricultural sector depends on the timing of the failure. A GPS failure during planting season would have a significantly different impact than a failure during the harvesting period or winter months. Experts in the agricultural industry were unanimous that a 30-day outage of GPS would have severely negative impacts depending on when it occurred, with the most severe impacts occurring during planting season. We asked them about how much a GPS outage would affect revenues if it occurred during spring and fall planting and then modeled how those economic impacts might affect the agricultural sector. To model the productivity impacts to the agricultural sector, we multiplied experts’ estimates of average impacts for a bundle of crops (corn, soybeans, spring wheat, winter wheat, rice, peanuts, and cotton) for a 30-day outage to a 5-year average of the value of the proportion of those crops where PA was adopted. We calculated value by multiplying the total crop production values by the average prices for each crop using NASS data. Agricultural impact = (Qc * Sc * Ac) * Pc Section 5 —Precision Agriculture 5-9 where Qc = Five-year production average (2013–2017), by crop Sc = Mean 30-day yield impact estimates from experts Ac = Extrapolated PA adoption by crop, 2017 (ARMS data)17 Pc = Five-year price average, by crop The results give the value of the estimated productivity loss from a 30-day GPS outage for corn, soybeans, spring wheat, rice, peanuts, and cotton, assuming that the outage occurred during the spring planting season when it would have the biggest impacts. 5.3.3 Expert Interviews were critical for obtaining rich descriptive qualitative information on impacts, as well as validating and/or adapting our methodology, data, and sources for estimating impacts. We also used the interviews to solicit both quantitative and qualitative feedback on the impacts of a 30-day disappearance of GPS. Our data collection is somewhat limited in scope because of the wide variety of analysis and data available on PA technologies and adoption. We interviewed 22 experts total, from six universities (largely land-grant universities), 10 private-sector firms, two government agencies, and an advocacy group. These experts included agronomists, agricultural economists, equipment providers, consultants, and technology developers. Informal conversations were also held at the InfoAg conference held in St. Louis, MO, from July 25–27, 2017. This is a premier event for PA. 5.4 Retrospective Economic Benefits Analysis PA experts agreed that GPS has become a critical component of modern agriculture in the United States. They indicated that although PA uses many co-technologies that work together with GPS, GPS was the enabling technology that allowed PA to come into existence at scale. They also believe that it would be appropriate to attribute most of the economic benefits of PA to the availability of GPS. However, respondents disagreed on the extent of those impacts. When presented with ERS’s table on the net benefits from 2010 corn, all but one of the respondents felt that the net benefits were lower than what they would have expected, too conservative, and likely had increased significantly since 2010. When asked about present-day net returns, most provided their own estimates for the current net benefits of PA for a bundle of crops (Table 5-5). Experts who worked in the private sector thought that the additional net returns from PA were higher than university researchers or government employees thought. 17 Our analysis used extrapolated values for the highest adoption rates of a PA technology for each crop in 2017 unless the most recent overall PA adoption rate was higher. See also the addendum to this chapter. Economic Benefits of the Global Positioning System (GPS).
Expert Estimates on Net Returns from Precision Agriculture Guidance Net Return Estimate (N=21) VRT Net Return Estimate (N=20) Soil/Yield Mapping Net Return Estimate (N=20) Mean of expert estimates (standard error) 5.69% (0.0117) 4.85% (0.0114) 5.05% (0.0111) Minimum of expert estimate 1.5% 1.1% 1.5% Maximum of expert estimate 20% 20% 20% Note: Experts were asked to consider corn, soybeans, wheat, cotton, peanuts, and rice. They also believed that although net benefits could vary significantly from one crop to another they expected the benefits to be higher for higher-value crops and lower for lower value crops. Because the crops for which USDA collects adoption rates include a mixture of low- and high-value crops, most experts thought that biases from applying PA net benefits to corn to the other major crops where USDA collects adoption rates (e.g., corn, soybeans, wheat, rice, cotton, peanuts) would be largely offset; therefore, it would be reasonable to apply them similarly across crops in the absence of more specific data. PA experts also described other benefits that are relevant but challenging to quantify. Some cited health impacts such as reduced stress or physical injuries, and many discussed how PA allows farmers to continue working later into their lives and delay retirement. PA allows farmers to apply inputs more quickly, accurately, and in less time, reducing health impacts related to the long hours of driving and concentration associated with operating large farm machinery.18 These benefits are likely even higher during a rainy spring planting season when farmers have less time for planting. GPS-enabled PA, and in particular, automated guidance systems, allows farmers to operate equipment around the clock and plant more quickly than using traditional technologies, which relieves the stress and reduces the risk of missing the planting season window because of extreme or ill-timed precipitation or low temperatures. Lastly, experts discussed several environmental benefits that PA enables, including using VRT to shut off agrochemical use in environmentally sensitive areas and increasing the efficiency of input use so less fertilizer and other agrochemicals enter the soil, water, and air. The guidance allows farmers to minimize overlaps where agrochemicals are applied in the same area twice, and VRT allows farmers to increase the percentage of fertilizer that is used by the crop rather than lost to the environment. In particular, nitrogen fertilizer is one of the primary direct contributors to GHGs from the agricultural sector, and VRT is one tool that allows farmers to apply it more efficiently, thereby reducing GHG emissions. Experts believed that although the majority of PA benefits will come from the six crops analyzed in this study there are some benefits from PA for higher-value horticultural crops. Most experts were reluctant to estimate the 18 The ERS estimates account for the economic impacts of the difference in labor hours and other overhead costs or production differences associated with PA but not for any health impacts, including stress, or the risk avoidance benefits of planting more quickly during an abnormally rainy planting season. Section 5 —Precision Agriculture 5-11 potential impacts of PA on these crops because of the differences between crops and the high spatial and temporal variability as it relates to PA. One expert reported that lettuce in California is probably close to 100% adoption. They started ~ 20 years ago. Open-air lettuce around major cities on the East Coast has much lower adoption since they started 15 years later. How can you compare lettuce in California (3–5 crops/year) with lettuce in the East with one crop per year? And how can you compare kale with tomatoes, onions with carrots? Others pointed to specific geographies and crops where farmers are gaining significant profits from using PA, such as wine producers in California that use PA to optimize grape harvesting, maximizing the quality of and selling price for their wine. One expert argued that the three categories of PA that may be applicable for row crops would not be applicable to horticulture crops: let’s not ask about Guidance, Variable Rate, and even Yield Mapping. We should be talking about much more rudimentary items—Data Logging, Constant Rate, and RealTime Tracking, as these are the items that will dramatically increase efficiency and productivity while decreasing input costs and management needs. We estimate that the total additional net returns since 1998 from adopting PA technologies for corn, soybeans, wheat, rice, peanuts, and cotton are $5.8 billion. This estimate takes into account the adoption of yield and soil mapping,19 guidance systems, and VRT (see Tables 5-6 and 5-7). These benefits assume that farmers who applied PA realized the Table 5-3 increase in net benefits from 1998–2010, and then net benefits increased linearly to 2017 until reaching the net benefit percentages reported in Table 5-5. Guidance systems for corn represent the highest single crop/GPS-assisted PA technology combination ($1.3 billion), and corn represents the highest net benefits overall ($2.9 billion), followed by soy ($1.6 billion), spring and winter wheat combined ($927 million),20 pieces of cotton ($224 million), rice ($96 million), and peanuts ($60 million).21 By technology, guidance systems led to the highest benefits ($2.9 billion), followed by yield and soil mapping ($1.7 billion) and VRT ($1.2 billion). These totals are lower than previous estimates, but they are net of adoption costs, which is important because PA can be a substantial investment. These estimates build off a national-level peer-reviewed study on the impacts of PA that implicitly consider the counterfactual because it compares adopters with nonadopters. 19 Yield and soil mapping adoption rates were averaged to conform to Schimmelpfennig (2016) PA technologies 20 Area harvested for spring and winter were pulled separately, but net returns were the same. 21 Example calculation. Net returns for corn = $139 per acre. Total acres = 81,446,000. Adoption rate = 45%. Additional % returns for guidance systems (2010) = 1.50%. Change in net returns due to guidance systems for 2010 corn= (139.19*1.50%)*(45.17%*81,446,000) = $76.8 million. In cases where farmer's net returns were negative, the increase in net returns was calculated as being less negative. Economic Benefits of the Global Positioning System (GPS) 5-12 Table 5-6. Net Returns from Precision Agriculture Technologies, by Crop, Since 1998 Crop Yield and Soil Mapping ($ million) Guidance Systems ($ million) Variable-Rate Technology ($ million) Total ($ million) Corn $973.6 $1,342.5 $623.7 $2,939.7 Cotton $34.2 $162.5 $27.0 $223.7 Peanuts $12.9 $34.6 $12.3 $59.7 Rice $19.1 $57.3 $20.0 $96.4 Soybeans $508.6 $734.2 $340.9 $1,583.7 Spring wheat $59.1 $215.8 $53.9 $328.8 Winter wheat $77.2 $364.2 $156.0 $598.3 Table 5-7. Net Returns from Precision Agriculture Technologies, by Year Year Yield and Soil Mapping ($ million) Guidance Systems ($ million) Variable-Rate Technology ($ million) Total ($ million) 1998 $18.4 $8.9 $27.3 1999 $41.7 $19.2 $60.9 2000 $46.3 $18.6 $64.9 2001 $35.8 $7.4 $13.4 $56.7 2002 $13.7 $3.1 $6.4 $23.1 2003 $12.0 $5.4 $6.0 $23.4 2004 $6.9 $9.1 $3.9 $19.9 2005 $34.1 $36.5 $16.6 $87.2 2006 $25.2 $39.0 $13.1 $77.4 2007 $23.0 $31.1 $10.4 $64.5 2008 $70.5 $104.9 $34.4 $209.8 2009 $31.2 $56.9 $16.4 $104.4 2010 $84.7 $124.4 $43.1 $252.3 2011 $190.6 $285.9 $112.5 $588.9 2012 $216.9 $369.6 $149.8 $736.3 2013 $131.4 $246.1 $102.3 $479.8 2014 $172.7 $368.5 $146.1 $687.4 2015 $148.4 $351.2 $143.1 $642.7 2016 $214.3 $475.5 $204.1 $893.9 2017 $166.9 $396.6 $166.3 $729.8 Total $1,684.7 $2,911.2 $1,234.6 $5,830.4 Section 5 —Precision Agriculture 5-13 5.5 Potential Impacts of a 30-Day GPS Outage The impact of a 30-day outage in the agricultural sector is highly dependent on the time in which it occurs. PA experts mentioned different potential impacts happening at different points in the year. All agreed that the most damaging impacts would occur during planting season because farmers could be so delayed that they could potentially miss the planting window or plant at a suboptimal time, causing significant yield losses. Loss of VRT would affect the ability to apply fertilizer, and planting speeds would have to decrease, causing further delays. Impacts at other times of the year might affect agrochemical applications or data collected by yield monitors during the harvest, but these impacts would be much less than those that occur during planting. Experts also agreed that a 30-day GPS outage during the planting season would be highly damaging or “devastating”; however, most agreed that farmers are quite independent and capable and would eventually figure out how to plant, even if it meant a yield loss and additional input costs. Experts estimated revenue losses if GPS were to shut off during the planting season. On average, experts estimated 17% with a +/− 6% margin of error for revenue losses across corn, soybeans, wheat, rice, peanuts, and cotton (Table 5-8). Many large growers that have adopted GPS-assisted PA technologies have adopted larger equipment that is less easy to manage without GPS. Experts discussed how many of the planters are not equipped with markers and are too big for drivers to easily track their rows. In the event of a 30-day outage, it would be very challenging for farmers to retrofit their equipment, and many operators do not have recent experience farming this way, causing numerous overlaps, skips, and over- or under applying of inputs. Without VRT, farmers would have to return to a single rate application of seeds, fertilizer, and other inputs, increasing operational costs and leading to lower yields. For many places and crops, there is only a 15-day planting window, and if a farmer plants outside that window, then he will not get optimal yields. In terms of quantity, one expert estimated that farmers lose a bushel per acre when they plant 1 day outside the planting window. Table 5-8. Estimated Yield Impacts from Unexpected 30-Day GPS Outage Spring Outage Yield Loss (N=22) Fall Outage Yield Loss (N=18) Mean of expert estimates (standard deviation) 17% (0.14) 10% (0.07) Minimum of expert estimate 5% 0% Maximum of the expert estimate 50% 25% Note: Estimate across corn, soybeans, wheat, rice, peanuts, and cotton. PA technologies are a nascent industry that depends on building consumer confidence, so it is likely that a 30-day failure would have a negative effect beyond the 30-day time window because farmers would be skeptical of adopting GPS-based systems in the future and would have to invest heavily in backup systems. Economic Benefits of the Global Positioning System (GPS) 5-14 Taking the 17% average losses estimated by experts and applying that to the average adoption rates for the six major crops for which PA is applied over time indicates that farmers would lose an average of $15.5 billion in revenue if the outage occurred during the spring planting season (see Table 5-9). The greatest impacts would for corn and soybeans ($8.5 billion and $5.1 billion, respectively), because they represent the highest value crops in the United States, followed by spring wheat, cotton, rice, and peanuts. These figures assume that the impacts are all related to productivity losses, when in fact revenue losses will most likely come from both lost inputs (e.g., seeds, fertilizer, labor) and losses in yields. They also mask the fact that some farmers might lose their entire crop for the year. Furthermore, unlike some other industries, which might be able to make up some of that lost productivity over time, agriculture is a seasonal enterprise, so any yields (and associated revenue) lost from the agricultural sector would not be possible to recover through increasing productivity in subsequent months. 5.6 Notes on Technology Transfer Federal and local governments have supported the transfer of GPS technology to the agricultural sector in several ways. The USDA’s Agricultural Research Service (ARS) Office of Technology Transfer manages USDA’s technology transfer activities with an explicit focus on transferring USDA’s agricultural research into the marketplace. This work includes administering patent and licensing information for all intramural research conducted by USDA. Table 5-9. Revenue Loss If GPS Failed during the Spring Planting Season Crop PA Applied (A) GPS Outage Yield Shock (B) Overall Yield Shock (A x B = C) Avg. Annual Value (5-year average production revenues) ($ billion) Shock Value (C x D = E) ($ billion) Corn 83% 17% 14% $60.6 $8.49 Cotton 55% 17% 9% $5.52 $0.50 Peanuts 69% 17% 12% $1.35 $0.16 Rice 75% 17% 13% $2.89 $0.38 Soybeans 73% 17% 12% $42.5 $5.12 Spring wheat 83% 17% 14% $3.52 $0.49 Total $116 $15.1 One of the public-private partnerships documented during interviews with stakeholders and in the literature was a successful research effort that led to a public-private partnership between USDA/ARS, University of Missouri, and Dupont Pioneer to develop the concept of Environmental Response Units that help farmers “optimize their seed and fertilizer inputs to match production potential within fields” (Bobryk et al., 2016; USDA/ARS, 2016). According to the USDA/ARS Annual Report on Technology Transfer, Section 5 —Precision Agriculture 5-15 Soil classification with ERU soil maps better delineates soil and landscape characteristics within fields and can better guide the use of precision agriculture variable-rate technologies. Farmers can use these findings to optimize seed and fertilizer inputs that match production potential within fields. Matching input applications to a better-characterized soil resource improves the cost-effectiveness of agricultural production and minimizes field losses of agrichemicals, which furthers production sustainability and natural resource protection. (USDA/ARS, 2016, p. 295). Currently, USDA/ARS is also using remote sensing technologies (which require GPS) to monitor crop production and provide actionable information to farmers. One example is research that enables grape producers to monitor water stress and optimize precision irrigation for their vineyards (USDA/ARS, 2016). USDA has also supported technology transfer through technical assistance and conservation incentive funding through the Natural Resources Conservation Service (NRCS) Environmental Quality Incentives Program (EQIP). EQIP offers this support in two areas related to precision farming: nutrient management and pest management. EQIP provides technical assistance and financial incentives for variable-rate technologies of fertilizer and for GPS-enabled guidance systems for pesticide application. Through this program, local NRCS conservationists provide technical assistance directly to farmers on how to implement these technologies on their farms. There is no publicly available data on exactly how much technical assistance and funding are provided for precision farming applications, but data show that of the dozens of conservation practices supported by the program the two most widely supported by USDA/NRCS are nutrient management and pest management (USDA, n.d.). The number of supported acres for these practices has steadily decreased over time while PA adoption has increased, perhaps suggesting that farmers have increasingly adopted these technologies overtime on their own, obviating the need for additional incentive payments (nutrient management decreased from 1.8 million acres in 2009 to 642,000 acres in 2017). Although less direct, USDA and many state departments of agriculture provide salary support for university-based agricultural extensionists at land grant universities. Agricultural extension workers provide direct support to farmers to transfer technology and research into practical applications for farmers, including PA. Extensionists have close contact and relationships with farmers, in part through conducting farmer workshops, farmer field days, and farm-focused research. As researchers, extensionists strive to provide science-based, unbiased information to farmers, and they are uniquely trusted by farmers. PA involves intensive research on expensive and complex technologies, so this kind of close linkage between university researchers and farmers has been important for the successful transfer of GPS-based technologies, particularly for small- and medium-scale farmers who are less able to take risks on new, sometimes expensive technologies. PA experts pointed to other examples of federally funded research and infrastructure that have supported GPS technology transfer to the agricultural sector: Federally supported infrastructure to test new PA equipment. Multiple stakeholders pointed to the Nebraska Tractor Test Laboratory (https://tractortestlab.unl.edu/), which allows companies to evaluate the performance of new equipment. This service is fee-based, but the infrastructure and Economic Benefits of the Global Positioning System (GPS) 5-16 capital costs were publicly funded. The lab has been designated by the U.S. Department of Commerce as the Designated Authority responsible for the U.S. tractor test program. The role of the National Science Foundation (NSF) in funding university programs and curriculum development for degree programs in geospatial technologies in the agricultural sector (particularly in the Midwest). 5.7 Concluding Remarks GPS enabled the development of PA at scale and has become an integral part of farming in the United States. Almost every new tractor and combine on the market today comes equipped with GPS technology, and precision farming is now widespread. Large commercial farmers heavily rely on GPS to more precisely manage their inputs to increase their yields. The pervasiveness of PA farming has led to significant monetary benefits for farmers over time, including $5.8 billion in net revenues for six major crops over the period of time from 1998 through 2017. These benefits have accrued over time and represent only a portion of the many benefits that PA has provided to farmers, consumers, and the environment. Farmers who use PA can plant more quickly and avoid the health and stress impacts associated with operating heavy machinery for long hours. Consumers benefit from buying food products at lower prices, and there is less contamination to the environment as farmers limit the number of agrochemicals that are lost to air and water sources. However, many farmers have also become reliant on GPS technology and are vulnerable in case there is a disruption or outage to GPS. It would be difficult for these farmers to adjust their farming practices to more traditional techniques in a short period of time. Farmers have relied so much on these technologies that the entire value that GPS has added to the industry could be wiped out if a GPS outage were to occur during the planting season. We estimate that the agricultural industry could lose $15.1 billion across corn, soybean, spring wheat, cotton, and peanuts if such an ill-timed outage were to occur. Both the monetary benefits of GPS to agriculture and the potential losses in the case of an outage are likely significantly larger because of the role that GPS technology and PA have played in supporting additional crops and farming logistics in recent years. The agriculture industry is generally increasing the adoption of PA, and several experts discussed how they expect GPS to be used more in the future; several pointed toward the potential for better managing logistics and enabling the deployment of autonomous vehicles. Many horticultural producers have adopted GPS-assisted PA to improve their efficiency, and others have used GPS to better manage their logistics and operations. The adoption rates and benefits in agriculture are very crop, use, and geography-specific; as such they are not well suited to generalizable monetization. Nonetheless, USDA has collected enough data through the ARMS and other sources to allow us to estimate the additional benefits that PA provides to several major crops in the United States. These benefits are significant and are likely to increase in the future as a result of the growing PA industry, growing adoption, and expansion of GPS-assisted PA into additional uses. Section 5 —Precision Agriculture 5-17 Addendum. Historical Adoption of Precision Agriculture Technologies by Crop (Percentage of Crop Planted Acres) 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017
Historical Adoption of Precision Agriculture Technologies by Crop (Percentage of Crop Planted Acres) (continued) 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 Variable-rate Application Technology Corn 8 12 12 9 10 11 11 12 14 16 18 20 22 25 27 29 31 33 36 38 Cotton 3 4 5 6 6 6 6 6 6 5 5 5 5 5 5 5 5 5 5 5 Peanuts - 1 2 2 3 3 4 5 7 8 10 11 13 15 16 18 19 21 23 24 Rice - - - - - - - - 6 8 10 12 15 17 19 21 23 25 27 29 Soybeans 7 9 7 6 5 6 7 7 8 10 13 15 17 20 22 24 26 29 31 33 Spring wheat 5 3 0 2 5 7 9 10 11 12 13 14 15 16 17 18 19 20 21 22 Winter wheat 2 2 3 4 5 7 8 9 10 11 13 14 16 18 20 23 25 27 30 32 Reported data from Agricultural Resource Management Survey (ARMS) and data from Schimmelpfennig (2016) in blue. Values in green-capped due to the inappropriateness of linear extrapolation. Extrapolated values in black. ARMS data can be found here: https://data.ers.usda.gov/reports.aspx?ID=17883. 6-1 6. Electricity Sector The electricity sector uses the precision timing provided by GPS to synchronize electrical waves in the power grid and detect potential problems and faults in the transmission infrastructure. GPS has been a key factor in making phasor measurement units (PMUs) cost-effective and pervasive in the United States’ electricity infrastructure. In the absence of GPS, the electric utility system would likely have continued to rely on its existing supervisory control and data acquisition (SCADA) systems. However, the use of PMUs enabled by GPS has led to a slight (1 to 2%) decrease in the probability and duration of outages and enhanced generation testing/modeling, resulting in economic benefits of approximately $15.7 billion since 2010. In the event of a 30-day outage of GPS today, a major disruption of the electrical system is unlikely because of safeguards and contingency plans in place. However, the probability of outages might increase. In addition, faults occurring from natural or non-natural events would take longer to identify and repair, increasing the duration of outages. The economic loss is estimated to be approximately $274.8 million from a 30-day GPS outage with little to no physical damage to the system. 6.1 Sector Applications Electricity suppliers can use the precision timing provided by GPS to monitor the daily operations of the power grid down to the nanosecond. This monitoring is conducted by PMUs, which evaluate electrical waves to detect potential problems and faults in the power distribution infrastructure. To realize this realtime monitoring and analysis, a large number of synchronized PMUs, or synchrophasor, are linked to a common time source that enables them to time stamp the dynamics of the electrical system. The time source is the Coordinated Universal Time (UTC), which is provided via the GPS system (Coppolino, D’Antonio, Elia, & Romano, 2011). Historically, power systems using SCADA have depended on its relative time clock features to perform daily monitoring operations. Relative time refers to the timing of triggering events such as a fault or lightning strike relative to their starting points; that is, the zero points are precisely when the event in question occurred (North America Synchrophasor Initiative [NASPI], 2017). This timing capability allows for SCADA’s scan time (frequency of data collection) to range between 1 to 2 seconds and 10 seconds or more (NASPI, 2017). These time scans work well for localized events on small decentralized systems, such as a single generating source supplying a stand-alone grid system. However, they do not work well across a wide-scale interconnected grid system. For many years, it was recognized that the power system data captured by different SCADA systems could be far more valuable if the systems all used a common time standard to “time tag” their measurements. Beginning in the late 2000s, GPS-based PMUs began to be installed in the electrical system to augment the SCADA-based systems for state estimation. Because PMUs collect data at a much higher sampling rate than SCADA systems, the granularity of the data helps reveal new information about dynamic stability events on the grid. Economic Benefits of the Global Positioning System (GPS) 6-2 By 2015, the installed number of PMUs reached approximately 1,800, offering nearly 100% coverage of the transmission system (NASPI, 2017). Figure 6-1 shows the current locations of PMUs on the electricity grid. Although the SCADA system is still the backbone of most system applications in the power sector, with GPS supporting ancillary (nonessential) system operations, this is evolving over time and PMUs are likely to be increasingly integrated into system operations, resulting in efficiency and reliability gains. Synchrophasor technology uses the absolute time to gauge the state of the system. This timing approach both time-synchronizes and time-stamps data against UTC, or local time, available through GPS (NASPI, 2017). As a result, synchrophasors make possible the monitoring of the electrical grid at 30 to 120 time-tagged samples per second, approximately 100 times faster than SCADA (NASPI, 2017). When data will be used to provide automatic control actions, it is imperative that timing remains as accurate, secure, and reliable as possible (NASPI, 2017). Given that these data must possess an absolute time precision of 1 μs (NASPI, 2017), highly accurate monitoring of the power line dynamics in real-time is only attainable by using GPS UTC time stamps. Figure 6-1. Phasor Measurement Units in the North American Power Grid Source: NASPI (2015b). Section 6 —Electricity Sector 6-3 Today’s wide-area grid is highly interconnected. PMUs using GPS have helped make this possible while maintaining resilience. For example, analyses of event observations have shown that during a generator trip situation, frequency is reduced in a proportional way. This reduction, monitored at a certain point, quickly spreads across the transmission lines, thereby showing up in other sites with a certain delay. But with the propagating generator trips evaluation feature of the synchrophasors, the exact location of the event and the power trip misbalancing can be identified, and necessary countermeasures can be subsequently taken. This characteristic allows for the forecasting of serious events such as blackouts and readies the remote power supplier with storage energy sources (Coppolino, D’Antonio, Elia, & Romano, 2011). 6.2 Methodological Notes Our approach to assessing the economic impacts was to begin by identifying the sector’s precision timing and synchronization needs, determining which are currently being provided by GPS, and determining what alternative precision timing systems are, or could be, available. We conducted initial scoping interviews to identify preliminary hypotheses regarding the counterfactual scenarios and the potential technical impacts. The preliminary assessment was then verified and refined based on a more extensive number of interviews with industry experts. 6.2.1 Precision Timing Needs Table 6-1 summarizes the precision timing needs and applications for the electricity sector. Applications range from precision timing needs in the nanoseconds for traveling-wave fault detection milliseconds to less demanding uses. Event reconstruction and system time/frequency are the applications with the greatest precision timing needs. The precision timing needs for different applications drive the benefits associated with GPS. For example, time-of-use billing requires time stamps, but the level of precision is minimal. In contrast, fault detection requires extremely accurate time stamps because electricity flows at close to the speed of light. 6-4Economic Benefits of the Global Positioning System (GPS) Table 6-1. Electricity Sector Precision Timing Uses and Needs Application Precision Needed Benefits: Qualitative Description Counterfactual Technical Impact Metric Economic Value Metric Potential Magnitude of Impacts Event reconstruction 1 ms Accurate time tags greatly speed up event reconstruction, helping to prevent future events Manual time stamping and longer event reconstruction time Frequency, magnitude, and duration of blackouts Economic impact of outages High Phasor measurements 5–6 μs Monitors grid instability and increases grid efficiency Less efficient grid system More efficient dispatch and reduced transmission losses Fuel and increased capacity requirements High System time and frequency 5–50 ms Line frequency is used by end-users as a time standard (clocks in appliances) Less accurate clocks. Not an issue for appliances but impact other apps. Increased cost for some applications needing time standards Low Billing and power quality incentives 50 ms: Billing 1 ms: Power quality harmonics Customers typically monitor themselves, and utility bill estimates need to match; thus, accurate time is key Less reliable M&V for harmonics incentive programs Impacts due to incentive program (partial attribution) Value of load shifting and improved power quality Low Traveling-wave fault detection 0.1 μs More precisely locate the point of fault Longer ground-based search time Speed time to identifying and fixing faults on large transmission lines (300– 500 M)
Approach for Quantifying Retrospective Benefits The most likely counterfactual scenario in which GPS was not made available for civilian use is that the SCADA system would have continued to be used to meet timing needs in the electricity sector. Alternatively, the sector might have migrated to a Loran-based system (if it had been expanded) for some timing needs. In either case, the decrease in precision would have inhibited many of the current applications that rely on GPS. During our interviews with industry experts, we investigated which system (or a combination) would have been more likely to meet timing needs and the potential impact of using that system. Even though SCADA technology has evolved significantly since its conception and is currently capable of providing reliable measurements and results, its readings are not as accurate as those obtained through GPS (North American Electric Reliability Corporation [NERC], 2012) because the measurements that SCADA typically evaluates are not time-aligned and are therefore unable to display real-time changes and angle evaluations (Coppolino, D’Antonio, Elia, & Romano, 2011). Highly accurate monitoring of the power line dynamics required in real-time is attainable only by using GPS’s UTC time stamps. The impact is similar comparing GPS with an expanded Loran system. Thus, the analytical focus is on the difference in precision timing of GPS compared with the SCADA system or an expanded Loran system. The primary technical impacts associated with the decrease in precision timing are a decreased ability to monitor system status and hence a slight increase in the probability of an event/outage; increased time to identify, trace, and mitigate/correct faults, leading to longer duration of outages; increased cost and downtime for generation model verification; a potential increase in transmission line loss; and less interconnectivity and hence a less efficient grid system. Once the technical impact metrics are validated, the next step is to estimate the associated economic impacts. In general, these economic impacts fall into several major categories: lost economic activities due to disruptions in the power supply or quality; impacts on household welfare in terms of inconvenience or health/injury; electricity system costs, including – additional costs of operating or switching to a SCADA or Loran system; – additional costs associated with identifying, tracing, and fixing system faults; – increased generation costs associated with increased line loss; and – increased system costs of ensuring reliability and resilience. Economic Benefits of the Global Positioning System (GPS) 6-6 The impacts described above are the technical (system) impacts resulting from the loss or unavailability of GPS. The next step is to value these technical impacts to estimate the economic impacts on service providers and their customers. We estimated economic impacts by valuing/monetizing the changes in the cost of service in terms of increased expenditures on labor, capital, and fuel and the quality of service in terms of the cost of increased outages to customers. In most instances, we used the expert interviews to identify and verify technical impacts and then the published literature to calculate the economic impacts. We estimated a time series of impacts to capture the share of the national electricity grid system using GPS as it was adopted over time. This time series is based on the penetration/installation of PMUs and their pervasiveness throughout the electricity system. Literature suggests that PMUs were initially integrated into the East Coast grid system starting in 2010 and were eventually being used throughout the entire system by 2014 to 2015. As noted below, although PMUs have not replaced the SCADA systems for real-time operation, they do provide ancillary benefits that are valued, and it is these benefits that are scaled to the national level. 6.2.3 Approach for Quantifying the Potential Impacts of a 30-Day GPS Outage Most published studies conclude that widespread grid failures are not to be expected from a major disruption of the GPS signal (NERC, 2012). The electrical system is highly distributed, and the existing SCADA system could be engaged quickly to serve as an adequate backup system for any GPS-supported functions. This capability reduces the likelihood that a large-magnitude event such as widespread cascading outages would occur. However, the loss of GPS would affect system monitoring operations and effectiveness, leading to a slightly increased probability of adverse events. The impact would be similar to the retrospective scenario (but more short-lived) and would lead to the following impacts over the 30-day GPS outage period: increased time to identify, trace, and mitigate/correct faults; increased probability and duration of small-scale outages; and increased probability (albeit low) of large-scale blackouts. Long-term impacts such as infrastructure damage are unlikely. 6.2.4 Interviews with Sector-Specific GPS Experts We interviewed industry experts with a range of expertise and perspectives. These groups along with key topic areas are summarized in Table 6-2. Electricity sector interviewees were identified through publications, conferences, workshop speaker lists, and referrals. We identified and contacted 55 experts who specialize in the role of PNT in the electricity sector, and 23 (41%) agreed to participate in the interview process. The largest group interviewed comprised utility and system operators because they were able to provide information on how PMUs are actually being used (as opposed to conceptual benefits).
Potential Impact of a 30-Day GPS Outage Sector-Specific Analytical Focus Potential Losses ($ million) Electricity Electrical system reliability and efficiency $275 Finance High-frequency trading Negligible Location-based services Smartphone apps and consumer devices that use location services to deliver services and experiences $2,859 Mining Efficiency gains, cost reductions, and increased accuracy $949 Maritime Navigation, port operations, fishing, and recreational boating $10,411 Oil and gas Positioning for offshore drilling and exploration $1,520 Surveying Productivity gains, cost reductions, and increased accuracy in professional surveying $331 Telecommunications Improved reliability and bandwidth utilization for wireless networks $9,816 Telematics Efficiency gains, cost reductions, and environmental benefits through improved vehicle dispatch and navigation $4,137 Total, Excluding Ag. If the outage were not to occur during critical planting seasons $30,298 Agriculture Precision agriculture technologies and practices $15,122 Total, Including Ag If the outage were to occur during critical planting seasons $45,420 Note: Range of potential losses is $16 to $35 billion, before accounting for losses of about $15 billion if a 30-day outage were to occur during critical planting seasons for U.S. farmers. Economic Benefits of the Global Positioning System (GPS) 14-4 The maritime sector had technologies and systems available that complemented mariners’ skills. GPS’s availability meant that the Loran system was no longer necessary, and the signal was turned off. This means that although the historical benefits relative to technology alternatives are negligible if GPS were lost, there could be more than $10 billion in losses over 30 days. This loss estimate underscores the critical role GPS has come to play in economic activity. 14.3 Final Observations The comprehensive costs of GPS are difficult to characterize because of the system’s emergence from multiple R&D programs over seven decades. Since 2010, expenditures have averaged roughly $1.3 billion per year (2017$).98 This estimate includes both defense and civilian development, procurement, and operations. A long history of investments, comingling of defense and nondefense funding to sustain and operate the system, and a large number of laboratories and agencies involved make an estimation of a benefit-to-cost ratio (or another form of return-on-investment measure) difficult. The Department of Transportation receives appropriations for GPS’s civilian use case, but most funding for GPS is provided by Congress to the Air Force. One could compare GPS’s comprehensive costs to only its private-sector benefits from 2010 through 2017. This produces a benefit-to-cost ratio of about 100 to 1.99 The civilian use portion is a fraction of total expenditure, so the ratio is likely an underestimate. If one assumes that 25% of GPS expenditures were related to civilian use cases, the ratio is about 400 to 1. If one assumes 40%, the ratio is 250 to 1. If one were to guess that the spend on GPS has been more or less constant since President Reagan first permitted civilian use, the ratio is closer to 10 to 1. This does not mean that other investments or programs will have such a high impact; this is simply what we observe within an 8-year window in the 2010s from a program launched in 1973 that itself has roots in programs from the 1960s. It would be a mistake to assume that a comparable investment would achieve these results. The math is less important than the outcome: making GPS available for private-sector use was a good idea.