Skip to main content

Full text of "DTIC ADA494399: Autonomous Approach and Landing Capability (AALC) Demonstration. Delivery Order 0018: Opportune Landing Site (OLS) Software Field Demonstration and Validation of Capability to Identify Landing Sites and Low Incidence of False Positives"

See other formats


AFRL-RB-WP-TR-2008-3163 



AUTONOMOUS APPROACH AND LANDING 
CAPABILITY (AALC) DEMONSTRATION 

Delivery Order 0018: Opportune Landing Site (OLS) Software Field 
Demonstration and Validation of Capability to Identify Landing 
Sites and Low Incidence of False Positives 


Carol Ventresca, Victoria M. Althoff, Kenneth R. Eizenga, and Capt Justin R. Rufa 
SynGenics Corporation 


SEPTEMBER 2008 
Interim Report 


Approved for public release; distribution unlimited. 

See additional restrictions described on inside pages 


STINFO COPY 


AIR FORCE RESEARCH LABORATORY 
AIR VEHICLES DIRECTORATE 
WRIGHT-PATTERSON AIR FORCE BASE, OH 45433-7542 
AIR FORCE MATERIEL COMMAND 
UNITED STATES AIR FORCE 







NOTICE AND SIGNATURE PAGE 


Using Government drawings, specifications, or other data included in this document for any 
purpose other than Government procurement does not in any way obligate the U.S. Government. 

The fact that the Government formulated or supplied the drawings, specifications, or other data 
does not license the holder or any other person or corporation; or convey any rights or permission to 
manufacture, use, or sell any patented invention that may relate to them. 

This report was cleared for public release by the USAF 88 th Air Base Wing (88 ABW) Public Affairs Office 
(PAO) and is available to the general public, including foreign nationals. Copies may be obtained from the 
Defense Technical Information Center (DTIC) (http://www.dtic.mil). 

AFRL-RB-WP-TR-2008-3163 HAS BEEN REVIEWED AND IS APPROVED FOR PUBLICATION IN 
ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. 


^//signature// 


//signature// 


JUSTIN R. RUFA, Captain, USAF 
Technical Monitor 
Control Systems Development and 
Applications Branch 


DANIEL B. THOMPSON 
Technical Advisor 
Control Systems Development and 
Applications Branch 


//signature// 


JEFFREY C. TROMP 
Senior Technical Advisor 
Control Systems Development and 
Applications Branch 


This report is published in the interest of scientific and technical information exchange, and its 
publication does not constitute the Government’s approval or disapproval of its ideas or findings. 



REPORT DOCUMENTATION PAGE 


Form Approved 
OMB No. 0704-0188 


The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, searching existing data 
sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of 
information, including suggestions for reducing this burden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson 
Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a 
collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 



Standard Form 298 (Rev. 8-98) 

Prescribed by ANSI Std. Z39-18 


1 












Table of Contents 


Section Page 

List of Figures.iv 

List of Tables.vii 

1. Executive Summary.1 

2. Introduction.2 

2.1 Summary of Approach.2 

2.2 Identifying Candidate OLSs.3 

3. Methods, Assumptions, and Procedures.4 

4. Results and Discussion.6 

Visit 1.7 

Visit 2.8 

Visit 3.9 

Visit 4.10 

Visit 5.11 

Visit 6.12 

Visit 7.14 

Visit 8.15 

Visit 9.19 

Visit 10.23 

Visit 11.26 

Visit 12.28 

Visit 13.29 

Visit 14.31 

Visit 15.32 

Visit 16.33 

Visit 17.34 

5. Conclusions.36 

List of Acronyms, Abbreviations, and Symbols.37 


Hi 





























List of Figures 

Figure 1. Demonstration Area within St. Clair County.3 

Figure 2. B1 Shown on Map.7 

Figure 3. B1.7 

Figure 4. B1 Another view.8 

Figure 5. B1 Software Output.8 

Figure 6. B2 Shown on Map.8 

Figure 7. B2 (a) .8 

Figure 8. B2 (b) Another View.8 

Figure 9. B2 (c) From Another Direction.8 

Figure 10. B2 (d) Acceptable Approach.9 

Figure 11. B2 (e).9 

Figure 12. B2 (f).9 

Figure 13. B2 Software Output.9 

Figure 14. B3 Shown on Map.9 

Figure 15. B3.9 

Figure 16. B3 Another View.10 

Figure 17. Orthophotoquad of B3.10 

Figure 18. B3. Software Output.10 

Figure 19. Bl, and J15 As Shown on Map.10 

Figure 20. J15.10 

Figure 21. Orthophotoquad of J15.11 

Figure 22. J15 Software Output.11 

Figure 23. JX1 Shown on Map.11 

Figure 24. JX1.11 

Figure 25. JX1 From Another Direction.11 

Figure 26. JX1 Another View.11 

Figure 27. J8 and JX2.12 

Figure 28. J8.12 

Figure 29. J8 A Closer View.12 

Figure 30. J8 and JX2.12 

Figure 31. JX2 (a).13 

Figure 32. JX2 (b) .13 

Figure 33. JX2 (c).13 

Figure 34. JX2 Repeat of Orthophotoquad.13 

Figure 35. JX2 (d).13 

Figure 36. J8.13 

Figure 37. Map Showing Location of J7.14 

Figure 38. J7.14 

Figure 39. J7 Another View.14 

Figure 40. J7 A Different Perspective.14 

Figure 41. Orthophotoquad of J7.14 

Figure 42. J4, J5 and J6 Shown on Map.15 

Figure 43. J5 Orthophotoquad.15 

Figure 44. J5 A Good OLS.15 

Figure 45. J5 Area.15 

Figure 46. J5 Another View.16 

Figure 47. Map showing B4, B5, B6, JX3 Areas .16 

Figure 48. B5 Similar to J4 .16 

Figure 49. B5 Looking North.16 

Figure 50. Map Showing OLSs B4, B5, and B6.16 

Figure 51. OLS Software Image of B4 or JX3.16 

Figure 52. Magnified Image of B4 or JX3.17 


IV 






















































List of Figures (Continued) 

Figure 53. B5 Software Output.17 

Figure 54. B6 Software Output.17 

Figure 55. Map of J4, J5, and J6 (Repeated from R 15).17 

Figure 56. Map of Shiloh Valley Area, J4, J5, J6, JX3, B4, B5, B6, and B30.17 

Figure 57. J4 Across the Creek from B5 .17 

Figure 58. J4 Another View.18 

Figure 59. J4 A Different Perspective.18 

Figure 60. Image of J4.18 

Figure 61. J4, J5, J7, and JX3 Orthophotoquad.18 

Figure 62. J4 Software Output.18 

Figure 63. J6 Software Output.18 

Figure 64. Map showing B24, B32, B33, J13, and J16.19 

Figure 65. Orthophotoquad of J13, J16, and J17.19 

Figure 66. Orthophotoquad of J13 and J16.19 

Figure 67. J16 (a), J13, J17, B24, B29, B32, and B33.19 

Figure 68. J16 (b) Another View.20 

Figure 69. J16 (c).20 

Figure 70. J16 (d) From Other End.20 

Figure 71. J16 (e) A Different View.20 

Figure 72, J16 (f) Another View.20 

Figure 73. Orthophotoquad of J17.20 

Figure 74. J17 (a).21 

Figure 75. J17 (b) A Different View.21 

Figure 76. J17(c).21 

Figure 11 .ill (d) Another View.21 

Figure 78. J17(e).21 

Figure 79. J17 (f).21 

Figure 80. J17(g).22 

Figure 81. Orthophotoquad Showing J13, J16, J17.22 

Figure 82. B29.22 

Figure 83. B29 Another View.22 

Figure 84. B24 Software Output.22 

Figure 85. B32 Software Output.22 

Figure 86. B33 Software Output.23 

Figure 87. J13 Software Output.23 

Figure 88. J16 Software Output.23 

Figure 89. J17 Software Output.23 

Figure 90. Jll and J12 Shown on Map.24 

Figure 91. Jll (a).24 

Figure 92. J11 (b) A Different View.24 

Figure 93 .Jll (c) Another View.24 

Figure 94. Jll (d) A Different Perspective.24 

Figure 95. Jll (e).24 

Figure 96. J11 (f) A Different View.25 

Figure 97. Jll (g).25 

Figure 98. Jll Software Output.25 

Figure 99. J12 Shown on Map.25 

Figure 100. J12 Parallel to Road.25 

Figure 101. J12 Paralleling Road.25 

Figure 102. J12.26 

Figure 103. Orthophotoquad of J12.26 

Figure 104. J12 Software Output.26 


V 






















































List of Figures (Continued) 

Figure 105. BIO Software Output.26 

Figure 106. MapofB13 andB14.26 

Figure 107. B14 (a) .26 

Figure 108. B14 (b) Another View.27 

Figure 109. B14 (c) A Different Perspective.27 

Figure 110. B 14(d).27 

Figure 111. B 14(e).27 

Figure 112. B14(f).27 

Figure 113. B14 (g).27 

Figure 114. B13 Software Output.28 

Figure 115. B14 Software Output.28 

Figure 116. J2 Shown on Map.28 

Figure 117. J2 Orthophotoquad.28 

Figure 118. J2 (a) Rolling Terrain.28 

Figure 119. J2 (b) A Different View.28 

Figure 120. J2 (c).29 

Figure 121. J2(d).29 

Figure 122. J2 (e) Another View.29 

Figure 123. J2 Software Output.29 

Figure 124. Map of J14.29 

Figure 125. J14.29 

Figure 126. J14 (a).30 

Figure 127. J14 (b) A Slightly Different View.30 

Figure 128. J14 (c) Another view.30 

Figure 129. J14 (d).30 

Figure 130. J14 (e).30 

Figure 131. B16 As Shown on Map .31 

Figure 132. B16(a).31 

Figure 133. B16 (b) Different Perspective.31 

Figure 134. B16(c).31 

Figure 135. B16(d).31 

Figure 136. B16(e).31 

Figure 137. B16 (f) Another View.32 

Figure 138. B16(g).32 

Figure 139. B16(h).32 

Figure 140. B16.32 

Figure 141. B37 Shown on Map.32 

Figure 142. B37 (a) A Short Approach.32 

Figure 143. B37 (b).33 

Figure 144. B37 (c) .33 

Figure 145. B37 (d).33 

Figure 146. B37 Software Output.33 

Figure 147. B36 Shown on Map.33 

Figure 148. B36.33 

Figure 149. B36 Another View.34 

Figure 150. B36 Software Output.34 

Figure 151. B35 Shown on Map.34 

Figure 152. B35 (a).34 

Figure 153.B35(b).34 

Figure 154. B35 (c) Another View.34 

Figure 155. B35 Software Output.35 


VI 





















































List of Tables 


Table 1: OLS Sites 


6 


vii 



Acknowledgment 

This work was funded by the United States Transportation Command (USTRANSCOM), managed by 
the Air Mobility Command (AMC), and executed by the Control Systems Development and Applications 
Branch of the Air Vehicles Directorate of the Air Force Research Laboratory (AFRL/RBCC). 

Support for SynGenics was provided through the Simulation Technology Assessment (STA) Contract, 
Prime Contract Number F33615-01-D-3105/0018, Delivery Order 18, under Subcontract Number D00058- 
D6SC0578 to General Dynamics Advanced Information Systems (GDAIS). 

The authors wish to express their appreciation to USTRANSCOM, AMC, AFRL, GDAIS, the Army's 
Engineer Research and Development Center-Cold Regions Research and Engineering Laboratory 
(ERDC-CRREL), and the Boeing Company for their participation in activities leading up to and 
comprising this field demonstration. 

Special thanks go to Dr. Charles E. Ryerson, ERDC-CRREL, for providing the photographs included 
in this document. 


via 


1. Executive Summary 

The objective of the Opportune Landing Site (OLS) Software Demonstration and Validation was to 
enable and demonstrate the capability to locate possible suitable Landing Zones (LZs) that are smooth, 
flat, firm, free of obstructions, and strong enough to support mobility aircraft operations. The Boeing 
Company developed the OLS system software application to aid the warfighter in achieving global 
access to the battlespace. The application currently comprises four separate modules of computer-coded 
algorithms. One module uses satellite imagery to identify candidate landing areas that are large enough, 
flat enough, and suitably free of vegetation, standing water, and obstacles. This module is referred to 
in this report as the runway-finding software. Another module uses topographic data and historical 
databases to determine soil type. A third module uses weather data and soil type to determine soil 
moisture content, and the fourth module uses soil type and moisture content to determine soil strength. 

A field demonstration and assessment of the OLS runway-finding software was held in St. Clair 
County, IL, on 5 June 2007. The purpose of this portion of the OLS field demonstration was to assess the 
capability of the runway-finding software. The field reviewing team comprised personnel from the Air 
Force Research Laboratory (AFRL), the Air Mobility Command (AMC), the Army's Engineer Research 
and Development Center-Cold Regions Research and Engineering Laboratory (ERDC-CRREL), General 
Dynamics Advanced Information Systems (GDAIS), the Boeing Company, and SynGenics Corporation. 

For the field demonstration, Boeing obtained LANDSAT imagery for St. Clair County, IL, collected 
in May 2007. Boeing ran the OLS “flatness” software on the LANDSAT image to determine suitable 
landing areas dimensioned 1,000 feet by 90 feet after the software was not able to find suitable landing 
sites measuring 3,500 feet by 90 feet, according to the original requirement. Boeing then provided the 
results of the software analysis of the region to the AMC. AMC designated a single trained Special 
tactics team (STT) representative to identify all suitable landing areas dimensioned 3,500 feet by 
90 feet in the same area (St. Clair County, IL) using aerial photography, topographic maps, digital 
topographic elevation data (DTED), and other typically used means. Currently, the conventional "boots- 
on-the-ground" method is used by STTs to review possible landing sites. The data used by the STT 
representative was not of the same time frame as the satellite image, but was older by several years. On 
5 June 2007, the field reviewing team visited many of the sites designated by the OLS runway-finding 
software in order to assess the accuracy of the software and its capability to find suitable landing sites. 
The team also visited 16 of the 17 sites proposed by the STT representative. The team went to 35 sites, 
some of which contained clusters of candidate OLSs identified by either or both methods. 

Of the 23 software-designated sites reviewed, all were considered potentially acceptable OLSs 
although they were shorter in length than what was initially sought. Of the 17 STT-determined sites, 

14 were considered potentially suitable. One of the sites appeared to cross power lines (although it 
was difficult to determine from the dirt road to which the team had access). Two of the sites crossed a 
construction site that was not reflected in the old DTED data. Additionally, one of those two also crossed 
a ditch (on the opposite end from the construction site location.) Adjusting the software to require a 
longer runway length may rule out some of the software-designated sites, but, based on the requirements 
made for this demonstration, all of the software-designated sites proved acceptable. The implication is 
that the runway-finding software module of the OLS System may provide an excellent tool in helping 
the warfighter to achieve global access to the battlespace. Other modules of the system were not 
demonstrated. 

Future steps may include further scientific investigation and refinement of this software module. 
Additionally, the OLS Project Team continues to define potential uses for the OLS software and to 
consider whether it should be distributed as a package or a service and who should maintain the database 
of information upon which it relies, adding to and/or upgrading that database as situations change. 

One issue is georegistration. That issue may provide a good candidate for future work under the OLS 
Technology Maturation Plan. 


1 


2. Overview 


The OLS Software Demonstration Plan describes a means to validate the utility and accuracy of the OLS 
software application to locate and evaluate natural terrain LZs for airlift aircraft. The OLS application 
uses satellite imagery to scan for obstacle-free, water-free and heavy-vegetation-free areas for evaluation 
as candidate LZs. It then uses myriad data sources to infer soil type, and it uses mesoscale atmospheric 
modeling and soil moisture modeling to infer soil strength. Areas that pass threshold values for openness; 
absence of heavy vegetation, standing water, and obstacles; smoothness; and soil strength are identified 
as opportune landing sites. 

A proven OLS System will aid the warfighter in achieving anywhere-anytime access to the 
battlespace. This technology will aid in conducting military operations from semi-prepared or 
unprepared locations to effect a wide range of military options. Currently, these sites are evaluated 
physically by military personnel before the planned operations begin. These evaluations may be 
performed under hostile conditions. The OLS application was developed as an alternate method of site 
evaluation. The OLS application will initially augment these physical site evaluations by prescreening 
candidate areas, providing the benefits of reducing the initial search time, and limiting the number 
of necessary physical site evaluations to the fewest areas. As technology and sensors improve, this 
application is expected to eliminate the need for the physical evaluations. 

A practical demonstration program highlighted the utility and accuracy of this module of the 
application, with final results briefed to AMC in August 2007. The final report of the demonstration and 
a Technology Maturation (Tech Mat) Plan were provided to AMC. The purpose of the demonstration 
was to exercise the OLS software with respect to a set of criteria that represents a checkpoint along the 
path toward a useful capability for airlift operations. This report covers a portion of the demonstration 
program, describing the efforts on 5 June 2007. 

2.1 Summary of Approach 

The purpose of this portion of the OLS demonstration was to assess the capability of the runway-finding 
software. The software was used to identify all suitable runways within an area of St. Clair County. In 
addition, a manual inspection was performed using current conventional means, that is, identification of 
sites by hand using satellite images and topographic maps. Sites were assigned numbers for identification. 
Sites identified by inspection are designated with the prefix "J M , while those determined by the software 
are named with a "B" prefix. There is no relationship implied between sites having the same numbers 
but different prefixes. On 5 June 2007, the observation team drove to most sites identified and visually 
inspected/verified their suitability as a landing zone. Each stop was identified by its quadrant number. 
This report details the results of this portion of the demonstration and compares the findings of the 
observation team with respect to each candidate OLS visited. 

The 5 June demonstration was intended to showcase the capabilities of the OLS software to the 
AMC staff, demonstrate the current state of the technology, and reveal the potential of the technology 
for further development and fielding. Further objectives were to prove that Key Performance Parameters 
(KPPs) and exit criteria for the OLS software demonstration and validation program have been met and 
to lay the foundation for the technology maturation and risk-mitigation way forward. The purpose of this 
portion of the OLS demonstration was to assess the capability of the runway-finding software 

2.2 Identifying Candidate OLSs 

The OLS algorithms make some specific assumptions about the physics of reflected electromagnetic 
radiation to find suitable landing sites. Appreciating these assumptions is important in understanding the 


2 


capabilities and limitations of the application. Multispectral and hyperspectral satellite imagers measure 
the electromagnetic radiation emitted from the sun and reflected by a given area (pixel) of the earth’s 
surface. The reflected component also includes atmospheric scattering of solar radiation, and, as the 
spectra approaches the IR spectral region, the radiation at the sensor includes earth- and atmospheric- 
emitted radiation. This radiation is formatted by the imager into separate images based on the wavelength 
of the radiation. 

The OLS algorithms are based on the assumption that variations of the earth’s surface reflectance are 
caused by physical (spatial) and material (spectral) characteristics, which can be used to discriminate 
the spatial and spectral properties of the terrain for a given area (pixel). These variations are used 
to identify standing water, areas containing heavy vegetation (high chlorophyll), and uneven terrain 
(combined spatial/spectral inhomogeneity). Conversely, areas with highly uniform reflectance (spatial 
and spectral homogeneity), are assumed to be flat areas of like material substance (dirt, grass, rock, etc.). 
The algorithms reject areas with large variations in reflectance, such as those caused by sharp contrast 
between the asphalt of a road or runway and the surrounding soil or vegetation; the OLS application 
looks solely for areas of homogeneous natural terrain. This report details the results of this portion of the 
demonstration and compares the findings of the observation team with respect to each candidate OLS 
visited. 

The red rectangle on the map below indicates that portion of St. Clair County in which the 
demonstration took place. 



i rear t • p&nwnr mm 3 i-hjj; 

Figure 1. Demonstration Area within St. Clair County 


3 



































































3. Methods, Assumptions, and Procedures 

It was agreed that the objective of the OLS Software Demonstration and Validation Program would be 
shown to have been met if the team were to demonstrate that the documented exit criteria were met. The 
purpose of this portion of the demonstration was to assess the OLS Software against two performance 
criteria, one of which was an exit criterion and a KPP: 

• KPP P01: Capability to identify suitable landing sites in a specified area, given that suitable landing 
sites exist. Suitable is defined as having an area of the specified dimensions that is flat and free of 
obstacles, standing water, and heavy vegetation. Bearing strength is not a consideration for suitability 
in this context. Exit criterion: at least 50 percent of OLSs found. Objective: 100 percent. 

• P03: Low incidence of false positives. Probability of designating an unsuitable landing site as a suitable 
OLS—a measure of accuracy expressed as the percentage of OLSs identified by the software that were 
unsuitable. Suitability as defined for this criterion excludes bearing strength. The value with respect 

to this desirement was to be assessed through comparison of the software analysis results with field 
inspection and observation results for St. Clair County. The goal was 0 percent. No upper bound was 
set at this stage. 

Results are highlighted in Section 4. Results. KPPs P01 and P03 comprised the focus of the 5 June 
effort. Boeing obtained LANDSAT imagery for St. Clair County collected in May 2007. Boeing ran the 
OLS “flatness” software on the LANDSAT image and determined suitable landing areas dimensioned 
1,000 feet by 90 feet. Boeing used those dimensions with AMC approval, after reporting that the software 
did not find any suitable landing sites 3,500 feet in length. Boeing then provided the results of the 
software analysis of the region to AMC. 

In parallel with the software analysis, AMC tasked a representative of a STT, to identify all suitable 
landing areas dimensioned 3,500 feet by 90 feet in St. Clair County using aerial photography of the same 
area as the LANDSAT imagery, topographic maps, DTED, and other means typically used by STTs. This 
method is henceforth referred to as ’’inspection”. The data used by the STT representative was not of the 
same time frame as the satellite image, but was older by several years. For example, Mid America Airport 
was under construction in the STT data, yet was operational by the time the test was conducted. The STT 
found OLSs by inspection in only eastern St. Clair County, looked for OLSs measuring 3,500 feet by 90 
feet, and found some longer ones as well. 

An AMC-designated referee was tasked to compare the software results with those of the STT 
representative using inspection and to calculate the percentage of correct sites (P01) and the incidence 
of false positives (P03). Other participants in the demonstration included representatives of AMC, AFRL/ 
RBCC, GDAIS, SynGenics, ERDC-CRREL, and the Boeing Project Manager. They confirmed sites 
by observation. AFRL/RBCC representatives along with ERDC-CRREL served as impartial obervers 
and adjunct referees. SynGenics served as observer and recorder, and the AMC representative was 
the photographer. As required by the Demonstration (demo) Plan, the team obtained vantage points 
as close as possible to the location of the alleged OLS and ascertained by observation whether the site 
was a suitable LZ. The percentage of suitable LZs was to be recalculated based on these findings. LZs 
identified by the software, missed by inspection, but subsequently confirmed by observation would 
contribute to both the numerator and the denominator of this calculation. The software performance was 
to be considered successful if it found at least 50 percent of the suitable sites. 

Although not a KPP for the demonstration, a low incidence of false positives was desired. A false 


4 


positive occurs when the software designates an area as a suitable OLS when, in fact, it is unsuitable. The 
occurrence of false positives would be calculated based on the comparison of software-identified LZs 
with inspection and observation results. 


5 


4. Results and Discussion 


The demonstration team followed the Demonstration Plan as detailed in the Demo Plan section 5.1 except 
that they visited nearly every site found by either the software or the AMC-trained individual, the STT 
representative. Exception: The team visited only eastern St. Clair County. 

• KPP P01: Capability to identify suitable landing sites proved difficult to quantify because it was 
unknown how many suitable sites exist in the region chosen for the demonstration of this desirement. 
OLS-MS identified 40 sites, whereas an individual using the standard manual method identified only 
17 sites in the region. It could be argued that the software scored 235 percent. While the exact score is 
unknown, there is agreement that the exit criterion of at least 50 percent was certainly exceeded, and it 
could be argued that the objective of 100 percent was met. The lesson learned is that properly defining 
the measurand and the method of collecting the data to support quantification against that measurand 
is important. 

• P03: Incidence of false positives was 0, meeting the objective. 

The following pictures depict sites visited. They are listed in the order visited. Table 1, OLS Sites, 
summarizes the visits. Information in each header includes the site designation; coordinates at the 
northwest corner of the landing zone; runway magnetic heading (degrees); and length (feet). Pictures 
comprise 1) the National Geographic map, 2) photo(s) of the field, 3) orthophotoquad (for J numbers), 4) 
OLS software output. Text reflects findings of the observation team concerning the site. The runway¬ 
finding software identified 40 candidate LZs. Some included clusters of possible runways 1,000 feet or 
longer, for a total of 54 potential OLSs. The inspection method identified 17 possible runways that were 
at least 3,500 feet long. Of the 16 STT-determined sites visited, 13 were considered potentially suitable. 
One of the sites appeared to cross power lines (although it was difficult to determine from the dirt road to 
which the inspection team had access). Two of the sites crossed a construction site that was not reflected 
in the old DTED data. Additionally, one of those two also crossed a ditch on the opposite end from the 
construction site location. The combined results of the evaluation for both capability to identify landing 
sites and the occurrence of false positives indicated that the runway-finding software performed very well 
in the portions of the demonstration that have been completed and, in combination with other tools, could 
provide an excellent means of finding potential OLSs. 

Table 1 indicates sites that the team visited and documented, in the order visited. 


Table 1: OLS Sites 

Visit 

J Number 

B Number 

Viable 

OLS? 

Page 

Number 

1 


Bl: 38° 38' 04.53" N 89° 47' 01.88" W, 180|360 

Yes 

6 

2 


B2: 38° 37' 8.31" N 89° 41' 19.86" W, 180|360 

Yes 

6-7 

3 


B3: 38° 33' 31.54" N 89° 42' 34.80" W, 180|360, 

3500 ft. 

Yes 

8 

4 

J15: 38° 39' 9.6” N 89° 45' 16.4" W, 100|280, 4600 
ft. 

B28: 38° 39' 18.65" N 89° 46' 54.82" W, 90|270 

Yes 

9 

5 

JX1: 38° 33' 43.4" N 89° 48' 24.2" W, 100|280, 

3419 ft. 



9-10 

6 

J8: 38° 33' 35.93" N 89° 49' 5.34" W, 120|300, 

3118 ft. 

JX2: 38° 33' 35.5" N 89° 49' 4.5" W, 110|290, 3003 
ft. 


No 

10-12 


6 














Table 1: OLS Sites 

Visit 

J Number 

B Number 

Viable 

OLS? 

Page 

Number 

7 

J7: 38° 31 ? 34.7” N 89° 52' 55.4” W, 150|330, 3000 
ft. 


Yes 

12-13 

8 

J5: 38° 30' 09” N 89° 52' 47” W, 180|360, 3100 ft. 

JX3: 38° 31' 10.49” N 89° 51' 24.57” W 

J4: 38° 30' 39.1” N 89° 53' 29.1” W, 90|270, 3600 
ft. 

J5: 38° 30' 09” N 89° 52' 47” W, 180|360, 3100 ft. 

J6: 38° 35' 29.57” N 89° 29' 19” W, 80|260, 4000 
ft. 

B5: 38° 30' 34.55” N 89° 51' 21.91” W, 180|360 

B4: 38° 31' 10.49” N 89° 51' 24.57” W, 90|270, 1000 
ft. 

B6: 38° 30' 04.64 N 89° 51' 51.68” W, 180|360 

Yes 

13-17 

9 

J13: 38° 29' 39.3” N 89° 42' 44.3” W, 174°, 4200 ft. 

J16: 38° 29' 16.4” N 89° 43' 40.4” W, 47°, 3500 ft. 

J17: 38° 28' 46.6” N 89° 43' 23.1” W, 144°, 3800 ft. 

B24: 38° 25' 16.47” N 89° 47' 15.59” W, 060|240 

B29: 38° 31' 26.06” N 89° 47' 22.24” W, 090|270 

B32: 38° 30' 32.13” N 89° 43' 32.47” W, 090|270 

B33: 38° 30' 23.38” N 89° 43' 32.14” W, 0901270 

Yes 

17-22 

10 

Jll: 38° 33' 48.4” N 89° 48' 24.2” W, 100|280, 

3500 ft. 

J12: 38°32'29.3” N 89°47'5.4” W, 196°, 3100 ft. 

BIO: 38° 32' 34.64” N 89° 46' 21.88” W, 180 360 

Yes 

22-25 

11 


B13: 38° 22' 52.29” N 89° 48' 3.21” W, 360|180 

B14: 38° 22' 54.14” N 89° 46' 4.61” W, 360|180 

Yes 

25-26 

12 

J2: 38° 16' 43.1” N 89° 45' 57” W, 180|360, 3800 
ft. 


Yes 

27-28 

13 

J14: 38°16'42.8” N 89°42'32.3” W, 109°, 3800 ft. 


Yes 

28-29 

14 


B16: 38° 17' 34.78” N 89° 43' 26.76” W, 180|360 

Yes 

29-31 

15 


B37: 38°20'49.33” N 89°48'18.24” W, 90|270 

Yes 

31-32 

16 


B36: 38° 23' 16.01” N 89° 54' 27.38” W, 90|270 

Yes 

32 

17 


B35: 38°24'37.58” N 89°48'35.72” W, 90|270. 

Yes 

33 


Visit 1—Bl: 38° 38' 04.53" N 89° 47' 01.88" W, 180|360 

B1 was deemed a good OLS. 



Figure 2. Bl Shown on Map 



Figure 3. Bl 


7 


























































Figure 5. B1 Software Output 


Visit 2—B2: 38° 37 f 8.31" N 89° 41 ' 19.86" W, 180|360 

B2 was very good. It could be oriented 90|270 or 180|360 degrees. A good approach was noted. The STT 
representative said he did not find it because it was an east-west runway that extended beyond the eastern 
edge of the area of consideration. 

























































Figure 10. B2 (d) Acceptable Approach 


Figure 11. B2 (e) 




Figure 13. B2 Software Output 


Visit 3—B3: 38° 33' 31.54" N 89° 42' 34.80" W, 180|360 

B3 was also a good OLS. 



Figure 14. B3 Shown on Map 



Figure 15. B3 


9 




























































Figure 16. B3 Another View Figure 17. Orthophotoquad of B3 



Figure 18. B3. Software Output 


Visit 4—J15: 38° 39' 9.6" N 89° 45' 16.4" W, 100|280, 4673 ft. 

B28: 38° 39’ 18.65" N 89° 46' 54.82" W, 90|270 

These runways are both oriented roughly east-west and are equivalent in terms of landing suitability. J15 
and B28 are located approximately one field apart. 



Figure 19. Bl, and J15 As Shown on Map 

Note: See red 15 



Figure 20. J15 


10 






































































Figure 21. Orthophotoquad of J15 



Figure 22. J15 Software Output 


Visit 5—JX1: 38° 33’ 43.4” N 89° 48' 24.2" W 110|280, 3419 ft. 

JX1 is parallel to JX2 and J8. See Visit 6. 



Figure 23. JX1 Shown on Map. 

Note: Fingers Point to the OLS 



Figure 24. JX1 



Figure 25. JX1 From Another Direction 



Figure 26. JX1 Another View 


11 





















Visit 6—J8: 38° 33 ? 35.93" N 89° 49 f 5.34" W, 120|300, 3118 ft. 

JX2: 38° 33 f 35.5" N 89° 49 f 4.5" W, 110|290, 3003 ft. 

These areas are under construction. There is a ramp from 1-64 into what will be Hayden Retail Office 
Park. The construction began after the image was taken to identify the site; so it is reasonable that the 
STT would not have ruled the site out because of construction. However, JX2 crosses a ditch, which 
makes the site unacceptable (see Figure 30), both because of the ditch and because of working with 
old data which did not show the construction site. The STT affirmed that he took a chance on this one, 
thinking the ditch might be a road. A higher resolution image would have revealed the truth, ’’Which is 
why you put boots on the ground,” the SST commented. This software output shows the OLS crossing 
the runway at MidAmerica Airport, whereas Figures 32. 34, and 36 show that it does not, illustrating the 
georegistration problem. Figure 32 shows that the OLS approaches the airport runway. 




Figure 29. J8 A Closer View 



12 



















Figure 31. JX2 (a) 


Figure 32. JX2 (b) 



Figure 33. JX2 (c) 


Figure 34. JX2 Repeat of Orthophotoquad 




Figure 35. JX2 (d) 


Figure 36. J8 

Note: Software Output Falsely Indicating OLS Runway 
Crossing at MidAmerica Airport, an Illustration of Geo¬ 
registration Problem 



13 


































Visit 1 — 31 : 38° 31 ? 34.7" N 89° 52 ? 55.4" W, 150|330, 3000 ft. 

The site has a ditch, but the OLS runs east of the ditch. A house is situated the corner of the LZ area, but 
neither the ditch nor the house renders the area unacceptable. 




Figure 38. J7 

Note: Picturing OLS Running East of the Ditch and There¬ 
fore Acceptable 



Figure 39. J7 Another View 


Figure 40. J7 A Different Perspective 




Figure 41. Orthophotoquad of J7 

Note: The site has a house at the corner of the LZ area, but 
still the area is acceptable 


14 











Visit 8—J5: 38° 30 ? 09" N 89° 52 f 47" W, 180|360, 3100 ft. 

B5: 38° 30 f 34.55" N 89° 51 f 21.91" W, 180|360 
B4: 38° 31 T 10.49" N 89° 51 T 24.57" W, 90|270,1000 ft. 

B6: 38° 30 f 04.64 N 89° 51 f 51.68" W, 180|360 
JX3: 38° 31 f 10.49" N 89° 51 f 24.57" W, 180|360 
J4: 38° 30 ? 39.1" N 89° 53 f 29.1" W, 90|270, 3600 ft. 

J5: 38° 30 ? 09" N 89° 52 f 47" W, 180|360, 3100 ft. 

J6: 38° 35 f 29.57" N 89° 29 ? 19" W, 80|260, 4000 ft. 

Sites B4, B5, B6, JX3, and J4-J6 were all very close together. The software-identified sites might have 
been missed by the STT representative because he was looking only for areas at least 3,000 ft. long. J4 
is one of the OLSs the STT representative found on his way to work and was not among the OLSs that 
he found in his workbook. The STT representative says the best way to land is on a 180° heading. B6 is 
only 1,000 to 2,000 feet. The runway-finding software did not like the creek or dip at the south end. The 
Boeing PM pointed out the dip at the far end of the runway and questioned whether the dip is why the 
software rejected this as a candidate LZ when it was looking for only 1,000 feet. 


J4 is not precisely the same as B5. It is in the same field but on the other side of the creek. The 
team did not find OLSs west of these sites because of the boundary of the search space. J7 has a ditch, 
but the OLS runs east of the ditch. It is nearly the same as B4. One team member observed that J7 is 
dangerously close to the tree line, which is on the other side of the road and not very visible in either 
image. The photos of B4, B6, and JX3 washed out due to a camera malfunction; hence, there is no 


documentation of B4-B6. All were good sites. B4 and J7 align, B5 and J4 align, and B5 and J6 align. 



15 





























Figure 48. B5 Similar to J4 

Note: Different Part of Same Field 



Figure 47. Map showing B4, B5, B6, JX3 Areas 

Note: Focus Is on B5, Section 4 



Figure 49. B5 Looking North 



Figure 50. Map Showing OLSs B4, B5, and B6 



B | p _ | g i lisp *2 Hats: 0 -V 

Figure 51. OLS Software Image of B4 or JX3. 

Note: Software Utility Indicates a Cluster. 


16 





















































Figure 52. Magnified Image of B4 or 
JX3 



Figure 54. B6 Software Output 



Figure 56. Map of Shiloh Valley Area, J4, J5, J6, JX3, 
B4, B5, B6, and B30 



Figure 53. B5 Software Output 



Figure 55. Map of J4, J5, and J6 (Repeated from P. 15) 

Note: J7 Indicated by Arrow 



Figure 57. J4 Across the Creek from B5 


17 








































































Figure 60. Image of J4 


Figure 61. J4, J5, J7, and JX3 Orthophotoquad 



Figure 62. J4 Software Output 


Figure 63. J6 Software Output 


18 



































Visit 9—B24: 38° 25’ 16.47" N 89° 47' 15.59" W, 060|240 
B29: 38° 31' 26.06" N 89° 47' 22.24" W, 090|270 
B32: 38° 30' 32.13" N 89° 43' 32.47" W, 090|270 
B33: 38° 30' 23.38" N 89° 43' 32.14" W, 090|270 
J13: 38° 29' 39.3" N 89° 42' 44.3" W, 174°, 4200 ft. 

J16: 38° 29’ 16.4" N 89° 43’ 40.4" W, 47°, 3500 ft. 

J17: 38° 28' 46.6" N 89° 43’ 23.1" W, 144°, 3800 ft. 

Areas B24, B29, B32, B33, J13, J16, and J17 are quite near each other and so are grouped here. B17, J13, 
J16 and J17 are near Miscoutah. 




Figure 66. Orthophotoquad of J13 and J16 


Figure 67. J16 (a), J13, J17, B24, B29, B32, and B33 

Note: Sites are close to each other 



19 


























20 












Figure 74. J17 (a) 



Figure 75. J17 (b) A Different View 


Figure 76. J17 (c) 



Figure 77. J17 (d) Another View 




Figure 78. J17 (e) Figure 79. J17 (f) 


21 





























Figure 80. J17 (g) 



Figure 82. B29 



Figure 84. B24 Software Output 



Figure 81. Orthophotoquad Showing J13, J16, J17 




Figure 85. B32 Software Output 


22 







































Figure 87. J13 Software Output 


Figure 86. B33 Software Output 



Figure 88. J16 Software Output 



Figure 89. J17 Software Output 


Visit 10—Jll: 38° 33 ? 48.4" N 89° 48 f 24.2" W, 100% 3500 ft. 

J12: 38°32'29.3" N 89°47'5.4" W, 196% 3100 ft. 

B10: 38° 32' 34.64" N 89° 46' 21.88" W, 180|360 

Jll and J12 are near B10 along with J8. JX1 and JX2 are parallel to J8. Jll parallels the road on which 
the team stood. The church was photographed in the wrong direction at first. The team was initially 
concerned that the possibility existed that the OLS was crossed by power lines. They could not confirm 
from the roadway, but ultimately deemed the site acceptable based on the orientation of the OLS. The 
SynGenics representative asked why the software did not find the field photographed. It turned out that 
there was a six-inch high patch of vegetation in part of the area. J12 is north-south, parallel to the road, 
and is a good site. 


23 




































Figure 90. Jll and J12 Shown on Map 



Figure 92. Jll (b) A Different View 



Figure 91. Jll (a) 

Note: Shown Here Near BIO, J8, and J12 



Figure 93. Jll (c) Another View 



Figure 94. Jll (d) A Different Perspective Figure 95. Jll (e) 


24 
















Figure 96. Jll (f) A Different View 


Figure 97. Jll (g) 



Figure 98. Jll Software Output 



Figure 100. J12 Parallel to Road. 



Figure 99. J12 Shown on Map 



25 





















































mm 



Figure 102. J12 



Figure 104. J12 Software Output 



Figure 103. Orthophotoquad of J12 



Figure 105. BIO Software Output 


Visit 11—B13: 38° IT 52.29" N 89° 48 ? 3.21" W, 180|360 
B14: 38° 22 ? 54.14" N 89° 46 ? 4.61" W, 180|360 

B14 was very good. The STT representative did not find OLSs to match them. There was high corn on the 
site at the time of the visit; so the team could not see the field well. 



26 




























































Figure 108. B14 (b) Another View 


Figure 109. B14 (c) A Different Perspective 



Figure 110. B14 (d) 



Figure 111. B14 (e) 




Figure 112. B14 (f) 


Figure 113. B14 (g) 


27 





















Figure 114. B13 Software Output 


Figure 115. B14 Software Output 


Visit 12—J2: 38° 16 ? 43.1" N 89° 45 ? 57" W, 180, 3800 ft. 

The Boeing software did not find this one. This was the third failure of the STT-designated landing sites. 
Power lines and telephone lines as well as rolling terrain and vegetation are reasons it was not considered 
viable. The Boeing software did not designate it for these reasons. 



Figure 116. J2 Shown on Map 



Figure 117. J2 Orthophotoquad 



Figure 118. J2 (a) Rolling Terrain 


Figure 119. J2 (b) A Different View 


28 








































































Figure 122. J2 (e) Another View Figure 123. J2 Software Output 


Visit 13—J14: 38°16 ? 42.8" N 89°42 ? 32.3" W, 109°, 3800 ft. 

Lehr Road has a suitable Landing Zone. The Software ruled it out because of vegetation (a winter wheat 
crop in April). A solution might include (1) reducing the vegetation threshold or (2) georectification (-200 
yards off). 



Figure 124. Map of J14 



Figure 125. J14 

Note: Software Illustrates Georegistration Problem 


29 


















































Figure 126. J14 (a) 



Figure 127. J14 (b) A Slightly Different View 



Figure 128. J14 (c) Another view 



Figure 129. J14 (d) 



Figure 130. J14 (e) 


30 







Visit 14—B16: 38° IT 34.78" N 89° 43 ? 26.76" W, 180|360 

B16 is a good Landing Zone space, and it is very wide. There is a cluster of suitable landing zones five 
pixels in width, indicating numerous potential sites. 



Figure 131. B16 As Shown on Map Figure 132. B16 (a) 



31 











Figure 137. B16 (f) Another View 



Figure 139. B16 (h) 



Figure 138. B16 (g) 



Figure 140. B16 

Note: Arrow Indicates Cluster of LZs in Software Output 


Visit 15—B37: 38°20 ? 49.33" N 89°48 ? 18.24" W, 90|270 

B 37 is in the Kaskaskia River flood plain. The team scored this as a good site based on what the software 
was designed to do. The site avoided power lines because it was beside the road. However, per the referee, 
the site is surrounded by trees. It was suggested that future versions of the software should consider 
approach and departure space, in which case this site might no longer be considered suitable. 



Figure 141. B37 Shown on Map 



Figure 142. B37 (a) A Short Approach 


32 





















































Figure 143. B37 (b) 



vt'ViijWt&JfRWaA 

Figure 145. B37 (d) 



Figure 144. B37 (c) 

Note: Arrow Indicates Fence in Lower Right Corner. 


Figure 146. B37 Software Output 



Visit 16—B36: 38° 23 ? 16.01" N 89° 54 ? 27.38" W, 90|270° 

B36 is an east-west site. The team could not get really close, but it looks good. The team photographed 
one OLS that starts near the big barn and one that starts beyond the cornfield. 



Figure 147. B36 Shown on Map Figure 148. B36 


33 
































Figure 149. B36 Another View 



Figure 150. B36 Software Output 


Visit 17—B35: 38°24 f 37.58" N 89°48 ? 35.72" W, 90|270°. 

B35 was a very good site. There were also others on the other side. While searching for B35, the team 
went past a flooded strip mining area which the software did not identify as OLSs. 



Figure 152. B35 (a) 



Figure 151. B35 Shown on Map 



Figure 153. B35 (b) 



Figure 154. B35 (c) Another View 


34 


















































35 









5. Conclusions 


The inspection team members viewed 40 sites during the day on 5 June 2007. In general, the software was 
judged to have performed quite well. Of the 23 "B" sites reviewed, all were considered potentially acceptable 
OLSs, although they were shorter in length than what was initially sought. The Inspection method identified 
17 possible runways that were at least 3500 feet long, of which three were rejected. Two had become 
construction sites in the period between the taking of the satellite images and the inspection. Another was 
considered unsuitable because of the presence of power lines and a ditch. Adjusting the software in order 
to require a longer runway length might have ruled out some of the "B" sites, but, in accordance with the 
requirements for this demonstration, all of the software-designated sites proved acceptable. Based on this 
sampling and its results, it would appear that the OLS runway-finding software could be used to identify 
potential OLSs at least as well as the methods currently being used by STTs, given that the software database 
is properly maintained. 

It was agreed that the objective of the OLS Software Demonstration and Validation Program would 
be shown to have been met if the team were to demonstrate that the documented exit criteria were met. 

In terms of KPP P01: Capability to identify suitable landing sites in a specified area, given that suitable 
landing sites exist, the runway-finding software identified 40 sites, whereas an individual using the 
standard manual method identified only 17 sites in the region. It could be argued that the software scored 
235 percent. While the exact score is unknown, there is agreement that the exit criterion of at least 50 
percent was certainly exceeded, and it could be argued that the objective of 100 percent was met. The 
lesson learned is that properly defining the measurand and the method of collecting the data to support 
quantification against that measurand is important. P03, Incidence of false positives was 0, meeting the 
objective. This result exceeded expectations. 

The performance of the software vastly exceeded the exit criteria. Moreover, it met the stretch goals 
established in the Demonstration Plan. 

Future steps may include further scientific investigation and refinement of the software module. The 
OLS Project Team continues to explore potential additional uses for the OLS software and its capabilities. 
Other issues to be resolved include whether it should be distributed as a package or a service, and who 
should maintain the database upon which it relies, adding to and/or upgrading that database as situations 
change. Resolving the issue of georegistration is a necessary step for possible future development under 
the OLS Technology Maturation Plan, AFRL-RB-WP-TR-2008-3064 (AD number B336859). A more 
complete view of recommendations to move toward an OLS Initial Operational Capability can be found 
in that document. 


36 


LIST OF ACRONYMS, ABBREVIATIONS, AND SYMBOLS 


ACRONYM 

AFB 

AFRL 

AFRL/RB 

AFRL/RBC 

AFRL/RBCC 

AMC 

DTED 

ERDC-CRREL 

GDAIS 

IR 

KPP 

LANDSAT 

LZ 

OLS 

STT 

Tech Mat 


DESCRIPTION 

Air Force Base 

Air Force Research Laboratory 
Air Vehicles Directorate of AFRL 
Control Sciences Dibision of AFRL/RB 

Control Systems Development & Application Branch of AFRL/RBC 

Air Mobility Command 

Digital Topographic Elevation Data 

Engineer Research and Development Center-Cold Regions Research and 
Development Center (Army) 

General Dynamics Advanced Information Systems 
Infrared (spectral region) 

Key Performance Parameter 

Land Remote Sensing Satellite System 

Landing Zone 

Opportune Landing Site 

Special Tactics Team 

Technology Maturation (Plan) 


37