Photonics Article

Target Acquisition Paper by Russell Lombardo

Abstract

This paper provides a brief overview of the target acquisition process.  It presents both qualitative and quantitative examples of various target acquisition criteria including detection, recognition, and identification.  Some history of the target acquisition genesis will also be discussed.  General “rules of thumb” for designing an optical sensor will also be provided.  Using a standard optical sensor (either a thermal imager or a video camera), fields of view will be parametrically estimated to perform various detection/recognition tasks.

Introduction

A relationship exists between the optical sensor’s field of view (FOV), its magnification, and its capability to perform visual discrimination tasks.  The criteria that all target acquisition comparisons are based upon will be introduced in the background section.  Figure 1 shows the elements of target acquisition[1].

Figure 1.  The Target Acquisition Process and Its Elements.

Designing an optical system typically begins with a set of target requirements.  These criteria include the following data:

1.   Target dimensions

2.   Range requirements for both detection and recognition

3.   Target contrast for visual sensors, contrast and brightness for image intensifiers, and target delta T (DT) for thermal imagers. 

4.   Atmospheric attenuation.

Using these few parameters, a general description of the sensors needed for both day and night target detection and recognition can be developed.  Secondary details regarding the sensor performance, e.g., first or second generation thermal imager, and other issues, can be determined on a cost and requirements basis.  The modern standards for most military applications are a direct view optical telescope for day viewing (or a video camera when viewing is done in a remote location ¾ as is the case for the Crusader turrets) and a thermal imager for night target acquisition. 

Another consideration is the weapon’s effective range.  If the platform is a light armored vehicle with a 25 mm gun that has a range of nominally 2 km, it does not provide a significant added value to have sophisticated, high-performance sensors capable of recognizing targets to 4 km installed in the vehicle.  Cost is typically the major driver in vehicle and optical sight design.  Additionally, if one uses a higher magnification sight other factors must also be considered.  These include the necessity of adding (or improving) the sensor stabilization system.  Also, with increased magnification comes problems of reduced field of view and overall sight utility that should be evaluated as well.  Additionally, larger fields of view (desirable in surveillance situations) reduce detection range as the sensor’s resolution is decreased.  Some examples of typical systems will be presented following a description of the development of target acquisition criteria.

Background

Target acquisition is generally concerned with the detection of points of interest (POIs), and their subsequent recognition and identification.  These criteria were originally quantified by John Johnson in the 1950s.  He investigated the relationship between the ability of the observer(s) to resolve bar targets (one black bar and one white bar equate to one cycle) through an imaging device and their ability to perform the tasks of detection, recognition, and identification of military vehicles through the same optical sensor.[2]  The empirical relationship that Johnson developed serves as the foundation for the de facto standard prediction methodology to be presented here.

Johnson observed a basic spatial resolution dependence for target acquisition tasks.  Table 1 shows the original Johnson criteria for several targets and three discrimination tasks, as noted below.

1.   Detection - “There’s something out there

2.   Recognition - “It’s a tank

3.   Identification - It’s a T72 tank”.

From these various visual tasks, and for nine different targets, generally accepted values were determined.  Over the years since the Johnson criteria were developed, some minor modifications were made; but the general information has remained unchanged.  The changes to the visual tasks are: the detection task has changed from one cycle to 0.75 cycles, the recognition task was broken into optimistic at 3 cycles and conservative at 4 cycles; and identification is generally accepted to be 6 cycles.

Table 1.  Johnson Criteria - The number of just resolvable cycles required across a target’s critical dimension for various discrimination tasks.

The human eye’s visual acuity (or resolving power) is about 1 minute of arc[3] (equivalent to about 291 µrad).  Comparing this resolution to the required sensor instantaneous field of view (IFOV) can provide an estimate of the magnification necessary to achieve the required visual task.  For instance, a task with a resolution of 50 µrad would require at least a 6x magnification (291 µrad ¸ 50 µrad » 6 magnification).

Using the Johnson criteria, the sensor resolution necessary for any visual acquisition task can be determined at any range.  With these criteria and the other data described herein, a general sensor description, including field of view, resolution, etc., can easily be generated.

Analysis

It is fairly simple to compute the sensor performance including field of view and required resolution.  Once the target size and range are established, the other data can be calculated.  Two standard targets will be used in this analysis.  They are a standard tank as defined by the U.S. Army’s Night Vision Laboratories (NVL) and a standard man-sized target.  The dimensions for these targets are presented in Table 2.

Table 2.  Standard targets to be used in the subsequent analysis.

The only other data necessary is the range that these targets are required to be detected, recognized, and identified.  For the sake of clarity, it will be assumed that the target contrast, ambient illumination, and target delta T are sufficient to permit acquisition.  We will also include atmospheric attenuation effects on the target’s contrast/DT, as well.  All these various reduction factors will be assumed to equate to a 50% reduction in overall sensor resolution[4].  Nominal distances for these targets are typically in the 0.5 to 4 km range.  Tables 3 and 4 show the required sensor resolution and fields of view for a tank and a man targets and the three visual tasks including detection, recognition, and identification.

Table 3.  Tank-Sized Target Sensor Spatial Requirements.

 

Table 4.  Man-Sized Target Sensor Spatial Requirements.

Once the sensor resolution is calculated, the maximum sensor field of view for any given task is easily calculated.  For example, let’s determine the day-viewing detection/FOV combination for a tank target.  For the sensor, we will assume a video camera with a standard 640 horizontal x 480 vertical pixel array.  To compute the sensor FOV, simply look up the sensor’s horizontal instantaneous field of view (IFOV) for the task desired, and multiply it by the number of horizontal pixels as shown below.

FOVS = 640 pixels * 0.383 mrad

FOVS = 245 mrad = 14°

Hence, the maximum video camera FOV to detect a standard tank target at 3.5 km is about 14 degrees.  This calculation can also be performed for a thermal imaging system (TIS) in much the same way.  A scanning TIS such as the first generation Hughes Infrared Equipment (HIRE) system has an effective horizontal pixel count of 418.  Using the same target and other parameters for the first generation HIRE results in a sensor FOV of 9.2°.  Table 5 shows the maximum FOVs for both a first and second generation HIRE and a standard video camera.  A second generation sensor (e.g., the 2nd generation HIRE), has a horizontal pixel count of 575, thereby increasing the allowable FOV by 38%.

Table 5.  Maximum Sensor Fields of View for the Target Cases.

 

Sensors may have more than one FOV to optimize performance for more than one visual task.  Most thermal imagers and many video cameras have at least two FOVs: wide for target detection and narrow for target recognition and identification.  The ratio for the wide-to-narrow FOVs is typically the same as the ratio for the cycle criteria between detection and recognition (nominally 3:1).  Nearly all tactical ground vehicles thermal sensors have this WFOV/NFOV ratio.  (As a side note, infinitely variable zoom lenses are not typically used due to the problems of boresight wander when optically zooming, causing alignment problems that are difficult and expensive to address.)  Day sights (direct view optics) typically have only one field of view and use unity periscopes to provide wide angle target acquisition.

Parametric Analysis

These techniques can be used to present data that plots field of view versus performance range for two tasks: detection and recognition.  Charts 1, 2, and 3 present these data for first and second generation thermal imagers and for a standard video camera.  All analysis was done for a 2.3 meter width standard tank target.  To use these graphs, simply find the desired range for a visual task (either detection or recognition) and determine where the curve crosses this range.  Then simply look at the corresponding value on the y-axis for the sensor field of view.  Conversely, if a desired FOV is known, you can determine the sensor performance that corresponds to that FOV by determining where the curve intersects the FOV and looking up the range.

Chart 1.  Maximum sensor field of view for a tank target using a first generation thermal sight.

 Chart 2.  Maximum sensor field of view for a tank target using a second generation thermal sight.

 

Chart 3.  Maximum sensor field of view for a tank target using a standard video camera.

These brief examples show the first-order methodology that can be applied to determine the required sensor resolution and fields of view for a given task and target.  Although these examples are not exhaustive, they do represent nominally standard sensors for both visible (video camera) and thermal (HIRE) sensors.  Higher resolution/performance detector arrays are available at increased cost.  An increased number of detectors increases the sensor field of view on a 1:1 basis.  For example, if the video camera array size was doubled from 640 to 1280, the maximum FOV for a given set of target parameters is increased by a factor of two as well.  This is one method of achieving a larger FOV while maintaining the same resolution and performance capability.  Upper limits of sensor field of view are typically about 45 degrees.  Larger FOVs require sophisticated optical systems and other cost and performance sacrifices that make them undesirable and seldom utilized.

Summary

This paper provided a brief overview of the system engineer’s methodology in choosing an optical sensor for a military vehicle.  Once the weapon’s maximum range and the target size(s) are determined, the sensor description can be determined. .  The balance of performance, cost, and capability must be carefully weighed when designing an overall optical sighting system.  All criteria are inter-related and less important criteria can be traded off to permit more important requirements to be met. 

Cost allocations for the visible and infrared sensors need to be considered.  If cost is a secondary concern, then a higher performance, increased resolution sensor provides additional benefits: either longer-range detection, or a larger field of view (or a combination of these two).  When cost is a primary concern, the target and range parameters combined with the lower cost sensors drives the overall FOV and subsequent performance.

Implications for Further Study

For specific performance predictions, detailed software performance prediction models that are considered industry standards should be used to obtain more exact data.  Sensor performance can be predicted for specific targets, atmospheric conditions, and for both day and night viewing.  Before a final determination is made, it is strongly recommended that these models be used to provide exact performance instead of the estimates provided in this paper.

 

Note: This article authored by Russell Lombardo was published in the July 1998 Photonics magazine.

Copyright Lombardo Technical Services

[1] The Infrared and Electro-Optical Systems Handbook, James D. Howe, 1993, Environmental Research Institute of

   Michigan and The Society of Photo-Optical Instrumentation Engineers, Volume 4, page 61.

[2] The Infrared and Electro-Optical Systems Handbook, James D. Howe, 1993, Environmental Research Institute of

   Michigan and The Society of Photo-Optical Instrumentation Engineers, Volume 4, page 92.

[3] The eye’s resolving power quoted, as 1 minute, is a minimum number, the actual real-world usage quantity is perhaps closer to 1.5 minutes of arc.

 [4] This factor of two (50%) resolution increase includes many attenuation factors, e.g., target DT or contrast, atmospheric attenuation, sensor performance, optical attenuation, light levels, etc.  Although this is only a rough estimate, calculations using more sophisticated models approximate this factor using many algorithms that are beyond the scope of this paper.