WO2020165581A1 - Night vision device testing - Google Patents

Night vision device testing Download PDF

Info

Publication number
WO2020165581A1
WO2020165581A1 PCT/GB2020/050320 GB2020050320W WO2020165581A1 WO 2020165581 A1 WO2020165581 A1 WO 2020165581A1 GB 2020050320 W GB2020050320 W GB 2020050320W WO 2020165581 A1 WO2020165581 A1 WO 2020165581A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
bars
night vision
vision device
test
Prior art date
Application number
PCT/GB2020/050320
Other languages
French (fr)
Inventor
Laurence Durnell
Original Assignee
Fenn Night Vision Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fenn Night Vision Limited filed Critical Fenn Night Vision Limited
Priority to GB2113029.9A priority Critical patent/GB2596009B/en
Publication of WO2020165581A1 publication Critical patent/WO2020165581A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention relates to night vision devices (NVDs), such as night vision goggles (NVGs), and in particular to the testing of NVDs.
  • NBDs night vision devices
  • NVGs night vision goggles
  • Night vision devices including binoculars and monoculars as well as goggles, are much used in military and other environments to provide improved vision in low light environments. They use well known image intensification methods to produce, from a low intensity image which the human eye cannot easily see, a higher intensity image which it can.
  • NVDs do not produce perfect images, and various methods are known for testing the quality of the images that they do produce.
  • Common problems with the images include de poor resolution, the presence of blemishes in the images, i.e. small features in the intensified image which do not correspond to anything in the original image, and also the presence of a hexagonal‘honeycomb’ or‘chicken wire’ pattern in the intensifi ed image which arises due to the structure of the micro -channel plate used for image intensification and becomes visible under some conditions.
  • Known NVD testing systems are also limited to performing tests using static input images, which is not representative of the use of NVDs and limits the parameters of the NVD that can be tested.
  • Known NVDs are also generally configured to perform only one type of test at a time, requiring an operator to specify exactly which test is to be performed and initiating each different test in turn, thus increasing the time required to perform a comprehensive test of a NVD.
  • a system for testing a night vision device comprising detection means arranged to detect an image produced by the night vision device and to output image data encoding the image; image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; processing means arranged to analyse the image data to determine whether the image encoded in the image data meets at least one criterion, and to generate an output indicative of whether the at least one criterion is met; and a mount arranged to support the night vision device and to align the night vision device with the detection means and the image generating means.
  • the detection means, the image generating means, the mount and optionally also the processing means may be assembled together to form an integrated system configured to receive the night vision device for testing.
  • the system may be arranged such that the night vision device is optically coupled with the image generating means and the detection means when it is supported by the mount.
  • the system may further comprise a casing.
  • the detection means, the image generating means, the mount and optionally the processing means may be arranged within the casing so as to receive the night vision device for testing.
  • the casing may be configured to be transported by hand.
  • the casing may be in the form of a briefcase or suitcase.
  • the casing may be rigid.
  • the system may be arranged to form a cavity for receiving the night vision device.
  • the cavity may be located between the detection means and the image generating means.
  • the mount may be arranged to support the night vision device within the cavity.
  • the image generating means may comprise an active screen, such as a digital display, LCD, LED or OLED screen the processing means may be configured to control the image displayed by the active screen.
  • an active screen such as a digital display, LCD, LED or OLED screen the processing means may be configured to control the image displayed by the active screen.
  • the processing means may be configured to perform a series of tests in response to a single input command. Each of the tests may be configured to evaluate whether the image encoded by the image data meets a different criterion.
  • the processing means may be configured to cause the image generating means to generate a series of images for performing the tests.
  • the detection means may comprise an objective lens arranged to focus the output image output by the night vision device for recording by the detection means.
  • the system may be configured to autofocus the objective lens of the detection means based on a focus metric calculated by the processing means calculated from the image data.
  • the system may be configured to autofocus the output image of the night vision device for imaging by the detection means by adjusting the focus of the objective lens of the detection means based on a focus metric calculated by the processing means calculated from the image data.
  • the processing means may be arranged to cause the image generating means to generate an input image comprising a moving feature.
  • the processing means may be arranged to calculate a spatial resolution based on the intensity profile defining an edge of the moving feature in the image encoded by the image data.
  • the processing means may be arranged to cause the image generating means to generate at least one input image comprising a plurality of sets of equally spaced parallel bars, preferably on a plain background, each of the sets of parallel bars having a different predetermined spacing between the bars.
  • the processing means may be arranged to measure, for each of the sets of parallel bars, the contrast between the bars and the spaces between the bars in the image encoded by the image data .
  • the processing means may be arranged to calculate a modulation transfer function based on the measured contrast values for the plurality of sets of bars and the predetermined spacing of the bars.
  • the processing means may be arranged to identify the presence of at least one feature in the image.
  • the at least one criterion may be the presence of a feature, or the presence of at least a predetermined number of features.
  • the system may further comprise a display screen.
  • the system may be arranged to provide an indication on the display screen of whether the at least one feature is detected.
  • the system may be arranged to display the image on the display screen, and may be arranged to highlight the feature on the image.
  • the input image may be a plain image, for example a plain white or other pale colour image.
  • the processing means may be arranged to identify features in the output image which are not present in the input image.
  • the processing means may be arranged to identify features meeting at least one feature criterion, and to determine the number of such features in at least an area of the image .
  • the features may be parts of a hexagonal‘chicken wire’ pattern, in which case the number of them that are identified will indicate the extent or strength of the pattern. Alternatively they may be blemishes, which can be identified by various criteria.
  • the processing means may be arranged to control the display so as to highlight each of the features that meet the at least one criterion.
  • the input image may comprise a plurality of features of the same shape but of different sizes, or different orientations, or both, and the processing means may be arranged to determine which of the corresponding features in the output image meet the at least one criterion. For example a set of criteria may be used to define the feature, which may itself comprise a number of sub -features, such as a number of bars or lines. The size of the smallest feature that can be identified will be an indication of the resolution of the NVD.
  • the processing means may be arranged to measure a parameter of the output image and to calculate from the parameter the gain of the device.
  • the processing means may be arranged to measure a parameter of the image or the output image, and to calculate from the parameter the level of gross distortion produced by the device.
  • the processing means may be arranged to detect discontinuities produced by the device in the image or the output image.
  • the processing means may be arranged to measure the magnification of the device.
  • the processing means may be arranged to measure alignment between two sides of a binocular night vision device.
  • the processing means may be arranged to measure flicker in the image or the output image.
  • the processing means may be arranged to measure signal to noise ratio in the image or the output image.
  • the processing means may be arranged to measure a characteristic of a halo surrounding a feature in an input image.
  • the present invention further provides a system for aiding a user in focusing an optical device, the system being arranged to acquire a sequence of images output by the device, calculate a focus metric for each of the images, and generate a user - perceptible output indicative of the focus metric.
  • the output may be arranged to vary in real time during adjustment of the focus of the device.
  • the output may be a plot of the metric against time, for example a line plot or a bar graph.
  • the system may be arranged to identify a maximum value of the metric as the device passes through a point of maximum focus, and then to detect when the value of the metric returns to that maximum value and in response to generate a further user perceptible output.
  • a method for testing the dynamic spatial resolution of a night vision device comprising generating a moving input image for the night vision device, the moving input image comprising a moving feature; and calculating a dynamic spatial resolution based on the intensity profile defining an edge of the moving feature in an output image generated by the night vision device.
  • the output image may be a static or still image (i.e. a single still frame) or may be a moving image (i.e. a sequence of frames).
  • the method may comprise calculating a spatial resolution for each of a plurality of output images generated by the night vision device at different times, and averaging the plurality of calculated spatial resolutions to give an average spatial resolution.
  • the edge used to calculate the spatial resolution may be a straight edge.
  • the straight edge may extend in a direction perpendicular to the direction of movement of the moving feature.
  • the edge used to calculate the spatial resolution may be a leading or a trailing edge of the moving feature.
  • the input image may comprise a plain background.
  • the moving feature may be square or rectangular in shape.
  • the moving feature may have a uniform intensity or shading across its area.
  • a method for measuring the modulation transfer function of a night vision device comprising generating at least one input image for the night vision device, the at least one input image comprising a plurality of sets of equally spaced parallel bars, preferably on a plain background, each of the sets of parallel bars having a different predetermined spacing between the bars; obtaining at least one output image generated by the night vision device, the at least one output image reproducing the at least one input image; measuring for each of the sets of parallel bars in the at least one output image the contrast between the bars and the spaces between the bars; and calculating the modulation transfer function based on the measured contrast values for the plurality of sets of bars and the predetermined spacing of the bars.
  • the measured contrast values may be representative of the maximum difference in intensity between the spaces and the bars as measured from the output image.
  • the measured contrast values may be the maximum measured contrast between the spaces and the bars.
  • Calculating the modulation transfer function may comprise normalising the measured contrast values with respect to the theoretical maximum contrast or the actual contrast in the input image between the bars and the spaces between the bars.
  • the theoretical maximum contrast is the maximum contrast that could be recorded by the night vision device for bars that have an infinite thickness and which are infinitely spaced apart.
  • the maximum theoretical contrast is the maximum measurable contrast between an infinitely sized area corresponding to the spaces and an infinitely sized area corresponding to the bars, i.e. as would be measured by a night vision device with a perfect spatial resolution.
  • the at least one input image may comprise a plurality of input images, each of the input images comprising at least one of the sets of parallel bars.
  • each of the input images may comprise one of the sets of parallel bars.
  • the measured contrast between the bars and the spaces between the bars for each of the sets of parallel bars may be an average of the contrast between the bars and the adjacent spaces between the bars in that set.
  • a night vision device testing system configured to perform any of the methods described above.
  • the system may comprise detection means arranged to detect an image produced by the night vision device and to output image data encoding the image; image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; and processing means arranged to analyse the image data.
  • Figure 1 is a schematic diagram of a night vision device testing system, which may be used in accordance with the methods of the invention ;
  • Figure 2 shows an example of a test image as displayed on the screen of the system of Figure 1 ;
  • Figure 3 shows an example of a further test image as displayed on the screen of the system of Figure 1 ;
  • Figure 4 is a chart showing the steps of a test procedure performed by the system of Figure 1 ;
  • Figure 5 is a display produced by the system of Figure 1 during the test procedure of Figure 4;
  • Figure 5a shows the variation in intensity over one of the bars of the image of Figure 2, as measured in the procedure of Figure 4;
  • Figures 6, 7, 8, 9 and 10 are charts showing the steps of further test procedures performed by the system of Figure 1 ;
  • Figure 11 shows a test image used in the test procedure of Figure 10
  • Figure 12 shows an output image from the device, analysed in the test procedure of Figure 10
  • Figure 13 is a chart showing the steps of a further test procedure performed by the system of Figure 1 ;
  • Figure 14 shows an output image from the device, analysed in the test procedure of Figure 13 ;
  • Figures 15, 16, 17, and 18 are charts showing the steps of further test procedures performed by the system of Figure 1 ;
  • Figure 19 is a schematic top view of a system according to a further embodiment of the invention.
  • Figure 20 is a schematic section through the system of Figure 19.
  • Figure 21 shows a graph of the modulation transfer function (MTF) plotted as a function of spatial frequency of parallel test bars.
  • Figure 22 illustrates aspects relating to the modulation transfer function.
  • the left-hand side of Figure 22 shows an input image and the relative intensity profile of the input image and the right-hand side of Figure 22 shows an output image generated by a night vision device reproducing the input image and the relative intensity profile of the output image.
  • Figure 23 shows an input image for a halo test.
  • Figure 24 shows a perspective view of an integrated portable testing system according to the invention.
  • Figure 25 shows a schematic top-down view of the system shown in Figure 24.
  • a test system comprises an image generation device 100 which includes a display screen and is arranged to display one or more images, and may comprise a mounting 102 arranged to support an NVD 104 so that the image displayed on the image generation device 100 is in the field of view of the NVD 104, and forms an input image for the NVD 104.
  • the image generation device preferably comprises an active screen, such as a display screen capable of generating a number of different images. Examples of such screens include LCD, LED and OLED screens.
  • the system further comprises an image detection device, such as a digital camera 106 . This may be, for example, an IDS UI- 1490SE, or other suitable machine vision camera.
  • the system may further comprise a further mounting device or connector 108 which is arranged to connect the camera 106 against the eyepiece 1 10 of the NVD so as to detect and record the output image generated by the NVD 104, and to output image data encoding the image.
  • a processing system or computer device in this example a tablet 112, may be arranged to receive image data from the camera or cameras 106, and may include a processor 114 and memory 1 16 and a display screen 1 18. It may also be arranged to control various aspects of the operation of the camera or cameras 106, such as the frame rate and exposure time of the camera.
  • each frame may also be arranged to control which of the detectors from the full array of the camera are used for each frame, so that, for example, each frame can include the maximum number of pixels corresponding to the total number of detectors in the camera, or a smaller number of pixels using only some of the detector array. This can allow smaller images to be acquired with a higher frame rate than is possible for the maximum image size.
  • the test system may be purely for monocular testing, and may generate only a single input image at one time. Alternatively it may be for binocular testing, and may therefore generate two images simultaneously, one for each side of a binocular NV D. Operation will be described below for a single monocular system, but for each test it will be appreciated that both sides of a binocular system may be tested in the same way, either simultaneously, or sequentially but without moving the NVD. A sliding mounting for a binocular NVD may also be incorporated in the system so that the two sides can be tested sequentially using the same input image .
  • the test system in particular the image generation device 100, may be arranged to perform a number of functions, but for the purposes of the present invention, the relevant functions may include the display of various types of input image, one of which may be a plain image, for example a white image with a number of rings defining different regions of the image , against which blemishes and chicken wire artefacts can easily be detected, one of which may be a resolution checking image which may contain features of a range of sizes, for example as described in more detail below, and one of which may be a rectangular or square grid.
  • the tablet 1 12 may be arranged to run a software program or application which is arranged to analyse the image data from the camera 106 to test the NVD 106 to determine whether the data, and therefore the output image, meets at least one test criterion.
  • the program may provide a number of functions which can be selected by a user, for example using icons 120 displayed on the screen 1 18 of the tablet.
  • the embodiment shown provides all of the functions which will be described below, but other embodiments may offer only some of these, and may offer others.
  • the image generation device 100 may generate an image which is shown in Figure 2 which comprises a number of sets of bars, for example three bars 200.
  • Each bar 200 is a solid rectangle or other suitable shape, of black or another suitable colour, and each set 200a, 200b may comprise three identical bars arranged parallel to each other with a small gap between them so that the whole set covers an approximately square area.
  • the sets 200a, 200b vary in size, but for each size there may be a pair of two sets 200a, 200b one with the bars arranged vertically and one with the bars arranged horizontally.
  • the pairs may be arranged in groups of four pairs, and in Figure 2 two groups 202, 204 are shown, with the largest sets of bars in one group 204 being smaller than the smallest sets in the other group 202.
  • one of the icons 120 can be selected by a user to test the resolution of the NVD, and this is selected when the image generation device 100 is displaying the resolution testing image shown in Figure 2.
  • the program may then be arranged, using image processing techniques such as edge detection and shape recognition, to detect as many sets 200a, 200b of bars 200 as it can, or to measure the resolution of the bars to determine which of them are imaged with at least a threshold level of resolution. For each set that it detects, or detects with sufficient resolution, it may be arranged to identify that set, for example by highlighting it on the screen, in the example shown this is by displaying a box 206 around it.
  • the results of the test can be displayed in a number of ways.
  • the highlighted sets 200a, 200b can simply be displayed so that a user can determine which ones have been identified, and therefore determine the resolution of the NVD in a similar way to when the user is looking at the sets of bars and determining themselves which ones can be seen and which not.
  • the program can be arranged to display details of the smallest set or sets, or the smallest group and all of the identified sets within that group, for example just indicating them by number.
  • a display area 210 on the screen indicates that sets a, b and c from Group 2 are the smallest sets that have been identified. This has the advantage over known methods that the user does not have to decide which of the sets is clear enough to be distinguished and which not. An example of how this test can be performed is described in more detail below with reference to Figure 6.
  • the resolution test described above can be performed at several different locations in the field of view (FOV) of the NVD so as to evaluate how the resolution varies across the FOV.
  • FOV field of view
  • the sets 200a, 200b of bars 200 used to determine the resolution may be located towards the edges of the FOV to determine the edge resolution, which often differs from the resolution in the central region of the NVD.
  • the resolution test may be performed for a central region of the FOV and for at least one edge region of the FOV and the results of these tests output by the test system.
  • the test system may also be arranged to measure the modulation transfer function (MTF) of the NVD.
  • MTF modulation transfer function
  • the MTF is another measure of spatial resolution and effectively describes how the measured contrast varies as a function of the spacing between adjacent features, for example as a function of the spacing between adj acent parallel bars such as those found in the resolution test images described above and shown in Figure 2.
  • the MTF is a measure of the ability of a NVD to transfer contrast at a particular resolution (i.e. the spatial frequency of repeating features) from the input image to the output image. As the spacing between adj acent bars in the input image decreases, it becomes increasingly difficult for the lens to transfer the contrast between the dark areas of the bars and the light areas between the bars, and, as result, the MTF decreases.
  • the MTF is expressed as the relative contrast (normalised relative to the actual contrast in the input image, or the maximum possible contrast measurable in the case of the adjacent bars being spaced infinitely far apart) between the measured intensity of the bars and the measured intensity of the gaps or spaces between the adj acent bars expressed as a function of the spacing between the bars, or the spatial frequency of the bars, e.g. in lines per unit distance (e.g. lines per mm) or expressed as an angular spatial frequency (e.g. cy/mrad) relative to the observation point .
  • the relative contrast is the maximum measured contrast between the bars and the spaces between the bars , as illustrated in Figure 22.
  • the relative contrast is the difference in intensity between the maxima 130 of the peaks and the minima 132 of the troughs corresponding to adj acent spaces and bars, respectively.
  • the spatial frequency of the bars increases the relative contrast between the bars and the spaces decreases.
  • the test system is arranged to display an input image comprising one or more sets of equally spaced parallel bars on a plain background.
  • the width of the spaces between the bars should be the same as the width of the bars themselves.
  • the bars may be black and the background may be white.
  • the test system may be arranged to display a resolution testing image, such as that shown in Figure 2, comprising a plurality of sets 200a, 200b of parallel bars 200.
  • the bars in each of the sets are equally spaced apart with a known spacing or frequency, but the spacing between the bars in different sets is different.
  • the test system may display the different sets of parallel bars in different images.
  • test system may display a series of images on image generating means , each of the images comprising one or more sets of bars having different bar spacings or spatial frequencies.
  • the processing means is arranged to measure, for each of the sets of parallel bars, the contrast between the bars and the spaces between the bars in the image encoded by the image data output by the detection means.
  • the processing means is further arranged to calculate the contrast between the minimum and maximum intensity of the areas corresponding to the bars and the areas corresponding to the spaces between the bars in each of the sets of parallel bars.
  • the measured contrast values are then normalised relative to the maximum theoretical contrast between the bars and the spaces between the bars, and the processing means calculates the modulation transfer function based on the normalised measured contrast values and the known spacing between the bars in each of the sets of parallel bars .
  • the test system may then display a plot of the MTF as a function of spatial frequency of features or line spacing and may output and display MTF values for one or a selection of different spatial frequencies or line spacing.
  • the test system may be arranged to output a pass or fail indicator based on whether the MTF satisfies a predetermined MTF requirement. For example, in order to pass the MTF test the test system may require the MTF values at one or a plurality of different spatial frequencies to be above a predetermined threshold for that spatial frequency.
  • the test system may also be arranged to perform a dynamic spatial resolution test using a moving input image.
  • the test system may display a moving input image comprising a moving feature, e.g. a shaded shape on a plain background.
  • the moving feature may travel across the input image at a predetermined speed and processing system 1 12 may be configured to calculate the dynamic spatial resolution of NVD 104 based on the sharpness of the intensity profile defining an edge of the moving feature in the image encoded by the image data.
  • the edge may be a leading or trailing edge of the moving feature, which will become blurred due to the movement of the feature, the inherent static resolution of the NVD, and the response time of the NVD.
  • the intensity profile e.g.
  • the sharpness of the leading or trailing edge taken along the direction of movement of the feature across the input image provides a measure of the dynamic spatial resolution of the NVD.
  • the leading or trailing edge of the moving feature extends perpendicularly to the direction of movement of the feature in the input image.
  • the moving feature is a rectangle or a square
  • the feature preferably moves across the image in a direction perpendicular to two opposing sides of the rectangle or square.
  • the test system may be arranged to measure the dynamic spatial resolution at a plurality of discrete times as the feature moves across the image and to average the measured values to provide an average dynamic spatial resolution.
  • the test system may be configured to continuously measure the dynamic spatial resolution for a predetermined period of time as the feature moves across the input image and to output an average value, for example an arithmetic mean, measured over the predetermined period.
  • the test system may measure the dynamic resolution in perpendicular directions, preferably in the horizontal and vertical directions, by first generating a moving input image in which the feature moves in a first direction and then generating a moving input image in which the feature moves in a second direction perpendicular to the first and calculating the dynamic spatial resolution in both directions.
  • the test system may also be configured to measure the dynamic spatial resolution for a plurality of moving speeds of the moving feature.
  • a first dynamic spatial resolution may be calculated with the moving feature moving at a first speed and a second dynamic spatial resolution may be calculated with the moving feature moving at a second speed across the input image.
  • the test system may then display a plot of the dynamic spatial resolution as a function of moving speed of the feature. Measuring the dynamic spatial resolution is particularly useful because NVDs are generally used in dynamic situations and the FOV being imaged typically includes a large number of moving objects. It is therefore more relevant to the real world performance of the device to measure the dynamic spatial resolution than the static spatial resolution.
  • the test system may also be arranged to perform a halo measurement test.
  • a“halo” of intensity is seen surrounding the feature (identified by the white circles in Figure 23) .
  • These image halos are a common problem experienced with NVDs and it is therefore useful to measure them.
  • the test system may be arranged to display an input image 134 comprising one or more light or bright features 136 against a plain dark background 138.
  • features 136 may be white circles and the background may be a plain black background.
  • Processing system 1 12 is arranged to measure the diameter of halo(s) 140 surrounding the one or more features in the image encoded in the output image data, for example using a threshold brightness criterion or edge detection algorithms.
  • the diameter of halo(s) 140 is then compared to the diameter of the corresponding feature 136 in the input image, which is predetermined and known by the test system which generates the input image, and the diameter, size or other parameter characterising the halo is then determined and output, for example the increase in diameter of the feature caused by the halo.
  • the halo test can be performed for different contrast levels between features 136 and background 138 to provide a measure of how the halo varies as a function of contrast.
  • the halo measurement can also be performed at different locations within the NVD field of view to determine how the halo varies within the FOV or the results of the halo test at different locations can be averaged to provide an average (e.g. mean) halo parameter.
  • the test system is arranged to display the plain white test input image with circular regions defined on it as described above .
  • a completely plain image could be used instead.
  • the tablet is then arranged to use image processing techniques to identify two specific features in the output image: hexagonal patterns, commonly referred to as ‘chicken wire’ or ‘honeycomb’ patterns, and blemishes.
  • the chicken wire pattern 300 is very distinctive and the program is arranged to detect individual lines 302 including their position and orientation, and then to analyse them to determine whether they are part of ‘chicken wire’ patterning.
  • the tablet is arranged to do this by detecting the presence of, for example, groups of three lines 304a, 304b, 304c meeting at a point at angles of 120° to each other. Once a sufficient number of these has been detected, the program is arranged to determine that the ‘chicken wire’ pattern has been detected.
  • the strength of the pattern can be determined by, for example, the number of instances of the required combination of lines being detected in the whole image, and the thickness or darkness of the lines compared to the background.
  • the program can then compare the detected pattern against one or more threshold criteria and give a pass or fail indication. Alternatively, or in addition, it can be arranged to grade the severity of the chicken wire pattern, for example on a scale of 1 to 6. An example of how this can be done is described in more detail below with reference to Figure 7.
  • the results can then be indicated to the user, for example in a results area 306 on the screen 1 18.
  • the program is arranged to identify, again using edge detection, shape recognition and other suitable algorithms, any features (other than the hexagonal pattern) which are present in the image, and to determine their size. Once the number and size of the blemishes has been determined, this can be displayed to the user, either by simply highlighting them so that the user can count them, or by giving a direct indication of the number in a dedicated area of the screen 1 18. Obviously a variety of criteria can be used to determine what should be counted as a blemish, for example it might include only shapes over a particular area and/or length.
  • each detected blemish can either just be identified as a blemish, or it may be categorized, for example by size and/or shape, and the number of each category of blemish indicated to the user, either just as a number for each category indicated in the results area 306 on the screen, or by highlighting different categories in different colours.
  • An example of how the test for hexagonal patterning is performed is described below with reference to Figure 8. While it is possible for the chicken wire detection and blemish counting to be performed for the whole of the image, it can be more useful for a separate analysis t o be carried out for different parts of the image.
  • the image may be divided into a plurality of different areas 402, which may be indicated on the image, for example by concentric rings 403 included in the white test image as shown in Figure 3, or rectangles or squares or segments.
  • the determination of the presence or absence and categorization of chicken wire patterning, and the number and nature of blemishes, is then performed for each area, and the results indicated separately for each area in the results area 306 of the screen.
  • These lines are used if the test is done by eye, but as will be described in more detail below, they can be used for other automated tests performed by the system.
  • the eyepiece 1 10 of the NVD is typically manually adjustable to adjust the focus of the output image produced by the NVD.
  • the system may be arranged to perform a focus assist function which aids a user in focusing the NVD so that other tests can be performed on it.
  • the program is arranged to perform a calculation of focus on an output image, for example generated using the resolution test image of Figure 2, and to repeat this for subsequent images over a test period while the user adjusts the focus, and indicate the result of the focus calculation, and how it varies over time, to the user on the screen 1 18.
  • the result may be displayed in real time, for example on a plot of focus against time, which may have time on a horizontal axis and focus, as determined by a focus metric, on the vertical axis.
  • the user can then watch the plot as the focus is adjusted to determine when the optimum focus is reached.
  • the program may be arranged to acquire an initial set-up image data set for an image of the test pattern with a reference exposure, and from that data set to calculate the best exposure settings.
  • the program is then arranged to communicate those optimum settings to the camera which is then arranged to use them for the rest of the test.
  • the program selects a test frame rate which is as fast as possible, whilst ensuring that the contrast is sufficient for the focus of each image to be determined. This may comprise testing the contrast between the lightest and d arkest parts of the set-up image and ensuring that it is above a predetermined threshold, which is sufficiently great for the focus measurement to work. Generally the greater the frame rate, the lower the contrast will be.
  • the program is arranged to acquire a series of test images at the test frame rate, and for each test image, to calculate the value of a focus metric and display the value on the screen as shown in Figure 5.
  • the value of the metric for each frame may be displayed as soon as it is calculated so as to generate a real -time plot of the focus. As the value of the metric changes over time, the plot will rise and fall, and the user can monitor this while adjusting the focus to determine where the best focus is.
  • the focus might increase towards the maximum, and then decline again which will be seen as a rising then falling curve 500. From this the user can see approximately where the maximum value 502 of the focus metric occurs, and move the eyepiece back towards the maximum focus. Then as the focus passes through the maximum again at point 504 this can be seen and the focus adjusted back again to the maximum. Obviously in practice the user may move the focus through the optimum any number times, but with practice the optimum focus can be found very quickly.
  • the program may also monitor the variation in focus metric over time and may detect the maximum value of the focus metric, for example by identifying the top 502 of the curve as the focus moves through that point. In may then be arranged to store the maximum value, and generate an output to the user when that maximum value is again reached, for example as an audible feedback, or visual feedback on the screen 1 18. This can aid the user in determining when the optimum focus has been reached.
  • the focus metric may be calculated in a number of ways, but may for example comprise plotting the intensity or grey level of the output image of one of the bars 200a in the test image of Figure 2, which will generally have the shape shown in Figure 5a with a high level 510 outside the bar and a low level 512 within the bar. If the focus is good the plot will be a sharp, deep, rectangular dip, whereas if the focus is poor, the dip will be rounder and more shallow. Therefore the focus metric may be the difference in intensity between the two levels 510, 512, or may include a measure of the sharpness of the intensity dip.
  • the test system may advantageously be arranged to automatically focus the objective lens of image detection device, for example the objective lens of digital camera 106. This removes the requirement to focus the NVD by hand.
  • Computer device 1 12 may be arranged so as to control the focus of the objective lens of image detection device 106 and to calculate a focus metric. Computer device 1 12 then autofocusses the objective lens of image detection device 106 based on the focus metric.
  • the test system may prompt the operator to adjust the NVD objective lens appropriately, for example by displaying a command on displ ay screen 1 18 , to bring the optimum focus within the focus range of the objective lens of image detection device 106.
  • Computer device 1 12 then autofocusses the objective lens of the image detection device.
  • the focus of the NVD objective lens should be set at infinity prior to the autofocussing procedure and the test system may therefore be arranged to instruct the operator to set the NVD objective lens to infinity before commencing the autofocussing procedure.
  • the system can perform a number of other tests.
  • a sequence of images of the test image of Figure 2 is acquired.
  • the program may then be arranged to average the sequence of images, for example by calculating an average over all the images for the v alue of each pixel. It may then be arranged to normalise the average image to correct for any variation in brightness over the area of the image, which is typically brighter in the centre than towards the edges, thereby producing a sample image. It may then be arranged to locate one or more large features in the image, such as the largest group of bars, and determine the exact position within the image of those features.
  • the program may calculate the position, scale and location of the image, i.e. the exact position, scale and location within the area imaged by the camera, of all of the features of the test image. From this it may determine where within the sample image all of the features of the test image should be.
  • the program may then test the contrast of a number of features in the sample image, and from the results provide an indication of the resolution of the NVD.
  • the program may be arranged to measure the contrast of one of the bars 200 of the test image as it appears in the sample image by determining the brightness at a point in the centre of the bar, and the brightness of a point well outside the bar.
  • the contrast will be high as the centre of the bar will be black and the area just outside the bar will be white.
  • the centre of the bar may not be completely black as the image will be somewhat blurred. Therefore the comparison between the two brightness levels gives a measure of the contrast for that bar.
  • the centre of larger bars will be darker than the centre of smaller bars, so the program may be arranged to set a threshold contrast, and then measure the contrast for each of the bars, and determine the smallest bars for which the contrast is above the threshold level. This result can then be indicated to the user as described above.
  • the honeycomb pattern test may be performed as follows.
  • the tablet may first acquire a single sample image of the white test image. If the test image includes circles or other lines defining areas of the test image, then the program may be arranged to locate those and remove them to form a plain sample image.
  • the program may then be arranged to calculate the extent of the honeycomb pattern in the plain sample image, and to allocate the extent of the pattern to one of a plurality of categories, for example on a scale of 1 to 10. This may then be repeated for a number of separate test images.
  • the scale value of the honeycomb pattern for the images may then be averaged, and the program may then determine whether the NVG passes or fails this test based on the average value.
  • results can be displayed to the user on the screen 1 18 either as a simple pass/fail indication, or as graphical slider on a scale so that the strength of the honeycomb pattern is indicated, either on a continuous scale or as one of a number of finite steps or categories, so that a user can tell whether the test is passed or failed, and how marginal the result is.
  • the program may first be arranged to acquire a sequence of sample images of the plain test image, and then to average those images to form an averaged sample image, which may then be normalised as for the resolution test described above.
  • the images may be acquired using high‘gamma’ which allows features in the darker parts of the image to be relatively easily identified.
  • the average sample image may then be converted to a binary image in which each pixel has one of only two values, which can be represented as black and white. This will generally result in a sample image which is mostly white with black blemishes of various sizes and shapes in it.
  • All areas of black may then be identified by the program, which may be arranged to categorize them by size and shape, in a ‘blob labelling’ step. Any circles or other lines intentionally present in the plain test image may then be identified by their size and shape, and excluded from the calculation, or they may be left in, for example if the blemishes are to be identified to a user for counting. Once all the blemishes have been categorized, they may be identified by category, for example by defining a plurality of size categories, assigning a colour to each category, and highlighting all the blemishes in each category in the colour associated with that category.
  • the program may be arranged to determine the number of blemishes, in total or in each category, in each area, and indicate this numerically.
  • the program may be arranged to compare the number of blemishes in each area with a reference threshold number for that area, and if one or more of the thresholds is exceeded to determine that the NVD fails the test, but if none of the thresholds is exceeded, to determine that the NVD has passed the test.
  • the pass/fail test result may be indicated to the user on the screen in any suitable manner.
  • the program may also be arranged to perform a gain test arranged to compare the gain of the two imaging systems.
  • the gain referred to here is the increase in brightness of the output image from the NVD compared to the input image.
  • the gain of the two sides may be equal.
  • the same input image may be used for both sides of the NVD, and the same camera may be used to capture the output image from both sides.
  • the system is arranged to acquire an image with each side of the NVD, and then from those images to calculate a measure of the gain of each side of the NVD. The gains can then be compared, either by the system or by a user, to determine whether they are close enough to each other.
  • the program may be arranged to acquire a sequence of output images for one side of the NVG, for example the left side, which are all from the same input image which may for example be the resolution test image. It may then be arranged to calculate an average image from the sequence of images. It may then be arranged to calculate an average grey level for an area of the average image, for example a small central area of the average image. It may then be arranged to convert the average grey level value to an average luminance value. These steps may then be repeated for the other side of the NVG to determine an average luminance value for that side. It may then be arranged to compare the two luminance values, for example by calculating a difference between the two values as a percentage of one of them or of the average of the two.
  • This percentage may then be compared with a threshold percentage, and if it is below the threshold to determine that the two values are similar enough to pass the test, but if it is above the threshold to determine that they are too different in which case the test is failed.
  • This pass/fail determination may be indicated to the user by any suitable indicator, for example on the screen 1 18, or the percentage value itself may be indicated.
  • the program may also be arranged to perform a gross distortion test to measure the degree of gross distortion of the image.
  • Gross distortion occurs in the inverter or ‘twister’ which is a twisted fibre optic bundle arranged to correct for inversion of the image caused by the objective lens of the NVD. Slight imperfection of the twister results in distortion of the final image which converts straight lines to a generally S -shaped line.
  • the test system may be arranged to generate an input image which includes a number of straight lines 1 10, for example in the form of a square grid 1 12 as shown in Figure 1 1.
  • the test program may be arranged to acquire a sequence of output images from the NVD using the camera.
  • the program may then be arranged to average those images and normalise them as described above. It may then be arranged to locate the grid lines 122 in the output image. As these may be distorted by gross distortion, as shown in an exaggerated form in Figure 12, the search may be arranged to find not just straight lines, but any lines which are straight or curved. Once the grid lines 122 have been located in the output image, the program may be arranged to calculate the distortion at a number of points along the lines, and convert the lines to data points on a distortion graph.
  • the program may be arranged to calculate the horizontal offset dx at points along each of the vertical lines 124 in the output image from the original position of the corresponding line 124a in the input image, and to calculate similar offset in the vertical direction y of the horizontal lines from their positions in the input image.
  • the distortion graph would then be a plot of horizontal offset dx as a function of vertical distance y along the line, or a plot of vertical offset dy as a function of horizontal distance x along the line. In each case the graph would have a maximum positive value of the offset and a maximum negative value of the offset, each of which can be identified.
  • the program may then be arranged to correct the data points for lens distortion which will be a known quantity for the optical system, and then to determine the magnitude of the gross distortion.
  • the magnitude may be defined in a number of ways, but is typically defined as the mean magnitude of the maximum positive and negative offsets dx between the distorted line and its original position. This can be determined for each of the lines and then averaged to determine an average value of gross distortion for the NVD. This value may then be compared with a threshold value, and if it exceeds the threshold the program may determine that the test has been failed, otherwise it may determine that it has passed.
  • the pass/fail result or the mean distortion value, or both can be indicated to the user on the display 1 18. It will be appreciated that other straight features, such as edges, may be used instead of, or as well as, the lines described above.
  • the program may also be arranged to perform a discontinuity test. This may use the same input image as the gross distortion test, and the first steps of acquiring a sequence of images, averaging and normalising, locating the grid, converting to points on a distortion graph and correcting for lens distortion may be the same as described above for the gross distortion test. Then, the data points for each line may be scanned for any discontinuities. As there will be some noise in the image, a simple offset between two adj acent points on a line is not sufficient to indicate a discontinuity.
  • the program may therefore be arranged to identify two sets of points each forming a substantially continuous vertical line 124a, 124b, i.e.
  • the program may further be arranged to perform a magnification test on the NVD.
  • the output image scale should be equal to that of the input image, i.e. a magnification of 1 : 1 , and this should be the same for both sides of the NVD if it is a binocular device.
  • the program may be arranged to acquire a sequence of images using an input image having large features, such as the circles in the plain white input image or the large bars in the resolution test image. The sequence of images may then be averaged and normalised as in other tests described above.
  • the program may then be arranged to locate one or more large features in the output image, such as the circles or large bars, and to compare one or more dimensions of those features with corresponding dimensions from a reference image or from a reference values of the dimensions. It may then be arranged to calculate the magnification of the NVD, for example as a ratio or percentage, and compare the value of the magnification with a threshold value to determine whether the system passes or fails the test. The results may then be displayed on the screen 1 18, either as a pass/fail indication or as a value of the magnification, or both. This may be repeated for both sides of the NVD if appropriate.
  • the program may further be arranged to perform an alignment test to check that the two sides of a binocular NVD are aligned with each other.
  • This test may use the same input image as is used for a manual test of alignment, which is in fact two images, one for the left hand side and one for the right hand side of the NVD.
  • the test system may be arranged to generate these two images separately, one after the other.
  • One of the images may have a square box and the other may have a cross, with the cross and the square box being in corresponding places in the two images, so that, if the two sides of the NVD are aligned, the cross and the box will be in corresponding places in the output images from the two sides.
  • the apparatus may include a bridge piece which is arranged to be placed over two eyepieces of the NVD and to combine the images from them into a single image. If the test system only generates a single image, the NVD may be mounted on a slider so that the NVD can be moved between a first position in which the LHS of the NVD is located over the input image, and a second position in which the RHS of the NVD is located over the input image.
  • the program may be arranged to acquire a first sequence of images from the LHS of the NVD using the box as the input image, and then to average the images and normalise the averaged image, and to calculate the location of the box in the image.
  • the program may then then arranged to acquire a second sequence of images from the RHS of the NVD using the cross as the input image, and again to average and normalise and calculate the location of the cross in the image. It is then arranged to calculate whether the cross and box in the respective output images are aligned such that, if the images were overlaid, the cross would be within the box in both the horizontal (x) and vertical (y) directions. If it would, then the program may be arranged to display an indication that the test has been passed, but if not then the program may be arranged to display an indication that the test has been failed.
  • the test system may be arranged to generate two output images simultaneously on spaced -apart screen areas so that each of the two sides of a binocular NVD can be used to image a respective one of the images.
  • the bridge piece may then output a single combined image of, for example, the square and the cross, with the image from the two sides superimposed on each other.
  • the program may then analyse the single combined image to determine the degree of misalignment between the two parts of the combined image, for example in two orthogonal directions, and determine from those measured misalignments whether the alignment test is passed or failed.
  • the program may further be arranged to perform an image stability test.
  • This is arranged to test for flicker in the image and comprises taking a number of images in a sequence with a high frame rate, for example in the range 100 to 200Hz, calculating an average grey level for all or part of each image in the sequence, and measuring the variation in the average grey level between the images.
  • the area of the image used may be a small part of the image, so as to keep the data acquisition rate at a manageable level, and may be in a central region of the image which is generally brighter.
  • the variation may be measured by calculating the variance of the average grey levels of the sequence of images.
  • the variance may be compared with a threshold value, and if it is above the threshold the st ability test may be determined as failed, and if it is below, the test may be determined as passed.
  • the pass/fail result may be indicated to the user on the screen 1 18.
  • the program may further be arranged to perform a signal to noise ratio test on the image.
  • the input image is a 0.2mm diameter spot with a luminance of 1x10 5 foot candles having the spectral content of 2856 ⁇ 50K blackbody radiation.
  • the program is arranged to capture a sequence of output images from the NVD using that input image with a frequency of 10 frames per second.
  • the program is then arranged, for each image, to measure the measure the average signal S 0 from the camera over the 0.2mm diameter area of the bright spot, the standard deviation N 0 of the signal S 0 over the sequence of images, the average signal S bkd from the camera over a 0.2mm diameter area outside the bright spot, which is taken to be the background signal, and the standard deviation N bkd of the background signal S bkd -
  • the program is then arranged to calculate the signal to noise ratio (SNR) of the NVD using the calculation:
  • test set 100 is arranged to generate two images simultaneously, and there are two cameras 106, one for mounting on each side of a binocular NVD, and both of which are connected to the tablet 1 12 at the same time.
  • This allows the simultaneous testing of both sides of the binocular device, and also means that, for the test that require images from both sides, such as the alignment test and the magnification and gain tests, the input images for the two sides can be coordinated and the output images detected and processes simultaneously.
  • the processing may be carried out entirely within the test set 100a.
  • test set 100a may be arranged to generate all of the required input images in sequence, with one image for each side of the binocular device 104a generated simultaneously, and two cameras 106a, 106b one for mounting on each side of the NVD to capture the required output images for all, or any combination, of the test described above to be performed in an automated cycle.
  • the test set 100a may include a touchscreen screen 1 18a which may provide and or all of the functions of the tablet 1 12 described above. The output to the user may then be in of an overall pass/fail result, or a separate result for each test, or more detailed data from each test as described above.
  • the test set 100a may include an active LCD screen 150a, which may be arranged over a light source 152a, and which may be controlled by the program so as to generate different images as required. These may include the plain white image (with or without circular regions marked), the square grid pattern, or the resolution test bar image, or any other images that may be used for testing the NVD 104a.
  • the light source 152a may comprise a lamp 154a located inside an integrating sphere 156a and a diffuser 158a.
  • One or more lenses 160a may be provided over the LCD screen 150a to provide some focussing of the images generated by the LCD screen. This image generation system allows high flexibility and automation of the various tests described above. It will be appreciated that other methods of generating the image such as an LED or OLED display or a micro-mirror array can be used.
  • the test system may advantageously be arranged as a self-contained portable kit 142.
  • the components of the test system specifically image generation device 100b, image detection device 106b, and the processing system or computer device (in this case tablet computer 1 12b) are arranged and mounted within a casing 144 to provide an integrated system configured to receive a NVD for testing.
  • casing 144 is a portable, hand-held casing such as a briefcase.
  • Casing 144 may be closable for transport, for example casing 144 may comprise a lid 146.
  • casing 144 may have an open configuration for receiving the NVD for testing and a closed configuration for storage or transport.
  • Integrated test kit 142 may also comprise NVD mount 148 arranged to support NVD 104b and to locate NVD 104b between image generation device 100b and image detection device 106b.
  • Mount 148 is arranged so that NVD 104b is aligned and optically coupled with image generation device 100b and image detection device 106b when NVD 104b is received and supported mount 148.
  • test system 142 is arranged such that ambient light is excluded by the coupling of NVD 104b to image generation device 100b and image detection device 106b.
  • the test system may comprise connectors (not shown) to couple NVD 104b to image generation device 100b and image detection device 106b.
  • connectors may be arranged so as to form an optical seal excluding ambient light when the NVD is coupled to image generation 100b and image detection 106b devices.
  • the connectors may comprise a flexible or resilient material such as rubber which abuts the NVD so as to form an optical seal.
  • One or more of image generation device 100b, image detection device 106b and NVD mount 148 may be slidable in a direction along the longitudinal axis of NVD 104b so as to allow the concatenation of image generation device 100b, image detection device 106b and NVD 104b once NVD 104b is received by mount 148.
  • image detection device 106b may be slidable to press against the output end of NVD 104b, for example the eyepiece of NVD 104b, so as to urge NVD 104b towards image generation device 100b and to form an optical seal between NVD 104b and image generation device 100b and NVD 104b and image detection device 106b.
  • Mount 148 may be arranged to support a monocular NVD, or it may be arranged to support a binocular NVD. In the latter case, the test system may be arranged to align one side of the NVD at a time with image generation device 100b and image detection device 106b.
  • mount 148 may be a static mount that is arranged to receive and align either the left or right side of a binocular NVD with image generation device 100b and image detection device 106b.
  • mount 148 may be arranged to receive both sides of the binocular NVD simultaneously and may be movable, for example slidable, between two positions in which either the left or the right side of the NVD is aligned with image generation device 100b and image detection device 106b.
  • the test system comprises two image generation devices and two image detection devices, one for each side of the NVD, and mount 148 is arranged to simultaneously align the left side of the binocular NVD with the left image generation and image detection devices and to align the right side of the binocular NVD with the right image generation and image detection devices.
  • the test system may be arranged to provide a recess or cavity 150 for receiving the NVD, mount 148 being arranged to locate NVD 104b within cavity 150.
  • Image generation device 100b may be located on a first side of cavity 150 and image detection device 106b may be located on a second side of cavity 150 opposite the first side.
  • the test system may also comprise a recess configured to receive the computer device, in this case tablet 1 12b.
  • the computer device may be provided separately from the integrated system and may be in data communication with image generation device 100b and image detection device 106b, preferably wirelessly.
  • the computer device may be a tablet computer or other mobile computer device such as a smart phone, and may interface with the integrated test system wirelessly.
  • the computer device may be integrated with the remainder of test system within the casing.
  • Test system 142 provides an integrated system in which the components of the system are pre-arranged ready to receive a NVD for testing. This minimises set-up time, produces repeatable results, and requires little skill or expertise to perform NVD testing. Furthermore, test kit 100b is self-contained and portable, is easy to store and is robust due to the closable casing.

Abstract

A system for testing a night vision device comprises: detection means arranged to detect an image produced by the night vision device and to output image data encoding the image; image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; processing means arranged to analyse the image data to determine whether the image encoded in the image data meets at least one criterion, and to generate an output indicative of whether the at least one criterion is met; and a mount arranged to support the night vision device and to align the night vision device with the detection means and the image generating means. The detection means, the image generating means, the mount and optionally also the processing means are assembled together to form an integrated system configured to receive the night vision device for testing.

Description

Night Vision Device Testing
Field of the Invention
The present invention relates to night vision devices (NVDs), such as night vision goggles (NVGs), and in particular to the testing of NVDs.
Background to the Invention
Night vision devices, including binoculars and monoculars as well as goggles, are much used in military and other environments to provide improved vision in low light environments. They use well known image intensification methods to produce, from a low intensity image which the human eye cannot easily see, a higher intensity image which it can.
NVDs do not produce perfect images, and various methods are known for testing the quality of the images that they do produce. Common problems with the images inclu de poor resolution, the presence of blemishes in the images, i.e. small features in the intensified image which do not correspond to anything in the original image, and also the presence of a hexagonal‘honeycomb’ or‘chicken wire’ pattern in the intensifi ed image which arises due to the structure of the micro -channel plate used for image intensification and becomes visible under some conditions.
It is known to check for these features manually by looking through the NVD at either an array of blocks or bars of various sizes to identify the smallest which can be seen thereby to determine the resolution, and by looking for and counting blemishes, and by checking for visible ‘chicken wire’ patterns. However, these tests are inevitably somewhat subjective. There are also other problems with the NVD which are hard, or in some cases impossible, to detect visually, such as gross distortion, discontinuities, flicker, and misalignment and variation in magnification between the two sides of a binocular NVD.
Systems have therefore been developed to test NVDs by generating an input image and using a computer or other data processor to analyse the image output by the NVD and captured by a camera. However, such systems generally comprise a number of individual components, which must be carefully assembled together with the NVD, for example on a benchtop, and properly aligned in order to perform such testing. This adds inconvenience to the testing procedure and requires an operator who is skilled in assembling the system. Such systems are also prone to error due to alignment errors and the interference of ambient light caused by imperfect optical coupling between the various components of the testing system. Also, due to the large number of individual component parts, storage and transport of known testing systems is also complicated and can result in some of the components becoming damaged or lost in transit or storage. Known NVD testing systems are also limited to performing tests using static input images, which is not representative of the use of NVDs and limits the parameters of the NVD that can be tested. Known NVDs are also generally configured to perform only one type of test at a time, requiring an operator to specify exactly which test is to be performed and initiating each different test in turn, thus increasing the time required to perform a comprehensive test of a NVD.
There is therefore a need for a new NVD testing system and methods of testing NVDs that overcome these and other disadvantages of known systems and testing methods.
Summary of the Invention
According to a first aspect of the invention , there is provided a system for testing a night vision device, the system comprising detection means arranged to detect an image produced by the night vision device and to output image data encoding the image; image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; processing means arranged to analyse the image data to determine whether the image encoded in the image data meets at least one criterion, and to generate an output indicative of whether the at least one criterion is met; and a mount arranged to support the night vision device and to align the night vision device with the detection means and the image generating means. The detection means, the image generating means, the mount and optionally also the processing means may be assembled together to form an integrated system configured to receive the night vision device for testing.
The system may be arranged such that the night vision device is optically coupled with the image generating means and the detection means when it is supported by the mount. The system may further comprise a casing. The detection means, the image generating means, the mount and optionally the processing means may be arranged within the casing so as to receive the night vision device for testing. The casing may be configured to be transported by hand. For example, the casing may be in the form of a briefcase or suitcase. The casing may be rigid.
The system may be arranged to form a cavity for receiving the night vision device. The cavity may be located between the detection means and the image generating means. The mount may be arranged to support the night vision device within the cavity.
The image generating means may comprise an active screen, such as a digital display, LCD, LED or OLED screen the processing means may be configured to control the image displayed by the active screen.
The processing means may be configured to perform a series of tests in response to a single input command. Each of the tests may be configured to evaluate whether the image encoded by the image data meets a different criterion. The processing means may be configured to cause the image generating means to generate a series of images for performing the tests.
The detection means may comprise an objective lens arranged to focus the output image output by the night vision device for recording by the detection means. The system may be configured to autofocus the objective lens of the detection means based on a focus metric calculated by the processing means calculated from the image data. In other words, the system may be configured to autofocus the output image of the night vision device for imaging by the detection means by adjusting the focus of the objective lens of the detection means based on a focus metric calculated by the processing means calculated from the image data.
The processing means may be arranged to cause the image generating means to generate an input image comprising a moving feature. The processing means may be arranged to calculate a spatial resolution based on the intensity profile defining an edge of the moving feature in the image encoded by the image data. The processing means may be arranged to cause the image generating means to generate at least one input image comprising a plurality of sets of equally spaced parallel bars, preferably on a plain background, each of the sets of parallel bars having a different predetermined spacing between the bars. The processing means may be arranged to measure, for each of the sets of parallel bars, the contrast between the bars and the spaces between the bars in the image encoded by the image data . The processing means may be arranged to calculate a modulation transfer function based on the measured contrast values for the plurality of sets of bars and the predetermined spacing of the bars.
The processing means may be arranged to identify the presence of at least one feature in the image. The at least one criterion may be the presence of a feature, or the presence of at least a predetermined number of features.
The system may further comprise a display screen. The system may be arranged to provide an indication on the display screen of whether the at least one feature is detected.
The system may be arranged to display the image on the display screen, and may be arranged to highlight the feature on the image.
The input image may be a plain image, for example a plain white or other pale colour image. The processing means may be arranged to identify features in the output image which are not present in the input image.
The processing means may be arranged to identify features meeting at least one feature criterion, and to determine the number of such features in at least an area of the image . The features may be parts of a hexagonal‘chicken wire’ pattern, in which case the number of them that are identified will indicate the extent or strength of the pattern. Alternatively they may be blemishes, which can be identified by various criteria. The processing means may be arranged to control the display so as to highlight each of the features that meet the at least one criterion.
The input image may comprise a plurality of features of the same shape but of different sizes, or different orientations, or both, and the processing means may be arranged to determine which of the corresponding features in the output image meet the at least one criterion. For example a set of criteria may be used to define the feature, which may itself comprise a number of sub -features, such as a number of bars or lines. The size of the smallest feature that can be identified will be an indication of the resolution of the NVD.
The processing means may be arranged to measure a parameter of the output image and to calculate from the parameter the gain of the device.
The processing means may be arranged to measure a parameter of the image or the output image, and to calculate from the parameter the level of gross distortion produced by the device.
The processing means may be arranged to detect discontinuities produced by the device in the image or the output image.
The processing means may be arranged to measure the magnification of the device.
The processing means may be arranged to measure alignment between two sides of a binocular night vision device.
The processing means may be arranged to measure flicker in the image or the output image.
The processing means may be arranged to measure signal to noise ratio in the image or the output image.
The processing means may be arranged to measure a characteristic of a halo surrounding a feature in an input image.
The present invention further provides a system for aiding a user in focusing an optical device, the system being arranged to acquire a sequence of images output by the device, calculate a focus metric for each of the images, and generate a user - perceptible output indicative of the focus metric. The output may be arranged to vary in real time during adjustment of the focus of the device. The output may be a plot of the metric against time, for example a line plot or a bar graph. The system may be arranged to identify a maximum value of the metric as the device passes through a point of maximum focus, and then to detect when the value of the metric returns to that maximum value and in response to generate a further user perceptible output.
According to a second aspect of the invention, there is provided a method for testing the dynamic spatial resolution of a night vision device, the method comprising generating a moving input image for the night vision device, the moving input image comprising a moving feature; and calculating a dynamic spatial resolution based on the intensity profile defining an edge of the moving feature in an output image generated by the night vision device. The output image may be a static or still image (i.e. a single still frame) or may be a moving image (i.e. a sequence of frames).
The method may comprise calculating a spatial resolution for each of a plurality of output images generated by the night vision device at different times, and averaging the plurality of calculated spatial resolutions to give an average spatial resolution. The edge used to calculate the spatial resolution may be a straight edge. The straight edge may extend in a direction perpendicular to the direction of movement of the moving feature. The edge used to calculate the spatial resolution may be a leading or a trailing edge of the moving feature. The input image may comprise a plain background. The moving feature may be square or rectangular in shape. The moving feature may have a uniform intensity or shading across its area.
According to a third aspect of the invention, there is provided a method for measuring the modulation transfer function of a night vision device, the method comprising generating at least one input image for the night vision device, the at least one input image comprising a plurality of sets of equally spaced parallel bars, preferably on a plain background, each of the sets of parallel bars having a different predetermined spacing between the bars; obtaining at least one output image generated by the night vision device, the at least one output image reproducing the at least one input image; measuring for each of the sets of parallel bars in the at least one output image the contrast between the bars and the spaces between the bars; and calculating the modulation transfer function based on the measured contrast values for the plurality of sets of bars and the predetermined spacing of the bars.
The measured contrast values may be representative of the maximum difference in intensity between the spaces and the bars as measured from the output image. In other words, the measured contrast values may be the maximum measured contrast between the spaces and the bars.
Calculating the modulation transfer function may comprise normalising the measured contrast values with respect to the theoretical maximum contrast or the actual contrast in the input image between the bars and the spaces between the bars. The theoretical maximum contrast is the maximum contrast that could be recorded by the night vision device for bars that have an infinite thickness and which are infinitely spaced apart. In other words, the maximum theoretical contrast is the maximum measurable contrast between an infinitely sized area corresponding to the spaces and an infinitely sized area corresponding to the bars, i.e. as would be measured by a night vision device with a perfect spatial resolution.
The at least one input image may comprise a plurality of input images, each of the input images comprising at least one of the sets of parallel bars. For example, each of the input images may comprise one of the sets of parallel bars.
The measured contrast between the bars and the spaces between the bars for each of the sets of parallel bars may be an average of the contrast between the bars and the adjacent spaces between the bars in that set.
According to a fourth aspect of the invention, there is provided a night vision device testing system configured to perform any of the methods described above.
The system may comprise detection means arranged to detect an image produced by the night vision device and to output image data encoding the image; image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; and processing means arranged to analyse the image data.
The system and methods of the invention may further comprise any one or more features, in any combination, of the preferred embodiments which are shown in the accompanying drawings, as will now be described.
Brief Description of the Drawings
Figure 1 is a schematic diagram of a night vision device testing system, which may be used in accordance with the methods of the invention ;
Figure 2 shows an example of a test image as displayed on the screen of the system of Figure 1 ;
Figure 3 shows an example of a further test image as displayed on the screen of the system of Figure 1 ;
Figure 4 is a chart showing the steps of a test procedure performed by the system of Figure 1 ;
Figure 5 is a display produced by the system of Figure 1 during the test procedure of Figure 4;
Figure 5a shows the variation in intensity over one of the bars of the image of Figure 2, as measured in the procedure of Figure 4;
Figures 6, 7, 8, 9 and 10 are charts showing the steps of further test procedures performed by the system of Figure 1 ;
Figure 11 shows a test image used in the test procedure of Figure 10;
Figure 12 shows an output image from the device, analysed in the test procedure of Figure 10; Figure 13 is a chart showing the steps of a further test procedure performed by the system of Figure 1 ;
Figure 14 shows an output image from the device, analysed in the test procedure of Figure 13 ;
Figures 15, 16, 17, and 18 are charts showing the steps of further test procedures performed by the system of Figure 1 ;
Figure 19 is a schematic top view of a system according to a further embodiment of the invention; and
Figure 20 is a schematic section through the system of Figure 19.
Figure 21 shows a graph of the modulation transfer function (MTF) plotted as a function of spatial frequency of parallel test bars.
Figure 22 illustrates aspects relating to the modulation transfer function. The left-hand side of Figure 22 shows an input image and the relative intensity profile of the input image and the right-hand side of Figure 22 shows an output image generated by a night vision device reproducing the input image and the relative intensity profile of the output image.
Figure 23 shows an input image for a halo test.
Figure 24 shows a perspective view of an integrated portable testing system according to the invention.
Figure 25 shows a schematic top-down view of the system shown in Figure 24.
Description of the Preferred Embodiments
Referring to Figure 1 , a test system comprises an image generation device 100 which includes a display screen and is arranged to display one or more images, and may comprise a mounting 102 arranged to support an NVD 104 so that the image displayed on the image generation device 100 is in the field of view of the NVD 104, and forms an input image for the NVD 104. The image generation device preferably comprises an active screen, such as a display screen capable of generating a number of different images. Examples of such screens include LCD, LED and OLED screens. The system further comprises an image detection device, such as a digital camera 106 . This may be, for example, an IDS UI- 1490SE, or other suitable machine vision camera. The system may further comprise a further mounting device or connector 108 which is arranged to connect the camera 106 against the eyepiece 1 10 of the NVD so as to detect and record the output image generated by the NVD 104, and to output image data encoding the image. For a binocular device, two such cameras may be provided, one mounted on each side of the NVD. A processing system or computer device, in this example a tablet 112, may be arranged to receive image data from the camera or cameras 106, and may include a processor 114 and memory 1 16 and a display screen 1 18. It may also be arranged to control various aspects of the operation of the camera or cameras 106, such as the frame rate and exposure time of the camera. It may also be arranged to control which of the detectors from the full array of the camera are used for each frame, so that, for example, each frame can include the maximum number of pixels corresponding to the total number of detectors in the camera, or a smaller number of pixels using only some of the detector array. This can allow smaller images to be acquired with a higher frame rate than is possible for the maximum image size.
The test system may be purely for monocular testing, and may generate only a single input image at one time. Alternatively it may be for binocular testing, and may therefore generate two images simultaneously, one for each side of a binocular NV D. Operation will be described below for a single monocular system, but for each test it will be appreciated that both sides of a binocular system may be tested in the same way, either simultaneously, or sequentially but without moving the NVD. A sliding mounting for a binocular NVD may also be incorporated in the system so that the two sides can be tested sequentially using the same input image .
The test system, in particular the image generation device 100, may be arranged to perform a number of functions, but for the purposes of the present invention, the relevant functions may include the display of various types of input image, one of which may be a plain image, for example a white image with a number of rings defining different regions of the image , against which blemishes and chicken wire artefacts can easily be detected, one of which may be a resolution checking image which may contain features of a range of sizes, for example as described in more detail below, and one of which may be a rectangular or square grid.
The tablet 1 12 may be arranged to run a software program or application which is arranged to analyse the image data from the camera 106 to test the NVD 106 to determine whether the data, and therefore the output image, meets at least one test criterion. The program may provide a number of functions which can be selected by a user, for example using icons 120 displayed on the screen 1 18 of the tablet. The embodiment shown provides all of the functions which will be described below, but other embodiments may offer only some of these, and may offer others.
Referring to Figure 2, as mentioned above, one of the input images that the test system may be arranged to display is a resolution testing image to test the resolution of the NVD. The image generation device 100 may generate an image which is shown in Figure 2 which comprises a number of sets of bars, for example three bars 200. Each bar 200 is a solid rectangle or other suitable shape, of black or another suitable colour, and each set 200a, 200b may comprise three identical bars arranged parallel to each other with a small gap between them so that the whole set covers an approximately square area. The sets 200a, 200b vary in size, but for each size there may be a pair of two sets 200a, 200b one with the bars arranged vertically and one with the bars arranged horizontally. The pairs may be arranged in groups of four pairs, and in Figure 2 two groups 202, 204 are shown, with the largest sets of bars in one group 204 being smaller than the smallest sets in the other group 202.
When the test program is running on the tablet 112, one of the icons 120 can be selected by a user to test the resolution of the NVD, and this is selected when the image generation device 100 is displaying the resolution testing image shown in Figure 2. The program may then be arranged, using image processing techniques such as edge detection and shape recognition, to detect as many sets 200a, 200b of bars 200 as it can, or to measure the resolution of the bars to determine which of them are imaged with at least a threshold level of resolution. For each set that it detects, or detects with sufficient resolution, it may be arranged to identify that set, for example by highlighting it on the screen, in the example shown this is by displaying a box 206 around it. Obviously other methods of highlighting the identified sets can be used, for example by changing their colour or changing the background around them. Once the program has identified as many sets of bars as possible, the results of the test can be displayed in a number of ways. For example the highlighted sets 200a, 200b can simply be displayed so that a user can determine which ones have been identified, and therefore determine the resolution of the NVD in a similar way to when the user is looking at the sets of bars and determining themselves which ones can be seen and which not. Alternatively, the program can be arranged to display details of the smallest set or sets, or the smallest group and all of the identified sets within that group, for example just indicating them by number. In the example shown, a display area 210 on the screen indicates that sets a, b and c from Group 2 are the smallest sets that have been identified. This has the advantage over known methods that the user does not have to decide which of the sets is clear enough to be distinguished and which not. An example of how this test can be performed is described in more detail below with reference to Figure 6.
The resolution test described above can be performed at several different locations in the field of view (FOV) of the NVD so as to evaluate how the resolution varies across the FOV. For example, the sets 200a, 200b of bars 200 used to determine the resolution may be located towards the edges of the FOV to determine the edge resolution, which often differs from the resolution in the central region of the NVD. Thus, the resolution test may be performed for a central region of the FOV and for at least one edge region of the FOV and the results of these tests output by the test system.
The test system may also be arranged to measure the modulation transfer function (MTF) of the NVD. The MTF is another measure of spatial resolution and effectively describes how the measured contrast varies as a function of the spacing between adjacent features, for example as a function of the spacing between adj acent parallel bars such as those found in the resolution test images described above and shown in Figure 2. The MTF is a measure of the ability of a NVD to transfer contrast at a particular resolution (i.e. the spatial frequency of repeating features) from the input image to the output image. As the spacing between adj acent bars in the input image decreases, it becomes increasingly difficult for the lens to transfer the contrast between the dark areas of the bars and the light areas between the bars, and, as result, the MTF decreases. This is illustrated by the plot shown in Figure 21. In this case, the MTF is expressed as the relative contrast (normalised relative to the actual contrast in the input image, or the maximum possible contrast measurable in the case of the adjacent bars being spaced infinitely far apart) between the measured intensity of the bars and the measured intensity of the gaps or spaces between the adj acent bars expressed as a function of the spacing between the bars, or the spatial frequency of the bars, e.g. in lines per unit distance (e.g. lines per mm) or expressed as an angular spatial frequency (e.g. cy/mrad) relative to the observation point . The relative contrast is the maximum measured contrast between the bars and the spaces between the bars , as illustrated in Figure 22. For example, in the case of black bars 126 on a white background 128, the relative contrast is the difference in intensity between the maxima 130 of the peaks and the minima 132 of the troughs corresponding to adj acent spaces and bars, respectively. As the spatial frequency of the bars increases the relative contrast between the bars and the spaces decreases.
In order to measure the MTF, the test system is arranged to display an input image comprising one or more sets of equally spaced parallel bars on a plain background. The width of the spaces between the bars should be the same as the width of the bars themselves. The bars may be black and the background may be white. For example, the test system may be arranged to display a resolution testing image, such as that shown in Figure 2, comprising a plurality of sets 200a, 200b of parallel bars 200. The bars in each of the sets are equally spaced apart with a known spacing or frequency, but the spacing between the bars in different sets is different. Alternatively, the test system may display the different sets of parallel bars in different images. For example, test system may display a series of images on image generating means , each of the images comprising one or more sets of bars having different bar spacings or spatial frequencies. The processing means is arranged to measure, for each of the sets of parallel bars, the contrast between the bars and the spaces between the bars in the image encoded by the image data output by the detection means. The processing means is further arranged to calculate the contrast between the minimum and maximum intensity of the areas corresponding to the bars and the areas corresponding to the spaces between the bars in each of the sets of parallel bars. The measured contrast values are then normalised relative to the maximum theoretical contrast between the bars and the spaces between the bars, and the processing means calculates the modulation transfer function based on the normalised measured contrast values and the known spacing between the bars in each of the sets of parallel bars . The test system may then display a plot of the MTF as a function of spatial frequency of features or line spacing and may output and display MTF values for one or a selection of different spatial frequencies or line spacing. The test system may be arranged to output a pass or fail indicator based on whether the MTF satisfies a predetermined MTF requirement. For example, in order to pass the MTF test the test system may require the MTF values at one or a plurality of different spatial frequencies to be above a predetermined threshold for that spatial frequency.
The test system may also be arranged to perform a dynamic spatial resolution test using a moving input image. For example, the test system may display a moving input image comprising a moving feature, e.g. a shaded shape on a plain background. The moving feature may travel across the input image at a predetermined speed and processing system 1 12 may be configured to calculate the dynamic spatial resolution of NVD 104 based on the sharpness of the intensity profile defining an edge of the moving feature in the image encoded by the image data. In particular, the edge may be a leading or trailing edge of the moving feature, which will become blurred due to the movement of the feature, the inherent static resolution of the NVD, and the response time of the NVD. Thus, the intensity profile (e.g. the sharpness) of the leading or trailing edge taken along the direction of movement of the feature across the input image provides a measure of the dynamic spatial resolution of the NVD. Preferably, the leading or trailing edge of the moving feature extends perpendicularly to the direction of movement of the feature in the input image. For example, if the moving feature is a rectangle or a square, the feature preferably moves across the image in a direction perpendicular to two opposing sides of the rectangle or square. The test system may be arranged to measure the dynamic spatial resolution at a plurality of discrete times as the feature moves across the image and to average the measured values to provide an average dynamic spatial resolution. Alternatively, the test system may be configured to continuously measure the dynamic spatial resolution for a predetermined period of time as the feature moves across the input image and to output an average value, for example an arithmetic mean, measured over the predetermined period. Advantageously, the test system may measure the dynamic resolution in perpendicular directions, preferably in the horizontal and vertical directions, by first generating a moving input image in which the feature moves in a first direction and then generating a moving input image in which the feature moves in a second direction perpendicular to the first and calculating the dynamic spatial resolution in both directions. The test system may also be configured to measure the dynamic spatial resolution for a plurality of moving speeds of the moving feature. For example, a first dynamic spatial resolution may be calculated with the moving feature moving at a first speed and a second dynamic spatial resolution may be calculated with the moving feature moving at a second speed across the input image. The test system may then display a plot of the dynamic spatial resolution as a function of moving speed of the feature. Measuring the dynamic spatial resolution is particularly useful because NVDs are generally used in dynamic situations and the FOV being imaged typically includes a large number of moving objects. It is therefore more relevant to the real world performance of the device to measure the dynamic spatial resolution than the static spatial resolution.
Referring to Figure 23, the test system may also be arranged to perform a halo measurement test. When a bright feature is imaged by a NVD a“halo” of intensity is seen surrounding the feature (identified by the white circles in Figure 23) . These image halos are a common problem experienced with NVDs and it is therefore useful to measure them. In order to do so, the test system may be arranged to display an input image 134 comprising one or more light or bright features 136 against a plain dark background 138. For example, features 136 may be white circles and the background may be a plain black background. Processing system 1 12 is arranged to measure the diameter of halo(s) 140 surrounding the one or more features in the image encoded in the output image data, for example using a threshold brightness criterion or edge detection algorithms. The diameter of halo(s) 140 is then compared to the diameter of the corresponding feature 136 in the input image, which is predetermined and known by the test system which generates the input image, and the diameter, size or other parameter characterising the halo is then determined and output, for example the increase in diameter of the feature caused by the halo. The halo test can be performed for different contrast levels between features 136 and background 138 to provide a measure of how the halo varies as a function of contrast. The halo measurement can also be performed at different locations within the NVD field of view to determine how the halo varies within the FOV or the results of the halo test at different locations can be averaged to provide an average (e.g. mean) halo parameter.
Referring to Figure 3, in a further test, the test system is arranged to display the plain white test input image with circular regions defined on it as described above . This means that any blemishes or artefacts that the NVD superimposes on the input image can be easily identified. A completely plain image could be used instead. The tablet is then arranged to use image processing techniques to identify two specific features in the output image: hexagonal patterns, commonly referred to as ‘chicken wire’ or ‘honeycomb’ patterns, and blemishes. The chicken wire pattern 300 is very distinctive and the program is arranged to detect individual lines 302 including their position and orientation, and then to analyse them to determine whether they are part of ‘chicken wire’ patterning. In this case the tablet is arranged to do this by detecting the presence of, for example, groups of three lines 304a, 304b, 304c meeting at a point at angles of 120° to each other. Once a sufficient number of these has been detected, the program is arranged to determine that the ‘chicken wire’ pattern has been detected. The strength of the pattern can be determined by, for example, the number of instances of the required combination of lines being detected in the whole image, and the thickness or darkness of the lines compared to the background. The program can then compare the detected pattern against one or more threshold criteria and give a pass or fail indication. Alternatively, or in addition, it can be arranged to grade the severity of the chicken wire pattern, for example on a scale of 1 to 6. An example of how this can be done is described in more detail below with reference to Figure 7. The results can then be indicated to the user, for example in a results area 306 on the screen 1 18.
In order to identify, categorize and count the blemishes 400, the program is arranged to identify, again using edge detection, shape recognition and other suitable algorithms, any features (other than the hexagonal pattern) which are present in the image, and to determine their size. Once the number and size of the blemishes has been determined, this can be displayed to the user, either by simply highlighting them so that the user can count them, or by giving a direct indication of the number in a dedicated area of the screen 1 18. Obviously a variety of criteria can be used to determine what should be counted as a blemish, for example it might include only shapes over a particular area and/or length. Also each detected blemish can either just be identified as a blemish, or it may be categorized, for example by size and/or shape, and the number of each category of blemish indicated to the user, either just as a number for each category indicated in the results area 306 on the screen, or by highlighting different categories in different colours. An example of how the test for hexagonal patterning is performed is described below with reference to Figure 8. While it is possible for the chicken wire detection and blemish counting to be performed for the whole of the image, it can be more useful for a separate analysis t o be carried out for different parts of the image. For example, the image may be divided into a plurality of different areas 402, which may be indicated on the image, for example by concentric rings 403 included in the white test image as shown in Figure 3, or rectangles or squares or segments. The determination of the presence or absence and categorization of chicken wire patterning, and the number and nature of blemishes, is then performed for each area, and the results indicated separately for each area in the results area 306 of the screen. These lines are used if the test is done by eye, but as will be described in more detail below, they can be used for other automated tests performed by the system.
Referring back to Figure 1 , the eyepiece 1 10 of the NVD is typically manually adjustable to adjust the focus of the output image produced by the NVD. The system may be arranged to perform a focus assist function which aids a user in focusing the NVD so that other tests can be performed on it. Referring to Figures 4 and 5, the program is arranged to perform a calculation of focus on an output image, for example generated using the resolution test image of Figure 2, and to repeat this for subsequent images over a test period while the user adjusts the focus, and indicate the result of the focus calculation, and how it varies over time, to the user on the screen 1 18. As shown in Figure 5, the result may be displayed in real time, for example on a plot of focus against time, which may have time on a horizontal axis and focus, as determined by a focus metric, on the vertical axis. The user can then watch the plot as the focus is adjusted to determine when the optimum focus is reached.
As shown in Figure 4 the program may be arranged to acquire an initial set-up image data set for an image of the test pattern with a reference exposure, and from that data set to calculate the best exposure settings. The program is then arranged to communicate those optimum settings to the camera which is then arranged to use them for the rest of the test. The program then selects a test frame rate which is as fast as possible, whilst ensuring that the contrast is sufficient for the focus of each image to be determined. This may comprise testing the contrast between the lightest and d arkest parts of the set-up image and ensuring that it is above a predetermined threshold, which is sufficiently great for the focus measurement to work. Generally the greater the frame rate, the lower the contrast will be. Once the optimum test frame rate has been set, the program is arranged to acquire a series of test images at the test frame rate, and for each test image, to calculate the value of a focus metric and display the value on the screen as shown in Figure 5. The value of the metric for each frame may be displayed as soon as it is calculated so as to generate a real -time plot of the focus. As the value of the metric changes over time, the plot will rise and fall, and the user can monitor this while adjusting the focus to determine where the best focus is.
For example as shown in Figure 5, as the eyepiece 1 10 is adjusted the focus might increase towards the maximum, and then decline again which will be seen as a rising then falling curve 500. From this the user can see approximately where the maximum value 502 of the focus metric occurs, and move the eyepiece back towards the maximum focus. Then as the focus passes through the maximum again at point 504 this can be seen and the focus adjusted back again to the maximum. Obviously in practice the user may move the focus through the optimum any number times, but with practice the optimum focus can be found very quickly. The program may also monitor the variation in focus metric over time and may detect the maximum value of the focus metric, for example by identifying the top 502 of the curve as the focus moves through that point. In may then be arranged to store the maximum value, and generate an output to the user when that maximum value is again reached, for example as an audible feedback, or visual feedback on the screen 1 18. This can aid the user in determining when the optimum focus has been reached.
The focus metric may be calculated in a number of ways, but may for example comprise plotting the intensity or grey level of the output image of one of the bars 200a in the test image of Figure 2, which will generally have the shape shown in Figure 5a with a high level 510 outside the bar and a low level 512 within the bar. If the focus is good the plot will be a sharp, deep, rectangular dip, whereas if the focus is poor, the dip will be rounder and more shallow. Therefore the focus metric may be the difference in intensity between the two levels 510, 512, or may include a measure of the sharpness of the intensity dip.
Instead of manually focussing the NVD, the test system may advantageously be arranged to automatically focus the objective lens of image detection device, for example the objective lens of digital camera 106. This removes the requirement to focus the NVD by hand. Computer device 1 12 may be arranged so as to control the focus of the objective lens of image detection device 106 and to calculate a focus metric. Computer device 1 12 then autofocusses the objective lens of image detection device 106 based on the focus metric. If computer device 1 12 determines that the optimum focus lies outside of the focus range of the objective lens of image detection device 106, the test system may prompt the operator to adjust the NVD objective lens appropriately, for example by displaying a command on displ ay screen 1 18 , to bring the optimum focus within the focus range of the objective lens of image detection device 106. Computer device 1 12 then autofocusses the objective lens of the image detection device. Generally, the focus of the NVD objective lens should be set at infinity prior to the autofocussing procedure and the test system may therefore be arranged to instruct the operator to set the NVD objective lens to infinity before commencing the autofocussing procedure.
Once the focus of the NVD is optimized, then the system can perform a number of other tests.
Referring to Figure 6, the automated resolution test, described in general terms above, will now be described in more detail. Firstly a sequence of images of the test image of Figure 2 is acquired. The program may then be arranged to average the sequence of images, for example by calculating an average over all the images for the v alue of each pixel. It may then be arranged to normalise the average image to correct for any variation in brightness over the area of the image, which is typically brighter in the centre than towards the edges, thereby producing a sample image. It may then be arranged to locate one or more large features in the image, such as the largest group of bars, and determine the exact position within the image of those features. Using stored data, for example a stored copy of the test image or stored coordinates of all the features of the test image, the program may calculate the position, scale and location of the image, i.e. the exact position, scale and location within the area imaged by the camera, of all of the features of the test image. From this it may determine where within the sample image all of the features of the test image should be. The program may then test the contrast of a number of features in the sample image, and from the results provide an indication of the resolution of the NVD. The program may be arranged to measure the contrast of one of the bars 200 of the test image as it appears in the sample image by determining the brightness at a point in the centre of the bar, and the brightness of a point well outside the bar. This is similar to the focus test described above. As described above with reference to Figure 5a, if the resolution is good, the contrast will be high as the centre of the bar will be black and the area just outside the bar will be white. However, if the bar is not well resolved in the sample image, the centre of the bar may not be completely black as the image will be somewhat blurred. Therefore the comparison between the two brightness levels gives a measure of the contrast for that bar. Generally the centre of larger bars will be darker than the centre of smaller bars, so the program may be arranged to set a threshold contrast, and then measure the contrast for each of the bars, and determine the smallest bars for which the contrast is above the threshold level. This result can then be indicated to the user as described above.
Referring to Figure 7, the honeycomb pattern test may be performed as follows. The tablet may first acquire a single sample image of the white test image. If the test image includes circles or other lines defining areas of the test image, then the program may be arranged to locate those and remove them to form a plain sample image. The program may then be arranged to calculate the extent of the honeycomb pattern in the plain sample image, and to allocate the extent of the pattern to one of a plurality of categories, for example on a scale of 1 to 10. This may then be repeated for a number of separate test images. The scale value of the honeycomb pattern for the images may then be averaged, and the program may then determine whether the NVG passes or fails this test based on the average value. It is possible to average the sample images and then calculate the extent of the honeycomb pattern on the averaged image. However this is less effective as the averaging tends to remove the honeycomb pattern. The results can be displayed to the user on the screen 1 18 either as a simple pass/fail indication, or as graphical slider on a scale so that the strength of the honeycomb pattern is indicated, either on a continuous scale or as one of a number of finite steps or categories, so that a user can tell whether the test is passed or failed, and how marginal the result is.
Referring to Figure 8, an example of how the blemish test can be performed by the program will now be described. The program may first be arranged to acquire a sequence of sample images of the plain test image, and then to average those images to form an averaged sample image, which may then be normalised as for the resolution test described above. The images may be acquired using high‘gamma’ which allows features in the darker parts of the image to be relatively easily identified. The average sample image may then be converted to a binary image in which each pixel has one of only two values, which can be represented as black and white. This will generally result in a sample image which is mostly white with black blemishes of various sizes and shapes in it. All areas of black may then be identified by the program, which may be arranged to categorize them by size and shape, in a ‘blob labelling’ step. Any circles or other lines intentionally present in the plain test image may then be identified by their size and shape, and excluded from the calculation, or they may be left in, for example if the blemishes are to be identified to a user for counting. Once all the blemishes have been categorized, they may be identified by category, for example by defining a plurality of size categories, assigning a colour to each category, and highlighting all the blemishes in each category in the colour associated with that category. If the areas of the image are also displayed this enables a user to count the number of blemishes in each category in each area. Alternatively, or in addition, the program may be arranged to determine the number of blemishes, in total or in each category, in each area, and indicate this numerically. Alternatively, or in addition, the program may be arranged to compare the number of blemishes in each area with a reference threshold number for that area, and if one or more of the thresholds is exceeded to determine that the NVD fails the test, but if none of the thresholds is exceeded, to determine that the NVD has passed the test. The pass/fail test result may be indicated to the user on the screen in any suitable manner.
Where the NVD is a goggle or other binocular device having two imaging systems, one on each side, one for each eye of the user, the program may also be arranged to perform a gain test arranged to compare the gain of the two imaging systems. The gain referred to here is the increase in brightness of the output image from the NVD compared to the input image. Clearly it is beneficial for the gain of the two sides to be equal. For this comparison to be accurate, the same input image may be used for both sides of the NVD, and the same camera may be used to capture the output image from both sides. This may be achieved, for example, using a sliding mounting for the NVD that enables the NVD to be moved between two positions, in one of which one side of the NVD is aligned with the input image so that it will produce an output image for capture by the camera, and in the other of which the other side of the NVD will be so aligned. To perform the test, the system is arranged to acquire an image with each side of the NVD, and then from those images to calculate a measure of the gain of each side of the NVD. The gains can then be compared, either by the system or by a user, to determine whether they are close enough to each other.
Referring to Figure 9, to perform the gain test the program may be arranged to acquire a sequence of output images for one side of the NVG, for example the left side, which are all from the same input image which may for example be the resolution test image. It may then be arranged to calculate an average image from the sequence of images. It may then be arranged to calculate an average grey level for an area of the average image, for example a small central area of the average image. It may then be arranged to convert the average grey level value to an average luminance value. These steps may then be repeated for the other side of the NVG to determine an average luminance value for that side. It may then be arranged to compare the two luminance values, for example by calculating a difference between the two values as a percentage of one of them or of the average of the two. This percentage may then be compared with a threshold percentage, and if it is below the threshold to determine that the two values are similar enough to pass the test, but if it is above the threshold to determine that they are too different in which case the test is failed. This pass/fail determination may be indicated to the user by any suitable indicator, for example on the screen 1 18, or the percentage value itself may be indicated.
Referring to Figures 10 to 12 the program may also be arranged to perform a gross distortion test to measure the degree of gross distortion of the image. Gross distortion occurs in the inverter or ‘twister’ which is a twisted fibre optic bundle arranged to correct for inversion of the image caused by the objective lens of the NVD. Slight imperfection of the twister results in distortion of the final image which converts straight lines to a generally S -shaped line. To perform this test the test system may be arranged to generate an input image which includes a number of straight lines 1 10, for example in the form of a square grid 1 12 as shown in Figure 1 1. The test program may be arranged to acquire a sequence of output images from the NVD using the camera. It may then be arranged to average those images and normalise them as described above. It may then be arranged to locate the grid lines 122 in the output image. As these may be distorted by gross distortion, as shown in an exaggerated form in Figure 12, the search may be arranged to find not just straight lines, but any lines which are straight or curved. Once the grid lines 122 have been located in the output image, the program may be arranged to calculate the distortion at a number of points along the lines, and convert the lines to data points on a distortion graph. To calculate the distortion, the program may be arranged to calculate the horizontal offset dx at points along each of the vertical lines 124 in the output image from the original position of the corresponding line 124a in the input image, and to calculate similar offset in the vertical direction y of the horizontal lines from their positions in the input image. The distortion graph would then be a plot of horizontal offset dx as a function of vertical distance y along the line, or a plot of vertical offset dy as a function of horizontal distance x along the line. In each case the graph would have a maximum positive value of the offset and a maximum negative value of the offset, each of which can be identified. The program may then be arranged to correct the data points for lens distortion which will be a known quantity for the optical system, and then to determine the magnitude of the gross distortion. The magnitude may be defined in a number of ways, but is typically defined as the mean magnitude of the maximum positive and negative offsets dx between the distorted line and its original position. This can be determined for each of the lines and then averaged to determine an average value of gross distortion for the NVD. This value may then be compared with a threshold value, and if it exceeds the threshold the program may determine that the test has been failed, otherwise it may determine that it has passed. The pass/fail result or the mean distortion value, or both can be indicated to the user on the display 1 18. It will be appreciated that other straight features, such as edges, may be used instead of, or as well as, the lines described above.
Referring to Figures 13 and 14, the program may also be arranged to perform a discontinuity test. This may use the same input image as the gross distortion test, and the first steps of acquiring a sequence of images, averaging and normalising, locating the grid, converting to points on a distortion graph and correcting for lens distortion may be the same as described above for the gross distortion test. Then, the data points for each line may be scanned for any discontinuities. As there will be some noise in the image, a simple offset between two adj acent points on a line is not sufficient to indicate a discontinuity. The program may therefore be arranged to identify two sets of points each forming a substantially continuous vertical line 124a, 124b, i.e. having equal x coordinates, with an offset dx between the x coordinate of one of the two lines and the x coordinate of the other, and with the two lines being offset from each other in the y direction so that shifting one of them in the x direction by the offset dx would form a single continuous line. Once these two lines 124a, 124b have been located, the value of the offset dx between them can be determined and this represents the magnitude of the discontinuity. This process may be carried out for all of the lines on the grid 122 and a total number of discontinuities with their magnitudes calculated. These magnitudes may be displayed to the user on the screen 1 18, or they may be analysed by the program against a set of criteria and a simple pass or fail result determined which may then be displayed. It will be appreciated that other straight features, such as edges, may be used instead of, or as well as, the lines described above.
Referring to Figure 15, the program may further be arranged to perform a magnification test on the NVD. Ideally the output image scale should be equal to that of the input image, i.e. a magnification of 1 : 1 , and this should be the same for both sides of the NVD if it is a binocular device. In order to perform this test the program may be arranged to acquire a sequence of images using an input image having large features, such as the circles in the plain white input image or the large bars in the resolution test image. The sequence of images may then be averaged and normalised as in other tests described above. The program may then be arranged to locate one or more large features in the output image, such as the circles or large bars, and to compare one or more dimensions of those features with corresponding dimensions from a reference image or from a reference values of the dimensions. It may then be arranged to calculate the magnification of the NVD, for example as a ratio or percentage, and compare the value of the magnification with a threshold value to determine whether the system passes or fails the test. The results may then be displayed on the screen 1 18, either as a pass/fail indication or as a value of the magnification, or both. This may be repeated for both sides of the NVD if appropriate.
Referring to Figure 16 the program may further be arranged to perform an alignment test to check that the two sides of a binocular NVD are aligned with each other. This test may use the same input image as is used for a manual test of alignment, which is in fact two images, one for the left hand side and one for the right hand side of the NVD. The test system may be arranged to generate these two images separately, one after the other. One of the images may have a square box and the other may have a cross, with the cross and the square box being in corresponding places in the two images, so that, if the two sides of the NVD are aligned, the cross and the box will be in corresponding places in the output images from the two sides. The apparatus may include a bridge piece which is arranged to be placed over two eyepieces of the NVD and to combine the images from them into a single image. If the test system only generates a single image, the NVD may be mounted on a slider so that the NVD can be moved between a first position in which the LHS of the NVD is located over the input image, and a second position in which the RHS of the NVD is located over the input image. The program may be arranged to acquire a first sequence of images from the LHS of the NVD using the box as the input image, and then to average the images and normalise the averaged image, and to calculate the location of the box in the image. It may then then arranged to acquire a second sequence of images from the RHS of the NVD using the cross as the input image, and again to average and normalise and calculate the location of the cross in the image. It is then arranged to calculate whether the cross and box in the respective output images are aligned such that, if the images were overlaid, the cross would be within the box in both the horizontal (x) and vertical (y) directions. If it would, then the program may be arranged to display an indication that the test has been passed, but if not then the program may be arranged to display an indication that the test has been failed. Alternatively, the test system may be arranged to generate two output images simultaneously on spaced -apart screen areas so that each of the two sides of a binocular NVD can be used to image a respective one of the images. The bridge piece may then output a single combined image of, for example, the square and the cross, with the image from the two sides superimposed on each other. The program may then analyse the single combined image to determine the degree of misalignment between the two parts of the combined image, for example in two orthogonal directions, and determine from those measured misalignments whether the alignment test is passed or failed.
Referring to Figure 17 the program may further be arranged to perform an image stability test. This is arranged to test for flicker in the image and comprises taking a number of images in a sequence with a high frame rate, for example in the range 100 to 200Hz, calculating an average grey level for all or part of each image in the sequence, and measuring the variation in the average grey level between the images. The area of the image used may be a small part of the image, so as to keep the data acquisition rate at a manageable level, and may be in a central region of the image which is generally brighter. The variation may be measured by calculating the variance of the average grey levels of the sequence of images. The variance may be compared with a threshold value, and if it is above the threshold the st ability test may be determined as failed, and if it is below, the test may be determined as passed. The pass/fail result may be indicated to the user on the screen 1 18.
Referring to Figure 18 the program may further be arranged to perform a signal to noise ratio test on the image. For this test, the input image is a 0.2mm diameter spot with a luminance of 1x10 5 foot candles having the spectral content of 2856±50K blackbody radiation. The program is arranged to capture a sequence of output images from the NVD using that input image with a frequency of 10 frames per second. The program is then arranged, for each image, to measure the measure the average signal S0 from the camera over the 0.2mm diameter area of the bright spot, the standard deviation N0 of the signal S0 over the sequence of images, the average signal Sbkd from the camera over a 0.2mm diameter area outside the bright spot, which is taken to be the background signal, and the standard deviation Nbkd of the background signal Sbkd- The program is then arranged to calculate the signal to noise ratio (SNR) of the NVD using the calculation:
SNR = (So - Sbkd)/K(No2 - Nbkd 2)0' 5
It will be understood that this corresponds to the standard MIL -SPEC SNR calculation as defined in MIL-PRF-49428(CR).
Once the SNR has been calculated, this is compared with a threshold value, and a pass/fail indication provided to the user.
While the systems shown in Figure 1 is only arranged to test a single output image at one time, as mentioned above, in a modification to this the test set 100 is arranged to generate two images simultaneously, and there are two cameras 106, one for mounting on each side of a binocular NVD, and both of which are connected to the tablet 1 12 at the same time. This allows the simultaneous testing of both sides of the binocular device, and also means that, for the test that require images from both sides, such as the alignment test and the magnification and gain tests, the input images for the two sides can be coordinated and the output images detected and processes simultaneously. Referring to Figure 19, in a further modification, rather than a separate tablet, the processing may be carried out entirely within the test set 100a. In this case the test set 100a may be arranged to generate all of the required input images in sequence, with one image for each side of the binocular device 104a generated simultaneously, and two cameras 106a, 106b one for mounting on each side of the NVD to capture the required output images for all, or any combination, of the test described above to be performed in an automated cycle. The test set 100a may include a touchscreen screen 1 18a which may provide and or all of the functions of the tablet 1 12 described above. The output to the user may then be in of an overall pass/fail result, or a separate result for each test, or more detailed data from each test as described above.
Referring to Figure 20, the test set 100a may include an active LCD screen 150a, which may be arranged over a light source 152a, and which may be controlled by the program so as to generate different images as required. These may include the plain white image (with or without circular regions marked), the square grid pattern, or the resolution test bar image, or any other images that may be used for testing the NVD 104a. The light source 152a may comprise a lamp 154a located inside an integrating sphere 156a and a diffuser 158a. One or more lenses 160a may be provided over the LCD screen 150a to provide some focussing of the images generated by the LCD screen. This image generation system allows high flexibility and automation of the various tests described above. It will be appreciated that other methods of generating the image such as an LED or OLED display or a micro-mirror array can be used.
Referring to Figures 24 and 25 , the test system may advantageously be arranged as a self-contained portable kit 142. In this arrangement, the components of the test system, specifically image generation device 100b, image detection device 106b, and the processing system or computer device (in this case tablet computer 1 12b) are arranged and mounted within a casing 144 to provide an integrated system configured to receive a NVD for testing. Preferably, casing 144 is a portable, hand-held casing such as a briefcase. Casing 144 may be closable for transport, for example casing 144 may comprise a lid 146. In particular, casing 144 may have an open configuration for receiving the NVD for testing and a closed configuration for storage or transport. Integrated test kit 142 may also comprise NVD mount 148 arranged to support NVD 104b and to locate NVD 104b between image generation device 100b and image detection device 106b. Mount 148 is arranged so that NVD 104b is aligned and optically coupled with image generation device 100b and image detection device 106b when NVD 104b is received and supported mount 148. In particular, test system 142 is arranged such that ambient light is excluded by the coupling of NVD 104b to image generation device 100b and image detection device 106b. To achieve this, the test system may comprise connectors (not shown) to couple NVD 104b to image generation device 100b and image detection device 106b. These connectors may be arranged so as to form an optical seal excluding ambient light when the NVD is coupled to image generation 100b and image detection 106b devices. For this purpose, the connectors may comprise a flexible or resilient material such as rubber which abuts the NVD so as to form an optical seal. One or more of image generation device 100b, image detection device 106b and NVD mount 148 may be slidable in a direction along the longitudinal axis of NVD 104b so as to allow the concatenation of image generation device 100b, image detection device 106b and NVD 104b once NVD 104b is received by mount 148. For example, image detection device 106b may be slidable to press against the output end of NVD 104b, for example the eyepiece of NVD 104b, so as to urge NVD 104b towards image generation device 100b and to form an optical seal between NVD 104b and image generation device 100b and NVD 104b and image detection device 106b.
Mount 148 may be arranged to support a monocular NVD, or it may be arranged to support a binocular NVD. In the latter case, the test system may be arranged to align one side of the NVD at a time with image generation device 100b and image detection device 106b. For example, mount 148 may be a static mount that is arranged to receive and align either the left or right side of a binocular NVD with image generation device 100b and image detection device 106b. Alternatively, mount 148 may be arranged to receive both sides of the binocular NVD simultaneously and may be movable, for example slidable, between two positions in which either the left or the right side of the NVD is aligned with image generation device 100b and image detection device 106b. Another possibility is that the test system comprises two image generation devices and two image detection devices, one for each side of the NVD, and mount 148 is arranged to simultaneously align the left side of the binocular NVD with the left image generation and image detection devices and to align the right side of the binocular NVD with the right image generation and image detection devices.
The test system may be arranged to provide a recess or cavity 150 for receiving the NVD, mount 148 being arranged to locate NVD 104b within cavity 150. Image generation device 100b may be located on a first side of cavity 150 and image detection device 106b may be located on a second side of cavity 150 opposite the first side. The test system may also comprise a recess configured to receive the computer device, in this case tablet 1 12b. Alternatively, the computer device may be provided separately from the integrated system and may be in data communication with image generation device 100b and image detection device 106b, preferably wirelessly. For example, the computer device may be a tablet computer or other mobile computer device such as a smart phone, and may interface with the integrated test system wirelessly. Alternatively, the computer device may be integrated with the remainder of test system within the casing.
Test system 142 provides an integrated system in which the components of the system are pre-arranged ready to receive a NVD for testing. This minimises set-up time, produces repeatable results, and requires little skill or expertise to perform NVD testing. Furthermore, test kit 100b is self-contained and portable, is easy to store and is robust due to the closable casing.

Claims

1. A system for testing a night vision device, the system comprising:
detection means arranged to detect an image produced by the night vision device and to output image data encoding the image;
image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; processing means arranged to analyse the image data to determine whether the image encoded in the image data meets at least one criterion, and to generate an output indicative of whether the at least one criterion is met; and
a mount arranged to support the night vision device and to align the night vision device with the detection means and the image generating means;
wherein the detection means, the image generating means, the mount and optionally also the processing means are assembled together to form an integrated system configured to receive the night vision device for testing.
2. The system of claim 1 , wherein the system is arranged such that the night vision device is optically coupled with the image generating means and the detection means when it is supported by the mount.
3. The system of claim 1 or claim 2 further comprising a casing, wherein the detection means, the image generating means, the mount and optionally the processing means are arranged within the casing so as to receive the night vision device for testing.
4. The system of claim 3, wherein the casing is configured to be transported by hand.
5. The system of any preceding claim, wherein the system is arranged to form a cavity for receiving the night vision device located between the detection means and the image generating means, the mount being arranged to support the night vision device within the cavity.
6. The system of any preceding claim, wherein the image generating means comprises an active screen, the processing means being configured to control the image displayed by the active screen.
7. The system of any preceding claim, wherein the processing means is configured to perform a series of tests in response to a single input command, each of the tests being configured to evaluate whether the image encoded by the image data meets a different criterion.
8. The system of claim 7, wherein the processing means is configured to cause the image generating means to generate a series of images for performing the tests.
9. The system of any preceding claim, wherein the detection means comprises an objective lens arranged to focus the output image from the night vision device, the system being configured to autofocus the objective lens of the detection means based on a focus metric calculated by the processing means calculated from the image data.
10. The system of any preceding claim, wherein the processing means is arranged to:
cause the image generating means to generate an input image comprising a moving feature; and
calculate a spatial resolution based on the intensity profile defining an edge of the moving feature in the image encoded by the image data.
1 1. The system of any preceding claim, wherein the processing means is arranged to:
cause the image generating means to generate at least one input image comprising a plurality of sets of equally spaced parallel bars, each of the sets of parallel bars having a different predetermined spacing between the bars;
measure, for each of the sets of parallel bars, the contrast between the bars and the spaces between the bars in the image encoded by the image data; and
calculate a modulation transfer function based on the measured contrast values for the plurality of sets of bars and the predetermined spacing of the bars.
12. A method for testing the dynamic spatial resolution of a night vision device, the method comprising:
generating a moving input image for the night vision device, the moving input image comprising a moving feature; and
calculating a spatial resolution based on the intensity profile defining an edge of the moving feature in an output image generated by the night vision device.
13. The method of claim 12, wherein the method comprises calculating a spatial resolution for each of a plurality of output images generated by the night vision device at different times, and averaging the plurality of calculated spatial resolutions to give an average spatial resolution.
14. The method of claim 13, wherein the edge used to calculate the spatial resolution is a straight edge extending in a direction perpendicular to the direction of movement of the moving feature.
15. The method of one of claims 12 to 14, wherein the edge used to calculate the spatial resolution is a leading or a trailing edge of the moving feature.
16. The method of any one of claims 12 to 15, wherein the input image comprises a plain background and the moving feature has a uniform intensity .
17. A method for measuring the modulation transfer function of a night vision device, the method comprising:
generating at least one input image for the night vision device, the at least one input image comprising a plurality of sets of equally spaced parallel bars, each of the sets of parallel bars having a different predetermined spacing between the bars;
obtaining at least one output image generated by the night vision device, the at least one output image reproducing the at least one input image;
measuring for each of the sets of parallel bars in the at least one output image the contrast between the bars and the spaces between the bars; and
calculating the modulation transfer function based on the measured contrast values for the plurality of sets of bars and the predetermined spacing of the bars.
18. The method of claim 17, wherein the measured contrast values are representative of the maximum difference in intensity between the spaces and the bars as measured from the output image.
19. The method of claim 17 or claim 18, wherein calculating the modulation transfer function comprises normalising the measured contrast values with respect to the theoretical maximum contrast between the bars and the spaces between the bars.
20. The method of any one of claims 17 to 19, wherein the at least one input image comprises a plurality of input images, each of the input images comprising at least one of the sets of parallel bars.
21. The method of any one of claims 17 to 20, wherein the measured contrast between the bars and the spaces between the bars for each of the sets of parallel bars is an average of the contrast between the bars and the adjacent spaces between the bars in that set.
22. A night vision device testing system configured to perform the method of any one of claims 12 to 21.
23. The system of claim 22, wherein the system comprises detection means arranged to detect an image produced by the night vision device and to output image data encoding the image;
image generating means arranged to generate an input image so that the image produced by the night vision device is an output image reproducing the input image; and
processing means arranged to analyse the image data.
PCT/GB2020/050320 2019-02-15 2020-02-12 Night vision device testing WO2020165581A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2113029.9A GB2596009B (en) 2019-02-15 2020-02-12 Night vision device testing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1902122.9A GB201902122D0 (en) 2019-02-15 2019-02-15 Night vision device testing
GB1902122.9 2019-02-15

Publications (1)

Publication Number Publication Date
WO2020165581A1 true WO2020165581A1 (en) 2020-08-20

Family

ID=65998455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2020/050320 WO2020165581A1 (en) 2019-02-15 2020-02-12 Night vision device testing

Country Status (2)

Country Link
GB (2) GB201902122D0 (en)
WO (1) WO2020165581A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113432838A (en) * 2021-06-09 2021-09-24 北方夜视技术股份有限公司 Automatic testing system and method for signal-to-noise ratio and halo of low-light-level image intensifier
CN114184355A (en) * 2021-11-22 2022-03-15 河南中光学集团有限公司 Night and day glimmer product performance adjusting and detecting device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5140418A (en) * 1991-03-18 1992-08-18 The United States Of America As Represented By The Secretary Of The Army System for quantitatively evaluating imaging devices
US5608213A (en) * 1995-11-03 1997-03-04 The United States Of America As Represented By The Secretary Of The Air Force Spectral distribution emulation
US20080158374A1 (en) * 2006-12-31 2008-07-03 Sapia Mark A Systems and methods for quantitatively assessing the quality of an image produced by an imaging system
GB2532651A (en) * 2015-05-14 2016-05-25 Fenn Night Vision Ltd Night vision device testing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5140418A (en) * 1991-03-18 1992-08-18 The United States Of America As Represented By The Secretary Of The Army System for quantitatively evaluating imaging devices
US5608213A (en) * 1995-11-03 1997-03-04 The United States Of America As Represented By The Secretary Of The Air Force Spectral distribution emulation
US20080158374A1 (en) * 2006-12-31 2008-07-03 Sapia Mark A Systems and methods for quantitatively assessing the quality of an image produced by an imaging system
GB2532651A (en) * 2015-05-14 2016-05-25 Fenn Night Vision Ltd Night vision device testing

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
J. PARTEE ET AL: "Automated intensifier tube measuring system", PROCEEDINGS OF SPIE, vol. 6956, 3 April 2008 (2008-04-03), US, pages 695608, XP055279291, ISBN: 978-1-5106-1533-5, DOI: 10.1117/12.771384 *
KRZYSZTOF CHRZANOWSKI: "A COMPUTERIZED STATION FOR TESTING NIGHT VISION DEVICES", 30 January 2014 (2014-01-30), XP055279168, Retrieved from the Internet <URL:http://www.inframet.com/Literature/Optro 2014-2951823.pdf> [retrieved on 20160609] *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113432838A (en) * 2021-06-09 2021-09-24 北方夜视技术股份有限公司 Automatic testing system and method for signal-to-noise ratio and halo of low-light-level image intensifier
CN113432838B (en) * 2021-06-09 2022-08-09 北方夜视技术股份有限公司 Automatic testing system and testing method for signal-to-noise ratio and halo of low-light-level image intensifier
CN114184355A (en) * 2021-11-22 2022-03-15 河南中光学集团有限公司 Night and day glimmer product performance adjusting and detecting device
CN114184355B (en) * 2021-11-22 2023-12-22 河南中光学集团有限公司 Device for adjusting and detecting performance of low-light-level product for day and night

Also Published As

Publication number Publication date
GB201902122D0 (en) 2019-04-03
GB2596009B (en) 2023-10-11
GB202113029D0 (en) 2021-10-27
GB2596009A (en) 2021-12-15

Similar Documents

Publication Publication Date Title
JP5043755B2 (en) Resin material inspection device and program
AU2007324081B2 (en) Focus assist system and method
US7405816B2 (en) Scalable test target and method for measurement of camera image quality
CN110261069B (en) Detection method for optical lens
WO2020165581A1 (en) Night vision device testing
CN109859155A (en) Image distortion detection method and system
CN108055532A (en) Automate the method and apparatus of matching test card
US20170048518A1 (en) Method and apparatus for adjusting installation flatness of lens in real time
CN111044262A (en) Near-to-eye display optical-mechanical module detection device
El Helou et al. AAM: An assessment metric of axial chromatic aberration
CN107091729B (en) A kind of focal length of lens test method of no mechanical movement
WO2016181099A1 (en) Night vision device testing
CN103176349A (en) Lens detection device and method
US11004229B2 (en) Image measurement device, image measurement method, imaging device
CN116952168A (en) Measuring system and measuring method for laser beam parallelism
CN107231553A (en) Corner location acquisition methods and device
US7162069B2 (en) Objectification of surface test methods through image processing
JP2006284495A (en) Method and instrument for measuring refractive index dispersion of transparent object
KR101653649B1 (en) 3D shape measuring method using pattern-light with uniformity compensation
TWM379758U (en) Lens imaging quality detecting apparatus
JPH10115514A (en) Method and apparatus for inspection of surface smoothness
RU2797508C2 (en) Method for automatic determination of resolution of digital optoelectronic systems and a test object of its implementation including line patterns with arcuate structure of elements
CN211696890U (en) Near-to-eye display optical-mechanical module detection device
CN113702008B (en) Image quality detection method and device of optical imaging system
KR102475140B1 (en) Resolution test chart of the lens module and the resolution test device including the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20707766

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 202113029

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20200212

122 Ep: pct application non-entry in european phase

Ref document number: 20707766

Country of ref document: EP

Kind code of ref document: A1