CN111306988A - Calibration data determination method and device - Google Patents

Calibration data determination method and device Download PDF

Info

Publication number
CN111306988A
CN111306988A CN201811518418.6A CN201811518418A CN111306988A CN 111306988 A CN111306988 A CN 111306988A CN 201811518418 A CN201811518418 A CN 201811518418A CN 111306988 A CN111306988 A CN 111306988A
Authority
CN
China
Prior art keywords
image
target
data
shooting
pixel position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811518418.6A
Other languages
Chinese (zh)
Other versions
CN111306988B (en
Inventor
温俊阳
柴冯冯
胡晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikmicro Sensing Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811518418.6A priority Critical patent/CN111306988B/en
Publication of CN111306988A publication Critical patent/CN111306988A/en
Application granted granted Critical
Publication of CN111306988B publication Critical patent/CN111306988B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G1/00Sighting devices
    • F41G1/06Rearsights

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

The application provides a calibration data determination method and a calibration data determination device, wherein the method comprises the following steps: acquiring a plurality of data sets, wherein each time a target device shoots a designated target, one data set is obtained, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image obtained by acquiring the designated target by aiming device arranged on the target device; traversing each data set, and marking the impact points in the images of the data sets; taking the shooting distance and the position information of the impact point in the image of the data set as a set of calibration data, and determining the calibration data by acquiring a plurality of shooting distances and image pairs, thereby omitting the time-consuming and labor-consuming (X, Y) value measurement; the image coordinate position is directly obtained by marking the impact point in the image, and the image coordinate position of the impact point does not need to be converted by (X, Y) with large error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.

Description

Calibration data determination method and device
Technical Field
The present application relates to the field of calibration technologies, and in particular, to a calibration data determining method and apparatus.
Background
At present, thermal imaging sights are widely applied to firearms of various types, and are characterized in that images of targets are captured under the condition of not being limited by external light, and bullet impact points are estimated through calibration data stored in the thermal imaging sights so as to assist shooting personnel in shooting the targets. Therefore, the accuracy of the calibration data directly influences the accuracy of the impact point, and further influences the accurate shooting of the target.
In the related technology, the calibration data is obtained by manually collecting multiple groups of shooting data and inputting the shooting data into a thermal imaging sighting device, wherein the shooting data comprises a shooting distance L, a horizontal distance X and a vertical distance Y between a shooting point and the center of a target. And then the thermal imaging sighting device converts the horizontal distance X and the vertical distance Y in each group of shooting data into an image coordinate position (u, v) through an optical imaging principle, and stores the converted (u, v) and L as calibration data. However, the shooting data acquisition is time-consuming and labor-consuming, and the accuracy of the obtained calibration data is low because the manual measurement errors of the horizontal distance and the vertical distance between the acquired impact point and the center of the target are large.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for determining calibration data, so as to solve the problem of low accuracy of the calibration data obtained in the related art.
According to a first aspect of an embodiment of the present application, there is provided a calibration data determining method, including:
acquiring a plurality of data sets, wherein a target device designates a target to obtain one data set every time the target device shoots, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device arranged on the target device and used for acquiring the designated target;
traversing each data set, and marking the impact points in the image of each traversed data set;
and taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
According to a second aspect of embodiments of the present application, there is provided a calibration data determination apparatus, including:
the data acquisition module is used for acquiring a plurality of data sets, a target device acquires one data set by designating a target once shooting, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device arranged on the target device and acquiring the designated target;
the marking module is used for traversing each data group and marking the impact points in the image of each traversed data group;
and the calibration data determining module is used for taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
According to a third aspect of embodiments herein, there is provided a sight apparatus, the apparatus comprising a readable storage medium and a processor;
wherein the readable storage medium is configured to store machine executable instructions;
the processor is configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method according to the first aspect.
By applying the embodiment of the application, a plurality of data sets are obtained (the target device gives a data set by shooting the designated target once, each data set comprises a shooting distance and an image, the shooting distance in each data set is different, the shooting distance is the distance between the target device and the designated target, and the image is the image obtained by aiming at the target device to collect the designated target), then, each data set is traversed in sequence, aiming at each traversed data set, the impact points are marked in the image of the data set, and the shooting distance of the data set and the position information of the impact points in the image are used as a set of calibration data.
Based on the description, calibration data are determined by acquiring a plurality of data sets comprising shooting distances and images, and time and labor consuming (X, Y) value measurement and recording work of the sighting device are omitted; the image coordinate position is directly obtained by marking the impact point in the image, and the image coordinate position of the impact point does not need to be converted by (X, Y) with large error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.
Drawings
FIG. 1 is a diagram illustrating a ballistic calibration architecture according to an exemplary embodiment of the present application;
FIG. 2A is a flowchart illustrating an embodiment of a calibration data determination method according to an exemplary embodiment of the present application;
FIG. 2B is a schematic diagram of a cursor marking a bullet point according to the embodiment shown in FIG. 2A;
FIG. 2C is a schematic illustration of a calibration data used to predict a strike point according to the embodiment of FIG. 2A;
FIG. 3 is a hardware block diagram of a sight apparatus shown in accordance with an exemplary embodiment of the present application;
fig. 4 is a block diagram of an embodiment of a calibration data determining apparatus according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
As shown in fig. 1, the ballistic calibration structure diagram includes that a sighting device is installed on a gun, after the gun aims at the center of a target, a shooting distance L between the target and the gun is measured, after a bullet shot by the gun penetrates through the target, a horizontal distance X and a vertical distance Y between a shooting point (i.e., a bullet hole) and the center of the target are measured, then an image coordinate position (u, v) is obtained by substituting the measured (X, Y) into a conversion formula of a space coordinate system and a screen coordinate system, and finally (u, v, L) is used as a set of calibration data. And sequentially obtaining multiple groups of calibration data by moving the position of the target or the gun.
However, the acquisition of the initial data (X, Y, L) is difficult, especially the acquisition of (X, Y) requires manual measurement on the target, and the manual measurement error is large because the target is small. In addition, in the process of mapping (X, Y) to the image coordinates, there is a conversion error in the conversion inside the collimator, and the calculation load of the chip inside the collimator is increased.
In order to solve the above problem, the present application provides a calibration data determining method, which obtains a plurality of data sets (a data set is obtained by specifying a target by a target device every time the target device is shot, each data set includes a shooting distance and an image, and the shooting distances in the data sets are different, the shooting distance is a distance between the target device and the specified target, and the image is an image obtained by acquiring the specified target by a sighting device), then sequentially traversing each data set, marking an impact point in an image of the data set for each traversed data set, and using the shooting distance of the data set and position information of the impact point in the image as a set of calibration data.
Based on the description, calibration data are determined by acquiring a plurality of data sets comprising shooting distances and images, and time and labor consuming (X, Y) value measurement and recording work of the sighting device are omitted; the image coordinate position is directly obtained by marking the impact point in the image, and the image coordinate position of the impact point does not need to be converted by (X, Y) with large error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.
The technical solution of the present application will be described in detail with specific examples.
Fig. 2A is a flowchart of an embodiment of a calibration data determining method according to an exemplary embodiment of the present application, where the calibration data determining method may be applied to an address device installed on a target device, and may also be applied to other electronic devices, that is, a data set collected by the address device is transmitted to the other electronic devices, and the other electronic devices determine calibration data by using multiple data sets. In the embodiment of the present application, a target device is taken as an example for explanation, and as shown in fig. 2A, the calibration data determining method includes the following steps:
step 201: the method comprises the steps of obtaining a plurality of data sets, wherein the target device designates a target once to obtain one data set, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device acquisition of the designated target.
In one embodiment, the target device is fired at the same point of the designated target each time. If the number of the designated targets is 1, the number of the impact points in the image coincides with the corresponding shooting order, and the shooting distance is gradually increased or decreased with the shooting order, for example, assuming that the shooting order is 5, there are 5 impact points in the image acquired at this time; if the number of the designated targets is plural, the number of the impact points in each image is only 1.
Wherein the designated target may be a target, and the target device is fired at the target center each time.
In an embodiment, the shooting distance of each group may be obtained by: and when the target device shoots at each time, the shooting distance between the target device and the specified target is obtained through a distance measuring module arranged in the sighting device equipment.
Wherein, this range finding module can be laser range finding module, pitch angle range finding module, ultrasonic ranging module etc..
In an exemplary scenario, as described above for the ballistic calibration configuration shown in fig. 1, assuming that the number of targets is 1, the gun is always aimed at the center of the target throughout the calibration process. During the first shooting, the shooting distance 1 between the target and the gun is acquired through the ranging module, after a bullet shot by the gun passes through the target, the aiming device records an acquired target image, and the target image is provided with 1 bullet hole (namely a bullet landing point); through removing gun or target, during the shooting of second time, gather shooting distance 2 between target and the gun through ranging module, after the bullet of gun shooting passed the target, aim the target image that utensil equipment record was gathered, have 2 bullet holes in this target image, analogize in proper order, until shooting N times, along with the increase of shooting order, the shooting distance between target and the gun is more and more far away, has N bullet hole in the target image of N time collection.
It should be noted that, because the sighting telescope device usually displays the acquired image on the screen in real time based on the thermal imaging (i.e. infrared imaging) principle, the pixel value size contained in the image is related to the temperature value of the object, the larger the temperature value is, the larger the corresponding pixel value is, the more obvious the image is presented, and after a bullet fired by the gun passes through a designated target, the temperature of the bullet hole formed on the designated target is higher than the ambient temperature due to frictional heat, and the temperature difference is also significant. The user can easily observe the location of the bullet hole in the image through the screen. If the position of the bullet hole in the image cannot be observed through the screen due to special reasons, a heating object (such as an ignited cigarette end) or paint with high emissivity can be inserted into the bullet hole until the user can observe the position of the bullet hole through the screen, and the sighting device records the current image by triggering the OK button, so that a data set comprising the shooting distance and the image is obtained.
Step 202: and traversing each data set in turn, and marking the impact points in the image of each traversed data set.
In an embodiment, based on the thermal imaging principle described in step 201, a pixel position corresponding to the maximum temperature value may be detected from an image corresponding to the shooting distance, and the detected pixel position is used as an impact point, so as to implement automatic marking, thereby saving manual operation.
It should be noted that, after the impact point is determined, the preset cursor may also be moved to the pixel position of the impact point for the user to view, and if the user observes that the position of the cursor in the screen deviates from the position of the impact point, the cursor may also be finely adjusted by pressing a key, so that the cursor and the impact point are completely overlapped, and thus the position of the cursor in the image may be determined as the impact point. The preset cursor may be a relatively conspicuous cross cursor.
It should be further noted that, in order to avoid the interference of the fire source, the field of view region of the designated target in the image may be calibrated in advance, and then the pixel position corresponding to the maximum temperature value is detected from the pre-calibrated field of view region of the image.
In an embodiment, since the user can easily observe the bullet hole in the image through the screen, the user can also move the cursor completely by operating the keys on the sighting telescope to make the cursor coincide with the bullet hole to realize the marking of the impact point, and the operation of detecting the maximum temperature value to mark the impact point is omitted. As shown in fig. 2B, the schematic diagram of the cursor marking the impact point displayed on the screen of the sight device is shown, and the cursor is moved to be completely overlapped with the impact point by the key.
It is noted that the impact point is marked with a cursor in the image, the marking error is introduced in pixel level, and one pixel represents the actual size in millimeter level, so the introduced error is in millimeter level, and the accuracy in the embodiment is higher compared with the centimeter level error introduced by manual measurement in the prior art.
In an embodiment, the traversal order of the data sets may be traversed from small to large according to the shooting order, for a process of taking a detected pixel position as an impact point, if only one detected pixel position exists, the pixel position is determined as the impact point, if a plurality of detected pixel positions exist, one pixel position is selected from the plurality of pixel positions to be determined as the impact point according to the magnitude relation between the shooting distance of the currently traversed data set and the shooting distance of the last traversed data set, and the selected pixel position is different from the pixel position of the impact point marked last time.
As can be seen from the above description of step 201, for the case that there are a plurality of designated targets, there are only 1 impact point in each image, so that there is only one pixel position corresponding to the maximum temperature value obtained from the image, and the pixel position can be directly determined as the impact point; for the case that the number of the designated targets is one, only the number of the bullet holes in the image of the data set obtained by the first shooting (the shooting order is 1) is 1, so that only one pixel position corresponding to the maximum temperature value obtained from the image is provided, in the subsequent shooting process, the number of the bullet holes in the collected image is consistent with the shooting order, and if the interval time of multiple shooting is short, the pixel position corresponding to the maximum temperature value in the image may be multiple, and based on the characteristic that the farther the shooting distance is, the lower the impact point is, when the obtained pixel position is multiple, if the shooting distance of the data set traversed this time is greater than the shooting distance of the data set traversed last time, the pixel position at the lowest position (the pixel position of the impact point not marked last time) may be selected as the impact point corresponding to the shooting distance of the data set traversed this time, and if the shooting distance of the data set traversed this time is less than the distance of the data set traversed last time, the pixel position located the top most may be selected as the impact point (the pixel position other than the last marked impact point) corresponding to the shot distance of the data set of the current traversal.
Step 203: and taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
In an embodiment, after obtaining multiple sets of calibration data, the sight device obtains a shooting distance between a target device and a target during actual shooting, searches stored calibration data by using the shooting distance, obtains calibration data corresponding to the shooting distance, determines an actual impact point by using position information in the obtained calibration data, and marks the actual impact point on a screen for a user to view.
If the stored calibration data contains the shooting distance, the position information in the calibration data where the shooting distance is located is directly determined as an actual impact point, if the stored calibration data does not contain the shooting distance, calibration data where two shooting distances adjacent to the shooting distance in front of and behind are located are obtained, interpolation calculation is carried out by utilizing the position information in the two calibration data to obtain new position information, and the new position information is determined as the actual impact point.
In an exemplary scenario, as shown in fig. 2C, the diagram is a diagram of the impact points displayed on the screen of the sight device, and fig. 2C contains five sets of calibration data, respectively (X) in each set1′,Y1′,L1)、(X2′,Y2′,L2)、(X3′,Y3′,L3)、(X4′,Y4′,L4)、(X5′,Y5′,L5) Wherein L is1<L2<L3<L4<L5Assuming that the current shooting distance is L0From FIG. 2C, L is shown3<L0<L4Thus is related to L0Two adjacent shooting distances L in front and back3And L4The position information in the calibration data is (X)3′,Y3') and (X)4′,Y4') by using (X)3′,Y3') and (X)4′,Y4') interpolation results in the impact point, i.e., the gray point shown in FIG. 2C.
In the embodiment of the application, a plurality of data sets are obtained (a target device obtains one data set by shooting a designated target once, each data set comprises a shooting distance and an image, the shooting distance in each data set is different, the shooting distance is the distance between the target device and the designated target, and the image is the image of the designated target collected by aiming device), then, each data set is traversed in sequence, aiming at each traversed data set, an impact point is marked in the image of the data set, and the shooting distance of the data set and the position information of the impact point in the image are used as a set of calibration data.
Based on the description, calibration data are determined by acquiring a plurality of data sets comprising shooting distances and images, and time and labor consuming (X, Y) value measurement and recording work of the sighting device are omitted; the image coordinate position is directly obtained by marking the impact point in the image through the cursor, and the image coordinate position of the impact point does not need to be converted by (X, Y) with larger error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.
Fig. 3 is a hardware block diagram of an aimer device according to an exemplary embodiment of the present application, where the aimer device includes: a communication interface 301, a processor 302, a machine-readable storage medium 303, and a bus 304; wherein the communication interface 301, the processor 302, and the machine-readable storage medium 303 communicate with each other via a bus 304. The processor 302 may execute the calibration data determination method described above by reading and executing machine executable instructions in the machine readable storage medium 303 corresponding to the control logic of the calibration data determination method, and the specific content of the method is described in the above embodiments, which will not be described herein again.
The machine-readable storage medium 303 referred to herein may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like. For example, the machine-readable storage medium may be: volatile memory, non-volatile memory, or similar storage media. In particular, the machine-readable storage medium 303 may be a RAM (random Access Memory), a flash Memory, a storage drive (e.g., a hard drive), any type of storage disk (e.g., an optical disk, a DVD, etc.), or similar storage medium, or a combination thereof.
Fig. 4 is a block diagram of an embodiment of a calibration data determining apparatus according to an exemplary embodiment of the present application, where the calibration data determining apparatus includes:
a data obtaining module 410, configured to obtain multiple data sets, where a target device obtains one data set every time the target device shoots a designated target, each data set includes a shooting distance and an image, the shooting distance in each data set is different, the shooting distance is a distance between the target device and the designated target, and the image is an image obtained by acquiring the designated target by a sighting device arranged on the target device;
a marking module 420, configured to traverse each data set, and for each traversed data set, mark a bullet in an image of the data set;
and a calibration data determining module 430, configured to use the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
In an optional implementation manner, the data obtaining module 410 is specifically configured to, in the process of obtaining the shooting distance of each data set, obtain, by a ranging module provided in the sighting device, the shooting distance between the target device and the specified target at each shooting of the target device.
In an optional implementation manner, the marking module 420 is specifically configured to detect a pixel position corresponding to a maximum temperature value from the image; the temperature of a bullet hole formed on a specified target after a bullet shot by the target device passes through the specified target is higher than the ambient temperature; and taking the detected pixel position as the impact point.
In an optional implementation manner, the marking module 420 is specifically configured to detect a pixel position corresponding to a maximum temperature value from a preset area of the image in a process of acquiring a pixel position corresponding to the maximum temperature value from the image, where the preset area is a field area of the specified target in the image calibrated in advance.
In an optional implementation manner, the traversal order of each data group is a traversal order from small to large according to the shooting order, and the marking module 420 is specifically configured to determine, when a detected pixel position is taken as the impact point, if there is only one detected pixel position, that pixel position is taken as the impact point; if a plurality of detected pixel positions exist, selecting one pixel position from the plurality of pixel positions to be determined as the impact point according to the size relation between the shooting distance of the currently traversed data group and the shooting distance of the last traversed data group, wherein the selected pixel position is different from the pixel position of the impact point marked last time.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (11)

1. A method for determining calibration data, the method comprising:
acquiring a plurality of data sets, wherein a target device designates a target to obtain one data set every time the target device shoots, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device arranged on the target device and used for acquiring the designated target;
traversing each data set, and marking the impact points in the image of each traversed data set;
and taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
2. The method of claim 1, wherein the shot distance for each data set is obtained by:
and when the target device shoots at each time, the shooting distance between the target device and the specified target is obtained through a distance measuring module arranged in the aiming device.
3. The method of claim 1, wherein marking the impact points in the image of the data set comprises:
detecting a pixel position corresponding to a maximum temperature value from the image; the temperature of a bullet hole formed on a specified target after a bullet shot by the target device passes through the specified target is higher than the ambient temperature;
and taking the detected pixel position as the impact point.
4. The method of claim 3, wherein obtaining the pixel position corresponding to the maximum temperature value from the image comprises:
and detecting a pixel position corresponding to the maximum temperature value from a preset area of the image, wherein the preset area is a field of view area of the specified target in the image calibrated in advance.
5. The method of claim 3, wherein the traversal order of the data sets is in a small-to-large order in the shooting order, and wherein the step of using the detected pixel position as the impact point comprises:
if only one detected pixel position exists, determining the pixel position as a bullet point;
if a plurality of detected pixel positions exist, selecting one pixel position from the plurality of pixel positions to be determined as the impact point according to the size relation between the shooting distance of the currently traversed data group and the shooting distance of the last traversed data group, wherein the selected pixel position is different from the pixel position of the impact point marked last time.
6. An apparatus for determining calibration data, the apparatus comprising:
the data acquisition module is used for acquiring a plurality of data sets, a target device acquires one data set by designating a target once shooting, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device arranged on the target device and acquiring the designated target;
the marking module is used for traversing each data group and marking the impact points in the image of each traversed data group;
and the calibration data determining module is used for taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
7. The apparatus of claim 6,
the data acquisition module is specifically used for acquiring the shooting distance between the target device and the specified target through a distance measurement module arranged in the sighting device during each shooting of the target device in the process of acquiring the shooting distance of each data set.
8. The apparatus according to claim 6, wherein the marking module is specifically configured to detect a pixel position corresponding to a maximum temperature value from the image; the temperature of a bullet hole formed on a specified target after a bullet shot by the target device passes through the specified target is higher than the ambient temperature; and taking the detected pixel position as the impact point.
9. The apparatus according to claim 8, wherein the marking module is specifically configured to detect a pixel position corresponding to the maximum temperature value from a preset region of the image in a process of obtaining the pixel position corresponding to the maximum temperature value from the image, where the preset region is a field of view region of the pre-calibrated specified target in the image.
10. The apparatus according to claim 8, wherein the traversal order of the data sets is a small-to-large traversal order according to the shooting order, and the marking module is specifically configured to, in the process of using the detected pixel position as the impact point, determine the detected pixel position as the impact point if only one pixel position exists; if a plurality of detected pixel positions exist, selecting one pixel position from the plurality of pixel positions to be determined as the impact point according to the size relation between the shooting distance of the currently traversed data group and the shooting distance of the last traversed data group, wherein the selected pixel position is different from the pixel position of the impact point marked last time.
11. A sight device, characterized in that the device comprises a readable storage medium and a processor;
wherein the readable storage medium is configured to store machine executable instructions;
the processor configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of any one of claims 1-5.
CN201811518418.6A 2018-12-12 2018-12-12 Calibration data determination method and device Active CN111306988B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811518418.6A CN111306988B (en) 2018-12-12 2018-12-12 Calibration data determination method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811518418.6A CN111306988B (en) 2018-12-12 2018-12-12 Calibration data determination method and device

Publications (2)

Publication Number Publication Date
CN111306988A true CN111306988A (en) 2020-06-19
CN111306988B CN111306988B (en) 2022-12-23

Family

ID=71150464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811518418.6A Active CN111306988B (en) 2018-12-12 2018-12-12 Calibration data determination method and device

Country Status (1)

Country Link
CN (1) CN111306988B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113091512A (en) * 2021-04-07 2021-07-09 合肥英睿系统技术有限公司 Aiming method and device for shooting device
CN113310351A (en) * 2021-05-29 2021-08-27 北京波谱华光科技有限公司 Method and system for calibrating precision of electronic division and assembly meter
WO2024066077A1 (en) * 2022-09-26 2024-04-04 汉王科技股份有限公司 Calibration method and apparatus for target shooting device, and target shooting device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11108596A (en) * 1997-10-03 1999-04-23 Nippon Avionics Co Ltd Bullet trace detector, and shooting automatic scoring device
CN1546938A (en) * 2003-12-04 2004-11-17 华南理工大学 Ball firing system and method for identifying rapid fire shot hole thereof
CN2716792Y (en) * 2004-07-08 2005-08-10 上海亿湾特训练设备科技有限公司 A novel impact point coordinate positioning device
CN105300181A (en) * 2015-10-30 2016-02-03 北京艾克利特光电科技有限公司 Accurate photoelectric sighting device capable of prompting shooting in advance
US20160305744A1 (en) * 2013-12-10 2016-10-20 Republic Of Korea (Air Force Logistics Command 83T 83Th Information And Communication Maintenance De Laser shooting training system and method
CN107958205A (en) * 2017-10-31 2018-04-24 北京艾克利特光电科技有限公司 Gunnery training intelligent management system
CN108288290A (en) * 2017-12-07 2018-07-17 中国航空工业集团公司西安航空计算技术研究所 A kind of on-line automatic scaling method of target center being applied to intelligent sniping gun
CN108805210A (en) * 2018-06-14 2018-11-13 深圳深知未来智能有限公司 A kind of shell hole recognition methods based on deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11108596A (en) * 1997-10-03 1999-04-23 Nippon Avionics Co Ltd Bullet trace detector, and shooting automatic scoring device
CN1546938A (en) * 2003-12-04 2004-11-17 华南理工大学 Ball firing system and method for identifying rapid fire shot hole thereof
CN2716792Y (en) * 2004-07-08 2005-08-10 上海亿湾特训练设备科技有限公司 A novel impact point coordinate positioning device
US20160305744A1 (en) * 2013-12-10 2016-10-20 Republic Of Korea (Air Force Logistics Command 83T 83Th Information And Communication Maintenance De Laser shooting training system and method
CN105300181A (en) * 2015-10-30 2016-02-03 北京艾克利特光电科技有限公司 Accurate photoelectric sighting device capable of prompting shooting in advance
CN107958205A (en) * 2017-10-31 2018-04-24 北京艾克利特光电科技有限公司 Gunnery training intelligent management system
CN108288290A (en) * 2017-12-07 2018-07-17 中国航空工业集团公司西安航空计算技术研究所 A kind of on-line automatic scaling method of target center being applied to intelligent sniping gun
CN108805210A (en) * 2018-06-14 2018-11-13 深圳深知未来智能有限公司 A kind of shell hole recognition methods based on deep learning

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113091512A (en) * 2021-04-07 2021-07-09 合肥英睿系统技术有限公司 Aiming method and device for shooting device
CN113310351A (en) * 2021-05-29 2021-08-27 北京波谱华光科技有限公司 Method and system for calibrating precision of electronic division and assembly meter
WO2024066077A1 (en) * 2022-09-26 2024-04-04 汉王科技股份有限公司 Calibration method and apparatus for target shooting device, and target shooting device

Also Published As

Publication number Publication date
CN111306988B (en) 2022-12-23

Similar Documents

Publication Publication Date Title
CN111306988B (en) Calibration data determination method and device
CN103328926B (en) There is the measurement mechanism automatically characterizing and change function
US4333106A (en) Method of measuring firing misses and firing miss-measuring installation for the performance of the method
US9612115B2 (en) Target-correlated electronic rangefinder
US11047648B2 (en) Firearm and/or firearm sight calibration and/or zeroing
CA2822387C (en) Geodesic measuring device comprising a thermographic camera
US10782096B2 (en) Skeet and bird tracker
TWI485630B (en) Sights, operational methods thereof, and computer program products thereof
US20200263956A1 (en) Firearm Training Apparatus And Methods
US8998085B2 (en) Optical device configured to determine a prey score of antlered prey
CN105026886B (en) Tracker unit and method in a tracker unit
CN109154486A (en) Bore sighting device and method
CN102221410A (en) Method for an IR-radiation -- based temperature measurement and IR-radiation -- based temperature measuring device
KR101200350B1 (en) A Shooting system
CN111397586A (en) Measuring system
CN113091512A (en) Aiming method and device for shooting device
KR101775153B1 (en) Target Training System and Analysis Method
RU2604909C9 (en) Method for assessment of firing efficiency of combat remote controlled module located on mobile object
CN107367201A (en) A kind of a wide range of multiple target shell fries drop point sound localization method
Liscio et al. The lead-in method for bullet impacts in metal panels
KR102011765B1 (en) Method and apparatus for aiming target
Xiao et al. Research on detection system of optical sights triaxial parallelism
CN104833268B (en) Small caliber piece dynamic tracking accuracy detecting device
US20160018196A1 (en) Target scoring system and method
KR102433858B1 (en) Apparatus and method for measuring distance of target and launching angle of guided projectile system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200710

Address after: 311501 building A1, No. 299, Qiushi Road, Tonglu Economic Development Zone, Tonglu County, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Haikang Micro Shadow Sensing Technology Co.,Ltd.

Address before: Hangzhou City, Zhejiang province 310051 Binjiang District Qianmo Road No. 555

Applicant before: Hangzhou Hikvision Digital Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant