CN111306988B - Calibration data determination method and device - Google Patents

Calibration data determination method and device Download PDF

Info

Publication number
CN111306988B
CN111306988B CN201811518418.6A CN201811518418A CN111306988B CN 111306988 B CN111306988 B CN 111306988B CN 201811518418 A CN201811518418 A CN 201811518418A CN 111306988 B CN111306988 B CN 111306988B
Authority
CN
China
Prior art keywords
image
target
data
shooting
pixel position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811518418.6A
Other languages
Chinese (zh)
Other versions
CN111306988A (en
Inventor
温俊阳
柴冯冯
胡晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikmicro Sensing Technology Co Ltd
Original Assignee
Hangzhou Hikmicro Sensing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikmicro Sensing Technology Co Ltd filed Critical Hangzhou Hikmicro Sensing Technology Co Ltd
Priority to CN201811518418.6A priority Critical patent/CN111306988B/en
Publication of CN111306988A publication Critical patent/CN111306988A/en
Application granted granted Critical
Publication of CN111306988B publication Critical patent/CN111306988B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G1/00Sighting devices
    • F41G1/06Rearsights

Abstract

The application provides a calibration data determination method and a calibration data determination device, wherein the method comprises the following steps: acquiring a plurality of data sets, wherein each time a target device shoots a designated target, one data set is obtained, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image obtained by acquiring the designated target by aiming device arranged on the target device; traversing each data set, and marking the impact points in the images of the data sets; taking the shooting distance and the position information of the impact point in the image of the data set as a set of calibration data, and determining the calibration data by acquiring a plurality of shooting distances and image pairs, thereby omitting the time-consuming and labor-consuming (X, Y) value measurement; the image coordinate position is directly obtained by marking the impact point in the image, and the image coordinate position of the impact point does not need to be converted by (X, Y) with large error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.

Description

Calibration data determination method and device
Technical Field
The present application relates to the field of calibration technologies, and in particular, to a method and an apparatus for determining calibration data.
Background
At present, thermal imaging sights are widely applied to firearms of various models, and are characterized in that images of targets are captured under the condition of no limitation of external light, and bullet impact points are estimated through calibration data stored in the thermal imaging sights so as to assist shooting personnel in shooting the targets. Therefore, the accuracy of the calibration data directly influences the accuracy of the impact point, and further influences the accurate shooting of the target.
In the related technology, the calibration data is obtained by manually collecting multiple groups of shooting data and inputting the shooting data into the thermal imaging sighting device, wherein the shooting data comprises a shooting distance L, a horizontal distance X and a vertical distance Y between a target and a target point. Then, the thermal imaging sight converts the horizontal distance X and the vertical distance Y in each set of shooting data into an image coordinate position (u, v) through an optical imaging principle, and stores the converted (u, v) and L as calibration data. However, the shooting data acquisition is time-consuming and labor-consuming, and the accuracy of the obtained calibration data is low because the manual measurement errors of the horizontal distance and the vertical distance between the shot point and the center of the target are large.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for determining calibration data, so as to solve the problem of low accuracy of the calibration data obtained in the related art.
According to a first aspect of an embodiment of the present application, a calibration data determining method is provided, where the method includes:
acquiring a plurality of data sets, wherein each time a target device shoots a designated target, one data set is obtained, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image obtained by acquiring the designated target by aiming device arranged on the target device;
traversing each data set, and marking the bullet points in the image of each traversed data set;
and taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
According to a second aspect of embodiments of the present application, there is provided a calibration data determination apparatus, including:
the data acquisition module is used for acquiring a plurality of data sets, a target device acquires one data set by designating a target once shooting, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device arranged on the target device and acquiring the designated target;
the marking module is used for traversing each data set and marking the bullet points in the image of the data set aiming at each traversed data set;
and the calibration data determining module is used for taking the shooting distance of the data group and the position information of the impact point in the image as a group of calibration data.
According to a third aspect of embodiments herein, there is provided a sight apparatus, the apparatus comprising a readable storage medium and a processor;
wherein the readable storage medium is configured to store machine executable instructions;
the processor is configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of the first aspect.
By applying the embodiment of the application, a plurality of data sets are obtained (each time a target device shoots a designated target to obtain one data set, each data set comprises a shooting distance and an image, the shooting distance in each data set is different, the shooting distance is the distance between the target device and the designated target, and the image is an image obtained by aiming at equipment for collecting the designated target), then, each data set is traversed in sequence, aiming at each traversed data set, an impact point is marked in the image of the data set, and the shooting distance of the data set and the position information of the impact point in the image are used as a set of calibration data.
Based on the description, calibration data are determined by acquiring a plurality of data sets including shooting distances and images, so that the measurement of time-consuming and labor-consuming (X, Y) values and the recording of the work of aiming device are omitted; the image coordinate position is directly obtained by marking the impact point in the image, and the image coordinate position of the impact point does not need to be converted by (X, Y) with large error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.
Drawings
FIG. 1 is a ballistic calibration block diagram illustrating the present application according to an exemplary embodiment;
FIG. 2A is a flowchart illustrating an embodiment of a calibration data determination method according to an exemplary embodiment of the present application;
FIG. 2B is a schematic diagram of a cursor marking a bullet point according to the embodiment shown in FIG. 2A;
FIG. 2C is a schematic diagram illustrating an exemplary method for predicting an impact point using calibration data according to the embodiment shown in FIG. 2A;
fig. 3 is a hardware block diagram of a target device according to an exemplary embodiment of the present application;
fig. 4 is a block diagram of an embodiment of a calibration data determining apparatus according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context.
As shown in fig. 1, a ballistic calibration structure diagram is shown, in which a sighting device is installed on a gun, after the gun aims at the center of a target, a shooting distance L between the target and the gun is measured, after a bullet shot by the gun penetrates the target, a horizontal distance X and a vertical distance Y from a bullet impact point (i.e., a bullet hole) to the center of the target are measured, then an image coordinate position (u, v) is obtained by substituting the measured (X, Y) into a conversion formula of a space coordinate system and a screen coordinate system, and finally (u, v, L) is used as a set of calibration data. And sequentially obtaining multiple groups of calibration data by moving the position of the target or the gun.
However, the acquisition of the initial data (X, Y, L) is difficult, especially the acquisition of (X, Y) requires manual measurement on the target, and the manual measurement error is large because the target is small. In addition, in the process of mapping (X, Y) to the image coordinates, there is a conversion error in the conversion inside the collimator, and the calculation load of the chip inside the collimator is increased.
In order to solve the above problem, the present application provides a calibration data determining method, which obtains a plurality of data sets (a data set is obtained by specifying a target by a target device every time the target device is shot, each data set includes a shooting distance and an image, and the shooting distances in the data sets are different, the shooting distance is a distance between the target device and the specified target, and the image is an image obtained by acquiring the specified target by a sighting device), then sequentially traversing each data set, marking an impact point in an image of the data set for each traversed data set, and using the shooting distance of the data set and position information of the impact point in the image as a set of calibration data.
Based on the description, calibration data are determined by acquiring a plurality of data sets comprising shooting distances and images, and time and labor consuming (X, Y) value measurement and recording work of the sighting device are omitted; the image coordinate position is directly obtained by marking the impact point in the image, and the image coordinate position of the impact point does not need to be converted by (X, Y) with large error ratio, so that the impact point calibration accuracy can be improved, and the accuracy of calibration data is further improved.
The technical solution of the present application will be described in detail with specific examples.
Fig. 2A is a flowchart of an embodiment of a calibration data determining method according to an exemplary embodiment of the present application, where the calibration data determining method may be applied to an address device installed on a target device, and may also be applied to other electronic devices, that is, a data set collected by the address device is transmitted to the other electronic devices, and the other electronic devices determine calibration data by using multiple data sets. In the embodiment of the present application, a target device is taken as an example for explanation, and as shown in fig. 2A, the calibration data determining method includes the following steps:
step 201: the method comprises the steps of obtaining a plurality of data sets, wherein the target device designates a target once to obtain one data set, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by aiming device acquisition of the designated target.
In one embodiment, the target device fires at the same point of the designated target at a time. If the number of the designated targets is 1, the number of the impact points in the image coincides with the corresponding shooting order, and the shooting distance is gradually increased or decreased with the shooting order, for example, assuming that the shooting order is 5, there are 5 impact points in the image acquired at this time; if the number of the designated targets is plural, the number of the impact points in each image is only 1.
Wherein the designated target may be a target, and the target device is fired at the target center each time.
In an embodiment, the shooting distance of each group may be obtained by: and when the target device shoots at each time, the shooting distance between the target device and the specified target is obtained through a distance measuring module arranged in the sighting device equipment.
Wherein, this range finding module can be laser ranging module, pitch angle range finding module, ultrasonic ranging module etc..
In an exemplary scenario, such as the ballistic calibration configuration shown in fig. 1, a firearm is always aimed at the center of a target during the entire calibration process, assuming that the number of targets is 1. During the first shooting, the shooting distance 1 between the target and the gun is acquired through the ranging module, after a bullet shot by the gun passes through the target, the sighting device records an acquired target image, and the target image is provided with 1 bullet hole (namely a bullet landing point); through removing gun or target, during the shooting of the second time, gather shooting distance 2 between target and the gun through range finding module, after the bullet of gun shooting passed the target, aim the target image that utensil equipment record was gathered, there are 2 bullet holes in this target image, analogize in proper order, up to shooting N times, along with the increase of shooting order, the shooting distance between target and the gun is more and more far away, has N bullet holes in the target image of collection for the Nth time.
It should be noted that, because the sighting telescope device usually displays the acquired image on the screen in real time based on the thermal imaging (i.e. infrared imaging) principle, the pixel value size contained in the image is related to the temperature value of the object, the larger the temperature value is, the larger the corresponding pixel value is, the more obvious the image is presented, and after a bullet fired by the gun passes through a designated target, the temperature of the bullet hole formed on the designated target is higher than the ambient temperature due to frictional heat, and the temperature difference is also significant. The user can easily observe the location of the bullet hole in the image through the screen. If the position of the bullet hole in the image cannot be observed through the screen due to special reasons, a heating object (such as an ignited cigarette end) or paint with high emissivity can be inserted into the bullet hole until the user can observe the position of the bullet hole through the screen, and the sighting device records the current image by triggering an OK button, so that a data set comprising the shooting distance and the image is obtained.
Step 202: and traversing each data set in turn, and marking the impact points in the image of each traversed data set.
In an embodiment, based on the thermal imaging principle described in step 201, a pixel position corresponding to the maximum temperature value may be detected from an image corresponding to the shooting distance, and the detected pixel position is used as an impact point, so as to implement automatic marking, thereby saving manual operation.
It should be noted that, after the impact point is determined, the preset cursor may also be moved to the pixel position of the impact point for the user to view, and if the user observes that the position of the cursor and the impact point in the screen has a deviation, the cursor may also be fine-tuned by pressing a key, so that the cursor and the impact point are completely overlapped, and thus the position of the cursor in the image may be determined as the impact point. The preset cursor may be a relatively conspicuous cross cursor.
It should be further noted that, in order to avoid the interference of the fire source, the field of view region of the designated target in the image may be calibrated in advance, and then the pixel position corresponding to the maximum temperature value is detected from the pre-calibrated field of view region of the image.
In an embodiment, since the user can easily observe the bullet hole in the image through the screen, the user can also move the cursor completely by operating the keys on the sighting telescope to make the cursor coincide with the bullet hole to realize the marking of the impact point, and the operation of detecting the maximum temperature value to mark the impact point is omitted. As shown in fig. 2B, the schematic diagram of the impact point of the cursor mark displayed on the screen of the sighting device is shown, and the cursor is moved to be completely overlapped with the impact point by pressing a key.
It is noted that the impact point is marked with a cursor in the image, the marking error is introduced in pixel level, and one pixel represents the actual size in millimeter level, so the introduced error is in millimeter level, and the accuracy in the embodiment is higher compared with the centimeter level error introduced by manual measurement in the prior art.
In an embodiment, the traversal order of the data sets may be traversed from small to large according to the shooting order, for a process of taking a detected pixel position as an impact point, if only one detected pixel position exists, the pixel position is determined as the impact point, if a plurality of detected pixel positions exist, one pixel position is selected from the plurality of pixel positions to be determined as the impact point according to the magnitude relation between the shooting distance of the currently traversed data set and the shooting distance of the last traversed data set, and the selected pixel position is different from the pixel position of the impact point marked last time.
Based on the above description of step 201, it can be known that, in the case that there are a plurality of designated targets, there are only 1 impact point in each image, so that there is only one pixel position corresponding to the maximum temperature value obtained from the image, and the pixel position can be directly determined as the impact point; for the case that the number of the designated targets is one, only the number of the shot holes in the image of the data set obtained by the first shooting (the shooting order is 1) is 1, so that only one pixel position corresponding to the maximum temperature value obtained from the image is provided, in the subsequent shooting process, the number of the shot holes in the collected image is consistent with the shooting order, and if the interval time of multiple shooting is short, the pixel position corresponding to the maximum temperature value in the image may be multiple, and based on the characteristic that the longer the shooting distance is, the lower the impact point is, when the obtained pixel positions are multiple, if the shooting distance of the data set traversed this time is greater than the shooting distance of the data set traversed last time, the pixel position at the lowest position (the pixel position of the impact point not marked last time) may be selected as the impact point corresponding to the shooting distance of the data set traversed this time (the impact point not marked last time) and, if the shooting distance of the data set traversed this time is less than the impact point of the data set traversed last time, the pixel position at the highest position may be selected as the impact point corresponding to the impact point of the data set traversed this time (the impact point not marked last time).
Step 203: and taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
In an embodiment, after obtaining multiple sets of calibration data, the sight device obtains a shooting distance between a target device and a target during actual shooting, searches stored calibration data by using the shooting distance, obtains calibration data corresponding to the shooting distance, determines an actual impact point by using position information in the obtained calibration data, and marks the actual impact point on a screen for a user to view.
If the stored calibration data does not contain the shooting distance, the calibration data of two shooting distances adjacent to the shooting distance before and after the shooting distance are obtained, interpolation calculation is carried out by utilizing the position information in the two calibration data to obtain new position information, and the new position information is determined as the actual impact point.
In an exemplary scenario, as shown in fig. 2C, the diagram is a schematic diagram of the impact point displayed on the screen of the sighting device, and fig. 2C includes five sets of calibration data, respectively (X) 1 ′,Y 1 ′,L 1 )、(X 2 ′,Y 2 ′,L 2 )、(X 3 ′,Y 3 ′,L 3 )、(X 4 ′,Y 4 ′,L 4 )、(X 5 ′,Y 5 ′,L 5 ) Wherein L is 1 <L 2 <L 3 <L 4 <L 5 Assuming that the current shooting distance is L 0 From FIG. 2C, it can be seen that L 3 <L 0 <L 4 Thus is related to L 0 Two adjacent shooting distances L 3 And L 4 The position information in the calibration data is (X) 3 ′,Y 3 ') and (X) 4 ′,Y 4 ') by using (X) 3 ′,Y 3 ') and (X) 4 ′,Y 4 ') interpolation results in the impact point, i.e., the gray point shown in FIG. 2C.
In the embodiment of the application, a plurality of data sets are obtained (a target device obtains one data set by shooting a designated target once, each data set comprises a shooting distance and an image, the shooting distance in each data set is different, the shooting distance is the distance between the target device and the designated target, and the image is the image of the designated target collected by aiming device), then, each data set is traversed in sequence, aiming at each traversed data set, an impact point is marked in the image of the data set, and the shooting distance of the data set and the position information of the impact point in the image are used as a set of calibration data.
Based on the description, calibration data are determined by acquiring a plurality of data sets comprising shooting distances and images, and time and labor consuming (X, Y) value measurement and recording work of the sighting device are omitted; the coordinate position of the image is directly obtained by marking the impact point in the image by the cursor, and the image coordinate position of the impact point does not need to be converted by (X, Y) with larger error ratio, so that the calibration accuracy of the impact point can be improved, and the accuracy of calibration data is further improved.
Fig. 3 is a hardware block diagram of an aimer device according to an exemplary embodiment of the present application, where the aimer device includes: a communication interface 301, a processor 302, a machine-readable storage medium 303, and a bus 304; wherein the communication interface 301, the processor 302, and the machine-readable storage medium 303 communicate with each other via a bus 304. The processor 302 may execute the calibration data determination method described above by reading and executing machine executable instructions in the machine readable storage medium 303 corresponding to the control logic of the calibration data determination method, and the specific content of the method is referred to the above embodiments, which will not be described again here.
The machine-readable storage medium 303 referred to herein may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like. For example, the machine-readable storage medium may be: volatile memory, non-volatile memory, or similar storage media. In particular, the machine-readable storage medium 303 may be a RAM (random Access Memory), a flash Memory, a storage drive (e.g., a hard drive), any type of storage disk (e.g., an optical disk, a DVD, etc.), or similar storage medium, or a combination thereof.
Fig. 4 is a block diagram of an embodiment of a calibration data determining apparatus according to an exemplary embodiment of the present application, where the calibration data determining apparatus includes:
a data obtaining module 410, configured to obtain multiple data sets, where a target device obtains one data set every time the target device shoots a designated target, each data set includes a shooting distance and an image, the shooting distance in each data set is different, the shooting distance is a distance between the target device and the designated target, and the image is an image obtained by acquiring the designated target by a sighting device arranged on the target device;
a marking module 420, configured to traverse each data set, and for each traversed data set, mark a bullet in an image of the data set;
and a calibration data determining module 430, configured to use the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
In an optional implementation manner, the data obtaining module 410 is specifically configured to, in the process of obtaining the shooting distance of each data set, obtain, by a ranging module provided in the sighting device, the shooting distance between the target device and the specified target at each shooting of the target device.
In an optional implementation manner, the marking module 420 is specifically configured to detect a pixel position corresponding to a maximum temperature value from the image; the temperature of a bullet hole formed on a specified target after a bullet shot by the target device passes through the specified target is higher than the ambient temperature; and taking the detected pixel position as the impact point.
In an optional implementation manner, the marking module 420 is specifically configured to, in a process of acquiring a pixel position corresponding to a maximum temperature value from the image, detect a pixel position corresponding to the maximum temperature value from a preset area of the image, where the preset area is a field area of the image of the pre-calibrated specified target.
In an optional implementation manner, the traversal order of each data group is a traversal order from small to large according to the shooting order, and the marking module 420 is specifically configured to determine, when a detected pixel position is taken as the impact point, if there is only one detected pixel position, that pixel position is taken as the impact point; if a plurality of detected pixel positions exist, selecting one pixel position from the plurality of pixel positions to be determined as the impact point according to the size relation between the shooting distance of the currently traversed data group and the shooting distance of the last traversed data group, wherein the selected pixel position is different from the pixel position of the impact point marked last time.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement without inventive effort.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus comprising the element.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (7)

1. A method for determining calibration data, the method comprising:
acquiring a plurality of data sets, wherein a target device acquires one data set by designating a target once shooting, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by acquiring the designated target by thermal imaging sighting equipment arranged on the target device;
traversing each data set according to the sequence of the shooting order from small to large, and detecting the pixel position corresponding to the maximum temperature value in the image aiming at each traversed data set; the temperature of a bullet hole formed on a specified target after a bullet shot by the target device passes through the specified target is higher than the ambient temperature;
if only one detected pixel position exists, determining the pixel position as a bullet point; if a plurality of detected pixel positions exist, selecting one pixel position from the plurality of pixel positions to be determined as an impact point according to the size relation between the shooting distance of the currently traversed data group and the shooting distance of the last traversed data group, wherein the selected pixel position is different from the pixel position of the impact point marked last time;
and taking the shooting distance of the data set and the position information of the impact point in the image as a set of calibration data.
2. The method of claim 1, wherein the shot distance for each data set is obtained by:
when the target device shoots at each time, the shooting distance between the target device and the designated target is obtained through a distance measuring module arranged in the sighting device equipment.
3. The method of claim 1, wherein obtaining the pixel position corresponding to the maximum temperature value from the image comprises:
and detecting a pixel position corresponding to the maximum temperature value from a preset area of the image, wherein the preset area is a field of view area of the specified target in the image calibrated in advance.
4. An apparatus for determining calibration data, the apparatus comprising:
the data acquisition module is used for acquiring a plurality of data sets, a target device acquires one data set every time the target device shoots a designated target, each data set comprises a shooting distance and an image, the shooting distances in the data sets are different, the shooting distance is the distance between the target device and the designated target, and the image is the image acquired by the thermal imaging sighting device arranged on the target device and used for acquiring the designated target;
the marking module is used for traversing each data group according to the shooting sequence from small to large, and detecting the pixel position corresponding to the maximum temperature value in the image aiming at each traversed data group; the temperature of a bullet hole formed on a specified target after a bullet shot by the target device passes through the specified target is higher than the ambient temperature; if only one pixel position is detected, determining the pixel position as an impact point; if a plurality of detected pixel positions exist, selecting one pixel position from the plurality of pixel positions to be determined as an impact point according to the size relationship between the shooting distance of the currently traversed data group and the shooting distance of the last traversed data group, wherein the selected pixel position is different from the pixel position of the impact point marked last time;
and the calibration data determining module is used for taking the shooting distance of the data group and the position information of the impact point in the image as a group of calibration data.
5. The apparatus of claim 4,
the data acquisition module is specifically used for acquiring the shooting distance between the target device and the designated target through a distance measurement module arranged in the sighting device equipment during each shooting of the target device in the process of acquiring the shooting distance of each data set.
6. The apparatus according to claim 4, wherein the marking module is specifically configured to detect a pixel position corresponding to the maximum temperature value from a preset region of the image in a process of obtaining the pixel position corresponding to the maximum temperature value from the image, where the preset region is a field of view region of the pre-calibrated specified target in the image.
7. A sight device, comprising a readable storage medium and a processor;
wherein the readable storage medium is configured to store machine executable instructions;
the processor configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of any one of claims 1-3.
CN201811518418.6A 2018-12-12 2018-12-12 Calibration data determination method and device Active CN111306988B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811518418.6A CN111306988B (en) 2018-12-12 2018-12-12 Calibration data determination method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811518418.6A CN111306988B (en) 2018-12-12 2018-12-12 Calibration data determination method and device

Publications (2)

Publication Number Publication Date
CN111306988A CN111306988A (en) 2020-06-19
CN111306988B true CN111306988B (en) 2022-12-23

Family

ID=71150464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811518418.6A Active CN111306988B (en) 2018-12-12 2018-12-12 Calibration data determination method and device

Country Status (1)

Country Link
CN (1) CN111306988B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113091512B (en) * 2021-04-07 2023-06-02 合肥英睿系统技术有限公司 Shooting device aiming method and device
CN113310351B (en) * 2021-05-29 2021-12-10 北京波谱华光科技有限公司 Method and system for calibrating precision of electronic division and assembly meter
CN116625256A (en) * 2022-09-26 2023-08-22 汉王科技股份有限公司 Calibration method and device of targeting equipment and targeting equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11108596A (en) * 1997-10-03 1999-04-23 Nippon Avionics Co Ltd Bullet trace detector, and shooting automatic scoring device
CN1546938A (en) * 2003-12-04 2004-11-17 华南理工大学 Ball firing system and method for identifying rapid fire shot hole thereof
CN105300181A (en) * 2015-10-30 2016-02-03 北京艾克利特光电科技有限公司 Accurate photoelectric sighting device capable of prompting shooting in advance
CN107958205A (en) * 2017-10-31 2018-04-24 北京艾克利特光电科技有限公司 Gunnery training intelligent management system
CN108288290A (en) * 2017-12-07 2018-07-17 中国航空工业集团公司西安航空计算技术研究所 A kind of on-line automatic scaling method of target center being applied to intelligent sniping gun
CN108805210A (en) * 2018-06-14 2018-11-13 深圳深知未来智能有限公司 A kind of shell hole recognition methods based on deep learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2716792Y (en) * 2004-07-08 2005-08-10 上海亿湾特训练设备科技有限公司 A novel impact point coordinate positioning device
KR101603281B1 (en) * 2013-12-10 2016-03-28 대한민국 Firearm laser training system and method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11108596A (en) * 1997-10-03 1999-04-23 Nippon Avionics Co Ltd Bullet trace detector, and shooting automatic scoring device
CN1546938A (en) * 2003-12-04 2004-11-17 华南理工大学 Ball firing system and method for identifying rapid fire shot hole thereof
CN105300181A (en) * 2015-10-30 2016-02-03 北京艾克利特光电科技有限公司 Accurate photoelectric sighting device capable of prompting shooting in advance
CN107958205A (en) * 2017-10-31 2018-04-24 北京艾克利特光电科技有限公司 Gunnery training intelligent management system
CN108288290A (en) * 2017-12-07 2018-07-17 中国航空工业集团公司西安航空计算技术研究所 A kind of on-line automatic scaling method of target center being applied to intelligent sniping gun
CN108805210A (en) * 2018-06-14 2018-11-13 深圳深知未来智能有限公司 A kind of shell hole recognition methods based on deep learning

Also Published As

Publication number Publication date
CN111306988A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
CN111306988B (en) Calibration data determination method and device
US4333106A (en) Method of measuring firing misses and firing miss-measuring installation for the performance of the method
US20200263956A1 (en) Firearm Training Apparatus And Methods
US9612115B2 (en) Target-correlated electronic rangefinder
US10782096B2 (en) Skeet and bird tracker
US11047648B2 (en) Firearm and/or firearm sight calibration and/or zeroing
TWI485630B (en) Sights, operational methods thereof, and computer program products thereof
US20130278759A1 (en) Geodesic measuring device comprising a thermographic camera
US6260466B1 (en) Target aiming system
US8998085B2 (en) Optical device configured to determine a prey score of antlered prey
CN109154486A (en) Bore sighting device and method
CN102221410A (en) Method for an IR-radiation -- based temperature measurement and IR-radiation -- based temperature measuring device
KR101200350B1 (en) A Shooting system
CN113091512A (en) Aiming method and device for shooting device
RU2604909C9 (en) Method for assessment of firing efficiency of combat remote controlled module located on mobile object
Liscio et al. The lead-in method for bullet impacts in metal panels
KR102011765B1 (en) Method and apparatus for aiming target
CN115345377A (en) Position prediction method and device, electronic equipment and storage medium
CN104833268B (en) Small caliber piece dynamic tracking accuracy detecting device
US20160018196A1 (en) Target scoring system and method
KR102433858B1 (en) Apparatus and method for measuring distance of target and launching angle of guided projectile system
KR101175752B1 (en) Bayesian rule-based target decision system for strapdown dual mode imaging seeker using location weighted or region growing moment modeling
WO2020006095A1 (en) Analysis of skeet target breakage
KR102183374B1 (en) Apparatus for Analyzing Marksmanship and Driving Method Thereof, and Computer Readable Recording Medium
KR102151340B1 (en) impact point detection method of shooting system with bullet ball pellet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200710

Address after: 311501 building A1, No. 299, Qiushi Road, Tonglu Economic Development Zone, Tonglu County, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Haikang Micro Shadow Sensing Technology Co.,Ltd.

Address before: Hangzhou City, Zhejiang province 310051 Binjiang District Qianmo Road No. 555

Applicant before: Hangzhou Hikvision Digital Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant