CN114820726A - Calibration method, calibration device, electronic equipment and storage medium - Google Patents

Calibration method, calibration device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114820726A
CN114820726A CN202210431653.XA CN202210431653A CN114820726A CN 114820726 A CN114820726 A CN 114820726A CN 202210431653 A CN202210431653 A CN 202210431653A CN 114820726 A CN114820726 A CN 114820726A
Authority
CN
China
Prior art keywords
image
calibrated
target object
reference object
error information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210431653.XA
Other languages
Chinese (zh)
Inventor
陈永泽
霍紫健
刘威云
谢林峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hengtian Weiyan Technology Co ltd
Original Assignee
Shenzhen Hengtian Weiyan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hengtian Weiyan Technology Co ltd filed Critical Shenzhen Hengtian Weiyan Technology Co ltd
Priority to CN202210431653.XA priority Critical patent/CN114820726A/en
Publication of CN114820726A publication Critical patent/CN114820726A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present disclosure provides a calibration method, an apparatus, an electronic device, and a storage medium, wherein the calibration method includes: acquiring an image to be calibrated, wherein the image to be calibrated is an image obtained by an eyepiece in the range finder, and a reference object is arranged in the eyepiece; performing image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated; and calibrating the range finder according to the error information. The image processing is carried out on the image to be calibrated acquired from the ocular lens, so that the error information of the target object and the reference object in the image to be calibrated is obtained, the distance meter is calibrated according to the error information, manual calibration is not needed, the calibration efficiency of the distance meter is improved, and the production efficiency of the laser distance meter is further improved.

Description

Calibration method, calibration device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of calibration, in particular to a calibration method, a calibration device, electronic equipment and a storage medium.
Background
The portable laser range finder of miniaturization is a development trend of civilian laser range finder, and hand-held type laser range finder is used for engineering projects such as building, traffic because of its convenient to carry extensively, and also the wide various range finding occasions in leisure activities such as golf, hunting that are applied to in the leisure area in addition. However, most of the existing laser range finders are manually calibrated before leaving factories, the laser range finders are placed on the well-adjusted jig, and whether the target points and the alignment centers coincide or not is observed from the eyepieces through users.
Disclosure of Invention
The embodiment of the invention provides a calibration method, a calibration device, electronic equipment and a storage medium, and aims to solve the problem that the calibration mode needs more manpower, so that the calibration efficiency of a laser range finder is low, and the production efficiency of the laser range finder is low. The image processing is carried out on the image to be calibrated acquired from the ocular lens, so that the error information of the target object and the reference object in the image to be calibrated is obtained, the distance meter is calibrated according to the error information, manual calibration is not needed, the calibration efficiency of the distance meter is improved, and the production efficiency of the laser distance meter is further improved.
In a first aspect, an embodiment of the present invention provides a calibration method, where the method includes:
acquiring an image to be calibrated, wherein the image to be calibrated is an image obtained by an eyepiece in the range finder, and a reference object is arranged in the eyepiece;
performing image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated;
and calibrating the range finder according to the error information.
Further, the step of acquiring the image to be calibrated includes:
mounting the distance measuring instrument on the adjusted jig;
and carrying out image acquisition on an eyepiece of the range finder through image equipment to obtain the image to be calibrated.
Further, the step of performing image processing on the image to be calibrated to obtain error information between the target object and the reference object in the image to be calibrated includes:
preprocessing the image to be calibrated;
and identifying the preprocessed image to be calibrated through a preset image identification model to obtain error information of the target object and the reference object in the image to be calibrated.
Further, the image recognition model includes a public network, a first recognition network and a second recognition network, an input of the first recognition network and an input of the second recognition network are both connected with an output of the public network, and the step of performing recognition processing on the preprocessed image to be calibrated through a preset image recognition model to obtain error information of the target object and the reference object in the image to be calibrated includes:
processing the preprocessed image to be calibrated through a public network to obtain a first characteristic diagram;
processing the first characteristic diagram through a first identification network to obtain an identification result of a target object, and processing the first characteristic diagram through a second identification network to obtain an identification result of a reference object;
and calculating to obtain error information of the target object and the reference object in the image to be calibrated according to the identification result of the target object and the identification result of the reference object.
Further, the step of calculating error information between the target object and the reference object in the image to be calibrated according to the recognition result of the target object and the recognition result of the reference object includes:
calculating to obtain the distance from the center point of the target object frame to the center point of the reference object frame according to the information of the target object frame and the information of the reference object frame;
and obtaining error information of the target object and the reference object in the image to be calibrated according to the distance from the center point of the target object frame to the center point of the reference object frame.
Further, the step of calculating error information between the target object and the reference object in the image to be calibrated according to the recognition result of the target object and the recognition result of the reference object includes:
calculating to obtain the intersection ratio of the target object frame and the reference object frame according to the target object frame information and the reference object frame information;
and calculating to obtain error information of the target object and the reference object in the image to be calibrated according to the intersection ratio of the target object frame and the reference object frame.
Further, the step of calibrating the range finder according to the error information includes:
generating an adjusting instruction according to the error information;
and adjusting the jig through the adjusting instruction so that the jig calibrates the distance measuring instrument.
In a second aspect, there is provided a calibration device, the device comprising:
the device comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring an image to be calibrated, the image to be calibrated is an image obtained by an eyepiece in the distance meter, and a reference object is arranged in the eyepiece;
the processing module is used for carrying out image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated;
and the calibration module is used for calibrating the range finder according to the error information.
In a third aspect, an electronic device is provided, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the calibration method according to any of the embodiments of the invention when executing the computer program.
In a fourth aspect, a computer readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, realizes the steps in the calibration method according to any one of the embodiments of the present invention.
In the embodiment of the invention, an image to be calibrated is obtained, wherein the image to be calibrated is an image obtained by an eyepiece in the range finder, and a reference object is arranged in the eyepiece; performing image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated; and calibrating the range finder according to the error information. The image processing is carried out on the image to be calibrated acquired from the ocular lens, so that the error information of the target object and the reference object in the image to be calibrated is obtained, the distance meter is calibrated according to the error information, manual calibration is not needed, the calibration efficiency of the distance meter is improved, and the production efficiency of the laser distance meter is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart of a calibration method.
Fig. 2 is a schematic structural diagram of a calibration device.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be illustrative of the present invention and should not be construed as limiting the present invention, and all other embodiments that can be obtained by one skilled in the art based on the embodiments of the present invention without inventive efforts shall fall within the scope of protection of the present invention.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "circumferential," "radial," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the present invention and to simplify the description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be considered limiting of the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly specified or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, "above" or "below" a first feature means that the first and second features are in direct contact, or that the first and second features are not in direct contact but are in contact with each other via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1, fig. 1 is a calibration method provided in the present application, the method including:
101. and acquiring an image to be calibrated.
In an embodiment of the present invention, the image to be calibrated is an image obtained by an eyepiece in the range finder, and the eyepiece is provided with a reference object.
Specifically, above-mentioned distancer can include objective and eyepiece, and wherein, objective setting is at the front end of distancer, and the eyepiece setting is at the rear end of distancer, and the user can follow the eyepiece and observe the target place region that awaits measuring.
More specifically, above-mentioned distancer includes shell and core, and the core includes objective and eyepiece, before the distancer leaves the factory, needs carry out the timing to the core to guarantee the accuracy of distancer.
The image to be calibrated may be an image obtained by shooting the image by aligning the camera with the eyepiece, a reference object is arranged in the eyepiece, and the reference object may be an alignment center.
In a possible embodiment, the camera is a dedicated camera, and the camera takes an eyepiece of the range finder as a lens to perform image acquisition, so that the image to be calibrated is consistent with a picture observed by a user through the eyepiece.
102. And carrying out image processing on the image to be calibrated to obtain error information of the target object and the reference object in the image to be calibrated.
In an embodiment of the present invention, the image to be calibrated further includes a target object, and the target object may be a target, so that the target of the eyepiece and the target of the object may be included in the image to be calibrated.
The center of the eyepiece is used for marking the target position aligned with the current range finder, and the distance between the range finder and the target center of the object is preset.
The image processing on the image to be calibrated may be to perform target identification on the image to be calibrated, identify target object information and reference object information in the image to be calibrated, and calculate error information between the target object and the reference object according to the target object information and the reference object information.
103. And calibrating the range finder according to the error information.
In the embodiment of the present invention, the calibration includes calibrating the eyepiece and calibrating the objective lens, and the eyepiece and the objective lens of the range finder can be continuously calibrated according to the steps 101 and 102, so that the alignment center can be aligned with the target center.
In the embodiment of the invention, an image to be calibrated is obtained, wherein the image to be calibrated is an image obtained by an eyepiece in the range finder, and a reference object is arranged in the eyepiece; performing image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated; and calibrating the range finder according to the error information. The image processing is carried out on the image to be calibrated acquired from the ocular lens, so that the error information of the target object and the reference object in the image to be calibrated is obtained, the distance meter is calibrated according to the error information, manual calibration is not needed, the calibration efficiency of the distance meter is improved, and the production efficiency of the laser distance meter is further improved.
Optionally, in the step of obtaining the image to be calibrated, the distance measuring instrument may be installed on the adjusted jig; and carrying out image acquisition on an eyepiece of the range finder through image equipment to obtain an image to be calibrated.
In the embodiment of the invention, the jig is an adjusted jig, the jig is adjusted according to the model parameters of the distance measuring instrument, and the corresponding jig adjustment ranges of the distance measuring instruments with different model parameters are different.
The tool can be adjusted by the range finder which is manually calibrated, so that the range finder which is manually calibrated coincides with the target center when being fixed by the tool. The range finder which is calibrated manually and the range finder to be calibrated have the same model parameters.
Further, be provided with special image equipment on the tool, special image equipment can be special camera, and special camera uses the eyepiece of waiting to calibrate the distancer as the camera lens, when installing the distancer on the tool, simultaneously, installs the camera lens part of camera on waiting to calibrate the eyepiece of distancer to the messenger waits to calibrate the eyepiece of distancer as the camera lens of camera, and then guarantees that the image that the camera was gathered and the picture that the user observed through the eyepiece have the uniformity.
When the calibration is carried out, image information in the ocular lens is collected through the image equipment, so that an image to be calibrated is obtained. At least a reference object, which may be an object of interest in the eyepiece, is included in the image to be calibrated.
Optionally, in the step of performing image processing on the image to be calibrated to obtain error information of the target object and the reference object in the image to be calibrated, the image to be calibrated may be preprocessed; and identifying the preprocessed image to be calibrated through a preset image identification model to obtain error information of the target object and the reference object in the image to be calibrated.
In the embodiment of the invention, the image processing of the image to be calibrated can be carried out by adopting a preset image recognition model. The preset image recognition model can also be called a pre-trained image recognition model, and the image recognition model for recognizing the image to be calibrated can be an image recognition model based on a convolutional neural network.
Further, the image to be calibrated can be preprocessed, and the preprocessing can be denoising, normalizing, scaling and cutting the image, so that the image to be calibrated which accords with the input size of the image recognition model is obtained.
After the image to be calibrated is preprocessed, the image to be calibrated is input into a preset image recognition model, and because the image recognition model is trained in advance, error information of a target object and a reference object is learned in the training process, so that the image recognition model can automatically output the error information of the target object and the reference object in the image to be calibrated.
The image to be calibrated is identified through the preset image identification model, so that the error information of the target object and the reference object in the image to be calibrated is obtained, the calibration efficiency of the distance meter can be improved, and the production efficiency of the laser distance meter is further improved.
Optionally, the image recognition model includes a public network, a first recognition network and a second recognition network, an input of the first recognition network and an input of the second recognition network are both connected to an output of the public network, and in the step of performing recognition processing on the pre-processed image to be calibrated through a preset image recognition model to obtain error information of the target object and the reference object in the image to be calibrated, the public network through which the pre-processed image to be calibrated passes may be processed to obtain a first feature map; processing the first characteristic diagram through a first identification network to obtain an identification result of the target object, and processing the first characteristic diagram through a second identification network to obtain an identification result of the reference object; and calculating to obtain error information of the target object and the reference object in the image to be calibrated according to the identification result of the target object and the identification result of the reference object.
In the embodiment of the invention, the image recognition model comprises three networks, the public network can be a shallow neural network and is used for extracting general features of the target object and the reference object, the public network is used for processing the image to be calibrated to obtain a first feature map, the primary features of the target object and the reference object are hidden in the first feature map, the high-level features of the target object are extracted from the first feature map through the first recognition network, and the high-level features of the reference object are extracted from the first feature map through the second recognition network.
The first recognition network and the second recognition network may be deep neural networks, and it should be noted that the first recognition network and the second recognition network may be identical or different in structure, and even though the first recognition network and the second recognition network are identical in structure, the network parameters thereof are different.
Before the training of the image recognition model, a data set is constructed, the image recognition model is trained through the data set, the data set comprises a sample image of the distance meter, the acquisition mode of the sample image is the same as that of an image to be calibrated, and a target object, a reference object and error information are marked in the sample image to serve as training guidance.
In the training process of the image recognition model, a first loss function of a first recognition network and a second loss function of a second recognition network are calculated, parameters of the first recognition network are adjusted through back propagation of the first loss function, parameters of the second recognition network are adjusted through back propagation of the second loss function, and parameters of a public network are adjusted through total back propagation of losses of the first loss function and the second loss function.
In a possible embodiment, the image recognition model further includes an error processing network, and the input of the error processing network is the output of the first recognition network and the second recognition network, that is, the recognition result of the target object and the recognition result of the reference object are input into the error processing network, and the error information of the target object and the reference object is obtained through processing by the error processing network. The error processing network may be a full convolution neural network, and the parameters in the error processing network are also trained.
Optionally, the identification result of the target object includes target object frame information, the reference object includes reference object frame information, and in the step of calculating error information between the target object and the reference object in the image to be calibrated according to the identification result of the target object and the identification result of the reference object, the distance from the center point of the target object frame to the center point of the reference object frame may be calculated according to the target object frame information and the reference object frame information; and obtaining error information of the target object and the reference object in the image to be calibrated according to the distance from the central point of the target object frame to the central point of the reference object frame.
In the embodiment of the present invention, the recognition result of the target object includes target object frame information, and the target object frame information may be five-tuple information such as (x1, y1, w1, h1, c1), where x1 and y1 are coordinates of a center point of the target object frame, w1 is a width of the target object frame, h1 is a height of the target object frame, and c1 is a category confidence of the target object frame.
Similarly, the recognition result of the reference object includes reference object frame information, and the reference object frame information may be five-tuple information such as (x2, y2, w2, h2, c2), where x2 and y2 are coordinates of a center point of the reference object frame, w2 is a width of the reference object frame, h2 is a height of the reference object frame, and c2 is a category confidence of the reference object frame.
Wherein c1 and c2 are both values between [0, 1 ].
The error information of the target object and the reference object may be a distance from a center point of the target object frame to a center point of the reference object frame, and specifically, the distance from the center point of the target object frame to the center point of the reference object frame may be calculated by using a euclidean distance through the center points (x1, y1) of the target object frame and the reference object frame (x2, y 2). The smaller the distance from the center point of the target object frame to the center point of the reference object frame, the smaller the deviation between the sighting center and the target center, and the smaller the error.
In one possible embodiment, the error information may be calculated by the following equation after obtaining the distance from the center point of the target object frame to the center point of the reference object frame:
Figure BDA0003610919900000081
where D is the distance from the center point of the target object frame to the center point of the reference object frame, and s is the error, it can be seen that, when the distance D from the center point of the target object frame to the center point of the reference object frame is constant, the higher the class confidence c1 of the target object frame and the class confidence c2 of the reference object frame, the smaller the error. Otherwise, the larger the error.
Optionally, the identification result of the target object includes target object frame information, the reference object includes reference object frame information, and in the step of obtaining error information of the target object and the reference object in the image to be calibrated through calculation according to the identification result of the target object and the identification result of the reference object, the intersection ratio between the target object frame and the reference object frame can be obtained through calculation according to the target object frame information and the reference object frame information; and calculating to obtain error information of the target object and the reference object in the image to be calibrated according to the intersection ratio of the target object frame and the reference object frame.
In the embodiment of the present invention, the recognition result of the target object includes target object frame information, and the target object frame information may be five-tuple information such as (x1, y1, w1, h1, c1), where x1 and y1 are coordinates of a center point of the target object frame, w1 is a width of the target object frame, h1 is a height of the target object frame, and c1 is a category confidence of the target object frame.
Similarly, the recognition result of the reference object includes reference object frame information, and the reference object frame information may be five-tuple information such as (x2, y2, w2, h2, c2), where x2 and y2 are coordinates of a center point of the reference object frame, w2 is a width of the reference object frame, h2 is a height of the reference object frame, and c2 is a category confidence of the reference object frame.
The intersection area of the target object frame and the reference object frame can be used for comparing the parallel area of the target object frame and the reference object frame to obtain the intersection and parallel ratio of the target object frame and the reference object frame. Specifically, the following formula can be used:
Figure BDA0003610919900000091
wherein r is the intersection ratio of the target object frame and the reference object frame, A is the target object frame, and B is the reference object frame. The minimum intersection ratio of the target object frame and the reference object frame is 0, which represents that the target object frame and the reference object frame have no intersection part, the maximum intersection ratio of the target object frame and the reference object frame is 1, which represents that the target object frame and the reference object frame have the same size and are completely overlapped.
In one possible embodiment, after obtaining the intersection ratio of the target object frame and the reference object frame, the error information may be calculated by the following equation:
Figure BDA0003610919900000092
it can be seen that the higher the class confidence c1 of the target object frame and the class confidence c2 of the reference object frame, the smaller the error. Otherwise, the larger the error.
Optionally, in the step of calibrating the distance meter according to the error information, an adjustment instruction may be generated according to the error information; and adjusting the jig through the adjusting instruction so that the jig calibrates the distance measuring instrument.
In the embodiment of the invention, the adjusting instruction can be generated according to the error information, the larger the error information is, the larger the adjusting range of the adjusting instruction is, and the jig is controlled according to the adjusting instruction to calibrate the objective lens or the eyepiece of the distance meter.
After one calibration, the next calibration is carried out by the calibration method until the error information of two or more continuous times is less than the preset value, and the calibration of the distance meter is finished to obtain the calibrated distance meter.
Referring to fig. 2, fig. 2 is a calibration apparatus provided in the present application, the apparatus including:
an obtaining module 201, configured to obtain an image to be calibrated, where the image to be calibrated is an image obtained by an eyepiece in the range finder, and a reference object is arranged in the eyepiece;
the processing module 202 is configured to perform image processing on the image to be calibrated to obtain error information between a target object and the reference object in the image to be calibrated;
and the calibration module 203 is configured to calibrate the range finder according to the error information.
Further, the obtaining module 201 includes:
the mounting submodule is used for mounting the distance measuring instrument on the adjusted jig;
and the acquisition submodule is used for acquiring images of an eyepiece of the range finder through image equipment to obtain the image to be calibrated.
Further, the processing module 202 includes:
the first processing submodule is used for preprocessing the image to be calibrated;
and the second processing submodule is used for identifying the preprocessed image to be calibrated through a preset image identification model to obtain error information of the target object and the reference object in the image to be calibrated.
Further, the image recognition model includes a public network, a first recognition network and a second recognition network, an input of the first recognition network and an input of the second recognition network are both connected to an output of the public network, and the second processing sub-module includes:
the first processing unit is used for processing the preprocessed image to be calibrated through a public network to obtain a first characteristic diagram;
the second processing unit is used for processing the first characteristic diagram through a first identification network to obtain an identification result of a target object, and processing the first characteristic diagram through a second identification network to obtain an identification result of a reference object;
and the calculating unit is used for calculating and obtaining the error information of the target object and the reference object in the image to be calibrated according to the identification result of the target object and the identification result of the reference object.
Further, the recognition result of the target object includes target object frame information, the reference object includes reference object frame information, and the calculation unit includes:
the first calculating subunit is configured to calculate, according to the target object frame information and the reference object frame information, a distance from a center point of the target object frame to a center point of the reference object frame;
and the processing subunit is configured to obtain error information of the target object and the reference object in the image to be calibrated according to a distance between the center point of the target object frame and the center point of the reference object frame.
Further, the recognition result of the target object includes target object frame information, the reference object includes reference object frame information, and the calculation unit further includes:
the second calculation subunit is configured to calculate, according to the target object frame information and the reference object frame information, an intersection-parallel ratio between the target object frame and the reference object frame;
and the third calculation subunit is used for calculating and obtaining error information of the target object and the reference object in the image to be calibrated according to the intersection ratio of the target object frame and the reference object frame.
Further, the calibration module 203 includes:
the generating submodule is used for generating an adjusting instruction according to the error information;
and the calibration submodule is used for adjusting the jig through the adjusting instruction so as to enable the jig to calibrate the distance measuring instrument.
Embodiments of the present invention also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the calibration methods described in the above method embodiments.
Embodiments of the present invention also provide an electronic device including a non-transitory computer-readable storage medium storing a computer program, the computer program being operable to cause a computer to perform part or all of the steps of any one of the calibration methods as recited in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that the acts and modules illustrated are not necessarily required to practice the invention.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated unit, if implemented in the form of a software program module and sold or used as a stand-alone product, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above embodiments of the present invention are described in detail, and the principle and the implementation of the present invention are explained by applying specific embodiments, and the above description of the embodiments is only used to help understanding the method of the present invention and the core idea thereof; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A calibration method for range finder calibration, the method comprising:
acquiring an image to be calibrated, wherein the image to be calibrated is an image obtained by an eyepiece in the range finder, and a reference object is arranged in the eyepiece;
performing image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated;
and calibrating the range finder according to the error information.
2. The method of claim 1, wherein the step of acquiring an image to be calibrated comprises:
mounting the distance measuring instrument on the adjusted jig;
and carrying out image acquisition on an eyepiece of the range finder through image equipment to obtain the image to be calibrated.
3. The method according to claim 2, wherein the step of performing image processing on the image to be calibrated to obtain the error information between the target object and the reference object in the image to be calibrated comprises:
preprocessing the image to be calibrated;
and identifying the preprocessed image to be calibrated through a preset image identification model to obtain error information of the target object and the reference object in the image to be calibrated.
4. The method according to claim 3, wherein the image recognition model includes a public network, a first recognition network and a second recognition network, an input of the first recognition network and an input of the second recognition network are both connected to an output of the public network, and the step of performing recognition processing on the pre-processed image to be calibrated through a preset image recognition model to obtain error information of the target object and the reference object in the image to be calibrated includes:
processing the preprocessed image to be calibrated through a public network to obtain a first characteristic diagram;
processing the first characteristic diagram through a first identification network to obtain an identification result of a target object, and processing the first characteristic diagram through a second identification network to obtain an identification result of a reference object;
and calculating to obtain error information of the target object and the reference object in the image to be calibrated according to the identification result of the target object and the identification result of the reference object.
5. The method according to claim 4, wherein the recognition result of the target object includes target object frame information, the reference object includes reference object frame information, and the step of calculating error information of the target object and the reference object in the image to be calibrated according to the recognition result of the target object and the recognition result of the reference object includes:
calculating to obtain the distance from the center point of the target object frame to the center point of the reference object frame according to the information of the target object frame and the information of the reference object frame;
and obtaining error information of the target object and the reference object in the image to be calibrated according to the distance from the center point of the target object frame to the center point of the reference object frame.
6. The method according to claim 4, wherein the recognition result of the target object includes target object frame information, the reference object includes reference object frame information, and the step of calculating error information between the target object and the reference object in the image to be calibrated according to the recognition result of the target object and the recognition result of the reference object includes:
calculating to obtain the intersection ratio of the target object frame and the reference object frame according to the target object frame information and the reference object frame information;
and calculating to obtain error information of the target object and the reference object in the image to be calibrated according to the intersection ratio of the target object frame and the reference object frame.
7. The method of claim 4, wherein calibrating the rangefinder based on the error information comprises:
generating an adjusting instruction according to the error information;
and adjusting the jig through the adjusting instruction so that the jig calibrates the distance measuring instrument.
8. A calibration device, the device comprising:
the device comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring an image to be calibrated, the image to be calibrated is an image obtained by an eyepiece in the distance meter, and a reference object is arranged in the eyepiece;
the processing module is used for carrying out image processing on the image to be calibrated to obtain error information of a target object and the reference object in the image to be calibrated;
and the calibration module is used for calibrating the range finder according to the error information.
9. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, the processor implementing the steps in the calibration method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps in the calibration method according to any one of claims 1 to 7.
CN202210431653.XA 2022-04-22 2022-04-22 Calibration method, calibration device, electronic equipment and storage medium Pending CN114820726A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210431653.XA CN114820726A (en) 2022-04-22 2022-04-22 Calibration method, calibration device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210431653.XA CN114820726A (en) 2022-04-22 2022-04-22 Calibration method, calibration device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114820726A true CN114820726A (en) 2022-07-29

Family

ID=82507843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210431653.XA Pending CN114820726A (en) 2022-04-22 2022-04-22 Calibration method, calibration device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114820726A (en)

Similar Documents

Publication Publication Date Title
US20200250429A1 (en) Attitude calibration method and device, and unmanned aerial vehicle
CN110645986B (en) Positioning method and device, terminal and storage medium
CN111179358A (en) Calibration method, device, equipment and storage medium
CN104613930B (en) Method and device for measuring distance as well as mobile terminal
CN104089628B (en) Self-adaption geometric calibration method of light field camera
JP2011242207A (en) Terminal locating system, mobile terminal, and terminal locating method
CN104422439A (en) Navigation method, apparatus, server, navigation system and use method of system
CN111488874A (en) Method and system for correcting inclination of pointer instrument
CN109674443B (en) Pupil distance measuring method and terminal
CN111758118B (en) Visual positioning method, device, equipment and readable storage medium
CN114820726A (en) Calibration method, calibration device, electronic equipment and storage medium
CN107343142A (en) The image pickup method and filming apparatus of a kind of photo
CN112581444A (en) Anomaly detection method, device and equipment
CN116439652A (en) Diopter detection method, diopter detection device, upper computer and diopter detection system
US20240135586A1 (en) Calibration method of a portable electronic device
CN110986916A (en) Indoor positioning method and device, electronic equipment and storage medium
CN113554754A (en) Indoor positioning method based on computer vision
CN113984055A (en) Indoor navigation positioning method and related device
CN107491778A (en) A kind of screen of intelligent device extracting method and system based on positioning image
CN106951553A (en) A kind of address locating methods and device
KR102006883B1 (en) Position compensation system and method using mobile device
CN115841666B (en) Instrument reading identification method and system
CN115578656B (en) Method and system for supporting full-automatic processing of multi-model multispectral camera data
CN109587403A (en) A kind of shooting bootstrap technique and device
CN113408542A (en) Pointer instrument reading identification method, system, equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination